First Test Answers Math 130 Linear Algebra
First Test Answers Math 130 Linear Algebra
a. [5] Write down the augmented matrix for this system. D Joyce, October 2012 1 1 3 2 7 2 1 0 4 8 Scale. 86100 A, 7185 B, 5070 C. Median 88 0 3 6 0 6 (A-). b. [10] Use row operations to convert this matrix to echelon or reduced echelon form. (Your choice, 1. [10] Name two subspaces of R2 whose union is but show your work.) not a subspace of R2 . (No need to give a proof.) 1 1 3 2 7 2 There arent many subspaces of R . Theres the 0 3 6 0 6 trivial subspace that consists of the origin 0 alone, 0 3 6 0 6 and the the entire plane R2 , but those wont help 1 1 3 2 7 in giving and answer here. 0 1 2 0 2 The rest of the subspaces of R2 are the straight 0 0 0 0 0 lines through 0. Any two of these will do. A line 1 0 1 2 5 through the origin is a subspace of the plane, yet 0 1 2 0 2 the union of two lines is not a subspace. The reason 0 0 0 0 0 is that if you add a vector on one line to a vector on the other, their sum will not be on either line. c. [10] Describe all the solutions for the original That means the union is not a subspace. system of linear equations. One way to describe the solutions is to say the 2. [12] Explain in a sentence or two why it is the general solution is case that if S is a linearly independent set of vectors (x, y, z, w) = (5 z 2w, 2 2z, z, w) in a vector space, then any subset S of S is also a linearly independent set of vectors in that vector where z and w can be any scalars. space. You could also write the general solution as A set of vectors is independent if the only linear combination of them that is 0 is the trivial linear combination. If thats true for linear combinations in a set S, it will automatically be true for any linear combination in a subset S of S since a linear combination of S is a linear combination of S. An explanation will have to describe what it means for a set of vectors to be linearly independent (or, if you prefer, what it means for them to be dependent). Without the connection to the meaning of the concept, an explanation would be incomplete. 1 (x, y, z, w) = (5, 2, 0, 0)+z(1, 2, 1, 0)+w(1, 0, 0, 1) where, again, z and w can be any scalars. You could also parametrize the solutions in terms of x and y instead of z and w, or, for that matter, any two of the four variables x, y, z, and z. 4. [25] Using only the axioms of vector spaces pertaining to addition prove the law of cancellation for addition: u + v = u + w implies v = w.
The axioms of vector spaces pertaining to addition are 1. Vector addition is commutative: v+w = w+v for all vectors v and w; 2. Vector addition is associative: (u + v) + w = u + (v + w) for all vectors u, v, and w; 3. There is a vector, denoted 0 and called the zero vector, such that v + 0 = v = 0 + v for each vector v; 4. For each vector v, there is another vector denoted (v) such that v + (v) = 0. In your proof, do not use the operation of subtraction, but use the fourth axiom as it is stated. You can write your proof in several dierent forms. Ill give a narrative proof, a two-column proof, and an equational proof. The proofs I have here arent the only ones. There are dierent orders you can use the axioms, and the equations dont have to be the ones shown here. Narrative proof. Suppose that
Thus, we have shown that u + v = u + w implies v = w. q.e.d. Two-column proof. You give the assertions in the left column and abbreviated reasons in the right column. These require the reader interpret what you say. In this particular proof, each follows from the previous, but in more complicated proofs a line may follow from more than one of the previous lines, and for those, you should state which lines are used. Also this is a proof of a simple implication of the form P = Q, so its enough to state P on the rst line and end with Q. 1. 2. 3. 4. 5. 6. u+v =u+w (u) + (u + v) = (u) + (u + w) ((u) + u) + v = ((u) + u) + w (u + (u)) + v = (u + (u)) + w 0+v =0+w v=w Given Ax 3 Ax 2 Ax 1 Ax 3 Ax 4
Equational proof. Some proofs, like this one, can be written as one long, continued equation. Theyre easy to check, but they usually have to be constructed from other proofs. When you read u + v = u + w. one youre often puzzled as to where it came from. By axiom 3, there is a vector u. Add it to both Heres what this proof looks like as an equation sides of the equation on the left to get where each equality has an associated justication. (u) + (u + v) = (u) + (u + w). By associatively, axiom 2, we can change the placement of parentheses to get ((u) + u) + v = ((u) + u) + w. v = = = = = = = = = 0+v (u + (u)) + v ((u) + u) + v (u) + (u + v) (u) + (u + w) ((u) + u) + w (u + (u)) + w 0+w w by axiom by axiom by axiom by axiom given by axiom by axiom by axiom by axiom 4 3 1 2 2 1 3 4
5. [28; 4 points each] True/false. But axiom 3 says u + (u) = 0, so that equation a. The solutions to a system of linear equations simplies to form a vector space. False. Thats only true when 0 + v = 0 + w, the linear equations are homogeneous. b. Every subspace of a vector space must contain which by axiom 4 further simplies to the zero vector, 0. True. Vector spaces have to have 0. v = w. 2
c. The set of all polynomials with real coecients forms a vector space over the eld R. True. The vector space is innite dimensional, but it is a vector space. d. Every symmetric matrix is an upper triangular matrix. False. The only symmetric matrices that are upper triangular matrices are the diagonal matrices. e. If two nonzero vectors are linearly dependent, then one is a scalar multiple of the other. True. However, three nonzero vectors can be linearly dependent without any of them being scalar multiples of any of the others. f. If the dimension of a vector space V is m, and the dimension of a vector space W is n, then the dimension of the product space V W is mn. False. The dimension of the product is m + n. For example, R2 R1 is isomorphic to R3 , and its dimension is 3, not 2. g. If there are n independent vectors in a vector space, then every spanning set for that vector space has at least n vectors. True. The n independent vectors cannot be all be linear combinations of fewer than n vectors.