Definition. (Subtle) A sequence of elements v_{1}, v_{2}, ..., v_{r} in a vector space V is linearly dependent if there exists a sequence a_{1}, a_{2}, ... a_{r}, not all zero, such that
NFR: Finitely many vectors are linearly dependent if some non-trivial linear combination of them vanishes.
Definition. (Subtle) A sequence of elements v_{1}, v_{2}, ..., v_{r} in a vector space V is linearly independent if it is not linearly dependent. NFR: Finitely many vectors are linearly independent if the only vanishing linear combination of them is the trivial linear combination.
Definition. A sequence of elements v_{1}, v_{2}, ..., v_{r} in a vector space V spans V or generates V if for each element x of V one can find scalars a_{1}, a_{2}, ..., a_{r} such that
NFR: A given sequence of vectors spans a vector space if every vector is some linear combination of them.
Theorem. If V is a vector space, v_{1}, v_{2}, ..., v_{r} is some linearly independent sequence in V, and w_{1}, w_{2}, ..., w_{s} is some spanning sequence in V, then r <= s. NFR: In a given vector space the number of elements in any linearly independent sequence is less than or equal to the number in any spanning sequence.
Definition. A sequence of elements v_{1}, v_{2}, ..., v_{r} in a vector space V is a basis of V if it spans V and if it is linearly independent.
Corollary. Any two bases of a vector space must have the same number of elements.
Definition. If a vector space V has a basis, as defined above, then the number of elements in any basis of V is called the dimension of V.
Show that the columns of the n \times n identity matrix are linearly independent.
Show that the columns of the n \times n identity matrix span R^{n}.
Show that the vectors (0, 1, 3), (2, 0, 1), (4, 1, 0) form a basis of R^{3}.
Explain why the three columns of a 2 \times 3 matrix can never be linearly independent.
Explain why the non-zero rows of a matrix in row echelon form are always linearly independent.
What happens with a system of m linear equations in n variables if the rows of its coefficient matrix are linearly dependent but the rows of its augmented matrix are linearly independent.
Could it happen with a system of m linear equations in n variables that the rows of its coefficient matrix are linearly independent while the rows of its augmented matrix are linearly dependent?