How Do You Know if a Nonsquare Matrix Is Linearly Independent
Objectives
- Empathise the concept of linear independence.
- Learn two criteria for linear independence.
- Empathize the relationship between linear independence and pivot columns / free variables.
- Recipe: exam if a set of vectors is linearly independent / find an equation of linear dependence.
- Picture: whether a set of vectors in or is linearly independent or non.
- Vocabulary words: linear dependence relation / equation of linear dependence.
- Essential vocabulary words: linearly contained, linearly dependent.
Sometimes the span of a set of vectors is "smaller" than you await from the number of vectors, as in the moving-picture show below. This means that (at least) one of the vectors is redundant: it tin exist removed without affecting the span. In the present section, we formalize this idea in the notion of linear independence.
Definition
A set of vectors is linearly independent if the vector equation
has but the trivial solution The set is linearly dependent otherwise.
In other words, is linearly dependent if there exist numbers non all equal to zero, such that
This is chosen a linear dependence relation or equation of linear dependence.
Note that linear dependence and linear independence are notions that utilize to a collection of vectors. It does non make sense to say things similar "this vector is linearly dependent on these other vectors," or "this matrix is linearly contained."
Example (Checking linear dependence)
Example (Checking linear independence)
Example (Vector parametric form)
The above examples pb to the post-obit recipe.
Recipe: Checking linear independence
A gear up of vectors is linearly independent if and only if the vector equation
has only the trivial solution, if and only if the matrix equation has merely the lilliputian solution, where is the matrix with columns
This is truthful if and only if has a pivot position in every column.
Solving the matrix equatiion will either verify that the columns are linearly independent, or will produce a linear dependence relation by substituting any nonzero values for the free variables.
(Think that has a nontrivial solution if and only if has a cavalcade without a pivot: see this observation in Section ii.4.)
Suppose that has more columns than rows. Then cannot accept a pivot in every column (information technology has at near one pivot per row), so its columns are automatically linearly dependent.
A wide matrix (a matrix with more columns than rows) has linearly dependent columns.
For example, 4 vectors in are automatically linearly dependent. Note that a tall matrix may or may not have linearly contained columns.
Facts about linear independence
- Two vectors are linearly dependent if and only if they are collinear, i.eastward., one is a scalar multiple of the other.
- Whatsoever gear up containing the zero vector is linearly dependent.
- If a subset of is linearly dependent, so is linearly dependent equally well.
Proof
- If and then so is linearly dependent. In the other direction, if with (say), then
- It is easy to produce a linear dependence relation if ane vector is the zero vector: for instance, if and then
- After reordering, we may suppose that is linearly dependent, with This means that in that location is an equation of linear dependence
with at least one of nonzero. This is also an equation of linear dependence among since we can take the coefficients of to all exist nix.
With regard to the outset fact, notation that the null vector is a multiple of any vector, and then it is collinear with any other vector. Hence facts one and 2 are consistent with each other.
In this subsection we give ii criteria for a ready of vectors to be linearly independent. Go along in mind, however, that the actual definition is above.
Theorem
A gear up of vectors is linearly dependent if and only if one of the vectors is in the span of the other ones.
Any such vector may be removed without affecting the span.
Suppose, for instance, that is in so we have an equation like We tin subract from both sides of the equation to get This is a linear dependence relation. In this example, any linear combination of is already a linear combination of Therefore, is contained in Any linear combination of is also a linear combination of (with the -coefficient equal to cipher), so is also contained in and thus they are equal. In the other direction, if we have a linear dependence relation like so we can move whatever nonzero term to the left side of the equation and divide past its coefficient: This shows that is in Nosotros exit it to the reader to generalize this proof for any prepare of vectors. Proof
Alarm
In a linearly dependent set it is not generally true that any vector is in the span of the others, only that at to the lowest degree one of them is.
For example, the set is linearly dependent, simply is non in the span of the other two vectors. Also see this effigy below.
The previous theorem makes precise in what sense a set of linearly dependent vectors is redundant.
Theorem (Increasing Span Criterion)
A prepare of vectors is linearly contained if and only if, for every the vector is not in
It is equivalent to show that is linearly dependent if and only if is in for some The "if" implication is an firsthand consequence of the previous theorem. Suppose then that is linearly dependent. This means that some is in the span of the others. Choose the largest such Nosotros claim that this is in If not, then with not all of equal to goose egg. Suppose for simplicity that So nosotros can rearrange: This says that is in the span of which contradicts our assumption that is the concluding vector in the span of the others. Proof
We tin can rephrase this as follows:
If yous make a set of vectors past adding 1 vector at a time, and if the bridge got bigger every fourth dimension y'all added a vector, and so your fix is linearly independent.
A set containg i vector is linearly independent when since implies
A fix of two noncollinear vectors is linearly independent:
- Neither is in the span of the other, and then nosotros tin utilise the beginning criterion.
- The span got bigger when we added so nosotros tin apply the increasing span criterion.
The ready of three vectors below is linearly dependent:
- is in then we can employ the outset criterion.
- The bridge did not increase when we added and so we can employ the increasing span criterion.
In the pic beneath, note that is in and is in so we tin can remove any of the three vectors without shrinking the bridge.
2 collinear vectors are always linearly dependent:
- is in and then we tin employ the outset criterion.
- The bridge did not increase when we added so nosotros can apply the increasing span criterion.
These iii vectors are linearly dependent: indeed, is already linearly dependent, so we tin can use the third fact.
Interactive: Linear independence of ii vectors in
Interactive: Linear dependence of 3 vectors in
The two vectors below are linearly contained considering they are not collinear.
The iii vectors below are linearly independent: the bridge got bigger when we added then again when we added then nosotros can apply the increasing bridge benchmark.
The three coplanar vectors below are linearly dependent:
- is in then we can utilise the beginning criterion.
- The bridge did not increase when we added so we can use the increasing span criterion.
Note that three vectors are linearly dependent if and only if they are coplanar. Indeed, is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or ).
The four vectors beneath are linearly dependent: they are the columns of a wide matrix. Note however that is not contained in See this warning.
Interactive: Linear independence of 2 vectors in
Interactive: Linear independence of three vectors in
In light of this important note and this criterion, it is natural to ask which columns of a matrix are redundant, i.e., which we can remove without affecting the cavalcade span.
Theorem
Let be vectors in and consider the matrix
So nosotros tin can delete the columns of without pivots (the columns respective to the free variables), without changing
The pivot columns are linearly independent, so we cannot delete whatsoever more columns without changing the span.
If the matrix is in reduced row echelon form: then the cavalcade without a pivot is visibly in the bridge of the pivot columns: and the pivot columns are linearly independent: If the matrix is non in reduced row echelon form, and then we row reduce: The following two vector equations have the same solution set, every bit they come up from row-equivalent matrices: We conclude that and that has merely the lilliputian solution. Proof
Note that information technology is necessary to row reduce to notice which are its pivot columns. Nonetheless, the span of the columns of the row reduced matrix is mostly not equal to the span of the columns of one must employ the pivot columns of the original matrix. Run into theorem in Section ii.7 for a restatement of the higher up theorem.
Example
Pin Columns and Dimension
Let be the number of pivot columns in the matrix
- If then is a line.
- If so is a plane.
- If then is a three-space.
- Et cetera.
The number is chosen the dimension. Nosotros discussed this notion in this important note in Section ii.four and this important note in Section ii.four. Nosotros volition ascertain this concept rigorously in Section 2.vii.
Source: https://textbooks.math.gatech.edu/ila/linear-independence.html
0 Response to "How Do You Know if a Nonsquare Matrix Is Linearly Independent"
Post a Comment