How Do You Know if a Nonsquare Matrix Is Linearly Independent

Objectives
  1. Empathise the concept of linear independence.
  2. Learn two criteria for linear independence.
  3. Empathize the relationship between linear independence and pivot columns / free variables.
  4. Recipe: exam if a set of vectors is linearly independent / find an equation of linear dependence.
  5. Picture: whether a set of vectors in R 2 or R three is linearly independent or non.
  6. Vocabulary words: linear dependence relation / equation of linear dependence.
  7. Essential vocabulary words: linearly contained, linearly dependent.

Sometimes the span of a set of vectors is "smaller" than you await from the number of vectors, as in the moving-picture show below. This means that (at least) one of the vectors is redundant: it tin exist removed without affecting the span. In the present section, we formalize this idea in the notion of linear independence.

Bridge { 5 , westward } v due west Bridge { u , v , westward } v w u

Figure iPictures of sets of vectors that are linearly dependent. Note that in each example, 1 vector is in the bridge of the others—so it doesn't make the span bigger.
Definition

A set of vectors { v ane , v ii ,..., v k } is linearly independent if the vector equation

x 1 v ane + x 2 5 2 + ··· + ten k 5 g = 0

has but the trivial solution ten 1 = x two = ··· = x k = 0. The set { v 1 , 5 2 ,..., v k } is linearly dependent otherwise.

In other words, { five 1 , v two ,..., five k } is linearly dependent if there exist numbers x i , x 2 ,..., x k , non all equal to zero, such that

x i 5 i + x 2 five 2 + ··· + ten k five chiliad = 0.

This is chosen a linear dependence relation or equation of linear dependence.

Note that linear dependence and linear independence are notions that utilize to a collection of vectors. It does non make sense to say things similar "this vector is linearly dependent on these other vectors," or "this matrix is linearly contained."

Example (Checking linear dependence)

Example (Checking linear independence)

Example (Vector parametric form)

The above examples pb to the post-obit recipe.

Recipe: Checking linear independence

A gear up of vectors { v ane , v 2 ,..., five k } is linearly independent if and only if the vector equation

x i v ane + x 2 v ii + ··· + x 1000 five k = 0

has only the trivial solution, if and only if the matrix equation Ax = 0 has merely the lilliputian solution, where A is the matrix with columns 5 i , v 2 ,..., v g :

A = E ||| v 1 five 2 ··· v thousand ||| F .

This is truthful if and only if A has a pivot position in every column.

Solving the matrix equatiion Ax = 0 will either verify that the columns 5 1 , 5 2 ,..., v k are linearly independent, or will produce a linear dependence relation by substituting any nonzero values for the free variables.

(Think that Ax = 0 has a nontrivial solution if and only if A has a cavalcade without a pivot: see this observation in Section ii.4.)

Suppose that A has more columns than rows. Then A cannot accept a pivot in every column (information technology has at near one pivot per row), so its columns are automatically linearly dependent.

A wide matrix (a matrix with more columns than rows) has linearly dependent columns.

For example, 4 vectors in R 3 are automatically linearly dependent. Note that a tall matrix may or may not have linearly contained columns.

Proof

  1. If v 1 = cv 2 and then v 1 cv ii = 0, so { v i , v two } is linearly dependent. In the other direction, if x i v 1 + x 2 v 2 = 0 with x ane A = 0 (say), then v 1 = 10 ii ten 1 5 two .
  2. It is easy to produce a linear dependence relation if ane vector is the zero vector: for instance, if v 1 = 0 and then

    i · 5 1 + 0 · 5 2 + ··· + 0 · v thou = 0.

  3. After reordering, we may suppose that { 5 i , v 2 ,..., v r } is linearly dependent, with r < p . This means that in that location is an equation of linear dependence

    x 1 v ane + ten 2 five 2 + ··· + x r 5 r = 0,

    with at least one of x ane , ten ii ,..., ten r nonzero. This is also an equation of linear dependence among { v 1 , v two ,..., five k } , since we can take the coefficients of v r + ane ,..., v yard to all exist nix.

With regard to the outset fact, notation that the null vector is a multiple of any vector, and then it is collinear with any other vector. Hence facts one and 2 are consistent with each other.

In this subsection we give ii criteria for a ready of vectors to be linearly independent. Go along in mind, however, that the actual definition is above.

Proof

Suppose, for instance, that v 3 is in Bridge { v 1 , v 2 , 5 4 } , so we have an equation like

v 3 = 2 v 1 1 2 five 2 + 6 v iv .

We tin subract v 3 from both sides of the equation to get

0 = 2 v 1 1 2 5 2 v 3 + six v iv .

This is a linear dependence relation.

In this example, any linear combination of v 1 , v two , v 3 , v four is already a linear combination of v 1 , v 2 , 5 four :

x 1 v 1 + ten two five 2 + 10 three v 3 + x 4 v four = x one v 1 + 10 ii five 2 + x 3 G 2 v 1 1 2 v 2 + 6 5 4 H + 10 4 v 4 =( x 1 + two ten three ) v ane + Chiliad ten ii 1 ii x 3 H five 2 +( x four + half-dozen ) v 4 .

Therefore, Span { v 1 , v 2 , v three , v 4 } is contained in Bridge { v one , v two , v 4 } . Any linear combination of v 1 , v 2 , v four is also a linear combination of five 1 , v 2 , five 3 , v 4 (with the 5 3 -coefficient equal to cipher), so Span { five one , v ii , five 4 } is also contained in Span { v 1 , v 2 , v 3 , v 4 } , and thus they are equal.

In the other direction, if we have a linear dependence relation like

0 = 2 5 1 1 2 five ii + five three 6 v 4 ,

so we can move whatever nonzero term to the left side of the equation and divide past its coefficient:

five 1 = 1 two G ane two v ii five three + vi v iv H .

This shows that 5 1 is in Span { v 2 , 5 3 , v 4 } .

Nosotros exit it to the reader to generalize this proof for any prepare of vectors.

Alarm

In a linearly dependent set { v 1 , v ii ,..., v g } , it is not generally true that any vector v j is in the span of the others, only that at to the lowest degree one of them is.

For example, the set CA 1 0 B , A two 0 B , A 0 ane BD is linearly dependent, simply A 0 1 B is non in the span of the other two vectors. Also see this effigy below.

The previous theorem makes precise in what sense a set of linearly dependent vectors is redundant.

Proof

It is equivalent to show that { five one , 5 2 ,..., v k } is linearly dependent if and only if 5 j is in Span { v ane , v 2 ,..., v j 1 } for some j . The "if" implication is an firsthand consequence of the previous theorem. Suppose then that { v 1 , v 2 ,..., 5 k } is linearly dependent. This means that some v j is in the span of the others. Choose the largest such j . Nosotros claim that this v j is in Span { v 1 , v ii ,..., five j 1 } . If not, then

v j = ten 1 five 1 + x 2 5 ii + ··· + x j i five j 1 + x j + 1 v j + one + ··· + x k v k

with not all of ten j + 1 ,..., x g equal to goose egg. Suppose for simplicity that x yard A = 0. So nosotros can rearrange:

5 m = 1 ten one thousand A x 1 5 1 + x 2 five 2 + ··· + x j 1 v j ane v j + x j + 1 five j + 1 + ··· + ten p 1 v p 1 B .

This says that v thou is in the span of { v 1 , v 2 ,..., v p i } , which contradicts our assumption that 5 j is the concluding vector in the span of the others.

We tin can rephrase this as follows:

If yous make a set of vectors past adding 1 vector at a time, and if the bridge got bigger every fourth dimension y'all added a vector, and so your fix is linearly independent.

A set containg i vector { 5 } is linearly independent when v A = 0, since xv = 0 implies x = 0.

Bridge { 5 } v

A fix of two noncollinear vectors { v , w } is linearly independent:

  • Neither is in the span of the other, and then nosotros tin utilise the beginning criterion.
  • The span got bigger when we added due west , so nosotros tin apply the increasing span criterion.

Span { v } Bridge { w } v due west

The ready of three vectors { five , w , u } below is linearly dependent:

  • u is in Bridge { v , w } , then we can employ the outset criterion.
  • The bridge did not increase when we added u , and so we can employ the increasing span criterion.

In the pic beneath, note that v is in Span { u , w } , and w is in Span { u , 5 } , so we tin can remove any of the three vectors without shrinking the bridge.

Span { v } Span { w } Bridge { v , w } v westward u

2 collinear vectors are always linearly dependent:

  • w is in Span { v } , and then we tin employ the outset criterion.
  • The bridge did not increase when we added west , so nosotros can apply the increasing span criterion.

Span { five } v westward

These iii vectors { 5 , due west , u } are linearly dependent: indeed, { v , w } is already linearly dependent, so we tin can use the third fact.

Bridge { v } v w u

Interactive: Linear independence of ii vectors in R 2

Interactive: Linear dependence of 3 vectors in R 2

The two vectors { v , w } below are linearly contained considering they are not collinear.

v w Span { five } Span { west }

The iii vectors { v , w , u } below are linearly independent: the bridge got bigger when we added w , then again when we added u , then nosotros can apply the increasing bridge benchmark.

five w u Span { v } Span { w } Span { v , w }

The three coplanar vectors { v , west , u } below are linearly dependent:

  • u is in Span { v , w } , then we can utilise the beginning criterion.
  • The bridge did not increase when we added u , so we can use the increasing span criterion.

five due west u Span { v } Bridge { w } Bridge { v , w }

Note that three vectors are linearly dependent if and only if they are coplanar. Indeed, { v , w , u } is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or { 0 } ).

The four vectors { v , westward , u , x } beneath are linearly dependent: they are the columns of a wide matrix. Note however that u is not contained in Span { v , w , x } . See this warning.

five westward u 10 Span { v } Span { west } Span { v , west }

Effigy 20The vectors { v , w , u , ten } are linearly dependent, simply u is non contained in Span { v , w , x } .

Interactive: Linear independence of 2 vectors in R three

Interactive: Linear independence of three vectors in R iii

In light of this important note and this criterion, it is natural to ask which columns of a matrix are redundant, i.e., which we can remove without affecting the cavalcade span.

Proof

If the matrix is in reduced row echelon form:

A = E 1020 0130 0001 F

then the cavalcade without a pivot is visibly in the bridge of the pivot columns:

Eastward 2 3 0 F = 2 E i 0 0 F + 3 E 0 1 0 F + 0 Due east 0 0 1 F ,

and the pivot columns are linearly independent:

Due east 0 0 0 F = x 1 E i 0 0 F + ten two East 0 1 0 F + x 4 Eastward 0 0 ane F = Eastward x ane x ii x 4 F = ten 1 = ten two = x 4 = 0.

If the matrix is non in reduced row echelon form, and then we row reduce:

A = E 17233 24160 1 2 84 F RREF −−→ East 1020 0130 0001 F .

The following two vector equations have the same solution set, every bit they come up from row-equivalent matrices:

x 1 E 1 ii 1 F + 10 two E 7 4 2 F + x 3 E 23 16 eight F + x 4 E 3 0 iv F = 0 ten 1 E 1 0 0 F + 10 2 E 0 ane 0 F + 10 3 Eastward two three 0 F + 10 4 Eastward 0 0 ane F = 0.

We conclude that

E 23 sixteen 8 F = two E 1 two i F + 3 E 7 four ii F + 0 Due east 3 0 4 F

and that

x 1 E i 2 one F + x 2 E seven 4 two F + 10 4 E 3 0 four F = 0

has merely the lilliputian solution.

Note that information technology is necessary to row reduce A to notice which are its pivot columns. Nonetheless, the span of the columns of the row reduced matrix is mostly not equal to the span of the columns of A : one must employ the pivot columns of the original matrix. Run into theorem in Section ii.7 for a restatement of the higher up theorem.

Example

Pin Columns and Dimension

Let d be the number of pivot columns in the matrix

A = E ||| v 1 5 2 ··· v 1000 ||| F .

  • If d = 1 then Bridge { v 1 , v two ,..., five chiliad } is a line.
  • If d = 2 so Span { five one , v 2 ,..., v chiliad } is a plane.
  • If d = 3 then Span { v 1 , five two ,..., five k } is a three-space.
  • Et cetera.

The number d is chosen the dimension. Nosotros discussed this notion in this important note in Section ii.four and this important note in Section ii.four. Nosotros volition ascertain this concept rigorously in Section 2.vii.

valasquezadardly.blogspot.com

Source: https://textbooks.math.gatech.edu/ila/linear-independence.html

0 Response to "How Do You Know if a Nonsquare Matrix Is Linearly Independent"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel