Mathematics from the eyes of a High School Student! – Bidit Acharya

One way of testing whether a set of vectors is a linearly independent set is to arrange the vectors as columns of a matrix and then work out its determinant. If the determinant of that matrix is 0 then the set is a linearly dependent set but if the determinant is something other than 0 then the set is a linearly independent one.

Obviously this only works If the matrix so formed is a square matrix, where the number of rows equals the number of columns of the matrix. This is the same a saying … if number of components of the vectors in our set equals the total number of vectors in our set. While this might at first seem as a letdown, but this isn’t totally useless, especially when it comes to determining whether the set is a basis for a particular subspace. What do I mean by that? Well, I’d recommend you continue reading!

I have been using this “technique” for quite sometime in Linear Algebra classes at school, but mostly, without really understanding what really is going on underneath. At first sight, this might seem as a “trick” of some sort but as Qiaochu Yuan says “ In mathematics nothing is a trick if seen from a sufficiently high level”, this is no trick.

Well, from here on, I’ll try to be as mathematically formal as I can.

Let set \mathcal A ,be a linearly dependent set, with cardinality n , of n-vectors. Now construct a matrix M whose columns are the vectors of \mathcal A . Or, if c_1,c_2,\ldots , c_n are the n-vector elements of \mathcal A , then construct M such that,

M=\begin{bmatrix} c_1 & c_2 &\ldots & c_n \end{bmatrix}

Now Let \nabla be the determinant of M. Then,

\nabla =| ~c_1 ~c_2 ~\cdots ~c_n ~|

Now because the set \mathcal A is a linearly dependent set, this implies that one of the vectors (without loss of generality, let’s say its the vector c_n ) can be represented as a linear combination of the other vectors in the set. Or,

c_n = a_1 c_1+a_2 c_2 \ldots + a_{n-1} c_{n-1} ~~~ \text{for some } a_i \in \mathbb R .

Now, take Matrix M and subtract a_1c_1+a_2c_2 ~\ldots +a_{n-1}c_{n-1} from the n \text{th} column (or from c_n ). This results in another matrix (say M’) whose last column is a 0 vector.

So, we now have that

M'=\begin{bmatrix} c_1 & c_2 &\ldots & c_{n-1} & 0 \end{bmatrix}

It follows from the fact that one of the columns of the matrix being 0, that \det(M') =0

Because the determinant of a matrix remains unchanged if a scalar multiple of a particular column(or row) is added(or subtracted) from a column, it thus follows

\det(M) = \det(M') ~\implies ~\nabla=0

Therefore, we have just shown that if a the columns of a matrix are linearly dependent, the determinant of that matrix should be zero. The converse is also true and it shouldn’t be that difficult to show.

Now that it has been shown (not so rigorous, but bearably solid) why it works, I’d like to focus on something I mentioned earlier on in the post, something about M being a square matrix. After reading the proof, it should be very clear as to why M should be a square matrix. Our technique uses the idea of a determinant which is only defined for a square matrix and hence , if M isn’t a square matrix, then we can not compute its determinant.

Now when you think about what it actually means for M to be a square matrix, you realize a certain fact about this technique that is, in my opinion, worthy of a mention. Beacause M is a square matrix, the number of vectors in \mathcal A and the number of component in each vector \vec {c_i} \in \mathcal A has to be equal. Else, M wouldn’t be a square matrix. This means that the linear independence of the columns of  \mathcal A implies that the vectors in \mathcal A spans the vector space \mathbb {R}^n . So, if a set is linearly independent set and it spans a particular subspace, the set is said to be a basis for that particular subspace, right?

So, what do we have now? Our little “theorem” now extends to

If a set with n vectors is a  basis for the subspace \mathbb {R}^n , then the determinant of the matrix whose columns are the basis vectors of that basis set, is a non-zero number. If the set isn’t a basis for that subspace  \mathbb {R}^n , then the determinant of the matrix is  0 .

The converse of this statement is true too and should be difficult to see why.

About these ads

Comments on: "On Determinants of Matrices with Linearly Independent Columns (or Rows)" (3)

  1. The way you explained was great. Thanks for the write this post.

  2. Thanks for the sensible critique. Me & my neighbor were just preparing to do a little research about this. We got a grab a book from our area library but I think I learned more clear from this post. I’m very glad to see such great information being shared freely out there.

Please leave in your thoughts on the Post

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

%d bloggers like this: