The columns of a square matrix over a field are linearly independent if and only if the determiniant of the matrix is nonzero

Let F be a field and let A = [A_1 | \cdots | A_n] be a square matrix of dimension n \times n over F. Prove that the set \{A_i\}_{i=1}^n is linearly independent if and only if \mathsf{det}\ A \neq 0.


Let B be the reduced row echelon form of A, and let P be invertible such that PA = B.

Suppose the columns of A are linearly independent. Now B has column rank n. In particular, B = I. Now 1 = \mathsf{det}(PA) = \mathsf{det}(P) \mathsf{det}(A); so \mathsf{det}(A) \neq 0.

We prove the converse contrapositively. Suppose the columns of A are linearly dependent; then the column rank of B is strictly less than n, so that B has a row of all zeros. Using the cofactor expansion formula, \mathsf{det}(B) = 0. Since P is invertible, its determinant is nonzero; thus \mathsf{det}(A) = 0. Thus if \mathsf{det}(A) \neq 0, then the columns of A are linearly independent.

Advertisements
Post a comment or leave a trackback: Trackback URL.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: