Throughout this page, I will be using Laplace expansion repeatedly to calculate determinants of matrices. If you don't remember/know what that is, here are some resources:
Denote by the elementary matrix obtained by exchanging the -th row and -th rows. Write in matrix form. Compute . Prove that . Prove that .
Denote by the elementary matrix obtained by multiplying the -th row by a scalar . Write in matrix for. Compute . Prove that . Prove that is invertible if with .
We denote by the elementary matrix obtained by adding to the -th row the -th row multiplied by a scalar . Write in matrix form. Compute . Prove that . Prove that .
To compute , we can just figure out what it does to the standard basis of . If , then the -th and -th rows of (or in other words, the -th and -th entries of ) are both , so swapping those rows does nothing. I.e., . However, if , then we swap a in the -th entry with a in the -th entry, so . Similarly, . Thus, looks like
Most entries are except on the diagonal and the 's the were moved off the diagonal. The off-diagonal 's have indices and .
To compute the determinant, we perform Laplace expansion on the -th column first. Assuming , this expansion gives
The matrix in the determinant is the minor we get from removing the -th row and -th column. Since the -th row is below the top-right , its row index doesn't change. However, because the -column is to the left, we shifted it to the left by . Thus, the has index . Laplace expanding in the -th column gives
Here, we used the fact that .
Next, since this is true for any matrix (or alternatively, you can say that is symmetric, so it's equal to its transpose).
Lastly, to show that is its own inverse, we can just check it on a basis. If , then
And for ,
and similarly for . Thus, holds on a basis, and by linearity, it holds everywhere, so .
Like before, we can compute on a basis. If , then . Otherwise, , so
Here, is except on the diagonal. At the -th entry, it has an , and it has a everywhere else on the diagonal. Since the determinant of an upper-triangular matrix is just the product of the diagonal entries (you can prove this by induction and Laplace expansion),
Like before, since it's symmetric. Lastly, if , then we can just check that is an inverse on a basis. If , then
and if ,
Thus, is a left-inverse for , and because is a square matrix, it follows that it's also a right-inverse. In other words, .
I will assume that . Otherwise, , which is the previous case.
When , the -th row of is , so adding the -th row of to anything doesn't change it. Thus, . On the other hand, if , then we add to the -th row of , which gives . Hence,
Like before, every entry is except on the diagonal, which are all 's, and except at the -th entry, which is . This is an upper-triangular matrix with all 's on the diagonal, so its determinant is
Note that is a lower-triangular matrix, but the determinant of this matrix has the same property: it's the product of its diagonal entries, so also.
Lastly, if , then
and if ,
Thus, is a left-inverse and is a square matrix, it is invertible with .
Let , define the matrix via for all .
The definition of the determinant we're using is
where is any index, is the minor of we get from removing the -th row and -th column (i.e., the row and column that contains ). Thus, if is an matrix, then is an matrix.
We prove this by induction. The base case is , where . Then
so the base case holds. For the inductive step, assume that for any matrix . We need to show it holds for any matrix. Let be an matrix. Then
Note that is an matrix, so by the inductive hypothesis,
Also, recall that complex conjugation is additive and multiplicative: for , we have and . Lastly, note that since it's a real number. Putting everything together,
Recall that for any matrix . Thus,
Recall that if are square, then . Thus, if we compute the determinant on both sides of , then
But by the first part,
Lastly, recall that if , then , so
which implies that .