The spectral theorem for Hermitian matrices
A spectral theorem is a theorem about the diagonalization of a matrix or linear operator. A matrix is diagonalizable if it can be written in the form where is a diagonal matrix. In this article, I will explain what a Hermitian matrix is, derive some properties, and use them to prove a spectral theorem for Hermitian matrices.
In the rest of the article, I will use the usual inner product on the complex vector space :
and the corresponding norm:
We will often use that the inner product is linear in its first argument, and conjugate linear in its second:
Here, denotes the complex conjugate, which is defined by for real . These are straightforward generalizations of the normal Euclidian inner product and norm on real vector spaces. In particular, this inner product equals the ‘normal’ inner product for real vectors.
Hermitian operators
Now, we are ready to define Hermitian operators:
Definition: A Hermitian or self-adjoint operator on a space with an inner product is an operator for which
for all
By this definition, symmetric matrices with real elements are Hermitian. However, for matrices with complex elements, the condition is slightly different due to the complex conjugation in the second argument of the inner product.
The conjugate transpose of a complex matrix is defined by .
Theorem: A matrix with complex elements is Hermitian if and only if
Proof: We have and , so . This equality can only hold for all if . Taking transposes from both sides, we see that this holds if and only if .
I want to emphasize that Hermicity can be seen as a generalization of symmetry: We have if is a matrix with real elements, so every symmetric matrix with real elements is Hermitian.
The spectral theorem for Hermitian matrices
Hermitian matrices have some pleasing properties, which can be used to prove a spectral theorem.
Lemma: The eigenvectors of a Hermitian matrix have real eigenvalues.
Proof: Let be an eigenvector with eigenvalue . Then . It follows that , so must be real.
Recall that two vectors and are orthogonal if their inner product is zero, that is, , that a set of vectors is orthogonal if every pair with is orthogonal, and that it is orthonormal if it is orthogonal and every vector has unit norm, that is, .
We will need some lemmas to prove the main result later on. The first is a simple result that states that vectors orthogonal to eigenvectors stay orthogonal when multiplied by .
Lemma: If is orthogonal to an eigenvector of a Hermitian matrix , then is orthogonal to as well.
Proof: Suppose that is the eigenvalue associated to . Then . So , which means that and are orthogonal.
The second lemma is about the behavior of matrices with orthogonal rows.
Lemma: Let be a matrix with orthonormal rows , and be the space spanned by these vectors. Then
- for all
Proof: Interpret the vectors as column vectors. Then the element at of is . By the orthonormality of it follows that this expression is when (that is, the element is on the diagonal), and otherwise. So equals , the identity matrix of size .
For the second result, assume that . Then is a linear combination of the rows in , or, equivalently, a linear combination of the columns of . So we can write for some . Then, using the first part of the lemma, we have:
With these results we are finally ready to prove the existence of an orthogonal basis of eigenvectors.
Theorem: A Hermitian matrix has orthogonal eigenvectors.
Proof: We use induction on the number of eigenvalues of . The characteristic equation is a complex polynomial equation of order , and has a solution in . That implies that for this , is singular, so there exists a such that . This implies that , so we have a set of one eigenvector , which is orthogonal. This proves the base case.
For the induction step, assume the existence of (with ) orthogonal eigenvectors . We then need to prove the existence of another eigenvector , which is orthogonal to . Let be an orthonormal basis of the space that is orthogonal to all the eigenvectors , and be the matrix with as its rows. Now, is Hermitian, so as we just proved for the base case, it must have at least one eigenvector with eigenvalue . So we have
Multiplying both sides by on the left gives . Now define and substitute to get
Now, since is a linear combination of the columns of , it is orthogonal to all the eigenvectors . So, by the first lemma, is also orthogonal to all these eigenvalues. This means that is a linear combination of as well. By the second lemma, it follows that . So we are left with
So is an eigenvector of . Moreover, since , is a linear combination of , so it is orthogonal to the eigenvectors . So this completes the induction step.
Of course, it is now easy to make this basis orthonormal by scaling the vectors in the basis.
Corollary: A Hermitian matrix has a basis of orthonormal eigenvectors.
Proof: By the preceding theorem, there exists a basis of orthogonal eigenvectors of . Denote this basis with , and define . Now, , which is when and when . So this basis is orthonormal.
Definition: A unitary matrix is a matrix for which .
Theorem (Spectral theorem for Hermitian matrices): A Hermitian matrix can be written as
where is a unitary matrix, and is a diagonal matrix with nonnegative elements.
Proof: Let be an orthonormal basis of eigenvector, and be the corresponding eigenvalues. Now, take to be the matrix with as the th column, and to be the matrix with as the th element on the diagonal.
To prove that is unitary, consider the element at position of the matrix . It is given by , which is when and otherwise. So the elements on the diagonal of are one and the others zero, which means that . Furthermore, we have . So , and is unitary.
To prove that , consider the effect of left multiplying an eigenvector by this expression:
Since is a basis of , every vector can be written as a linear combination of the vectors . So we have for every . It follows that .
With this, we finally proved the spectral theorem for Hermitian matrices. While the theorem itself is certainly interesting enough to prove, the proof has other benefits as well. First, there is a spectral theorem for unitary matrices as well, and the proof is analogous to this proof. Secondly, the spectral theorem for Hermitian matrices can be used to easily prove the existence of the singular value decomposition.