Linear algebra (Osnabrück 2024-2025)/Part II/Lecture 42
- Normal endomorphisms
Due to Theorem 34.2 , an isometry over possesses an orthonormal basis consisting of eigenvectors. Due to Theorem 41.11 , a self-adjoint mapping (over or ) has also an orthonormal basis of eigenvectors. We are looking for a generalization of these two statements over . The result is the Spectral theorem for normal endomorphisms; see Theorem 42.9 . As this theorem provides an equivalent characterization for the existence of an orthonormal basis of eigenvectors, a further generalization is not possible.
Let be a finite-dimensional -vector space, endowed with an inner product . An endomorphism
is called normal if and its adjoint endomorphism
commute.Thus, the condition is
A self-adjoint endomorphism is normal. In the case of an isometry , the adjoint endomorphism is , due to Example 41.2 . Therefore, an isometry is normal. If the endomorphism is described by the matrix , with respect to an orthonormal basis, then the condition for normality means
In the two-dimensional situation, the condition for normality is
looking at the entries, this translates to the conditions
and
Besides diagonal matrices and rotation matrices, also real matrices of the form
have this property.
Suppose that the linear mapping
has an orthonormal basis (with respect to the standard inner product) consisting of eigenvectors; that means that the describing matrix is in diagonal form
Then the adjoint endomorphism is described, due to Example 41.4 , by the complex-conjugated matrix
These two matrices commute, that is, we have a normal endomorphism.
The third property of the following lemma explains the word "normal“.
Let be a finite-dimensional -vector space, endowed with an inner product . Let
denote an
endomorphism. Then the following properties are equivalent.- is normal.
- For all
,
the identity
holds.
- For all
,
the identity
holds.
We have
and, using Lemma 41.7 (3), we also have
If and commute, then we also have
for arbitrary . On the other hand, if this holds, then
for all , therefore, the endomorphisms commute. Hence, (1) and (2) are equivalent. From (2) to (3) is a restriction. But also (3) implies (2), because we can express the inner product with the norm alone, according to the polarization formula.
Let be a finite-dimensional -vector space, endowed with an inner product . Let
denote an endomorphism. Then a linear subspace is -invariant if and only if the orthogonal complement
is invariant under .Suppose that is invariant under . Let and . Then
The reverse statement follows, because the situation is symmetric, due to Corollary 32.13 (3) and Lemma 41.7 (3).
Due to
Exercise 42.7
,
is itself invariant under . However, this rests on
Theorem 42.9
below.
Let be a finite-dimensional -vector space, endowed with an inner product . Let
denote a normal endomorphism. Then
Proof
Let be a finite-dimensional -vector space, endowed with an inner product . Let
denote a
normal endomorphism. Then the following statements hold.- is an eigenvalue of if and only if is an eigenvalue of .
- A vector is an eigenvector of the eigenvalue if and only if is an eigenvector of of the eigenvalue.
Let
its kernel is the eigenspace of for the eigenvalue . The adjoint endomorphism of is, using Lemma 41.7 ,
Due to Exercise 42.26 , is also normal. Therefore, due to Lemma 42.7 , we have
Let be a finite-dimensional complex vector space, endowed with an inner product . Let
denote an endomorphism. Then is normal if and only if there exists an orthonormal basis consisting of eigenvectors
of .Suppose first that is an orthonormal basis of , where the are eigenvectors of . The describing matrix is a diagonal matrix, its diagonal entries are the eigenvalues. Due to Lemma 41.6 , the adjoint endomorphism is described by the conjugated-transposed matrix. Hence, this is also a diagonal matrix; therefore, it commutes with , and is normal.
We prove the converse statement by induction over the dimension of . So let be normal. The one-dimensional case is clear. Because of des Fundamental theorem of algebra, there exists an eigenvector of , and we may assume that it has norm . Due to Lemma 42.8 (2), is also an eigenvector of . This implies by Lemma 42.6 that is invariant under . Therefore, is the direct sum of the restrictions . Hence, the restriction of to is again normal, and the induction hypothesis yields the claim.
The preceding statement does not hold over the real numbers, as every plane rotation (with the exception of the identity and the point reflection) shows.
- Principal axis transformation
We want to apply the results of the previous lecture to Hermitian forms. This technique is called principal axis transformation, these terms will become clear in the next lecture. As we are working over the complex numbers, we have to mention briefly that the concepts positive definite, negative definite, and type, which we have defined for a real-symmetric bilinear form, carry over directly to complex-hermitian sesquilinear forms. In the same spirit, Sylvester's law of inertia holds again, with the same proof; also the minor criterion holds.
Let be a finite-dimensional complex vector space, endowed with an Hermitian sesquilinear form of type . Then the Gram matrix of with respect to every orthogonal basis is a diagonal matrix
with positive real and negative real entries.This implies immediately that, if we start with a real-symmetric bilinear form on , and if we consider it as an Hermitian sesquilinear form on , then the real type coincides with the complex type. The big advantage of the complex situation is that the fundamental theorem of algebra is available, and this guarantees the existence of eigenvalues. Even if we know, like in Lemma 41.10 , that all eigenvalues are real, their existence is only clear when working over the complex numbers.
Let be a finite-dimensional -vector space, endowed with an inner product. Let denote an Hermitian form on , corresponding to the self-adjoint endomorphism
in the sense of Lemma 41.12 . Let be the type of . Then is the number of positive eigenvalues, and is the number of negative eigenvalues of , where we have to take this numbers with their (algebraic or geometric)
multiplicity.According to Lemma 41.10 , the characteristic polynomial of splits into real linear factors. Let be the positive zeroes and let be the negative zeroes. Due to Theorem 41.11 , we have a direct sum decomposition
which is orthogonal with respect to the inner product ( might be zero). For vectors and from different eigenspaces, we have
therefore, the eigenspaces are also orthogonal with respect to the form . For
with , we have
This means that on this linear subspace, the restricted form is positive definite; hence,
If were strictly larger that this dimension, there existed a -dimensional linear subspace such that the restriction of to is positive definite. Because of Corollary 9.8 , we have
This yields a contradiction, since the form is negative semidefinite on the right-hand space. Therefore,
The argument for is the same.
We are now able to present the proof of
zum eigenvalue criterion
about the type of a real-symmetric bilinear form. This follows immediately from
Theorem 42.11
.
The following Theorem is called the principle axis theorem.
Let be a finite-dimensional -vector space, endowed with an inner product. Let denote an Hermitian form on . Then there exists an orthonormal basis of (with respect to the inner product), which is an orthogonal basis
with respect to .Due to Lemma 41.12 (2) and Lemma 41.12 (4), we can write
with some self-adjoint endomorphism
Because of Theorem 41.11 , there exists an orthonormal basis consisting of eigenvectors ot with the eigenvalues . For this basis, we have
Therefore, this basis is also an orthogonal basis with respect to .
In this context, the Eigenlines of are also called principal axes of , and the eigenvalues are also called principal values.
| << | Linear algebra (Osnabrück 2024-2025)/Part II | >> PDF-version of this lecture Exercise sheet for this lecture (PDF) |
|---|