Linear algebra (Osnabrück 2024-2025)/Part II/Lecture 41

< Linear algebra (Osnabrück 2024-2025) < Part II

A bilinear form or a sesquilinear form on an -dimensional -vector space is described, with respect to a given basis, by its Gram matrix. Also, a linear mapping from to is described by a matrix. Altogether, we have a correspondence (for )

On the left-hand side, properties like symmetric, hermitian, positive definite are relevant; on the right-hand side, we have eigenvalues, eigenspaces, characteristic polynomial. How are these two worlds related? In the next lectures we will deal with these questions. For this, we will work with a direct correspondence between the left-hand and the right-hand side which comes from an inner product, not from a fixed basis.



Adjoint endomorphism

Let be a -vector space, endowed with an inner product , and let

denate an endomorphism. An endomorphism

is called adjoint to , if

holds for all

.

For an isometry

on a Euclidean vector space, the inverse mapping is the adjoint endomorphism. In this case, we have indeed


For a homothety with scaling factor on a -vector space , endowed with an inner product, the homothety with scaling factor is the adjoint mapping. Indeed, we have


Suppose that for the linear mapping

there exists an orthonormal basis (with respect to the standard inner product) consisting of eigenvectors; that is, the describing matrix with respect to this basis is in diagonal form

Then the adjoint endomorphism is described by the complex-conjugated matrix

Indeed, on one hand we have

and on the other hand we have

For , we have on both sides, and for , we have on both sides.


Let be a finite-dimensional -vector space, endowed with an inner product , and let

denote an endomorphism. Then there exists a uniquely determined adjoint endomorphism

of .

Let

be given, and let be fixed. Then, the mapping

is a linear form on . Therefore, there exists (due to Corollary 38.6 in the real case; for the complex case see Exercise 41.16 ) a right gradient in (uniquely determined by and ) fulfilling

We have to show that the assignment

is linear. We have

As this holds for all , we have

Moreover,

As this holds for all , we get


As in the proof of this theorem, we denote the adjoint endomorphism by . If we denote the assignment that sends a vector to the linear form , by , then we have

where

is the dual mapping.


Let be a finite-dimensional -vector space, endowed with an inner product . Let

be an endomorphism, and suppose that it is described by the matrix with respect to the orthonormal basis . Then the adjoint endomorphism

is described by the matrix with respect to this basis .

Let be the orthonormal basis, and let

and

be the matrices of and of with respect to this basis. This means that we have in particular

and

Due to adjointness, the relation

holds. That is

and vice versa.



Let be a finite-dimensional -vector space, endowed with an inner product . Then the adjoint endomorphism fulfills the following properties

(here, denote endomorphisms).

Proof



Self-adjoint endomorphisms

Let be a -vector space, endowed with an inner product, and let

be an endomorphism. Then is called self-adjoint if

holds for all

.

This property simply means

A homothety is self-djoint if and only if the scaling factor is real.


Let be a finite-dimensional -vector space, endowed with an inner product , and let

denote an endomorphism. Then is self-adjoint if and only if it is described by an Hermitian matrix with respect to some (every) orthonormal basis

of .

If is self-adjoint, then the statement follows from Lemma 41.6 . If is described by an Hermitian matrix with respect to a orthonormal basis, then, again by Lemma 41.6 , the adjoint endomorphism is described by

with respect to this basis. Therefore, it coincides with .



Let be a -vector space, endowed with an inner product, and let

be a

self-adjoint endomorphism. Then the following statements hold.
  1. For a -invariant linear subspace , also the orthogonal complement is -invariant.
  2. All eigenvalues are real.
  3. The eigenspaces for different eigenvalues are orthogonal to each other.
  4. Let be finite-dimensional. Then the characteristic polynomial of splits into linear factors.
  1. Let , and . Because of the invariance of , we have . Therefore,

    This means that is orthogonal to ; thus, it belongs to , and this means the invariance.

  2. This is only relevant in case . Let be an eigenvalue, and let be an eigenvector, that is,

    We may assume that this eigenvector is normed. We have

    therefore, is real.

  3. Let be an eigenvector for the eigenvalue , and let be an eigenvector for the eigenvalue . Then

    This is only possible for

  4. We may assume that , endowed with the standard inner product. In case , the statement is known; so suppose that . We may consider the mapping also as a mapping from to . This map is again self-adjoint, and the characteristic polynomial does not change. Therefore, it splits into linear factors, and the zeroes are real by (2).


The following statement is called Spectral theorem for self-adjoint endomorphisms.


Let be a -vector space, endowed with an inner product, and let

be a self-adjoint endomorphism. Then there exists an orthonormal basis of consisting of eigenvectors

of .

We do induction over the dimension of . Because of Lemma 41.10   (4), has an eigenvector ; we may assume that this vector is normed. Due to Lemma 41.10   (1), the orthogonal complement

is also invariant. Hence, we have a direct sum decomposition

The restriction of to is also self-adjoint. Therefore, the induction hypothesis yields the claim.

In particular, a self-adjoint endomorphism is diagonalizable.



Self-adjoint endomorphisms and hermitian forms

Let be a -vector space, endowed with an inner product. An endomorphism

induces, with the help of the inner product, a form defined by

The following properties hold for this form.


Let be a -vector space, endowed with an

inner product. Then the following statements hold.
  1. The assignment

    assigns to an endomorphism a sesquilinear form. Hence,

  2. This assignment is linear; it is bijective if has finite dimension.
  3. Let be finite-dimensional. The endomorphism is bijective if and only if is not degenerate.
  4. Let be finite-dimensional. The endomorphism is self-adjoint if and only if is Hermitian.
  1. We have

    and

    that is, the assignment is linear in the first component, and antilinear in the second component. Therefore, is a sesquilinear form.

  2. The linearity follows from the linearity of the inner product in the first component. In the finite-dimensional case, we have on the left-hand side and on the right-hand side vector spaces of the dimension ; therefore, it is enough to show injectivity. If is the zero form, then for all . In particular, , which implies .
  3. If is not bijective, then let , . Then, is the zero mapping in the second component, and the form is degenerate. To prove the converse, suppose that is degenerate. Then there exists a vector , , such that is the zero-mapping. Since an inner product is nondegenerate, this implies , and is not bijective.
  4. In the self-adjoint case, we have

    The converse follows from


<< | Linear algebra (Osnabrück 2024-2025)/Part II | >>
PDF-version of this lecture
Exercise sheet for this lecture (PDF)