Suppose . The adjoint of is the unique function such that
Note that can also be expressed as the unique vector such that . It may perhaps be easier to think of the adjoint map this way, but it is often more useful to use the original definition.
Suppose . Then .
Proof:
An intuitive way to prove this would be to observe that is the composition of three functions that preserve vector addition, and it is also the composition of three maps that, though they do not all preserve scaling, two of them conjugate scalars and these two conjugations cancel each other out. On the other hand, we can use the inner product property to prove this. Let and and . Then
and
Suppose . Then a) b) for all c) d) e) where is the identity on f) If is invertible, then is invertible and
Proof:
a) For any
b) For any ,
c) Using the definition of the adjoint, sends each vector in to the unique vector such that Using this fact, we observe the following:
shows that d) This follows from the fact that Alternatively, using inner products we have:
for every , e) This follows from the fact that the dual of the identity is the identity (becuse precomposing with the identity does not change your functional). Alternatively, we have the following using inner products:
for every , f) This also follows from the inverse of the dual being the dual of the inverse for invertible linear maps. This can be shown with inner products, but we would only be using the following equalities, which prove this result (and similarl equalities prove that the dual of the inverse is the inverse of the dual.)
and
Suppose . Then a) b) c) d)
Proof:
We begin by proving part c): because for any ,
Now, b) is the orthogonal complement of both sides of c), and the remaining two statements are b) and c) with the switching of and .
The conjugate transpose of an matrix is the matrix that changes rows and columns (transpose) and conjugates each entry. Stated in matrix notation,
For any linear map and orthonormal bases for and , the corresponding matrix for with respect to these bases is the conjugate transpose of the matrix of with respect to these bases. Said with symbols, if and are orthonormal bases for and respectively, then
Proof:
Let , , , and . To find the th column of , we put into and find its coordinates in terms of . Because we're working with orthonormal bases, The th coordinate of this column vector can be found by projection: Thus, is the th entry of the th column of . Now observe what we get from using conjugate symmetry on this inner product:
By the same reasoning, this is the conjugate of the th coordinate of the th column of . Therefore and are conjugate transposes of each other.
An operator is self-adjoint if .
Results for :
Every eigenvalue of a self-adjoint operator is a real number.
Proof:
To prove this, we will show that conjugating an eigenvalue does not change it. Let be self-adjoint and let be an eigenvector with eigenvalue . Then
Because , .
Suppose is a complex inner product space and Then for every if and only if .
Proof:
This proof uses a generalization of the polarization identity for inner products, and it will first be proved as a claim. Claim: For any and any ,
Proof of claim: Take 4 times the sum and expand each of the inner products using conjugate linearity. You will see that the first terms ( with ) cancel for values of that are negatives of each other, the second terms ( with ) remain to equal 4 times the left hand side of the equation, the third terms ( with ) cancel for values of by and , and the fourth terms ( with ) cancel similarly to the first terms, for avlues of that are negatives of each other. There is a version with every term showing and color-coordinated cancellation in my notes, but the unabridged statement of this claim doesn't even fit in a regular computer browser, let alone the expanded versions. With the claim proved, we proceed: On the one hand, if is the zero map then for all . On the other hand, if for every , then for any we have
Let be a complex inner product space and Then is self-adjoint if and only if for every
Proof:
: This direction comes from conjugate symmetry. If is self-adjoint then for any ,
: Suppose that for every , . Then , and hence for every Because we are working over , this means is the zero operator, and therefore .
Result for :
Suppose is a self-adjoint operator on with the property that for every . Then
Proof:
Similarly to the complex case, we first prove a generalization of the real polarization identity. Claim: For every and any self-adjoint
Proof: There is room to write out the expansion in the real case.
With the claim proven, we can proceed. Let and let be self-adjoint. Then similarly to the corresponding proof, we have
Because is orthogonal to every , .
An operator on an inner product space is normal if it commutes with its adjoint. Using symbols, is normal if and only if
Note: This means that every self-adjoint operator is normal.
Let . is normal if and only if
for every
Proof:
Note first that being normal is equivalent to the difference operator being the zero operator. This difference is called the commutator and it is often compared to a trivial element to test whether or not two things commute. We also note that and are both self-adjoint, so the commutator is also self-adjoint, which we will need for the first implication in the case where . We are now ready to begin.
Let be a normal operator on an inner product space . Then a) b) c) d) is normal for every . e) If and , then
Proof:
a) if and only if because b) Because their kernels are the same, their ranges must be the same, as range is the orthogonal complement of the kernel. c) This follows from the orthogonal complement decomposition into a direct sum. d) Because and commute, all of the terms in commute with each other. Therefore and commute. e) Very similarly to part a), because
Let be a normal operator on . Then eigenvectors of with distinct eigenvalues are orthogonal.
Proof:
Let be eigenvectors of with distinct eigenvalues . Then
Because , .
Result for :
Suppose is an operator on a complex inner product space . Then is normal if and only if there exist commuting self-adjoint operators and such that .
Proof:
: Suppose is normal. Set and Then both operators are self-adjoint, and simplifies to In summary, and commuting made the difference zero, which in turn made the difference zero, so and commute. : Suppose for commuting self-adjoint operators and . Then Using a similar process to the other direction, we have and . Subsequently, Therefore is normal.