An operator is positive if is self-adjoint and for every
An operator is a square root of an operator if
Let . The following are equivalent. a) is a positive operator. b) is self-adjoint and all eigenvalues of are non-negative. c) has a diagonal matrix consisting of only non-negative diagonals with respect to some orthonormal basis. d) has a positive square root. e) has a self-adjoint square root. f) for some .
Proof:
a) b): To show that must have non-negative eigenvalues, we note that because for every eigenvector this means that and must have the same sign, as b) c): This follows from the spectral theorem, as being self-adjoint means there is diagonal matrix for with respect to an orthonormal basis. The non-negative eigenvalues are, of course, the diagonal entries. c) d): Since we have a diagonal matrix for with non-negative diagonal entries by hypothesis, we can construct a diagonal matrix with square roots of each of these entries on the diagonal. This is a square root for the diagonal matrix, so its associated operator is a square root of . Furthermore, it is positive because its own conjugate transpose and every inner product of the form is the standard dot product, which results in a sum of products of non-negative diagonal entries and squares, which are never negative. d) e): This follows because positive operators are already self-adjoint. Note that the converse, on the other hand, is proven in 5 steps in this proof. e) f): Setting to a self-adjoint square root of that exists by hypothesis, we have f) a): Every operator of the form is self-adjoint, even if is just a linear map. So we will check that is positive. Let . Then
Every positive operator on has a unique positive square root.
Proof:
With existence proven in the last result, we will prove uniqueness. The strategy for the proof will be in showing that the behavior of R is uniquely determined on eigenvectors. Because has an orthonormal basis of eigenvectors of , this will be sufficient. Suppose and are both positive operators, and suppose is an eigenvector of with eigenvalue Let be an orthonormal basis with respect to which has a diagonal matrix. So where the 's are the non-negative diagonal entries of 's matrix, and by extension
Because we also have , this gives us a comparison of coefficients for linearly independent vectors (which must be the same):
For all terms where this means , leaving us with only basis vectors that share the common eigenvalue Therefore, must scale by the scalar uniquely determining 's behavior on the eigenvectors of in .