User:IssaRice/Linear algebra/Classification of operators: Difference between revisions
No edit summary |
No edit summary |
||
(5 intermediate revisions by the same user not shown) | |||
Line 7: | Line 7: | ||
| <math>T</math> is diagonalizable || There exists a basis of <math>V</math> consisting of eigenvectors of <math>T</math> || <math>T</math> is diagonalizable (there exists a basis <math>\beta</math> of <math>V</math> with respect to which <math>[T]_\beta^\beta</math> is a diagonal matrix) || || || This basis is not unique because we can reorder the vectors and also scale eigenvectors by a non-zero number to obtain an eigenvector. But there are at most <math>\dim V</math> distinct eigenvalues so the diagonal matrix should be unique up to order? This result holds even if <math>V</math> is merely a vector space with any field of scalars. || If <math>T</math> is the identity map, then every non-zero vector <math>v \in V</math> is an eigenvector of <math>T</math> with eigenvalue <math>1</math> because <math>Tv = 1v</math>. Thus every basis <math>\beta = (v_1,\ldots,v_n)</math> diagonalizes <math>T</math>. The matrix of <math>T</math> with respect to <math>\beta</math> is the identity matrix. | | <math>T</math> is diagonalizable || There exists a basis of <math>V</math> consisting of eigenvectors of <math>T</math> || <math>T</math> is diagonalizable (there exists a basis <math>\beta</math> of <math>V</math> with respect to which <math>[T]_\beta^\beta</math> is a diagonal matrix) || || || This basis is not unique because we can reorder the vectors and also scale eigenvectors by a non-zero number to obtain an eigenvector. But there are at most <math>\dim V</math> distinct eigenvalues so the diagonal matrix should be unique up to order? This result holds even if <math>V</math> is merely a vector space with any field of scalars. || If <math>T</math> is the identity map, then every non-zero vector <math>v \in V</math> is an eigenvector of <math>T</math> with eigenvalue <math>1</math> because <math>Tv = 1v</math>. Thus every basis <math>\beta = (v_1,\ldots,v_n)</math> diagonalizes <math>T</math>. The matrix of <math>T</math> with respect to <math>\beta</math> is the identity matrix. | ||
|- | |- | ||
| <math>T</math> is normal || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> || <math>T</math> is diagonalizable using an orthonormal basis || || <math>TT^* = T^*T</math> | | <math>T</math> is normal || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> || <math>T</math> is diagonalizable using an orthonormal basis || || <math>TT^* = T^*T</math> || A normal operator has the additional property that it can be written as <math>T = S + A</math>, where <math>S</math> is a self-adjoint operator and <math>A</math> is an anti-self-adjoint operator, and where <math>S</math> and <math>A</math> are simultaneously diagonalizable using a single orthonormal basis || | ||
|- | |- | ||
| <math>T</math> self-adjoint ( | | <math>T</math> self-adjoint (aka Hermitian) || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> with real eigenvalues || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries are all real || || <math>T = T^*</math> || | ||
|- | |- | ||
| <math>T</math> is anti-self-adjoint || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> with pure imaginary eigenvalues || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries are all pure imaginary || || <math>T^* = -T</math> || | | <math>T</math> is anti-self-adjoint (aka skew-Hermitian or anti-Hermitian) || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> with pure imaginary eigenvalues || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries are all pure imaginary || || <math>T^* = -T</math> || || 90-degree rotation of the plane? | ||
|- | |- | ||
| <math>T</math> is an isometry (aka | | <math>T</math> is an isometry (aka unitary in a complex vector space, or orthogonal in a real vector space) || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> whose eigenvalues all have absolute value 1 || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries all have absolute values 1 || || <math>TT^* = T^*T = TT^{-1} = I</math> || This only works when the field of scalars is the complex numbers | ||
|- | |- | ||
| <math>T</math> is positive (positive semidefinite) || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> with nonnegative real eigenvalues || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries are all nonnegative real numbers || Polar decomposition says an arbitrary linear operator can be written as a positive operator followed by a rotation (isometry). In polar decomposition, the positive operator step chooses orthogonal directions in which to stretch or shrink, so that we have a tilted ellipse, and the isometry rotates that ellipse. So a positive operator is simply one that does not require the second step. In other words, for a positive operator you can find some orthogonal "coordinate axes" along which to scale. || || | | <math>T</math> is positive (positive semidefinite) || There exists an orthonormal basis of <math>V</math> consisting of eigenvectors of <math>T</math> with nonnegative real eigenvalues || <math>T</math> is diagonalizable using an orthonormal basis and the diagonal entries are all nonnegative real numbers || Polar decomposition says an arbitrary linear operator can be written as a positive operator followed by a rotation (isometry). In polar decomposition, the positive operator step chooses orthogonal directions in which to stretch or shrink, so that we have a tilted ellipse, and the isometry rotates that ellipse. So a positive operator is simply one that does not require the second step. In other words, for a positive operator you can find some orthogonal "coordinate axes" along which to scale. || || | ||
|} | |} | ||
Acknowledgments: Thanks to Philip B. for feedback on this page. |
Latest revision as of 13:45, 28 December 2021
Let be a finite-dimensional inner product space, and let be a linear transformation. Then in the table below, the statements within the same row are equivalent. Below, we consider only complex operators, or the complexification of a real operator.
Operator kind | Description in terms of eigenvectors | Description in terms of diagonalizability | Geometric interpretation | Algebraic property | Notes | Examples |
---|---|---|---|---|---|---|
is diagonalizable | There exists a basis of consisting of eigenvectors of | is diagonalizable (there exists a basis of with respect to which is a diagonal matrix) | This basis is not unique because we can reorder the vectors and also scale eigenvectors by a non-zero number to obtain an eigenvector. But there are at most distinct eigenvalues so the diagonal matrix should be unique up to order? This result holds even if is merely a vector space with any field of scalars. | If is the identity map, then every non-zero vector is an eigenvector of with eigenvalue because . Thus every basis diagonalizes . The matrix of with respect to is the identity matrix. | ||
is normal | There exists an orthonormal basis of consisting of eigenvectors of | is diagonalizable using an orthonormal basis | A normal operator has the additional property that it can be written as , where is a self-adjoint operator and is an anti-self-adjoint operator, and where and are simultaneously diagonalizable using a single orthonormal basis | |||
self-adjoint (aka Hermitian) | There exists an orthonormal basis of consisting of eigenvectors of with real eigenvalues | is diagonalizable using an orthonormal basis and the diagonal entries are all real | ||||
is anti-self-adjoint (aka skew-Hermitian or anti-Hermitian) | There exists an orthonormal basis of consisting of eigenvectors of with pure imaginary eigenvalues | is diagonalizable using an orthonormal basis and the diagonal entries are all pure imaginary | 90-degree rotation of the plane? | |||
is an isometry (aka unitary in a complex vector space, or orthogonal in a real vector space) | There exists an orthonormal basis of consisting of eigenvectors of whose eigenvalues all have absolute value 1 | is diagonalizable using an orthonormal basis and the diagonal entries all have absolute values 1 | This only works when the field of scalars is the complex numbers | |||
is positive (positive semidefinite) | There exists an orthonormal basis of consisting of eigenvectors of with nonnegative real eigenvalues | is diagonalizable using an orthonormal basis and the diagonal entries are all nonnegative real numbers | Polar decomposition says an arbitrary linear operator can be written as a positive operator followed by a rotation (isometry). In polar decomposition, the positive operator step chooses orthogonal directions in which to stretch or shrink, so that we have a tilted ellipse, and the isometry rotates that ellipse. So a positive operator is simply one that does not require the second step. In other words, for a positive operator you can find some orthogonal "coordinate axes" along which to scale. |
Acknowledgments: Thanks to Philip B. for feedback on this page.