User:IssaRice/Linear algebra/Trace of matrix equals sum of eigenvalues

From Machinelearning
Revision as of 04:56, 6 February 2021 by IssaRice (talk | contribs)

This is chapter 4 exercise 1.11 in Linear Algebra Done Wrong (p. 105).

Theorem. Let A be a matrix. Then traceA=λ1++λn, where λ1,,λn are the eigenvalues of A (counting multiplicities).

Proof. We already know that det(AλI)=(λ1λ)(λnλ). Multiplying out the right hand side, the λn1 term is λ1(λ)n1+λ2(λ)n1++λn(λ)n1=(λ1++λn)(1)n1λn1.

Now consider equation 4.2 from the book (p. 89):

detA=σPerm(n)aσ(1),1aσ(2),2aσ(n),nsign(σ)

The matrix AλI has the form:

(a1,1λ*an,nλ)

Taking the identity permutation we get the term (a1,1λ)(an,nλ). All the rest of the permutations correspond to terms that can take at most n2 of the elements along the diagonal, so will result in a term (when multiplied out) that will have degree at most n2 (why n2 rather than n1? because if we take row k off in column j!=k, then when we get to column k, we can't take row j either, since a permutation can take at most one element from each row).

So we can write det(AλI)=(a1,1λ)(an,nλ)+q(λ), where q(λ) has degree at most n2.

Now if we look at the λn1 term in (a1,1λ)(an,nλ)+q(λ) we get (a1,1++an,n)(1)n1λn1. This means that λ1++λn=a1,1++an,n=traceA as required.

See also