User:IssaRice/Linear algebra/Riesz representation theorem

From Machinelearning

Let's take the case where V=Rn and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional T:RnR then we can write T as Tv=vu for some vector uRn. But we already know (from the correspondence between matrices and linear transformations) that we can represent T as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!

If σ=(e1,,en) is the standard basis, then Tv=[Tv]σ=[T]σσ[v]σ=[T]σσ[v]σ=[v]σ[T]σσ.