User:IssaRice/Linear algebra/Riesz representation theorem

From Machinelearning

Let's take the case where V=Rn and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional T:RnR then we can write T as Tv=vu for some vector uRn. But we already know (from the correspondence between matrices and linear transformations) that we can represent T as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!

If σ=(e1,,en) is the standard basis of Rn and (1) is the standard basis of R, then Tv=[Tv](1)=[T]σ(1)[v]σ=[T]σ(1)[v]σ=[v]σ[T]σ(1)=v[T]σ(1).

So in the case V=Rn we can understand the Riesz representation theorem as saying something we already knew. What Riesz representation theorem does is extend this same sort of "representability" to all finite-dimensional inner product spaces V and all linear functionals T:VF.

this video talks about this for R2