User:IssaRice/Linear algebra/Riesz representation theorem: Difference between revisions

From Machinelearning
No edit summary
No edit summary
Line 1: Line 1:
Let's take the case where <math>V = \mathbf R^n</math> and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional <math>T : \mathbf R^n \to \mathbf R</math> then we can write <math>T</math> as <math>Tv = v\cdot u</math> for some vector <math>u \in \mathbf R^n</math>. But we already know (from the correspondence between matrices and linear transformations) that we can represent <math>T</math> as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!
Let's take the case where <math>V = \mathbf R^n</math> and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional <math>T : \mathbf R^n \to \mathbf R</math> then we can write <math>T</math> as <math>Tv = v\cdot u</math> for some vector <math>u \in \mathbf R^n</math>. But we already know (from the correspondence between matrices and linear transformations) that we can represent <math>T</math> as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!


If <math>\sigma = (e_1, \ldots, e_n)</math> is the standard basis, then <math>Tv = [Tv]^\sigma = [T]_\sigma^\sigma [v]^\sigma = [T]_\sigma^\sigma \cdot [v]^\sigma = [v]^\sigma \cdot [T]_\sigma^\sigma = v \cdot [T]_\sigma^\sigma</math>.
If <math>\sigma = (e_1, \ldots, e_n)</math> is the standard basis of <math>\mathbf R^n</math> and <math>(1)</math> is the standard basis of <math>\mathbf R</math>, then <math>Tv = [Tv]^{(1)} = [T]_{(1)}^\sigma [v]^\sigma = [T]_{(1)}^\sigma \cdot [v]^\sigma = [v]^\sigma \cdot [T]_\sigma^{(1)} = v \cdot [T]_\sigma^{(1)}</math>.

Revision as of 17:33, 6 January 2019

Let's take the case where V=Rn and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional T:RnR then we can write T as Tv=vu for some vector uRn. But we already know (from the correspondence between matrices and linear transformations) that we can represent T as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!

If σ=(e1,,en) is the standard basis of Rn and (1) is the standard basis of R, then Tv=[Tv](1)=[T](1)σ[v]σ=[T](1)σ[v]σ=[v]σ[T]σ(1)=v[T]σ(1).