User:IssaRice/Linear algebra/Riesz representation theorem: Difference between revisions

From Machinelearning
(Created page with "Let's take the case where <math>V = \mathbf R^n</math> and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says tha...")
 
No edit summary
Line 1: Line 1:
Let's take the case where <math>V = \mathbf R^n</math> and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional <math>T : \mathbf R^n \to \mathbf R</math> then we can write <math>T</math> as <math>Tv = v\cdot u</math> for some vector <math>u \in \mathbf R^n</math>. But we already know (from the correspondence between matrices and linear transformations) that we can represent <math>T</math> as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!
Let's take the case where <math>V = \mathbf R^n</math> and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional <math>T : \mathbf R^n \to \mathbf R</math> then we can write <math>T</math> as <math>Tv = v\cdot u</math> for some vector <math>u \in \mathbf R^n</math>. But we already know (from the correspondence between matrices and linear transformations) that we can represent <math>T</math> as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!
If <math>\sigma = (e_1, \ldots, e_n)</math> is the standard basis, then <math>Tv = [Tv]^\sigma = [T]_\sigma^\sigma [v]^\sigma = [T]_\sigma^\sigma \cdot [v]^\sigma = [v]^\sigma \cdot [T]_\sigma^\sigma</math>.

Revision as of 17:30, 6 January 2019

Let's take the case where and the inner product is the usual dot product. What does the Riesz representation theorem say in this case? It says that if we have a linear functional then we can write as for some vector . But we already know (from the correspondence between matrices and linear transformations) that we can represent as a 1-by-n matrix. And v can be thought of as a n-by-1 matrix. And now the dot product is the same thing as the matrix multiplication!

If is the standard basis, then .