User:IssaRice/Chain rule proofs: Difference between revisions

From Machinelearning
Line 17: Line 17:
whenever <math>|x-x_0|\leq \delta</math>. So let <math>\epsilon > 0</math>.
whenever <math>|x-x_0|\leq \delta</math>. So let <math>\epsilon > 0</math>.


<math>g(y) = g(y_0) + g'(y_0)(y - y_0) + E_g(y,y_0)</math>
Now we do some algebraic manipulation. Write


<math>g(f(x)) = g(f(x_0)) + g'(f(x_0))(f(x) - f(x_0)) + E_g(f(x),f(x_0))</math>
<math display="block">g(y) = g(y_0) + g'(y_0)(y - y_0) + E_g(y,y_0)</math>


<math>f(x) = f(x_0) + f'(x_0)(x - x_0) + E_f(x,x_0)</math>
where <math>E_g(y,y_0) := g(y) - (g(y_0) + g'(y_0)(y - y_0))</math>. This holds for every <math>y \in Y</math>. Since <math>f(x) \in Y</math> we thus have
 
<math display="block">g(f(x)) = g(f(x_0)) + g'(f(x_0))(f(x) - f(x_0)) + E_g(f(x),f(x_0))</math>
 
Similarly write
 
<math display="block">f(x) = f(x_0) + f'(x_0)(x - x_0) + E_f(x,x_0)</math>
 
where <math>E_f(x,x_0) := f(x) - (f(x_0) + f'(x_0)(x - x_0))</math>.


so
so

Revision as of 03:41, 28 November 2018

Using Newton's approximation

Main idea

The main idea of using Newton's approximation to prove the chain rule is that since f is differentiable at x0 we have the approximation f(x)f(x0)+f(x0)(xx0) when x is near x0. Similarly since g is differentiable at f(x0) we have the approximation g(y)g(f(x0))+g(f(x0))(yf(x0)) when y is near f(x0). Since f is differentiable at x0, it is continuous there also, so we know that f(x) is near f(x0) whenever x is near x0. This allows us to substitute f(x) into y whenever x is near x0. So we get

g(f(x))g(f(x0))+g'(f(x0))(f(x)f(x0))g(f(x0))+g'(f(x0))(f'(x0)(xx0))

Thus we get gf(x)gf(x0)+g(f(x0))f(x0)(xx0), which is what the chain rule says.

Proof

We want to show gf is differentiable at x0 with derivative L:=g(f(x0))f(x0). By Newton's approximation, this is equivalent to showing that for every ϵ>0 there exists δ>0 such that

|gf(x)(gf(x0)+L(xx0))|ϵ|xx0|

whenever |xx0|δ. So let ϵ>0.

Now we do some algebraic manipulation. Write

g(y)=g(y0)+g(y0)(yy0)+Eg(y,y0)

where Eg(y,y0):=g(y)(g(y0)+g(y0)(yy0)). This holds for every yY. Since f(x)Y we thus have

g(f(x))=g(f(x0))+g(f(x0))(f(x)f(x0))+Eg(f(x),f(x0))

Similarly write

f(x)=f(x0)+f(x0)(xx0)+Ef(x,x0)

where Ef(x,x0):=f(x)(f(x0)+f(x0)(xx0)).

so

g(f(x))=g(f(x0))+g'(f(x0))(f'(x0)(xx0)+Ef(x,x0))+Eg(f(x),f(x0))=g(f(x0))+g'(f(x0))f'(x0)(xx0)+g'(f(x0))Ef(x,x0)+Eg(f(x),f(x0))

we can rewrite this as gf(x)(gf(x0)+L(xx0))=g(f(x0))Ef(x,x0)+Eg(f(x),f(x0))

Thus our goal now is to show |g(f(x0))Ef(x,x0)+Eg(f(x),f(x0))|ϵ|xx0|

old proof

Since g is differentiable at y0, we know g(y0) is a real number, and we can write

g(y)=g(y0)+g(y0)(yy0)+[g(y)(g(y0)+g(y0)(yy0))]

(there is no magic: the terms just cancel out)

If we define Eg(y,y0):=g(y)(g(y0)+g(y0)(yy0)) we can write

g(y)=g(y0)+g(f(x0))(yy0)+Eg(y,y0)

Newton's approximation says that |Eg(y,y0)|ϵ|yy0| as long as |yy0|δ.

Since f is differentiable at x0, we know that it must be continuous at x0. This means we can keep |f(x)y0|δ as long as we keep |xx0|δ.

Since f(x)Y and |f(x)y0|δ, this means we can substitute y=f(x) and get

g(f(x))=g(y0)+g(f(x0))(f(x)y0)+Eg(f(x),y0)

Now we use the differentiability of f. We can write

f(x)=f(x0)+f(x0)(xx0)+[f(x)(f(x0)+f(x0)(xx0))]

Again, we can define Ef(x,x0):=f(x)(f(x0)+f(x0)(xx0)) and write this as

f(x)=f(x0)+f(x0)(xx0)+Ef(x,x0)

Now we can substitute this into the expression for g(f(x)) to get

g(f(x))=g(y0)+g(f(x0))(f(x0)(xx0)+Ef(x,x0))+Eg(f(x),f(x0))

where we have canceled out two terms using f(x0)=y0.

Thus we have

g(f(x))=g(y0)+g(f(x0))f(x0)(xx0)+[g(f(x0))Ef(x,x0)+Eg(f(x),f(x0))]

We can write this as

(gf)(x)((gf)(x0)+L(xx0))=[g(f(x0))Ef(x,x0)+Eg(f(x),f(x0))]

where L:=g(f(x0))f(x0). Now the left hand side looks like the expression in Newton's approximation. This means to show gf is differentiable at x0, we just need to show that |g(f(x0))Ef(x,x0)+Eg(f(x),f(x0))|ϵ|xx0|.

The stuff in square brackets is our "error term" for gf. Now we just need to make sure it is small, even after dividing by |xx0|.

But f is differentiable at x0, so by Newton's approximation,

|g(f(x0))Ef(x,x0)||g(f(x0))|ϵ1|xx0|

we also have

|Eg(f(x),f(x0))|ϵ2|f(x)f(x0)|=ϵ2|f(x0)(xx0)+Ef(x,x0)|

We can bound this from above using the triangle inequality:

|Eg(f(x),f(x0))|ϵ2|f'(x0)(xx0)|+ϵ2|Ef(x,x0)|ϵ2|f'(x0)||xx0|+ϵ2ϵ1|xx0|

Now we can just choose ϵ1,ϵ2 small enough.

Limits of sequences