IFRAME SYNC
IFRAME SYNC
IFRAME SYNC
IFRAME SYNC

What does it mean for a polynomial to be the 'best' approximation of a function around a point? https://ift.tt/eA8V8J

I think I understand how to use Taylor polynomials to approximate sine. For instance, if $$ \sin x \approx ax^2+bx+c $$ and we want the approximation to be particularly accurate when $x$ is close to $0$, then we could adopt the following approach. When $x=0$, $\sin x = 0$, and so $ax^2+bx+c=0$, meaning that $c=0$. Therefore we get $$ \sin x \approx ax^2+bx $$ If we want the first derivatives to match, then $\frac{d}{dx}(ax^2+bx)$ should equal $1$. Therefore, $b=1$: $$ \sin x \approx ax^2+x $$ Finally, if we want the second derivatives to match, then $\frac{d^2}{dx^2}(ax^2+x)$ should equal $0$, and so $a=0$. The small angle approximation for sine is $$ \sin x \approx x $$ All of this makes sense to me. What I don't understand is when people try to put this on rigorous footing. I have often heard people say 'this shows that $x$ is the best quadratic approximation of $\sin x$ when $x$ is near to $0$'. But what is meant by 'best', and 'near'? If the approximation suddenly became terrible when $x=0.5$, then would this be considered close enough to $0$ for there to be a problem? It seems that there are formal definitions for these terms, but I don't know what they are.



from Hot Weekly Questions - Mathematics Stack Exchange
Joe

Post a Comment

[blogger]

Contact Form

Name

Email *

Message *

copyrighted to mathematicianadda.com. Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget

Blog Archive