Dot Products and Approximations
1) Application: Function Approximation and Accuracy
Do you remember discovering the irrational number $\pi$?
How many digits did you end up learning?
No matter how many digits you learn, you still never have perfect accuracy. But if you emmorize 60 digits then your level of accuracy is close to the ratio of the largest distances in the known universe to the smallest.
We would like to model (approximate) the function $f(x) = e^x$ on the interval $[-1,1]$ with polynomials. We'll start by limiting our degree to 2 (kind of like accuracy to 2 decimal places).
What do you think is the most accurate way to approximate $e^x$ with a quadratic function $ax^2 + bx + c$?
2) Which Functions to Use
In the approximation above, the Taylor series taken at $x=0$ (also known as the McClaurin Series) was the best approximation if you get to zoom in to the point $(0,1)$ as much as you need to beat any other function. But the projection function, while not as close at $x=0$, did a much better job overall for the whole interval $[-1,1]$.
So why don't we learn this method instead of Taylor series? There are two reasons. One is that Taylor series is really important for convergence and has a lot of extensions in later Math classes. The second reason is why I didn't show you any calculations for the second approximation. The monomials $1, x, x^2$ aren't orthogonal to each other in the sense of our problem. This means some really tedious computations had to be done to get orthogonal versions of these.
Why are they complicated? Imagine learning the first 3 digits of $\pi$, and then when you get to the 4th digit you are told that now you have to go back and change some of the previous digits to keep getting more accurate. Yuck! Watch the polynomial coefficients below to see what I mean.
3) Sine and Cosine to the Rescue
What we will be studying in the near future is function approximation in the same sense as the previous example: How well our approximations stay close to a given function for the whole interval and not just a single point.
It turns out that while monomial terms like $x^n$ are not very fun to work with in this sense, functions of the form $\sin(nx)$ and $\cos(nx)$ work incredibly well for this very thing, and there are a lot of applications.