stoutfellow (
stoutfellow) wrote2008-10-07 11:06 am
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Entry tags:
Ramble, Part 59: The Power of Series
I was going to talk about Joseph Fourier next, but I think I'd better drop back a bit and discuss the importance of power series in analysis first.
When Leonhard Euler introduced the concept of a function, his definition was more or less rule-based: a function f(x) was defined by giving a method of constructing f(x) from x. The easiest such rules are those given by polynomials - sums of constant multiples of powers of x, such as x2+3x-2 or 3x100+24. Now, polynomials are nice. You can evaluate them with a four-function calculator, or with pen and paper; it may be tedious, but the procedure is straightforward and the results as exact as the inputs. Other sorts of function - the trig functions, exponential and logarithm functions, and so on - are much harder to work with, and in the pre-electronic age were generally dealt with by means of tables. Furthermore, polynomials are easy to manipulate. If, say, you multiply sin x by cos x, you get sin x cos x, and there's not much more to say about it; but the product of x2-x-1 with x2+x-3 simplifies quickly to x4-5x2+2x+3. (OK, so it doesn't look much simpler, but in terms of computation, with paper or calculator, it's a bit easier to deal with.)
Brook Taylor's discovery of Taylor polynomials and Taylor series made it possible, in many cases, to apply the ease of computation with polynomials to computations with other sorts of function. Basically, he showed that many familiar functions could be represented as power series - as limits of polynomial functions. A power series can be thought of as a polynomial with infinitely many terms, and manipulating them is no harder (in principle) than dealing with polynomials. (For example, sin x has Taylor series x-x3/6+..., and cos x is 1-x2/2+...; the Taylor series of their product can be obtained by simple multiplication, beginning x-2x3/3+....) Though this method does not allow for exact computation of the values of, e.g., sin x, still, it can be used to produce arbitrarily good approximations.
Joseph Lagrange attempted to shore up the shaky foundations of Newtonian calculus by means of Taylor series - or, more generally, power series; rather than begin with derivatives and construct the Taylor series, he proposed starting with the representation of functions by power series and define the derivative from that. The attempt was unsuccessful, since the core of the problem had to do with the inadequacy of Newton's treatment of limits, and limits are just as necessary to deal with series as they are for derivatives.
Nonetheless, the treatment of functions by means of power series proved very useful. Functions which can be represented by power series are (if I may use the word again) nice; they are differentiable arbitrarily many times, the location of their points of discontinuity can be read off from the series, and they can be algebraically manipulated almost as easily as polynomials. In the hands of Karl Weierstrass in the nineteenth century, power series provided one of the two main ways of thinking about analysis in the complex numbers, and - to my mind - the more beautiful way.
However, the very niceness of this approach proved a handicap, as there are other functions, not representable in this way, that turn out to be useful in both pure and applied mathematics; it was the work of Joseph Fourier that showed the way to a more difficult, but more rewarding, line of attack.
Ramble Contents
When Leonhard Euler introduced the concept of a function, his definition was more or less rule-based: a function f(x) was defined by giving a method of constructing f(x) from x. The easiest such rules are those given by polynomials - sums of constant multiples of powers of x, such as x2+3x-2 or 3x100+24. Now, polynomials are nice. You can evaluate them with a four-function calculator, or with pen and paper; it may be tedious, but the procedure is straightforward and the results as exact as the inputs. Other sorts of function - the trig functions, exponential and logarithm functions, and so on - are much harder to work with, and in the pre-electronic age were generally dealt with by means of tables. Furthermore, polynomials are easy to manipulate. If, say, you multiply sin x by cos x, you get sin x cos x, and there's not much more to say about it; but the product of x2-x-1 with x2+x-3 simplifies quickly to x4-5x2+2x+3. (OK, so it doesn't look much simpler, but in terms of computation, with paper or calculator, it's a bit easier to deal with.)
Brook Taylor's discovery of Taylor polynomials and Taylor series made it possible, in many cases, to apply the ease of computation with polynomials to computations with other sorts of function. Basically, he showed that many familiar functions could be represented as power series - as limits of polynomial functions. A power series can be thought of as a polynomial with infinitely many terms, and manipulating them is no harder (in principle) than dealing with polynomials. (For example, sin x has Taylor series x-x3/6+..., and cos x is 1-x2/2+...; the Taylor series of their product can be obtained by simple multiplication, beginning x-2x3/3+....) Though this method does not allow for exact computation of the values of, e.g., sin x, still, it can be used to produce arbitrarily good approximations.
Joseph Lagrange attempted to shore up the shaky foundations of Newtonian calculus by means of Taylor series - or, more generally, power series; rather than begin with derivatives and construct the Taylor series, he proposed starting with the representation of functions by power series and define the derivative from that. The attempt was unsuccessful, since the core of the problem had to do with the inadequacy of Newton's treatment of limits, and limits are just as necessary to deal with series as they are for derivatives.
Nonetheless, the treatment of functions by means of power series proved very useful. Functions which can be represented by power series are (if I may use the word again) nice; they are differentiable arbitrarily many times, the location of their points of discontinuity can be read off from the series, and they can be algebraically manipulated almost as easily as polynomials. In the hands of Karl Weierstrass in the nineteenth century, power series provided one of the two main ways of thinking about analysis in the complex numbers, and - to my mind - the more beautiful way.
However, the very niceness of this approach proved a handicap, as there are other functions, not representable in this way, that turn out to be useful in both pure and applied mathematics; it was the work of Joseph Fourier that showed the way to a more difficult, but more rewarding, line of attack.
Ramble Contents