Analytic functions (part 1)
July 9, 2012 6 Comments
I am obligated to mention that I made a mistake when talking about holomorphic functions. To check that is holomorphic, you must check that is continuous and the partials , etc exist and satisfy the Cauchy-Riemann equations. I forgot to mention that needs to be continuous. The proof relies on the continuity, and I even used this fact in the examples.
Now let’s talk about analytic functions and power series. Let be an open set in , and let . Then we say is analytic if locally, it is given by a power series. This is to say, it doesn’t have to have a unique power series representation on all of , but if you pick a point , there’s a neighborhood around for which is given by a power series centered at . By power series centered at , I mean that, in this small neighborhood of ,
for . Don’t forget that even though we can pretend, we really don’t add up infinitely many terms. What we have really done is defined the finite sum , and let
The distinction is important.
But these power series don’t always converge. In the case of real power series, recall that , but this only makes sense where the right-hand side converges (between and ). A similar thing happens over .
For a power series like the one given above, define
The astute observer will recognize that this could be , or . In such a case, we just say that or respectively. We call the radius of convergence for reasons that the next theorem should make clear. For example, will mean the power series converges everywhere. There are a few points to note here. If the power series is finite, then for large enough, , meaning . This makes sense, because polynomials are well defined on all of . Also, recognize that this definition is perfectly valid for real power series. In particular, for , the radius of convergence is 1.
Lemma 1 (Root Test) Let be a power series, and define
Then if , converges. If , diverges. We make no claims about the case .
Proof: We need to treat the cases and separately. I won’t bother doing them at all, because they’re really much easier than the case where . Suppose first that . Then , meaning there are infinitely many for which . That is, there are infinitely many for which
Thus, the terms in the sum do not converge to zero, so there is no way the sum could converge.
On the other hand, if , then . From the definition of , this means there is some point after which all entries are less than . More technically, there is some number such that for each ,
We can even do a bit better than that. Just being less than , they could get arbitrarily close to . Of course, if that were the case, then would be the , which it’s not (it’s greater than the ). So in fact there is some real number for for which, whenever ,
Now, it follows that . Since , we have shown that the terms in the power series (after a certain point) grow slower in magnitude than a geometric series of radius less than one. Thus, must converge.
This was a proof of the root test, and a first step towards what we’ll prove next time: That analytic functions are holomorphic.