July 3, 2012 2 Comments
Most courses in complex analysis start with a definition of , it’s topological structure, and so on. I’m going to assume you know that is a field, that it is a complete metric space, and how all that junk works (i.e., how to multiply complex numbers, compute distances, etc.). You may also know that is algebraically closed, but we’ll prove that later. The most important thing to remember is that and are virtually identical. You can see the correspondenc: is like the point . The only difference between the two is that I know how to multiply in , but I know how to do no such thing in . We’ll be using this identification from time to time to pull out theorems from real analysis. But anyway. On to the complex stuff.
Firstly, I suppose we should define a derivative. If is an open subset of , and , then we say is differentiable at if
exists. Notice that we require the limit to exist no matter which way we take to . Straight lines, loopy spirals, whatever. They all have to converge to the same value. As you might expect, whatever complex number the limit might be, we say that , or sometimes .
It’s worth mentioning as well that all the basic rules about computation go through exactly as you would expect. Differentiation is a linear operator; it satisfies the product rule, quotient rule, and chain rule. The proofs of these facts are all nearly identical to their counterparts over . Because of this, these facts don’t really qualify as complex analysis, and so I won’t bother to prove them here.
What I want to discuss is the question “how nice can a function be?” Of course, functions can be hideously discontinuous, but forget about those. In fact, let’s only worry about the functions that are continuous and differentiable. That’s a pretty nice class of functions that one might guess has some nice properties we could discover. What about functions that have two derivatives? That is, functions for which and both exist? Or -th order derivatives for some ? Or functions that have derivatives of every order (i.e., are infinitely differentiable)?
This seems like more than we could every want to say about functions, but it isn’t. Another nice property a function could have is that it’s given locally by a power series. That is, for close enough to some point , we could have
where for each , . We call these functions complex analytic (or just analytic if the context is clear). Convince yourself (without really proving it) that convergent power series should be infinitely differentiable (right? They’re like infinite polynomials!).
Over the next several posts, we’ll be proving the following fun fact:
Let . The following are equivalent:
- is differentiable
- is infinitely differentiable
- is analytic
By the way, the theorems are automatically numbered, and I’m not sure yet how to control this. So in future posts, this might not be called “theorem 1.” Just a heads up.