Why don't power series seem to work for imaginary numbers some of the time. e.g.
There could be other ways though of expanding out ln(-1) apart from substituting in x = -2 to the ln(1 + x) expansion (which of course doesn't work for the reason stated by Anonymous).
In other words, if you want a Taylor expansion for log which is convergent at x = i. You must expand about a point closer to i than to 0. You could try this but doubtless the resulting series would be evil and you'd need to be a Professor of combinatorics to evaluate the coefficient of a general term.
I'm still not sure I entirely understand the reason why this method doesn't work though.
OK, so the problem is the one Anonymous
states. Quite simply a Maclaurin series expansion won't always
converge. Another example:
f(x) = 1/(1 + x)
So f(n) (x) = (-1)n n!/(1 +
x)n+1
So using the Maclaurin series we get:
f(x) = 1 - x + x2 - ...
Now this is only convergent if modulus(x) > 1 yet 1/(1 + x) is
defined everywhere except x = -1.
So the Maclaurin formula:
f(x) = f(0) + f'(0)x/1! + ...
only holds under certain conditions. I am not absolutely certain
what these conditions are, but normally it is a safe bet that if
the expansion converges, and the function is nicely behaved then
the Maclaurin expansion will work.
(Obviously the definition of a "nicely behaved function" is one
which the Maclaurin expansion holds for.)
Taking liberties with notation at x = -1, log(1+x) = infinity.
So the function log has a 'singularity' at 0 in the complex
plane.
If you draw a circle about the point 1 (the point about which we
are expanding the Taylor series) through 0 then the power series
converges only within the disc drawn.
Or at least if it does converge, it won't converge to the correct value.
I am not sure, but I don't think a series will converge at all outside its radius of convergence will it?
James - I don't believe it does converge
for x = -2. The series for ln(1 + x) is actually:
ln(1 + x) = x - x2 /2 + x3 /3 - ...
and if you substitute in x = -2, then the terms are tending to
infinity.
I think you're right actually, I think they do always diverge outside the radius of convergence; at a point on the boundary they may or may not - is that right?
Yeah, I reckon thats the way things are.
That they may or may not converge on the boundary is certainly true. (For example 1/(1 - x) at x = 1, or ln(1 + x) at x = 1 do and don't converge respectively.) Proving the other statement looks like an interesting challenge...
Aha - the ratio test gives a straightforward way of proving James' assertion.