Series for
By Brad Rodgers on Wednesday, June 20,
2001 - 02:26 am:
What is the series expansion for
? Please explain how you obtain it.
Thanks,
Brad
[Editor: See Gamma and
Beta functions for a discussion of the above
function.]
By David Loeffler on Thursday, June 21,
2001 - 10:31 pm:
according to MAPLE.
(MAPLE also offers
, which can quite easily be proved
from the recurrence for the gamma function. So for integer
,
is
the sum of
terms of the harmonic series minus
.
By Michael Doré on Friday, June 22, 2001 - 10:21 pm:
Interesting. Well we know that:
(this is provable by taking logs with Euler's infinite sine product and then
differentiating).
If you break up each term in the series using a geometric progression you
arrive at:
Now using David's formula we get:
Also:
So it looks to me like:
Is this true?
By David Loeffler on Friday, June 22, 2001
- 11:10 pm:
Are you sure about that expansion of the cot function? There
seems to be some problem with the constants (did you forget the
negative n terms in the product?) The result is actually
(again with some help from MAPLE and its ever-useful series() function.)
Can anyone see how we would prove that
? If we could show
that
it would follow (and this avoids using
any complicated properties of gamma). Any ideas?
By David Loeffler on Friday, June 22, 2001
- 11:32 pm:
Err - sorry to answer my own question, but I have found an
argument that sort of suggests why it might be true but an
analyst would probably pick holes in it very rapidly.
We have
By Stirling's formula
is
asymptotically
, so this is
so
as required.
Any ideas how we make this properly rigorous?
David
By Brad Rodgers on Saturday, June 23, 2001
- 05:14 am:
I doubt that this shows any promise, but its at least a bit
interesting:
From
;
which by integration by parts,
As
And as
#(sparing the case of
)#
We know that
Which, (aside from being a peculiar result) if someone can relate to
,
we can have a reasonably rigorous proof.
David, what's Stirling's Theorem?
Thanks,
Brad
By Michael Doré on Saturday, June 23, 2001 - 12:34 pm:
Stirling's theorem is:
is asymptotic to
(We say
is asymptotic to
if and only if
as
)
Proof of Stirling's theorem is as follows.
For
we have:
Factorise:
Divide through by
which is positive:
Integrate this from 0 to
to get:
(*)
(Can you see why this step is valid? It may help to draw a diagram.)
Now let
.
Using (*) we get:
It follows that for
we have:
Therefore
is decreasing and if you add the inequality to itself for
successive values of
you obtain:
so
is bounded below. But it is an axiom of real analysis that any
sequence which is bounded below and decreasing is convergent. Hence
for some
. We then have:
so
is asymptotic to
(**)
for some
. It suffices to show that
. To do this let
. Integration by parts gives
.
So we have:
and
Since
is decreasing we get:
Hence
as
. Plug in (**) and you get
as required.
By David Loeffler on Saturday, June 23,
2001 - 11:24 pm:
Brad,
You have lost a
somewhere, since your series seems to converge to
. In fact it converges alarmingly fast, with error about
for
.
(As for the presentation of the proof I think you have to let the upper
limit of the integrals be
, then let
at the end, otherwise
you are adding a series of terms all of which are actually infinity. But
that's a minor quibble.)
However, to show rigorously that your series is
looks like it
will be very difficult as so little is known about
other than its
definition. That is why I was forced to prove that
above
without actually mentioning
in the main body of the proof.
By David Loeffler on Saturday, June 23,
2001 - 11:54 pm:
Sorry, please ignore my comment about the presentation of the
proof as that's what you've already done.
By Brad Rodgers on Sunday, June 24, 2001 -
01:56 am:
Are you sure I've forgot a ln somewhere? When you try the sum
for t=10, it works very well with the ln in there, but for t=100,
no such luck.(though it doesn't seem to work any better without
the ln) I wouldn't expect the series to converge all that
rapidly. I'll double check my work though.
Just wondering, what's unrigorous about your proof. Now that I
understand Stirlings theorem, it seems perfectly fine to me (I'm
certainly no analyst, though).
Thanks,
Brad
By Brad Rodgers on Sunday, June 24, 2001 -
03:31 am:
Sorry for the earlier post; I did forgot to put in a ln: where
one evaluates integral from infinity to 0 of 1/t, two
ln(infinity)'s are produced. So the result is

Brad
By Brad Rodgers on Sunday, June 24, 2001 -
03:34 am:
It's interesting that the calculations are so close for t=10,
but for t=100, they end up being so far off. It must be because
we have to wait for the a!a to be larger than the t^a, which ends
up giving too large a number for the t=100 for my computer to
use.