Infinite Integrals
By Andrew Hodges on Sunday, June 24, 2001
- 10:46 am:
If you integrate 1/x from 0 to infinity, you will get infinity
as your answer, even though 1/x is asymptotic to the x-axis. The
normal distribution function however, which is also asymptotic to
the x-axis, has a definite value, 0.5, for the integral from 0 to
infinity. Is there a way of telling which asymptotic functions
will have a definite sum to infinity from their equations? Is it
do with the rate at which they converge with the x-axis?
By Dan Goodman on Sunday, June 24, 2001
- 03:12 pm:
First of all, the integral of 1/x from 0 to 1 is infinity,
so I'll assume we're talking about an integral from 1 to infinity (or 0.0001
to infinity, or whatever) rather than 0 to infinity. I don't know of any
hard and fast rule which you can apply to test if an integral is finite, but
there are some useful quick methods. For example, if g(x) is a positive
function such that ò1¥ g(x) dx is infinite and hence
òR¥ g(x) dx is infinite for all R > 1, and if you have a function
f(x) (I'll assume f(x) > 0) such that f(x) > g(x) for all x > R for some
number R > 1, then ò1¥ f(x) dx=ò1R f(x)dx+òR¥ f(x) dx > òR¥ g(x) dx. So if we can say that f(x) is bigger than
g(x) for big enough x then ò0¥ f(x) dx is infinite. Suppose
h(x) is a function such that ò1¥ h(x) dx is finite. If
f(x) < h(x) for x > R and f(x) < M for x < R and some number M then
ò1¥ f(x) dx=ò1R f(x) dx+òR¥ f(x) dx < ò1R M dx+òR¥ h(x) dx which is finite.
So, let's take the function f(x)=x4 e-x as an example. For large
enough R, x4 < ex/2 (do you know how to prove this?) and so for x > R
f(x) < ex/2e-x=e-x/2, and we know ò1¥ e-x/2 dx is
finite. If x < R then x4 < R4 and e-x < 1 so f(x) < R4. So
ò1¥ x4 e-x dx is finite (using the above).
Good functions h(x) whose integrals from 1 to infinity converge are xt
for t < -1, e-x and e-x2. The best function g(x) whose integral
from 1 to infinity is infinite is 1/x. This collection of functions is
usually enough to prove that any integral of a positive function that you
come across converges or diverges. It's more difficult to show things
for functions which are sometimes positive and sometimes negative, but if you
can show that ò1¥ |f(x)| dx is finite then also
ò1¥ f(x) dx is finite.
By Andrew Hodges on Sunday, June 24, 2001
- 09:37 pm:
Could you explain how to prove ex/2 >
x4 for some number R?
By Dan Goodman on Monday, June 25, 2001 -
12:52 am:
Hmm, it depends on what you know. Do you
know that ex =1+x+x2 /2+...+xn
/n!+... ? If not, write back and I'll see if I can prove it
another way. If so, then for x> 0 we have ex >
xn+1 /(n+1)! (since this is just one term of the
series). If x> (n+1)! then ex >
(x/(n+1)!)xn > xn since x/(n+1)!> 1.
So for any integer n, if x> (n+1)! then ex >
xn . Obviously if you are thinking about x/2 instead
of x you have to make a slight change to that argument.