1. Basic Facts about the Gamma Function
The Gamma function is defined by the improper integral
Gamma(x) = INT_{0}^{INFTY}[t^{x} e^{-t} {dt}/{t} ] .
|
The integral is absolutely convergent for x >= 1 since
t^{x-1} e^{-t} <= e^{-t/2} , t ≫ 1
|
and INT_{0}^{INFTY}[e^{-t/2} dt ] is convergent. The preceding
inequality is valid, in fact, for all x. But for x < 1
the integrand becomes infinitely large as t approaches 0 through
positive values. Nonetheless, the limit
lim_{r --> 0+} INT_{r}^{1}[t^{x-1} e^{-t} dt ]
|
exists for x > 0 since
t^{x-1} e^{-t} <= t^{x-1}
|
for t > 0, and, therefore, the limiting value of the preceding integral
is no larger than that of
lim_{r --> 0+} INT_{r}^{1}[t^{x-1} dt ] = {1}/{x} .
|
Hence, Gamma(x) is defined by the
first formula above for all values x > 0 .
If one integrates by parts the integral
Gamma(x + 1) = INT_{0}^{INFTY}[t^{x} e^{-t} dt ] ,
|
writing
INT_{0}^{INFTY}[udv ] = u(INFTY)v(INFTY) - u(0)v(0) - INT_{0}^{INFTY}[vdu ] ,
|
with dv = e^{-t}dt and u = t^{x}, one obtains the functional
equation
Gamma(x+1) = x Gamma(x) , x > 0 .
|
Obviously, Gamma(1) = INT_{0}^{INFTY}[e^{-t} dt ] = 1 , and, therefore,
Gamma(2) = 1 · Gamma(1) = 1,
Gamma(3) = 2 · Gamma(2) = 2!,
Gamma(4) = 3 Gamma(3) = 3!, …, and, finally,
for each integer n > 0.
Thus, the gamma function provides a way of giving a meaning to the
“factorial” of any positive real number.
Another reason for interest in the gamma function is its relation
to integrals that arise in the study of probability. The graph of the
function varphi defined by
is the famous “bell-shaped curve” of probability theory. It can be
shown that the anti-derivatives of varphi are not expressible in
terms of elementary functions. On the other hand,
Phi(x) = INT_{-INFTY}^{x}[varphi(t) dt ]
|
is, by the fundamental theorem of calculus, an anti-derivative of
varphi, and information about its values is useful.
One finds that
Phi(INFTY) = INT_{-INFTY}^{INFTY}[e^{-t^{2}} dt ] = Gamma(1/2)
|
by observing that
INT_{-INFTY}^{INFTY}[e^{-t^{2}} dt ] = 2 · INT_{0}^{INFTY}[e^{-t^{2}} dt ] ,
|
and that upon making the substitution t = u^{1/2} in the latter
integral, one obtains Gamma(1/2).
To have some idea of the size of Gamma(1/2), it will be useful
to consider the qualitative nature of the graph of Gamma(x).
For that one wants to know the derivative of Gamma.
By definition Gamma(x) is an integral (a definite integral with
respect to the dummy variable t) of a function of x and
t. Intuition suggests that one ought to be able to find the
derivative of Gamma(x) by taking the integral (with respect to t)
of the derivative with respect to x of the integrand.
Unfortunately, there are examples where this fails to be correct; on
the other hand, it is correct in most situations where one is inclined
to do it. The methods required to justify “differentiation under the
integral sign” will be regarded as slightly beyond the scope of this
course. A similar stance will be adopted also for differentiation of
the sum of a convergent infinite series.
Since
{d}/{dx} t^{x} = t^{x}(log t) ,
|
one finds
{d}/{dx} Gamma(x) = INT_{0}^{INFTY}[t^{x} (log t) e^{-t} {dt}/{t} ] ,
|
and, differentiating again,
{d^{2}}/{dx^{2}} Gamma(x) = INT_{0}^{INFTY}[t^{x} (log t)^{2} e^{-t} {dt}/{t} ] .
|
One observes that in the integrals for both Gamma and the second
derivative Gamma^{″} the integrand is always positive. Consequently,
one has Gamma(x) > 0 and Gamma^{″}(x) > 0 for all x > 0. This
means that the derivative Gamma^{′} of Gamma is a strictly increasing
function; one would like to know where it becomes positive.
If one differentiates the functional equation
Gamma(x+1) = x Gamma(x) , x > 0 ,
|
one finds
psi(x+1) = {1}/{x} + psi(x) , x > 0 ,
|
where
psi(x) = {d}/{dx} logGamma(x) = {Gamma^{′}(x)}/{Gamma(x)} ,
|
and, consequently,
psi(n+1) = psi(1) + SUM_{k = 0}^{n}[{1}/{k} ] .
|
Since the harmonic series diverges, its partial sum in the foregoing
line approaches INFTY as x --> INFTY. Inasmuch as
Gamma^{′}(x) = psi(x)Gamma(x), it is clear that Gamma^{′} approaches
INFTY as x --> INFTY since Gamma^{′} is steadily increasing
and its integer values (n-1)!psi(n) approach INFTY.
Because 2 = Gamma(3) > 1 = Gamma(2),
it follows that Gamma^{′} cannot be negative everywhere in the interval
2 <= x <= 3, and, therefore, since Gamma^{′} is increasing, Gamma^{′}
must be always positive for x >= 3. As a result, Gamma must be
increasing for x >= 3, and, since Gamma(n + 1) = n!, one sees
that Gamma(x) approaches INFTY as x --> INFTY.
It is also the case that Gamma(x) approaches INFTY as
x --> 0. To see the convergence one observes that the
integral from
0 to INFTY defining Gamma(x) is greater than the integral from
0 to 1 of the same integrand. Since e^{-t} >= 1/e for
0 <= t <= 1, one has
Gamma(x)>INT_{0}^{1}[(1/e)t^{x-1}dt] = (1/e) |
| _{t = 0}^{t = 1} = {1}/{ex} .
|
It then follows from the mean value theorem combined with the fact that
Gamma^{′} always increases that Gamma^{′}(x) approaches -INFTY
as x --> 0.
Hence, there is a unique number c > 0 for which Gamma^{′}(c) = 0,
and Gamma decreases steadily from INFTY to the minimum value
Gamma(c) as x varies from 0 to c and then increases to
INFTY as x varies from c to INFTY. Since
Gamma(1) = 1 = Gamma(2), the number c must lie in the interval
from 1 to 2 and the minimum value Gamma(c) must be less than 1.
Figure 1: Graph of the Gamma Function |
Thus, the graph of Gamma (see Figure 1) is concave upward
and lies entirely in the first quadrant of the plane. It has the
y-axis as a vertical asymptote. It falls steadily for 0 < x < c
to a postive minimum value Gamma(c) < 1. For x > c the graph
rises rapidly.
2. Product Formulas
It will be recalled, as one may show using l'Hôpital's Rule,
that
e^{-t} = \underset{n --> INFTY}{lim}
|
| ^{n} .
|
From the original formula for Gamma(x), using an interchange of limits
that in a more careful exposition would receive further comment, one has
Gamma(x) = \underset{n --> INFTY}{lim}
Gamma(x,n) ,
|
where Gamma(x,n) is defined by
Gamma(x,n) = INT_{0}^{n}[t^{x-1} |
| ^{n} dt] , n >= 1 .
|
The substitution in which t is replaced by nt leads to
the formula
Gamma(x,n) = n^{x} INT_{0}^{1}[t^{x-1} (1 - t)^{n} dt] .
|
This integral for Gamma(x,n) is amenable to integration by parts.
One finds thereby:
Gamma(x,n) = {1}/{x} |
| ^{x+1}Gamma(x+1,n-1) ,n >= 2 .
|
For the smallest value of n, n = 1 , integration by parts yields:
Gamma(x,1) = {1}/{x(x+1)} .
|
Iterating n-1 times, one obtains:
Gamma(x,n) = n^{x} {n!}/{x(x+1)(x+2)⋯(x+n)} , n >= 1 .
|
Thus, one arrives at the formula
Gamma(x) = \underset{n --> INFTY}{lim}
n^{x} {n!}/{x(x+1)(x+2)⋯(x+n)} .
|
This last formula is not exactly in the form of an infinite product
PROD_{k = 1}^{INFTY}[p_{k} ] = \underset{n --> INFTY}{lim}
{PROD_{k = 1}^{n}[p_{k}]} .
|
But a simple trick enables one to maneuver it into such an infinite
product. One writes n as a “collapsing product”:
n+1 = {n+1}/{n}·{n}/{n-1}· ⋯ ·{3}/{2}·{2}/{1}
|
or
n+1 = PROD_{k = 1}^{n}[ |
| ] ,
|
and, taking the xth power, one has
(n+1)^{x} = PROD_{k = 1}^{n}[ |
| ^{x}] .
|
Since
lim_{n --> INFTY}{n^{x}}/{(n+1)^{x}} = 1 ,
|
one may replace the factor n^{x} by (n+1)^{x} in the last expression above
for Gamma(x) to obtain
Gamma(x) = {1}/{x}lim_{n --> INFTY}{PROD_{k = 1}^{n}[{ |
| ^{x}}/{ |
| }]} ,
|
or
Gamma(x) = {1}/{x}{PROD_{k = 1}^{INFTY}[{ |
| ^{x}}/{ |
| }]} .
|
The convergence of this infinite product for Gamma(x) when x > 0
is a consequence, through the various maneuvers performed, of the
convergence of the original improper integral defining Gamma(x) for
x > 0.
It is now possible to represent logGamma(x) as the sum of an infinite
series by taking the logarithm of the infinite product formula. But first
it must be noted that
{(1+t)^{r}}/{1 + rt} > 0 for t > 0 , r > 0 .
|
Hence, the logarithm of each term in the preceding infinite product is
defined when x > 0.
Taking the logarithm of the infinite product one finds:
log Gamma(x) = - log x + SUM_{k = 1}^{INFTY}[u_{k}(x)] ,
|
where
It is, in fact, almost true that this series converges absolutely for
all real values of x. The only problem with non-positive
values of x lies in the fact that log(x) is meaningful only for
x > 0, and, therefore, log(1+x/k) is meaningful only for
k > |x|. For fixed x, if one excludes the finite set of terms
u_{k}(x) for which k <= |x|, then the remaining “tail” of
the series is meaningful and is absolutely convergent.
To see this one applies the “ratio
comparison test” which says that an infinite series converges absolutely
if the ratio of the absolute value of its general term to the general
term of a convergent positive series exists and is finite. For this
one may take as the “test series”, the series
SUM_{k = 1}^{INFTY}[{1}/{k^{2}}] .
|
Now as k approaches INFTY, t = 1/k approaches 0,
and so
lim_{k --> INFTY}{u_{k}(x)}/{1/k^{2}} |
|
|
lim_{t --> 0}{xlog(1+t)-log(1+xt)}/{t^{2}} |
|
|
|
lim_{t --> 0}{{x}/{1+t}-{x}/{1+xt}}/{2t} |
|
|
|
lim_{t --> 0}{x[(1+xt)-(1+t)]}/{2t(1+t)(1+xt)} |
|
|
|
|
Hence, the limit of |u_{k}(x)/k^{-2}| is |x(x-1)/2|, and the series
SUM[u_{k}(x)] is absolutely convergent for all real x. The absolute
convergence of this series foreshadows the possibility of defining
Gamma(x) for all real values of x other than non-positive integers.
This may be done, for example, by using the functional equation
or
Gamma(x) = {1}/{x} Gamma(x + 1)
|
to define Gamma(x) for -1 < x < 0 and from there
to -2 < x < -1, etc.
Taking the derivative of the series for logGamma(x) term-by-term
– once again a step that would receive justification in a more careful
treatment – and recalling the previous notation psi(x) for the
derivative of logGamma(x), one obtains
|
|
|
|
|
\underset{n --> INFTY}{lim}
SUM_{k = 1}^{n}[ |
| ] |
|
|
|
\underset{n --> INFTY}{lim}
| { | log(n+1)-SUM_{k = 1}^{n}[{1}/{x+k}] | } |
| |
|
|
|
\underset{n --> INFTY}{lim}
| { | log(n+1)-SUM_{k = 1}^{n}[{1}/{k}] +SUM_{k = 1}^{n}[ |
| ] | } |
| |
|
|
|
\underset{n --> INFTY}{lim}
| { | log(n+1)-SUM_{k = 1}^{n}[{1}/{k}] +xSUM_{k = 1}^{n}[{1}/{k(x+k)}] | } |
| |
|
|
|
-gamma + xSUM_{k = 1}^{INFTY}[{1}/{k(x+k)}] , |
|
where gamma denotes Euler's constant
gamma = \underset{n --> INFTY}{lim}
| ( | SUM_{k = 1}^{n}[{1}/{k}] - log n | ) |
| .
|
When x = 1 one has
psi(1) = -1 - gamma + SUM_{k = 1}^{INFTY}[{1}/{k(k+1)}] ,
|
and since
{1}/{k(k+1)} = {1}/{k} - {1}/{k+1} ,
|
this series collapses and, therefore, is easily seen to sum to 1.
Hence,
psi(1) = - gamma , psi(2) = psi(1) + 1/1 = 1 - gamma .
|
Since Gamma^{′}(x) = psi(x)Gamma(x), one finds:
and
Gamma^{′}(2) = 1 - gamma .
|
These course notes were prepared while consulting standard references
in the subject, which included those that follow.