Roots of 1 + x + x^2/2! + … + x^n/n! (1)

Problem. Given an integer n \ge 0, consider the polynomial \displaystyle p(x):=\sum_{k=0}^n \frac{x^k}{k!}=1+x+\frac{x^2}{2!} + \cdots + \frac{x^n}{n!}.
Show that
i) if n is even, p(x) > 0 for all x and so p(x) has no real root,
ii) if n is odd, p(x) has exactly one real root and the root is < -1 for n > 1,
iii) if n is even, \displaystyle p(x) \ge \frac{1}{n!} for all x.

Solution. By Taylor’s theorem, for every real number x, there exists c between 0 and x such that \displaystyle e^x=p(x)+\frac{e^c}{(n+1)!}x^{n+1} and so
\displaystyle p(x)=e^x - \frac{e^c}{(n+1)!}x^{n+1}. \ \ \ \ \ \ \ \ \ \ \ \ (*)

i) Clearly p(x) > 0 for all x \ge 0. For x  < 0, there exists x < c < 0 such that (*) holds. Since x < 0 and n+1 is odd, x^{n+1} < 0 and so \displaystyle p(x)=e^x - \frac{e^c}{(n+1)!}x^{n+1} > 0.

ii) If n=1, then p(x)=1+x and so x=-1 is the root of p(x). Suppose now that n > 1. Since n-1 is even, \displaystyle p'(x)= \sum_{k=0}^{n-1} \frac{x^k}{k!}  has no real root, by i). Thus p(x) has at most one real root, by Rolle’s theorem. Since \displaystyle \lim_{x\to-\infty}p(x)=-\infty, (because n is odd) and p(0)=1 > 0, \ p(x) has a root in the interval (-\infty, 0), by the intermediate value theorem. So we have proved that p(x) has a unique real root \alpha and \alpha < 0.
To prove that \alpha < -1, we only need to show that p(-1) > 0. By (*), there exists -1 < c < 0 such that \displaystyle p(-1)=e^{-1} - \frac{e^c}{(n+1)!} and so, since e^c < 1 and n > 2, we have \displaystyle p(-1) > e^{-1}-\frac{1}{6} > 0.

iii) If n=0, then p(x)=1 and there’s nothing to prove. Suppose now that n > 0. By ii), p'(x) has a unique real root \alpha and \alpha \le -1. Since, by i), p''(x) > 0, \ p(x) attains its absolute minimum at x = \alpha. Thus \displaystyle p(x) \ge p(\alpha)=\frac{\alpha^n}{n!} \ge \frac{1}{n!}. \ \Box

In part (2) of this post, we are going to dig a little deeper and get better results.

Advertisements
Posted in Differentiation | Tagged , , | Leave a comment

The Stolz–Cesàro lemma

Problem (The Stolz–Cesàro lemma). Let \{a_n\}, \{b_n\} be two sequences of real numbers. Suppose that
i) b_n > 0 for all n and \{b_n\} is strictly increasing,
ii) \displaystyle \lim_{n\to\infty}b_n=\infty and \displaystyle \lim_{n\to\infty} \frac{a_n-a_{n-1}}{b_n-b_{n-1}}=\ell \in [-\infty,+\infty].
Show that \displaystyle \lim_{n\to\infty} \frac{a_n}{b_n}=\ell.

Solution. I’ll assume that \ell is a finite number; the proof for \ell = \pm \infty is similar (Exercise 1).
Let \epsilon > 0. Then, by the definition of limit, there exists an integer N such that if n > N, then
\displaystyle \ell - \epsilon < \frac{a_n-a_{n-1}}{b_n-b_{n-1}} < \ell + \epsilon. \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (1)
Since \{b_n\} is strictly increasing, b_n-b_{n-1} > 0 and so (1) gives \displaystyle (\ell - \epsilon)(b_n-b_{n-1}) < a_n-a_{n-1} < (\ell + \epsilon)(b_n-b_{n-1}),
for all n > N. Therefore
\displaystyle (\ell-\epsilon)\sum_{k=1}^{n-N} (b_{N+k}-b_{N+k-1})<\sum_{k=1}^{n-N}(a_{N+k}-a_{N+k-1}) < (\ell + \epsilon)\sum_{k=1}^{n-N} (b_{N+k}-b_{N+k-1})
Thus, since \displaystyle \sum_{k=1}^{n-N} (b_{N+k}-b_{N+k-1})=b_n-b_N and \displaystyle \sum_{k=1}^{n-N}(a_{N+k}-a_{N+k-1})=a_n-a_N, we get
\displaystyle (\ell-\epsilon)(b_n-b_N) < a_n-a_N < (\ell+\epsilon)(b_n-b_N),
which gives us
\displaystyle (\ell-\epsilon)\left(1-\frac{b_N}{b_n}\right)+\frac{a_N}{b_n} < \frac{a_n}{b_n} < (\ell + \epsilon)\left(1-\frac{b_N}{b_n}\right)+\frac{a_N}{b_n}. \ \ \ \ \ \ \ \ \ \ \ \  (2)
Since \displaystyle \lim_{n\to\infty}b_n=\infty, we have \displaystyle \lim_{n\to\infty} \frac{a_N}{b_n}=\lim_{n\to\infty}\frac{b_N}{b_n}=0 and so
\displaystyle \lim_{n\to\infty} (\ell \pm \epsilon)\left(1-\frac{b_N}{b_n}\right)+\frac{a_N}{b_n}=\ell \pm \epsilon.
Therefore if n is large enough, we will have
\displaystyle (\ell - \epsilon)\left(1-\frac{b_N}{b_n}\right)+\frac{a_N}{b_n} > \ell -2\epsilon, \ \  (\ell + \epsilon)\left(1-\frac{b_N}{b_n}\right)+\frac{a_N}{b_n} < \ell+2\epsilon
and then, by (2), \displaystyle \ell-2\epsilon < \frac{a_n}{b_n} < \ell + 2\epsilon, proving that \displaystyle \lim_{n\to\infty} \frac{a_n}{b_n}=\ell. \ \Box

Example 1. Let \{u_n\}, \{v_n\} be two sequences of real numbers and suppose that
i) v_n > 0 for all n and \displaystyle \sum_{n=1}^{\infty} v_n=\infty,
ii) \displaystyle \lim_{n\to\infty} \frac{u_n}{v_n}=\ell.
Show that \displaystyle \lim_{n\to\infty} \frac{u_1 + \cdots + u_n}{v_1 + \cdots + v_n}=\ell.

Solution. For n \ge 1, let \displaystyle a_n:=\sum_{k=1}^nu_k, \ b_n:=\sum_{k=1}^n v_k and apply the above problem. \Box

Example 2. Let \{x_n\} be a sequence of real numbers and suppose that \displaystyle \lim_{n\to\infty}x_n=x. Show that \displaystyle \lim_{n\to\infty} \frac{1}{n}\sum_{k=1}^n x_k=x and, if x_n > 0 for all n, \displaystyle \lim_{n\to\infty} \sqrt[n]{x_1 x_2 \cdots x_n}=x.

Solution. Both parts follow trivially from Example 1: for the first part, consider the sequences u_n:=x_n, \ v_n:=1 and for the second part, consider the sequences u_n:=\ln x_n, \ v_n:=1. \ \Box

Example 3 (see also this post!) . Show that \displaystyle \lim_{n\to\infty} \frac{\sqrt[n]{n!}}{n}=\frac{1}{e}.

Solution. Let \displaystyle x_n:=\left(1+\frac{1}{n}\right)^n, \ n \ge 1, and apply the second part of Example 2. \Box

Example 4. Show that \displaystyle \lim_{n\to\infty} \frac{1}{n^n} \sum_{k=1}^n k^k = 1.

Solution. Let \displaystyle a_n:=\sum_{k=1}^n k^k, \ b_n=n^n, \ n \ge 1. Then \displaystyle \lim_{n\to\infty} \frac{a_n-a_{n-1}}{b_n-b_{n-1}}=\lim_{n\to\infty} \frac{n^n}{n^n-(n-1)^{n-1}}=\lim_{n\to\infty} \frac{1}{1-\left(1-\frac{1}{n}\right)^n(n-1)^{-1}}=1,
because \displaystyle \lim_{n\to\infty} \left(1-\frac{1}{n}\right)^n=e^{-1} and \displaystyle \lim_{n\to\infty} (n-1)^{-1}=0. The result now follows from the Stolz–Cesàro lemma. \Box

Exercise 1. Prove the Stolz–Cesàro lemma for \ell = \pm \infty.

Exercise 2. Let \{x_n\} be a sequence of real numbers and suppose that \displaystyle \lim_{n\to\infty}nx_n=\ell. Show that \displaystyle \lim_{n\to\infty} \frac{1}{\ln n}\sum_{k=1}^nx_k=\ell.

Posted in Limit And Continuity, Sequences | Tagged | Leave a comment

Limit of integrals (8)

Problem 1. Consider the sequence \displaystyle I_n:=\frac{1}{n!}\int_n^{\infty}x^ne^{-x}dx. Show that \displaystyle \lim_{n\to\infty}I_n=\frac{1}{2}.

Solution. My solution is basically the same as how I proved Stirling’s formula; so you should consider this problem as a nice exercise of that post.
We begin with the substitution x=n(1+t) to get \displaystyle I_n=\frac{n^{n+1}e^{-n}}{n!}\int_0^{\infty}e^{nf(t)}dt, where f(t):=\ln(1+t)-t. Now choose any real number a such that \displaystyle \frac{1}{3} < a < \frac{1}{2}. Then
\displaystyle I_n=\frac{n^{n+\frac{1}{2}}e^{-n}}{n!}\left(\sqrt{n}\int_0^{\frac{1}{n^a}}e^{nf(t)}dt + \sqrt{n}\int_{\frac{1}{n^a}}^{\infty}e^{nf(t)}dt \right). \ \ \ \ \ \ \ \ \ \ \ \ (\star)
Now, by Stirling’s formula, we have \displaystyle \lim_{n\to\infty} \frac{n^{n+\frac{1}{2}}e^{-n}}{n!}=\frac{1}{\sqrt{2\pi}}. Also, by ii), iii) in the solution of Problem 2, \displaystyle \lim_{n\to\infty} \sqrt{n}\int_{\frac{1}{n^a}}^{\infty}e^{nf(t)}dt=0 and \displaystyle  \lim_{n\to\infty} \sqrt{n}\int_0^{\frac{1}{n^a}}e^{nf(t)}dt =\frac{\sqrt{2\pi}}{2}. The result now follows from (\star). \ \Box

Example. Show that \displaystyle \lim_{n\to\infty} e^{-n}\sum_{k=0}^n \frac{n^k}{k!}=\frac{1}{2}.

Solution. We show that \displaystyle e^{-n}\sum_{k=0}^n \frac{n^k}{k!}=\frac{1}{n!}\int_n^{\infty}x^ne^{-x}dx and so the limit is \displaystyle \frac{1}{2}, by Problem 1. To show that, consider the sequence \displaystyle c_m:=\frac{1}{m!}\int_n^{\infty}x^me^{-x}dx. Then c_0=e^{-n} and integration by parts with x^m=u, \ s^{-x}dx=dv gives \displaystyle c_m=\frac{n^m}{m!}e^{-n}+c_{m-1}. Thus \displaystyle c_m=e^{-n}\sum_{k=0}^m \frac{n^k}{k!} and the result follows. \Box

Remark. If m \ge 0 is an integer independent of n, then \displaystyle \lim_{n\to\infty} e^{-n}\sum_{k=0}^m \frac{n^k}{k!}=0 because \displaystyle \sum_{k=0}^m \frac{n^k}{k!} is a polynomial in n and we know that if p(x) is a polynomial, then \displaystyle \lim_{x\to\infty} \frac{p(x)}{e^x}=0. This fact also follows, trivially, from \displaystyle e^{-n}\sum_{k=0}^m \frac{n^k}{k!}=\frac{1}{m!}\int_n^{\infty} x^me^{-x}dx, which we proved in the above example.

Problem 2. Show that the sequence in Problem 1 is decreasing.

Solution. Integration by parts with x^{n+1}=u and e^{-x}dx=dv gives \displaystyle (n+1)!I_{n+1}=\int_{n+1}^{\infty}x^{n+1}e^{-x}dx=(n+1)^{n+1}e^{-(n+1)}+(n+1)\int_{n+1}^{\infty}x^ne^{-x}dx
and thus
\displaystyle n!I_{n+1}=(n+1)^ne^{-(n+1)}+\int_{n+1}^{\infty}x^ne^{-x}dx. \ \ \ \ \ \ \ \ \ \ \ (\star \star)
Now see that the function g(x):=x^ne^{-x} is decreasing on the interval [n,n+1] and so \displaystyle (n+1)^ne^{-(n+1)}=g(n+1) < \int_n^{n+1}g(x) \ dx=\int_n^{n+1}x^ne^{-x}dx. Thus, by (\star \star), \displaystyle n!I_{n+1} <  \int_n^{n+1}x^ne^{-x}dx + \int_{n+1}^{\infty}x^ne^{-x}dx=\int_n^{\infty}x^ne^{-x}dx=n!I_n. \ \Box

Exercise 1. Show that the sequence \displaystyle J_n:= \frac{1}{n!}\int_0^n x^ne^{-x}dx is increasing and \displaystyle \lim_{n\to\infty}J_n=\frac{1}{2}.

Exercise 2. Show that \displaystyle e^n < 2\sum_{k=0}^n \frac{n^k}{k!} for all n.

Exercise 3. Let I_n and J_n be the sequences in Problem 1 and Exercise 1, respectively. True or false: the sequences n!I_n and n!J_n are both increasing?

Posted in Integration, Limit And Continuity, Sequences | Tagged , | Leave a comment

The sequence n!^(1/n)/n

Consider the sequence \displaystyle u_n:=\frac{\sqrt[n]{n!}}{n}, \ n \ge 1. It is quite easy to show that \displaystyle \lim_{n\to\infty} u_n=e^{-1} (Problem 2, i)) A more interesting fact is that \{u_n\} is decreasing (Problem 2, ii)). But before proving that, we need to prove something that in this post I gave as exercise.

Problem 1. Show that the sequence \displaystyle b_n:=\left(1+\frac{1}{n} \right)^{n+1}, \ n \ge 1, is decreasing and so \displaystyle b_n > e for all n \ge 1.

Solution. We have
\displaystyle (b_{n+1})^{\frac{1}{n+1}}=\left(1+\frac{1}{n+1} \right)^{\frac{n+2}{n+1}}=\left(1+\frac{1}{n+1} \right)\left(1+\frac{1}{n+1} \right)^{\frac{1}{n+1}}
and so, by the Bernoulli’s inequality, \displaystyle (b_{n+1})^{\frac{1}{n+1}} < \left(1+\frac{1}{n+1} \right)\left(1+\frac{1}{(n+1)^2} \right) < \left(1+\frac{1}{n+1} \right)\left(1+\frac{1}{n^2+2n} \right) =1+\frac{1}{n}.
Thus \displaystyle b_{n+1} < \left(1+\frac{1}{n}\right)^{n+1}=b_n. \ \Box

Problem 2. Consider the sequence \displaystyle u_n:=\frac{\sqrt[n]{n!}}{n}, \ n \ge 1. Show that
i) \displaystyle \lim_{n\to\infty} u_n=e^{-1},
ii) \displaystyle e^{n-1} \ge \frac{n^n}{n!} for n \ge 1. The equality holds for n=1 only,
iii) \displaystyle \{u_n\} is decreasing.

Solution. i) Since \displaystyle \ln u_n=\frac{1}{n}\ln n! - \ln n=\frac{1}{n}\sum_{k=1}^n \ln \left(\frac{k}{n}\right) and \ln x is a continuous function, we have
\displaystyle \ln \left(\lim_{n\to\infty} u_n \right)=\lim_{n\to\infty} \ln u_n = \lim_{n\to\infty} \frac{1}{n}\sum_{k=1}^n \ln \left(\frac{k}{n}\right)=\int_0^1 \ln x \ dx=-1.

ii) For n=1 we have an equality.
First Solution. We use induction over n \ge 2. The inequality holds for n=2 because e > 2. Now suppose that it holds for n. Then, using Example 2 in this post, we have
\displaystyle e^n =e \cdot e^{n-1} > e \cdot \frac{n^n}{n!} > \left(1+\frac{1}{n}\right)^n \frac{n^n}{n!}=\frac{(n+1)^{n+1}}{(n+1)!}.
Second Solution. If n > 1, then \displaystyle e^n =\sum_{k=0}^{\infty} \frac{n^k}{k!} > \sum_{k=n-1}^{\infty} \frac{n^k}{k!}=\frac{n^n}{n!}s_n, where
\displaystyle s_n=2 + \sum_{k=1}^{\infty} \frac{n^k}{(n+1)(n+2) \cdots (n+k)}=2+\sum_{k=1}^{\infty} \frac{1}{(1+\frac{1}{n})(1+\frac{2}{n}) \cdots (1+\frac{k}{n})} > 2 + \sum_{k=1}^{\infty} \frac{1}{(k+1)!}=e.

iii) Using ii) and Problem 1, we have
\displaystyle \left(\frac{u_n}{u_{n+1}}\right)^{n(n+1)}=\frac{n!}{n^n}\left(1+\frac{1}{n}\right)^{n^2} > e^{1-n} \left(1+\frac{1}{n}\right)^{n^2-1} = e^{1-n} \left[\left(1+\frac{1}{n}\right)^{n+1}\right]^{n-1} > e^{1-n}e^{n-1}=1. \ \Box

Remark. Unlike the inequality \displaystyle e^{n-1} > \frac{n^n}{n!}, the inequality  \displaystyle e^n > \frac{n^n}{n!} is trivial because \displaystyle e^n=\sum_{k=0}^{\infty} \frac{n^k}{k!} > \frac{n^n}{n!}.

Exercise. Show that \displaystyle \lim_{n\to\infty} \sqrt[n]{\sum_{k=0}^n \frac{n^k}{k!}}=e.

Posted in Limit And Continuity, Sequences | Tagged , | Leave a comment

Irrationality of pi

We have already proved that e is irrational (see the example in this post!)
The fact that \pi is irrational, i.e. there are no integers a,b such that \displaystyle \pi=\frac{a}{b}, was discovered a couple of hundreds of years ago and there are several proofs of this result; I like Ivan Niven’s proof better and that’s what I’m going to explain here.

Throughout this post, f^{(k)}(x) is the k-th derivative of a function f(x).

Problem 1. Let P(x) be a polynomial of degree 2n. Show that
\displaystyle \int_0^{\pi} P(x) \sin x \ dx = \sum_{k=0}^n (-1)^k(P^{(2k)}(0)+P^{(2k)}(\pi)).

Solution. The proof is by induction over n. If n=0, then P(x)=p is a constant and so \displaystyle \int_0^{\pi} P(x) \sin x \ dx=p\int_0^{\pi} \sin x \ dx = 2p=P(0)+P(\pi).
Now suppose the equality in the problem holds for polynomials of degree 2n and let P(x) be a polynomial of degree 2n+2. Then using integration by parts with P(x)=u and \sin x \ dx = dv gives \displaystyle \int_0^{\pi} P(x) \sin x \ dx = P(0)+P(\pi)+\int_0^{\pi}P'(x)\cos x \ dx. \ \ \ \ \ \ \ \ \ \ \ \ \ (1)
Now, in (1), we use integration by parts again, this time with P(x)=u and \cos x \ dx = dv to get \displaystyle \int_0^{\pi} P(x) \sin x \ dx = P(0)+P(\pi)-(P'(0)+P'(\pi)) + \int_0^{\pi} P''(x) \sin x \ dx. \ \ \ \ \ \ \ \ \ \ \ \ (2)
But since P''(x) is a polynomial of degree 2n, we can use our induction hypothesis to write \displaystyle \int_0^{\pi} P''(x) \sin x \ dx= \sum_{k=0}^n (-1)^k(P^{(2k+2)}(0)+P^{(2k+2)}(\pi)) and so (2) becomes
\displaystyle \int_0^{\pi} P(x) \sin x \ dx = P(0)+P(\pi)-(P'(0)+P'(\pi)) + \sum_{k=0}^n (-1)^k(P^{(2k+2)}(0)+P^{(2k+2)}(\pi))= \sum_{k=0}^{n+1} (-1)^k(P^{(2k)}(0)+P^{(2k)}(\pi)). \ \Box

Problem 2. Let n,a,b be integers with n \ge 1 and b \ne 0. Let \displaystyle c:=\frac{a}{b} and \displaystyle P(x):=\frac{x^n(a-bx)^n}{n!}. Show that P^{(k)}(0) and P^{(k)}(c) are integers for all k \ge 0.

Solution. Since P(x) is a polynomial of degree 2n, we have P^{(k)}(x)=0 for all x and k > 2n and so there is nothing to prove for k > 2n.
Now, since P(x) is a polynomial of degree 2n, we have
\displaystyle P(x)=\sum_{k=0}^{2n} \frac{P^{(k)}(0)}{k!}x^k. \ \ \ \ \ \ \ \ \ \ \ \ \ (1)
On the other hand, using the binomial theorem, it is clear that
\displaystyle P(x)=\frac{x^n(a-bx)^n}{n!}=\sum_{k=n}^{2n}\frac{p_k}{n!}x^k, \ \ \ \ \ \ \ \ \ \ \ \ \ \ (2)
where p_k are some integers.
So (1) and (2) together give P^{(k)}(0)=0 for 0 \le k \le n-1 and \displaystyle P^{(k)}(0)=\frac{k!}{n!}p_k for n \le k \le 2n. Thus, since \displaystyle \frac{k!}{n!} is an integer for k \ge n, we have proved that P^{(k)}(0) is an integer for all k \ge 0.
Finally, since P(c-x)=P(x), we have (-1)^kP^{(k)}(c-x)=P^{(k)}(x) for all k \ge 0 and thus (-1)^kP^{(k)}(0)=P^{(k)}(c). So P^{(k)}(c) is an integer because we have already proved that P^{(k)}(0) is an integer. \Box

Problem 3. Show that \pi is irrational.

Solution (Ivan Niven). Suppose, to the contrary, that \pi is rational and put \displaystyle \pi=\frac{a}{b}, where a,b > 0 are integers. Let n \ge 1 be an integer and put
\displaystyle P(x):=\frac{x^n(a-bx)^n}{n!}.
Then by Problem 1 and 2, \displaystyle \int_0^{\pi} P(x) \sin x \ dx must be an integer. But we are now going to prove that if n is large enough, then \displaystyle \int_0^{\pi} P(x) \sin x \ dx is not an integer and this contradiction proves that our assumption that \pi is rational is false.
First note that, on the interval [0,\pi], we have
\displaystyle 0 \le P(x)\sin x \le P(x) =\frac{b^nx^n(\pi - x)^n}{n!} \le \frac{b^n \pi^{2n}}{n!}=\frac{(b \pi^2)^n}{n!}
and thus
\displaystyle 0 < \int_0^{\pi} P(x)\sin x \ dx \le \frac{(b \pi^2)^n}{n!} \pi. \ \ \ \ \ \ \ \ \ \ (*)
(Note that \displaystyle \int_0^{\pi} P(x)\sin x \ dx \ne 0 because P(x) \sin x is not identically zero on [0, \pi]).
But for every real number r, we have \displaystyle \lim_{n\to\infty} \frac{r^n}{n!}=0 because the series \displaystyle \sum_{n=0}^{\infty} \frac{r^n}{n!} is convergent (to e^r). So \displaystyle \lim_{n\to\infty} \frac{(b \pi^2)^n}{n!} \pi=0. Hence, if n is large enough, we will have \displaystyle \frac{(b \pi^2)^n}{n!} \pi  < 1 and thus, by (*), \displaystyle 0 < \int_0^{\pi} P(x)\sin x \ dx < 1. Therefore \displaystyle \int_0^{\pi} P(x)\sin x \ dx is not an integer if n is large enough. \Box

Remark. Another way to say that a real number t is irrational is to say that t is not the root of a polynomial of degree one with integer coefficients. If t is not a root of any polynomial with integer coefficients, then t is called transcendental. Clearly every transcendental number is irrational. It is known that both e and \pi are in fact transcendental.

Exercise 1. Let P(x) be a polynomial of degree 2n+1. Show that
\displaystyle \int_0^{\pi} P(x) \sin x \ dx = \sum_{k=0}^n (-1)^k(P^{(2k)}(0)+P^{(2k)}(\pi)).

Exercise 2. Show that the inequality (*) in the solution of Problem 3 can be improved by proving that \displaystyle \int_0^{\pi} P(x)\sin x \ dx \le \frac{(b \pi^2)^n}{4^n n!} \pi.

Posted in Applications, Differentiation, Integration, Limit And Continuity | Tagged , , , | Leave a comment

Limit of (1^n + 2^n + … + n^n)/n^n

We have all seen this easy exercise: show that \displaystyle \lim_{n\to\infty} \frac{1^{\alpha} + 2^{\alpha} + \cdots + n^{\alpha}}{n^{\alpha + 1}}=\frac{1}{\alpha +1}, where \alpha > -1 is a real constant, i.e. \alpha is independent of n. The proof is quite simple: we have \displaystyle \frac{1^{\alpha} + 2^{\alpha} + \cdots + n^{\alpha}}{n^{\alpha + 1}}= \frac{1}{n}\sum_{k=1}^n \left(\frac{k}{n}\right)^{\alpha}, which is a Riemann sum, and so the limit is \displaystyle \int_0^1 x^{\alpha} dx = \frac{1}{\alpha +1}.
But things get interesting when \alpha depends on n.
For example, what is \displaystyle \lim_{n\to\infty} \frac{1^n + 2^n + \cdots + n^n}{n^{n + 1}}  ? Well, the answer is 0 and that’s a trivial consequence of a much nicer result that we are going to prove in Problem 2. But first, we need to prove something, which is nice by itself.

Problem 1. If r is a real constant, show that \displaystyle \lim_{n\to\infty} \sum_{k=1}^{n-1} \frac{k^re^{-k}}{n-k}=0.

Solution. Choose an integer m \ge 0 such that r \le m. Then \displaystyle e^k > \frac{k^{m+1}}{(m+1)!} \ge \frac{k^{r+1}}{(m+1)!} and so
\displaystyle 0 <  \sum_{k=1}^{n-1} \frac{k^re^{-k}}{n-k} < (m+1)! \sum_{k=1}^{n-1} \frac{1}{k(n-k)}=\frac{2(m+1)!}{n}\sum_{k=1}^{n-1}\frac{1}{k} < \frac{2(m+1)!}{n}\sum_{k=1}^n\frac{1}{k}=\frac{2(m+1)!}{n}\left(\sum_{k=1}^n\frac{1}{k}-\ln n \right) + \frac{2(m+1)! \ln n}{n}.
Now the result follows from the facts that \displaystyle \lim_{n\to\infty} \frac{1}{n}\left(\sum_{k=1}^n\frac{1}{k}-\ln n \right)=\lim_{n\to\infty}\frac{1}{n}\gamma=0, where \gamma is the Euler’s constant, and \displaystyle \lim_{n\to\infty} \frac{\ln n}{n}=0. \ \Box

Problem 2. Let \displaystyle a_n:= \frac{1^n + 2^n + \cdots + n^n}{n^n}. Show that \displaystyle \lim_{n\to\infty} a_n=\frac{e}{e-1}.

Solution. By part i) and the Example in this problem, we have
\displaystyle e^{\frac{x}{1+x}} \le 1+ x \le e^x, \ \ \ \ \ \ \ \ \ \ \ \ \ \ (*)
for all real numbers x. Thus
\displaystyle a_n =\sum_{k=1}^n \left(\frac{k}{n}\right)^n=\sum_{k=0}^{n-1} \left(1-\frac{k}{n}\right)^n\le \sum_{k=0}^{n-1}e^{-k} \le \sum_{k=0}^{\infty}e^{-k}=\frac{e}{e-1}.
On the other hand, again by (*), we have
\displaystyle a_n=\sum_{k=0}^{n-1} \left(1-\frac{k}{n}\right)^n \ge \sum_{k=0}^{n-1} e^{\frac{-kn}{n-k}}=\sum_{k=0}^{n-1}e^{-k} - \sum_{k=0}^{n-1} \left(1-e^{\frac{-k^2}{n-k}} \right)e^{-k} \ge \sum_{k=0}^{n-1}e^{-k} - \sum_{k=0}^{n-1} \frac{k^2e^{-k}}{n-k}.
So we have proved that \displaystyle \sum_{k=0}^{n-1}e^{-k} - \sum_{k=0}^{n-1} \frac{k^2e^{-k}}{n-k} \le a_n \le \frac{e}{e-1}. Hence, in order to complete the solution, we only need to show that \displaystyle \lim_{n\to\infty} \left(\sum_{k=0}^{n-1}e^{-k} - \sum_{k=0}^{n-1} \frac{k^2e^{-k}}{n-k}\right)=\frac{e}{e-1}
and that’s easy to do since \displaystyle \lim_{n\to\infty} \sum_{k=0}^{n-1}e^{-k}=\sum_{k=0}^{\infty}e^{-k}=\frac{e}{e-1} and, by Problem 1, \displaystyle \lim_{n\to\infty} \sum_{k=0}^{n-1} \frac{k^2e^{-k}}{n-k}=0. \ \Box

Remark. There’s a shorter proof of Problem 2 that uses Tannery’s theorem but I don’t discuss that theorem in this blog (I like my solution better!)

Exercise 1. If \alpha \le -1 is a real constant, what is \displaystyle \lim_{n\to\infty} \frac{1^{\alpha} + 2^{\alpha} + \cdots + n^{\alpha}}{n^{\alpha + 1}} ?

Exercise 2. Show that \displaystyle \sum_{k=0}^n \binom{n+1}{k}\frac{B_k}{n^k}=\frac{1}{e-1}, where B_k are Bernoulli numbers.
Hint. Use Problem 2 and the problem in this post.

Exercise 3. Prove that \displaystyle \lim_{n\to\infty} \frac{1^n + 2^n + \cdots + n^n}{n^{n + 1}}=0 without using the result given in Problem 2.
Hint. You’ll only need the easy half of the solution of Problem 2.

Posted in Limit And Continuity, Sequences | Tagged , , , , | Leave a comment

Convergence of series (2)

Problem 1. Show that if \displaystyle \sum_{n=1}^{\infty} a_n, \ 0\le a_n < 1, converges, then \displaystyle \sum_{n=1}^{\infty} \frac{a_n}{1-a_n} converges too.

Solution. Since \displaystyle \lim_{n\to\infty} a_n=0, there exists an integer N > 0 such that \displaystyle a_n \le \frac{1}{2} for all n \ge N.  Then \displaystyle 1-a_n \ge \frac{1}{2} for n \ge N and so \displaystyle 0 \le \frac{a_n}{1-a_n} \le 2a_n for n \ge N. The result now follows from the comparison test. \Box

Problem 2. Show that
i) if \displaystyle \sum_{n=1}^{\infty} a_n, \ a_n \ge 0, converges, then \displaystyle \sum_{n=1}^{\infty} \ln(1+a_n) converges too.
ii) if \displaystyle \sum_{n=1}^{\infty} a_n, \ 0 \le a_n < 1, converges, then \displaystyle \sum_{n=1}^{\infty} \ln(1-a_n) converges too.

Solution. By the Example in this post, we have \displaystyle \frac{x}{x+1} \le \ln(x+1) \le x for all real numbers x > -1. So \displaystyle 0 \le \ln(1+a_n) \le a_n for a_n \ge 0 and \displaystyle 0 \le -\ln(1-a_n) \le \frac{a_n}{1-a_n} for 0 \le a_n < 1. The result now follows from the comparison test and Problem 1. \Box

Exercise 1. Show that if \displaystyle \sum_{n=1}^{\infty} a_n, \ a_n \ge 0, diverges, then \displaystyle \sum_{n=1}^{\infty} \frac{a_n}{a_n+1} diverges too.
Hint. Suppose that \displaystyle \sum_{n=1}^{\infty} \frac{a_n}{a_n+1} converges and then use Problem 1.

Exercise 2. Show that \displaystyle \sum_{n=1}^{\infty} \ln \left(\cos \left(\frac{1}{n}\right) \right) is convergent.

Posted in Series | Tagged | Leave a comment

Integral of ln(x^2 + a)/(x^2 + 1)

We have already seen one example of evaluating definite integrals using the Leibniz integral rule. In this post, I give another one.

Problem. Show that \displaystyle \int_0^{\infty} \frac{\ln(x^2+a)}{x^2+1} \ dx = \pi \ln(\sqrt{a}+1) for all real constants a \ge 0.

Solution. Let \displaystyle f(a):=\int_0^{\infty} \frac{\ln(x^2+a)}{x^2+1} \ dx, \ a \ge 0.
First see that the equality in the problem holds for a=0 because if, in \displaystyle f(0)=2\int_0^{\infty} \frac{\ln x}{x^2+1} \ dx, we put \displaystyle x = \frac{1}{t}, then we will quickly get \displaystyle f(0)=0.
So, from now on, we will assume that a > 0. Now we use the definition of derivative to find f'(a), the derivative of f with respect to a. We have \displaystyle f'(a)=\lim_{h\to0} \frac{1}{h} \int_0^{\infty} \frac{1}{x^2+1} \ln \left(1+\frac{h}{x^2+a} \right)  dx. \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (1)
Now we have \displaystyle \frac{t}{t+1} \le \ln(1+t) \le t for all real numbers t > -1 (see the Example in this post), and thus
\displaystyle \frac{h}{x^2+a+h} \le  \ln \left(1+\frac{h}{x^2+a} \right) \le \frac{h}{x^2+a}. \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (2)
It now follows from (1) and (2) that
\displaystyle f'(a)=\int_0^{\infty} \frac{dx}{(x^2+a)(x^2+1)}=\frac{\pi}{2(a+\sqrt{a})}
and therefore
\displaystyle f(a)=\frac{\pi}{2} \int \frac{da}{a+\sqrt{a}}=\pi \ln(\sqrt{a}+1) + C, \ \ \ \ \ \ \ \ \ \ \ \ \ \ (3)
for some constant C.
But what is C? Well, if in (3) we, for example, put a = 1, then we’ll get
C=f(1)-\pi \ln 2. \ \ \ \ \ \ \ \ \ \ \ \ \ \ (4)
We also have \displaystyle f(1)=\int_0^{\infty} \frac{\ln(x^2+1)}{x^2+1} \ dx=\int_0^{\frac{\pi}{2}} \ln(\tan^2 \theta  + 1)  \ d \theta = -2 \int_0^{\frac{\pi}{2}} \ln(\cos \theta) \ d \theta=-2 \int_0^{\frac{\pi}{2}} \ln(\sin \theta) \ d \theta
and so f(1)=\pi \ln 2, by this problem. Thus, by (4), \ C=0 and we are done by (3). \ \Box

Example. Show that \displaystyle g(c):=\int_0^{\pi} \ln(c+\cos x) \ dx = \pi \ln \left(\frac{c+\sqrt{c^2-1}}{2}\right) for all real constants c \ge 1.

Solution. We first prove the equality for c=1. We have \displaystyle g(1)=\int_0^{\pi} \ln(1+\cos x) \ dx = \int_0^{\pi} \left(\ln2 + 2\ln \cos \left(\frac{x}{2}\right) \right) \ dx=\pi \ln 2 + 4 \int_0^{\frac{\pi}{2}} \ln \cos x \ dx \\ =\pi \ln 2 + 4 \int_0^{\frac{\pi}{2}} \ln \sin x \ dx.
Thus g(1)=-\pi \ln 2, by this problem, and we’re done in this case.
Now suppose that c > 1 and let \displaystyle \tan \left(\frac{x}{2}\right)=y. Then \displaystyle \cos x = \frac{1-y^2}{1+y^2} and \displaystyle dx = \frac{2dy}{1+y^2}. Hence, by the above problem, we have
\displaystyle g(c)=2 \int_0^{\infty} \frac{\ln(c-1) + \ln \left(y^2 + \frac{c+1}{c-1} \right) - \ln(y^2+1)}{y^2+1} \ dy=\pi \ln(c-1) + 2\pi \ln \left(\sqrt{\frac{c+1}{c-1}}+1\right)-2\pi \ln 2=\pi \ln \left(\frac{c+\sqrt{c^2-1}}{2}\right). \ \Box

Exercise 1. Show that \displaystyle \int_0^{\infty} \frac{dx}{(x^2+a)(x^2+1)}=\frac{\pi}{2(a+\sqrt{a})} for all real constants a > 0.

Exercise 2. For a real constant c \ge 1, show that  \displaystyle \int_0^{\frac{\pi}{2}} \ln(c^2-\sin^2x) \ dx=\int_0^{\pi} \ln(c+\cos x) \ dx.

Exercise 3. Evaluate \displaystyle \int_0^{\infty} \frac{\ln(x^2+a)}{x^2+b} \ dx, where a \ge 0 and b > 0 are real constants.

Posted in Differentiation, Integration, Limit And Continuity | Tagged , , | Leave a comment

Bernoulli’s inequality

If x \ge 0 is a real number and n \ge 1 is an integer, then it is clear, from the binomial theorem, that \displaystyle (1+x)^n \ge 1+\binom{n}{1}x=1+nx. This is the trivial case of the Bernoulli’s inequality.
Let’s extend that result. Let r, x be real numbers with x > -1 (we need the condition x > -1 to make sure that (1+x)^r is a real number). When do we have (1+x)^r \ge 1+rx or (1+x)^r \le 1+rx ? For x=0 or r=0,1, we have (1+x)^r=1+rx and so we will assume that x \ne 0 and r \ne 0,1.

Problem (Bernoulli’s inequality). Let r,x be real numbers with r \ne 0,1 and 0 \ne x > -1. Show that
i) if r > 1 or r < 0, then (1+x)^r > 1+rx.
ii) if 0 <  r < 1, then (1+x)^r < 1+rx.

Solution. Given a real number r, let f(x):=(1+x)^r. By Taylor’s theorem, for every nonzero x > -1, there exists c between 0 and x such that \displaystyle f(x)=f(0)+f'(0)x+\frac{f''(c)}{2}x^2.
Thus, since f(0)=1, \ f'(0)=r and f''(c)=r(r-1)(1+c)^{r-2}, we have
\displaystyle (1+x)^r=1+rx+\frac{r(r-1)}{2}(1+c)^{r-2}x^2
and hence
\displaystyle (1+x)^r -(1+rx)=\frac{r(r-1)}{2}(1+c)^{r-2}x^2. \ \ \ \ \ \ \ \ \ \ \ \ \ (*)
Now c > -1 because x > -1 and c is between 0 and x. Thus (1+c)^{r-2} > 0 and so, by (*), if r(r-1) > 0, i.e. if r > 1 or r < 0, then (1+x)^r-(1+rx) > 0 and if r(r-1) < 0, i.e. if 0 <  r < 1, then (1+x)^r-(1+rx) < 0. \ \Box

Example 1. Show that \displaystyle \lim_{r\to0+} \sum_{n=1}^{\infty} \frac{n^r}{n!}=e-1.

Solution. (I have also given this solution here) By part ii) of the above problem, 1 \le (n+1)^r \le 1+nr for n \ge 0 and real numbers 0 < r < 1. So for 0 < r < 1,
\displaystyle \sum_{n=0}^{\infty} \frac{1}{(n+1)!} \le \sum_{n=0}^{\infty}\frac{(n+1)^r}{(n+1)!} \le \sum_{n=0}^{\infty} \frac{1}{(n+1)!}+r \sum_{n=0}^{\infty} \frac{n}{(n+1)!}.
But \displaystyle \lim_{r\to0} r \sum_{n=0}^{\infty} \frac{n}{(n+1)!}=0, because \displaystyle \sum_{n=0}^{\infty} \frac{n}{(n+1)!} is convergent, and so, by the squeeze theorem, \displaystyle \lim_{r\to0+} \sum_{n=1}^{\infty} \frac{n^r}{n!}=\sum_{n=0}^{\infty} \frac{1}{(n+1)!}=e-1. \ \Box

Example 2. Show that the sequence \displaystyle a_n:=\left(1+\frac{1}{n}\right)^n, \ \ n\ge 1, is increasing and so a_n < e for n \ge 1.

Solution. We have \displaystyle \frac{a_{n+1}}{a_n}=\frac{\left(1+\frac{1}{n+1}\right)^{n+1}}{\left(1+\frac{1}{n}\right)^n}=\left(1+\frac{1}{n}\right)\left(\frac{1+\frac{1}{n+1}}{1+\frac{1}{n}}\right)^{n+1} = \left(1+\frac{1}{n}\right) \left(\frac{n^2+2n}{(n+1)^2} \right)^{n+1}=\left(1+\frac{1}{n}\right) \left(1-\frac{1}{(n+1)^2} \right)^{n+1}.
But by part i) of the above problem, \displaystyle  \left(1-\frac{1}{(n+1)^2} \right)^{n+1} > 1-\frac{1}{n+1} and thus \displaystyle \frac{a_{n+1}}{a_n} > \left(1+\frac{1}{n}\right)\left(1-\frac{1}{n+1}\right)=1. \ \Box

Example 3 (AM-GM inequality). Show that \displaystyle \frac{x_1+x_2 + \cdots + x_n}{n} \ge \sqrt[n]{x_1x_2 \cdots x_n} for any integer n \ge 1 and positive real numbers x_1, x_2, \cdots , x_n.

Solution. First note that, by part i) of the above problem, x^r \ge 1+r(x-1) for x > 0 and r \ge 1 (just change x to x-1).
Now, back to the solution, there’s nothing to prove for n=1 and so we’ll assume that n\ge 2. For integers 1 \le k \le n, let \displaystyle \sigma_k:=\frac{x_1+x_2 + \cdots + x_k}{k}. We want to show that \sigma_n^n \ge x_1x_2 \cdots x_n.
If k \ge 2, then, as we mentioned at the beginning of the solution, \displaystyle \left(\frac{\sigma_k}{\sigma_{k-1}}\right)^k \ge 1 + k \left(\frac{\sigma_k}{\sigma_{k-1}}-1 \right)=\frac{k\sigma_k-(k-1)\sigma_{k-1}}{\sigma_{k-1}}=\frac{x_k}{\sigma_{k-1}}
and so \displaystyle \sigma_k^k \ge x_k \sigma_{k-1}^{k-1}. Thus we have
\displaystyle  \sigma_n^n \ge x_n \sigma_{n-1}^{n-1} \ge x_nx_{n-1}\sigma_{n-2}^{n-2} \ge \cdots \ge x_nx_{n-1} \cdots x_2 \sigma_1=x_nx_2 \cdots x_1. \ \Box

Exercise.  Show that the sequence \displaystyle b_n:=\left(1+\frac{1}{n}\right)^{n+1}, \ \ n\ge 1, is decreasing and so \displaystyle \left(1+\frac{1}{n}\right)^n < e < \left(1+\frac{1}{n}\right)^{n+1} for n \ge 1.
Hint. You can always see this post for a solution.

Posted in Differentiation | Tagged , , , | Leave a comment

Stirling’s formula (1)

Stirling’s formula (Problem 2) gives a useful asymptotic approximation for n!. Using this formula, we can easily evaluate many limits involving factorials.

Problem 1. Let a>0 and 0 < b < 1 be real numbers and consider the function \displaystyle f(t):=\ln(t+1)-t, \ t>-1.
Show that
i) \displaystyle f(t) < \frac{-b^{2a}}{2} for t \in (-1, -b^a] and \displaystyle f(t) < \frac{-b^a}{6}t for \displaystyle t \ge b^{a}.
ii) \displaystyle - \frac{t^2}{2} - \frac{b^{3a}}{3(1-b^a)^3} < f(t) < - \frac{t^2}{2} + \frac{b^{3a}}{3(1-b^a)^3} for \displaystyle t \in [-b^a, b^a].

Solution. By Taylor’s theorem, for every t > -1 there exists some c between 0 and t such that \displaystyle f(t)=f(0)+f'(0)t + \frac{f''(0)}{2}t^2+\frac{f'''(c)}{6}t^3. Thus
\displaystyle f(t)=-\frac{t^2}{2}+\frac{t^3}{3(c+1)^3}. \ \ \ \ \ \ \ \ \ \ \ \ \  (*)
Now back to the problem.
i) Let t \in (-1, -b^a]. Then, by (*), we have \displaystyle f(t) <\frac{-t^2}{2} \le \frac{-b^{2a}}{2}.
For t \ge b^a, first see that the function \displaystyle g(t):=\frac{f(t)}{t}, \ t > -1, is decreasing (Exercise!) and thus g(t) \le g(b^a), for all t \ge b^a.
On the other hand, we have from (*) that \displaystyle f(b^a) < -\frac{b^{2a}}{2}+ \frac{b^{3a}}{3} < -\frac{b^{2a}}{2}+\frac{b^{2a}}{3}=\frac{-b^{2a}}{6} and so \displaystyle \frac{f(t)}{t}=g(t) \le g(b^a) =\frac{f(b^a)}{b^a} < \frac{-b^a}{6}.

ii) By (*), we have \displaystyle \left|f(t)+\frac{t^2}{2}\right|=\frac{|t|^3}{3(c+1)^3} < \frac{b^{3a}}{3(1-b^a)^3} because |c|< |t| \le b^a. \ \Box

Problem 2. (Stirling’s formula) Show that \displaystyle \lim_{n\to\infty} \frac{n!}{n^{n+\frac{1}{2}}e^{-n}}=\sqrt{2\pi}.

Solution. We have \displaystyle n!=\int_0^{\infty}x^ne^{-x}dx (see Exercise 2 in this post for a hint). Now let’s make the substitution x=n(1+t) to get \displaystyle n!=n^{n+1}e^{-n}\int_{-1}^{\infty}e^{n(\ln(t+1)-t)}dt. Thus \displaystyle \frac{n!}{n^{n+\frac{1}{2}}e^{-n}}=\sqrt{n} \int_{-1}^{\infty} e^{n(\ln(t+1)-t)}dt. \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (1)
Now choose any real number a such that \displaystyle \frac{1}{3} < a < \frac{1}{2} and let f(t):=\ln(t+1)-t. Then we can rewrite (1) as
\displaystyle \frac{n!}{n^{n+\frac{1}{2}}e^{-n}}=\sqrt{n} \left(\int_{-1}^{\frac{-1}{n^a}}e^{nf(t)}dt + \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} e^{nf(t)}dt + \int_{\frac{1}{n^a}}^{\infty} e^{nf(t)}dt \right).
So we are done if we show that
i) \displaystyle \lim_{n\to\infty} \sqrt{n} \int_{-1}^{\frac{-1}{n^a}}e^{nf(t)}dt=0,
ii) \displaystyle \lim_{n\to\infty} \sqrt{n} \int_{\frac{1}{n^a}}^{\infty} e^{nf(t)}dt=0,
iii) \displaystyle \lim_{n\to\infty} \sqrt{n} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} e^{nf(t)}dt=\sqrt{2\pi}.
Proof of i). By part i) of Problem 1, \displaystyle f(t) < \frac{-1}{2n^{2a}} for all \displaystyle -1<t \le \frac{-1}{n^a} and thus \displaystyle 0 < \sqrt{n} \int_{-1}^{\frac{-1}{n^a}}e^{nf(t)}dt \le \sqrt{n}\left(1-\frac{1}{n^a}\right)e^{\frac{-1}{2}n^{1-2a}}.
But since 1-2a > 0, we have \displaystyle \lim_{n\to\infty} \sqrt{n}\left(1-\frac{1}{n^a}\right)e^{\frac{-1}{2}n^{1-2a}}=0 and the result follows.
Proof of ii). By part i) of Problem 1, we have \displaystyle f(t) < \frac{-1}{6n^a}t for \displaystyle t \ge \frac{1}{n^a} and thus \displaystyle 0 < \sqrt{n} \int_{\frac{1}{n^a}}^{\infty} e^{nf(t)}dt \le \sqrt{n} \int_{\frac{1}{n^a}}^{\infty} e^{\frac{-1}{6}n^{1-a}t}dt =\frac{6\sqrt{n}}{n^{1-a}}e^{\frac{-1}{6}n^{1-2a}}.
But since 1-2a > 0, we have \displaystyle \lim_{n\to\infty} \frac{6\sqrt{n}}{n^{1-a}}e^{\frac{-1}{6}n^{1-2a}}=0 and the result follows.
Proof of iii). By Problem 1, ii), we have
\displaystyle -\frac{t^2}{2} - \frac{1}{3(n^a-1)^3} < f(t) < -\frac{t^2}{2} + \frac{1}{3(n^a-1)^3},
for all \displaystyle \frac{-1}{n^a} \le t \le \frac{1}{n^a} and thus
\displaystyle e^{\frac{-n}{3(n^a-1)^3}}\int_{\frac{-1}{n^a}}^{\frac{1}{n^a}}\sqrt{n} e^{\frac{-nt^2}{2}}dt \le \sqrt{n} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} e^{nf(t)}dt \le e^{\frac{n}{3(n^a-1)^3}} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} \sqrt{n}e^{\frac{-nt^2}{2}}dt. \ \ \ \ \ \ \ \ \ \ \ \ \ \ (2)
But clearly \displaystyle \lim_{n\to\infty} \frac{n}{3(n^a-1)^3}=0, because 1 < 3a, and so (2) gives \displaystyle \lim_{n\to\infty} \sqrt{n} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} e^{nf(t)}dt=\lim_{n\to\infty} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} \sqrt{n}e^{\frac{-nt^2}{2}}dt=2 \lim_{n\to\infty} \int_0^{\frac{1}{n^a}} \sqrt{n}e^{\frac{-nt^2}{2}}dt. \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (3)
The substitution \displaystyle \sqrt{\frac{n}{2}}t=s now gives
\displaystyle \int_0^{\frac{1}{n^a}} \sqrt{n}e^{\frac{-nt^2}{2}}dt=\sqrt{2}\int_0^{\sqrt{\frac{1}{2}n^{1-2a}}}e^{-s^2}ds.
But since 1-2a >0, we have \displaystyle \lim_{n\to\infty} \sqrt{\frac{1}{2}n^{1-2a}}=\infty and hence, by this post,
\displaystyle \lim_{n\to\infty}\int_0^{\frac{1}{n^a}} \sqrt{n}e^{\frac{-nt^2}{2}}dt=\sqrt{2} \int_0^{\infty}e^{-s^2}ds=\sqrt{\frac{\pi}{2}}.
Therefore, by (3),  \displaystyle \lim_{n\to\infty} \sqrt{n} \int_{\frac{-1}{n^a}}^{\frac{1}{n^a}} e^{nf(t)}dt=2\sqrt{\frac{\pi}{2}}=\sqrt{2\pi}. \ \Box

Posted in Differentiation, Integration, Limit And Continuity | Tagged , , , , | Leave a comment