The asymptotic expansion of an integral of an exponential function

The asymptotic expansion of an integral of an exponential function



What is the asymptotic expansion of $f(x) := int_0^1 e^-x(1-u^2)du$?



The integrand steeply declines near $u=1$. I tried to transform $u$ into something that is suitable for the method of steepest descent, but have not found an appropriate transformation. Integration by parts has not yield a satisfactory result, perhaps due to I having not found the right components to integrate.




2 Answers
2



The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



$$
int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
$$



Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



$$
beginalign
int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
&approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
&= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
endalign
$$



as $x to infty$, which matches the series given in the other answer.



Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



$$
I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
$$



as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



$$
g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
$$



We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



$$
I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
$$



We only need a rough estimate on $I_2(v)$:



$$
lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
$$



where we have assumed the last integral on the right is finite. Thus



$$
I(v) = I_1(v) + O(e^-delta x)
$$



as $x to infty$.



Now, from $(1)$ we have



$$
I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
$$



The last integral can be bounded by



$$
leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
$$



Thus



$$
I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
$$



as $x to infty$. Finally we reattach the tails to the integrals in the sum,



$$
beginalign
int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
&= frack!x^k+1 + O!left(e^-delta xright),
endalign
$$



and substitute these into the sum to get



$$
I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
$$



and hence



$$
I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
$$



as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



$$
I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
$$



as $x to infty$.





Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
– Hans
Aug 22 at 8:46





@Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
– Antonio Vargas
Aug 23 at 0:17






Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
– Antonio Vargas
Aug 23 at 0:36





@Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
– Antonio Vargas
Aug 23 at 1:16





It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
– Hans
Aug 23 at 1:33



The integral can be solved in closed form in terms of the Dawson function.

$$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
Mathematica was used to perform the asymptotic expansion for $s to infty.$






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)