Example of a non-negative discrete distribution where the mean (or another moment) does not exist?

Example of a non-negative discrete distribution where the mean (or another moment) does not exist?



I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)



I don't have an example handy if the discrete random variable has support on $mathbbZ$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).




4 Answers
4



Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



$$int_0^infty (1-F(x))mathrmdx = 1/2 + 1/3 + 1/4 + cdots$$



which diverges. In this sense the first moment (and therefore all higher moments) is infinite. (See remarks at the end for further elaboration.)



If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



$$Pr_F(n) = frac1n - frac1n+1.$$



This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



The expectation is



$$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



which diverges.



This way of expressing the answer it makes it clear that all solutions are obtained by such divergent series. Indeed, if you would like the distribution to be supported on some subset of the positive values $x_1, x_2, ldots, x_n, ldots,$ with probabilities $p_1, p_2, ldots$ summing to unity, then for the expectation to diverge the series which expresses it, namely



$$(a_n) = (x_n p_n),$$



must have divergent partial sums.



Conversely, every divergent series $(a_n)$ of non-negative numbers is associated with many discrete positive distributions having divergent expectation. For instance, given $(a_n)$ you could apply the following algorithm to determine sequences $(x_n)$ and $(p_n)$. Begin by setting $q_n = 2^-n$ and $y_n = 2^n a_n$ for $n=1, 2, ldots.$ Define $Omega$ to be the set of all $y_n$ that arise in this way, index its elements as $Omega=omega_1, omega_2, ldots, omega_i, ldots,$ and define a probability distribution on $Omega$ by



$$Pr(omega_i) = sum_n mid y_n = omega_iq_n.$$



This works because the sum of the $p_n$ equals the sum of the $q_n,$ which is $1,$ and $Omega$ has at most a countable number of positive elements.



As an example, the series $(a_n) = (1, 1/2, 1, 1/2, ldots)$ obviously diverges. The algorithm gives



$$y_1 = 2a_1 = 2; y_2 = 2^2 a_2 = 2; y_3 = 2^3 a_3 = 8; ldots$$



Thus $$Omega = 2, 8, 32, 128, ldots, 2^2n+1,ldots$$



is the set of odd positive powers of $2$ and $$p_1 = q_1 + q_2 = 3/4; p_2 = q_3 + q_4 = 3/16; p_3 = q_5 + q_6 = 3/64; ldots$$



When all the values are positive, there is no such thing as an "undefined" moment: moments all exist, but they can be infinite in the sense of a divergent sum (or integral), as shown at the outset of this answer.



Generally, all moments are defined for positive random variables, because the sum or integral that expresses them either converges absolutely or it diverges (is "infinite.") In contrast to that, moments can become undefined for variables that take on positive and negative values, because--by definition of the Lebesgue integral--the moment is the difference between a moment of the positive part and a moment of the absolute value of the negative part. If both those are infinite, convergence is not absolute and you face the problem of subtracting an infinity from an infinity: that does not exist.





$begingroup$
does this argument give an example of an infinite moment or an undefined moment? I'm looking for an undefined moment. Maybe there is a subtlety of undefined versus infinite moments that I am missing to fully understand your answer.
$endgroup$
– Lucas Roberts
Sep 14 '18 at 12:07





$begingroup$
When all the values are positive, there is no such thing as an "undefined" moment: moments all exist, but they can be infinite.
$endgroup$
– whuber
Sep 14 '18 at 14:36





$begingroup$
All moments are defined for positive random variables. Some may be infinite, that's all. Moments can become undefined for variables that take on positive and negative values, because--by definition of the Lebesgue integral--the moment is the difference between a moment of the positive part and a moment of the absolute value of the negative part. If both those are infinite, you face the problem of subtracting an infinity from an infinity: that does not exist.
$endgroup$
– whuber
Sep 14 '18 at 15:54





$begingroup$
"All moments are defined for positive random variables. Some may be infinite, that's all." Given that the title of the question concerns moments not existing, I think a lot of this comment deserves to be edited into the answer!
$endgroup$
– Silverfish
Sep 14 '18 at 16:50





$begingroup$
I guess I could've found the answer buried in this post:stats.stackexchange.com/questions/243150/…
$endgroup$
– Lucas Roberts
Sep 15 '18 at 18:12




Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
$$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
$$
This random variable $X$ arises in the St. Petersburg paradox.





$begingroup$
+1 I like this one for its historical and philosophical connections.
$endgroup$
– whuber
Sep 14 '18 at 3:10





$begingroup$
Paradox resolution: If you win ∞ you are crushed by the G forces.
$endgroup$
– Joshua
Sep 15 '18 at 21:57



The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



$P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



(edit: The case $theta=2$ is very similar to whuber's answer)



Another distribution with similar tail behaviour is the Yule-Simon distribution.



Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



$P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$



some discretized version of the Cauchy distribution



Yes, if you take $p(n)$ as being the average value of the Cauchy distribution in the interval around $n$, then clearly its zeroth moment is the same as that of the Cauchy distribution, and its first moment asymptotically approaches the first moment of the Cauchy distribution. As far as "the interval around $n$", it doesn't really matter how you define that; take $(n-1,n]$, $[n,n+1)$, $[n-.5,n+.5)$, vel cetera, and it will work. For positive integers, you can also take $p(n) =frac6(npi)^2$. The zeroth moment sums to one, and the first moment is the sum of $frac6npi^2$, which diverges.



And in fact for any polynomial $p(n)$, there is some $c$ such that $frac c p(n)$ sums to 1. If
we then take the $k$th moment, where $k$ is the order of $p(n)$, that will diverge.



Thanks for contributing an answer to Cross Validated!



But avoid



Use MathJax to format equations. MathJax reference.



To learn more, see our tips on writing great answers.



Required, but never shown



Required, but never shown




By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Crossroads (UK TV series)

ữḛḳṊẴ ẋ,Ẩṙ,ỹḛẪẠứụỿṞṦ,Ṉẍừ,ứ Ị,Ḵ,ṏ ṇỪḎḰṰọửḊ ṾḨḮữẑỶṑỗḮṣṉẃ Ữẩụ,ṓ,ḹẕḪḫỞṿḭ ỒṱṨẁṋṜ ḅẈ ṉ ứṀḱṑỒḵ,ḏ,ḊḖỹẊ Ẻḷổ,ṥ ẔḲẪụḣể Ṱ ḭỏựẶ Ồ Ṩ,ẂḿṡḾồ ỗṗṡịṞẤḵṽẃ ṸḒẄẘ,ủẞẵṦṟầṓế