Central moment


moment of a random variable minus its mean


In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterized. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location.


Sets of central moments can be defined for both univariate and multivariate distributions.




Contents





  • 1 Univariate moments

    • 1.1 Properties


    • 1.2 Relation to moments about the origin


    • 1.3 Symmetric distributions



  • 2 Multivariate moments


  • 3 See also


  • 4 References




Univariate moments


The nth moment about the mean (or nth central moment) of a real-valued random variable X is the quantity μn := E[(X − E[X])n], where E is the expectation operator. For a continuous univariate probability distribution with probability density function f(x), the nth moment about the mean μ is



μn=E⁡[(X−E⁡[X])n]=∫−∞+∞(x−μ)nf(x)dx.displaystyle mu _n=operatorname E left[(X-operatorname E [X])^nright]=int _-infty ^+infty (x-mu )^nf(x),mathrm d x.mu _n=operatorname E left[(X-operatorname E [X])^nright]=int _-infty ^+infty (x-mu )^nf(x),mathrm d x.[1]

For random variables that have no mean, such as the Cauchy distribution, central moments are not defined.


The first few central moments have intuitive interpretations:


  • The "zeroth" central moment μ0 is 1.

  • The first central moment μ1 is 0 (not to be confused with the first raw moments or the expected value μ).

  • The second central moment μ2 is called the variance, and is usually denoted σ2, where σ represents the standard deviation.

  • The third and fourth central moments are used to define the standardized moments which are used to define skewness and kurtosis, respectively.


Properties


The nth central moment is translation-invariant, i.e. for any random variable X and any constant c, we have


μn(X+c)=μn(X).displaystyle mu _n(X+c)=mu _n(X).,mu _n(X+c)=mu _n(X).,

For all n, the nth central moment is homogeneous of degree n:


μn(cX)=cnμn(X).displaystyle mu _n(cX)=c^nmu _n(X).,mu _n(cX)=c^nmu _n(X).,

Only for n such that n equals 1, 2, or 3 do we have an additivity property for random variables X and Y that are independent:



μn(X+Y)=μn(X)+μn(Y)displaystyle mu _n(X+Y)=mu _n(X)+mu _n(Y),displaystyle mu _n(X+Y)=mu _n(X)+mu _n(Y), provided n1, 2, 3.

A related functional that shares the translation-invariance and homogeneity properties with the nth central moment, but continues to have this additivity property even when n ≥ 4 is the nth cumulant κn(X). For n = 1, the nth cumulant is just the expected value; for n = either 2 or 3, the nth cumulant is just the nth central moment; for n ≥ 4, the nth cumulant is an nth-degree monic polynomial in the first n moments (about zero), and is also a (simpler) nth-degree polynomial in the first n central moments.



Relation to moments about the origin


Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the nth-order moment about the origin to the moment about the mean is


μn=E⁡[(X−E⁡[X])n]=∑j=0n(nj)(−1)n−jμj′μn−j,displaystyle mu _n=operatorname E left[left(X-operatorname E [X]right)^nright]=sum _j=0^nn choose j(-1)^n-jmu '_jmu ^n-j,displaystyle mu _n=operatorname E left[left(X-operatorname E [X]right)^nright]=sum _j=0^nn choose j(-1)^n-jmu '_jmu ^n-j,

where μ is the mean of the distribution, and the moment about the origin is given by


μm′=∫−∞+∞xmf(x)dx=E⁡[Xm]=∑j=0m(mj)μjμm−j.displaystyle mu '_m=int _-infty ^+infty x^mf(x),dx=operatorname E [X^m]=sum _j=0^mm choose jmu _jmu ^m-j.displaystyle mu '_m=int _-infty ^+infty x^mf(x),dx=operatorname E [X^m]=sum _j=0^mm choose jmu _jmu ^m-j.

For the cases n = 2, 3, 4 — which are of most interest because of the relations to variance, skewness, and kurtosis, respectively — this formula becomes (noting that μ=μ1′displaystyle mu =mu '_1mu =mu '_1 and μ0′=1displaystyle mu '_0=1mu '_0=1):,



μ2=μ2′−μ2displaystyle mu _2=mu '_2-mu ^2,mu _2=mu '_2-mu ^2, which is commonly referred to as Var⁡(X)=E⁡[X2]−(E⁡[X])2displaystyle operatorname Var (X)=operatorname E [X^2]-left(operatorname E [X]right)^2displaystyle operatorname Var (X)=operatorname E [X^2]-left(operatorname E [X]right)^2
μ3=μ3′−3μμ2′+2μ3displaystyle mu _3=mu '_3-3mu mu '_2+2mu ^3,mu _3=mu '_3-3mu mu '_2+2mu ^3,
μ4=μ4′−4μμ3′+6μ2μ2′−3μ4.displaystyle mu _4=mu '_4-4mu mu '_3+6mu ^2mu '_2-3mu ^4.,mu _4=mu '_4-4mu mu '_3+6mu ^2mu '_2-3mu ^4.,

... and so on,[2] following Pascal's triangle, i.e.


μ5=μ5′−5μμ4′+10μ2μ3′−10μ3μ2′+4μ5.displaystyle mu _5=mu '_5-5mu mu '_4+10mu ^2mu '_3-10mu ^3mu '_2+4mu ^5.,mu _5=mu '_5-5mu mu '_4+10mu ^2mu '_3-10mu ^3mu '_2+4mu ^5.,

because 5μ4μ1′−μ5μ0′=5μ4μ−μ5=5μ5−μ5=4μ5displaystyle 5mu ^4mu '_1-mu ^5mu '_0=5mu ^4mu -mu ^5=5mu ^5-mu ^5=4mu ^55mu ^4mu '_1-mu ^5mu '_0=5mu ^4mu -mu ^5=5mu ^5-mu ^5=4mu ^5


The following sum is a stochastic variable having a compound distribution


W=∑i=1MYi,displaystyle W=sum _i=1^MY_i,displaystyle W=sum _i=1^MY_i,

where the Yidisplaystyle Y_iY_i are mutually independent random variables sharing the same common distribution and Mdisplaystyle MM a random integer variable independent of the Ykdisplaystyle Y_kY_k with its own distribution. The moments of Wdisplaystyle WW are obtained as [3]


E⁡[Wn]=∑i=0nE⁡[(Mi)]∑j=0i(ij)(−1)i−jE⁡[(∑k=1jYk)n],displaystyle operatorname E [W^n]=sum _i=0^noperatorname E left[M choose iright]sum _j=0^ii choose j(-1)^i-joperatorname E left[left(sum _k=1^jY_kright)^nright],displaystyle operatorname E [W^n]=sum _i=0^noperatorname E left[M choose iright]sum _j=0^ii choose j(-1)^i-joperatorname E left[left(sum _k=1^jY_kright)^nright],

where E⁡[(∑k=1jYk)n]displaystyle operatorname E left[left(sum _k=1^jY_kright)^nright]displaystyle operatorname E left[left(sum _k=1^jY_kright)^nright] is defined as zero for j=0displaystyle j=0j=0.



Symmetric distributions


In a symmetric distribution (one that is unaffected by being reflected about its mean), all odd central moments equal zero, because in the formula for the nth moment, each term involving a value of X less than the mean by a certain amount exactly cancels out the term involving a value of X greater than the mean by the same amount.



Multivariate moments


For a continuous bivariate probability distribution with probability density function f(x,y) the (j,k) moment about the mean μ = (μXμY) is


μj,k=E⁡[(X−E⁡[X])j(Y−E⁡[Y])k]=∫−∞+∞∫−∞+∞(x−μX)j(y−μY)kf(x,y)dxdy.displaystyle mu _j,k=operatorname E left[(X-operatorname E [X])^j(Y-operatorname E [Y])^kright]=int _-infty ^+infty int _-infty ^+infty (x-mu _X)^j(y-mu _Y)^kf(x,y),dx,dy.mu _j,k=operatorname E left[(X-operatorname E [X])^j(Y-operatorname E [Y])^kright]=int _-infty ^+infty int _-infty ^+infty (x-mu _X)^j(y-mu _Y)^kf(x,y),dx,dy.


See also


  • Standardized moment

  • Image moment

  • Normal distribution § Moments


References




  1. ^ Grimmett, Geoffrey; Stirzaker, David (2009). Probability and Random Processes. Oxford, England: Oxford University Press. ISBN 978 0 19 857222 0..mw-parser-output cite.citationfont-style:inherit.mw-parser-output .citation qquotes:"""""""'""'".mw-parser-output .citation .cs1-lock-free abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .citation .cs1-lock-limited a,.mw-parser-output .citation .cs1-lock-registration abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .citation .cs1-lock-subscription abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registrationcolor:#555.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration spanborder-bottom:1px dotted;cursor:help.mw-parser-output .cs1-ws-icon abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/12px-Wikisource-logo.svg.png")no-repeat;background-position:right .1em center.mw-parser-output code.cs1-codecolor:inherit;background:inherit;border:inherit;padding:inherit.mw-parser-output .cs1-hidden-errordisplay:none;font-size:100%.mw-parser-output .cs1-visible-errorfont-size:100%.mw-parser-output .cs1-maintdisplay:none;color:#33aa33;margin-left:0.3em.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-formatfont-size:95%.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-leftpadding-left:0.2em.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-rightpadding-right:0.2em


  2. ^ http://mathworld.wolfram.com/CentralMoment.html


  3. ^ Grubbström, Robert W.; Tang, Ou (2006). "The moments and central moments of a compound distribution". European Journal of Operational Research. 170: 106–119. doi:10.1016/j.ejor.2004.06.012.









Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)