If both prior and likelihood are Gaussian what can we say about the posterior? [closed]










4















If X is a random variable that has Gaussian prior and Gaussian likelihood. What can be inferred about the posterior?



As posterior is proporional to prior*likelihood which are Gaussians, the posterior should also be Gaussians. But I have struggles deriving it.










share|cite|improve this question















closed as unclear what you're asking by Xi'an, kjetil b halvorsen, mdewey, Ben, Sycorax Nov 19 '18 at 17:04


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.










  • 3





    Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

    – jbowman
    Nov 10 '18 at 16:36











  • @Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

    – curious_dan
    Nov 10 '18 at 18:49











  • Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

    – Sycorax
    Nov 19 '18 at 17:04












  • Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

    – Sycorax
    Nov 19 '18 at 17:32















4















If X is a random variable that has Gaussian prior and Gaussian likelihood. What can be inferred about the posterior?



As posterior is proporional to prior*likelihood which are Gaussians, the posterior should also be Gaussians. But I have struggles deriving it.










share|cite|improve this question















closed as unclear what you're asking by Xi'an, kjetil b halvorsen, mdewey, Ben, Sycorax Nov 19 '18 at 17:04


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.










  • 3





    Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

    – jbowman
    Nov 10 '18 at 16:36











  • @Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

    – curious_dan
    Nov 10 '18 at 18:49











  • Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

    – Sycorax
    Nov 19 '18 at 17:04












  • Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

    – Sycorax
    Nov 19 '18 at 17:32













4












4








4








If X is a random variable that has Gaussian prior and Gaussian likelihood. What can be inferred about the posterior?



As posterior is proporional to prior*likelihood which are Gaussians, the posterior should also be Gaussians. But I have struggles deriving it.










share|cite|improve this question
















If X is a random variable that has Gaussian prior and Gaussian likelihood. What can be inferred about the posterior?



As posterior is proporional to prior*likelihood which are Gaussians, the posterior should also be Gaussians. But I have struggles deriving it.







self-study normal-distribution maximum-likelihood likelihood conjugate-prior






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 10 '18 at 19:47







Alex

















asked Nov 10 '18 at 16:19









AlexAlex

233




233




closed as unclear what you're asking by Xi'an, kjetil b halvorsen, mdewey, Ben, Sycorax Nov 19 '18 at 17:04


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.






closed as unclear what you're asking by Xi'an, kjetil b halvorsen, mdewey, Ben, Sycorax Nov 19 '18 at 17:04


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.









  • 3





    Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

    – jbowman
    Nov 10 '18 at 16:36











  • @Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

    – curious_dan
    Nov 10 '18 at 18:49











  • Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

    – Sycorax
    Nov 19 '18 at 17:04












  • Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

    – Sycorax
    Nov 19 '18 at 17:32












  • 3





    Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

    – jbowman
    Nov 10 '18 at 16:36











  • @Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

    – curious_dan
    Nov 10 '18 at 18:49











  • Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

    – Sycorax
    Nov 19 '18 at 17:04












  • Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

    – Sycorax
    Nov 19 '18 at 17:32







3




3





Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

– jbowman
Nov 10 '18 at 16:36





Welcome to CV. Since you’re new here, you may want to take our tour, which has information for new users. Please add the [self-study] tag and read its wiki. Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. Please make these changes as just posting your homework & hoping someone will do it for you is grounds for closing. If this is self-study rather than homework, let us know, and... it's still a good idea to show us what you've tried or explain specifically what you don't understand.

– jbowman
Nov 10 '18 at 16:36













@Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

– curious_dan
Nov 10 '18 at 18:49





@Alex, do you mean to say "As posterior is proportional to prior*likelihood" ?

– curious_dan
Nov 10 '18 at 18:49













Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

– Sycorax
Nov 19 '18 at 17:04






Possible duplicate of posterior Gaussian distribution although we can probably reverse the duplicate and the target

– Sycorax
Nov 19 '18 at 17:04














Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

– Sycorax
Nov 19 '18 at 17:32





Another promising target is my own answer, stats.stackexchange.com/questions/124623/…

– Sycorax
Nov 19 '18 at 17:32










1 Answer
1






active

oldest

votes


















6














The exponents in the prior density and the likelihood are added to each other
$$
frac(mu-mu_0)^2tau^2 + frac(overline x - mu)^2sigma^2/n
quad = quad fracsigma^2(mu-mu_0)^2 + tau^2(overline x - mu)^2sigma^2tau^2/n tag 1
$$

Now let's work on the numerator:
$$
((sigma^2/n)+tau^2) Big(mu^2 - 2(mu_0sigma^2 + overline x tau^2)mu + text“constant'' Big) tag 2
$$

where “constant” means not depending on $mu$.



Now complete the square:
$$
Big(mu - (mu_0 sigma^2 + overline x tau^2)Big)^2 + text“constant''
$$

(where this “constant” will differ from the earlier one, but it just becomes part of the normalizing constant).



So the posterior density is $$ textconstant times expBig(
textnegative constant times (mu-textsomething)^2 Big).$$



In other words, the posterior is Gaussian.



The posterior mean is a weighted average of the prior mean $mu_0$ and the sample mean $overline x,$ with weights proportional to the reciprocals variances $tau^2$ (for the prior) and $sigma^2/n$ (for the sample mean).






share|cite|improve this answer

























  • However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

    – Dimitris Rizopoulos
    Nov 10 '18 at 18:21











  • @Henry : Yes. Fixed. $qquad$

    – Michael Hardy
    Nov 12 '18 at 3:41

















1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









6














The exponents in the prior density and the likelihood are added to each other
$$
frac(mu-mu_0)^2tau^2 + frac(overline x - mu)^2sigma^2/n
quad = quad fracsigma^2(mu-mu_0)^2 + tau^2(overline x - mu)^2sigma^2tau^2/n tag 1
$$

Now let's work on the numerator:
$$
((sigma^2/n)+tau^2) Big(mu^2 - 2(mu_0sigma^2 + overline x tau^2)mu + text“constant'' Big) tag 2
$$

where “constant” means not depending on $mu$.



Now complete the square:
$$
Big(mu - (mu_0 sigma^2 + overline x tau^2)Big)^2 + text“constant''
$$

(where this “constant” will differ from the earlier one, but it just becomes part of the normalizing constant).



So the posterior density is $$ textconstant times expBig(
textnegative constant times (mu-textsomething)^2 Big).$$



In other words, the posterior is Gaussian.



The posterior mean is a weighted average of the prior mean $mu_0$ and the sample mean $overline x,$ with weights proportional to the reciprocals variances $tau^2$ (for the prior) and $sigma^2/n$ (for the sample mean).






share|cite|improve this answer

























  • However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

    – Dimitris Rizopoulos
    Nov 10 '18 at 18:21











  • @Henry : Yes. Fixed. $qquad$

    – Michael Hardy
    Nov 12 '18 at 3:41















6














The exponents in the prior density and the likelihood are added to each other
$$
frac(mu-mu_0)^2tau^2 + frac(overline x - mu)^2sigma^2/n
quad = quad fracsigma^2(mu-mu_0)^2 + tau^2(overline x - mu)^2sigma^2tau^2/n tag 1
$$

Now let's work on the numerator:
$$
((sigma^2/n)+tau^2) Big(mu^2 - 2(mu_0sigma^2 + overline x tau^2)mu + text“constant'' Big) tag 2
$$

where “constant” means not depending on $mu$.



Now complete the square:
$$
Big(mu - (mu_0 sigma^2 + overline x tau^2)Big)^2 + text“constant''
$$

(where this “constant” will differ from the earlier one, but it just becomes part of the normalizing constant).



So the posterior density is $$ textconstant times expBig(
textnegative constant times (mu-textsomething)^2 Big).$$



In other words, the posterior is Gaussian.



The posterior mean is a weighted average of the prior mean $mu_0$ and the sample mean $overline x,$ with weights proportional to the reciprocals variances $tau^2$ (for the prior) and $sigma^2/n$ (for the sample mean).






share|cite|improve this answer

























  • However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

    – Dimitris Rizopoulos
    Nov 10 '18 at 18:21











  • @Henry : Yes. Fixed. $qquad$

    – Michael Hardy
    Nov 12 '18 at 3:41













6












6








6







The exponents in the prior density and the likelihood are added to each other
$$
frac(mu-mu_0)^2tau^2 + frac(overline x - mu)^2sigma^2/n
quad = quad fracsigma^2(mu-mu_0)^2 + tau^2(overline x - mu)^2sigma^2tau^2/n tag 1
$$

Now let's work on the numerator:
$$
((sigma^2/n)+tau^2) Big(mu^2 - 2(mu_0sigma^2 + overline x tau^2)mu + text“constant'' Big) tag 2
$$

where “constant” means not depending on $mu$.



Now complete the square:
$$
Big(mu - (mu_0 sigma^2 + overline x tau^2)Big)^2 + text“constant''
$$

(where this “constant” will differ from the earlier one, but it just becomes part of the normalizing constant).



So the posterior density is $$ textconstant times expBig(
textnegative constant times (mu-textsomething)^2 Big).$$



In other words, the posterior is Gaussian.



The posterior mean is a weighted average of the prior mean $mu_0$ and the sample mean $overline x,$ with weights proportional to the reciprocals variances $tau^2$ (for the prior) and $sigma^2/n$ (for the sample mean).






share|cite|improve this answer















The exponents in the prior density and the likelihood are added to each other
$$
frac(mu-mu_0)^2tau^2 + frac(overline x - mu)^2sigma^2/n
quad = quad fracsigma^2(mu-mu_0)^2 + tau^2(overline x - mu)^2sigma^2tau^2/n tag 1
$$

Now let's work on the numerator:
$$
((sigma^2/n)+tau^2) Big(mu^2 - 2(mu_0sigma^2 + overline x tau^2)mu + text“constant'' Big) tag 2
$$

where “constant” means not depending on $mu$.



Now complete the square:
$$
Big(mu - (mu_0 sigma^2 + overline x tau^2)Big)^2 + text“constant''
$$

(where this “constant” will differ from the earlier one, but it just becomes part of the normalizing constant).



So the posterior density is $$ textconstant times expBig(
textnegative constant times (mu-textsomething)^2 Big).$$



In other words, the posterior is Gaussian.



The posterior mean is a weighted average of the prior mean $mu_0$ and the sample mean $overline x,$ with weights proportional to the reciprocals variances $tau^2$ (for the prior) and $sigma^2/n$ (for the sample mean).







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 12 '18 at 3:40

























answered Nov 10 '18 at 17:00









Michael HardyMichael Hardy

3,7051430




3,7051430












  • However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

    – Dimitris Rizopoulos
    Nov 10 '18 at 18:21











  • @Henry : Yes. Fixed. $qquad$

    – Michael Hardy
    Nov 12 '18 at 3:41

















  • However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

    – Dimitris Rizopoulos
    Nov 10 '18 at 18:21











  • @Henry : Yes. Fixed. $qquad$

    – Michael Hardy
    Nov 12 '18 at 3:41
















However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

– Dimitris Rizopoulos
Nov 10 '18 at 18:21





However, just to note that this works when $mu$ enters linearly in the quadratic form in the Gaussian density of $[X mid mu]$. If we have $x_i - f(mu)$, with $f(cdot)$ nonlinear, then we don’t get a Gaussian posterior.

– Dimitris Rizopoulos
Nov 10 '18 at 18:21













@Henry : Yes. Fixed. $qquad$

– Michael Hardy
Nov 12 '18 at 3:41





@Henry : Yes. Fixed. $qquad$

– Michael Hardy
Nov 12 '18 at 3:41



Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Crossroads (UK TV series)

ữḛḳṊẴ ẋ,Ẩṙ,ỹḛẪẠứụỿṞṦ,Ṉẍừ,ứ Ị,Ḵ,ṏ ṇỪḎḰṰọửḊ ṾḨḮữẑỶṑỗḮṣṉẃ Ữẩụ,ṓ,ḹẕḪḫỞṿḭ ỒṱṨẁṋṜ ḅẈ ṉ ứṀḱṑỒḵ,ḏ,ḊḖỹẊ Ẻḷổ,ṥ ẔḲẪụḣể Ṱ ḭỏựẶ Ồ Ṩ,ẂḿṡḾồ ỗṗṡịṞẤḵṽẃ ṸḒẄẘ,ủẞẵṦṟầṓế