Conjugate Prior for Poisson Distribution

 

Conjugate Prior for Poisson

We derive the conjugate prior for a Poisson likelihood and compute the posterior distribution step by step.


Likelihood: Poisson Model

Suppose we observe counts

x1,x2,,xnPoisson(λ)x_1,x_2,\dots,x_n \sim \text{Poisson}(\lambda)

Poisson PMF:

p(xλ)=λxeλx!,x=0,1,2,p(x|\lambda)=\frac{\lambda^x e^{-\lambda}}{x!}, \quad x=0,1,2,\dots

For independent data:

L(λ)=i=1nλxieλxi!L(\lambda)=\prod_{i=1}^n \frac{\lambda^{x_i} e^{-\lambda}}{x_i!} L(λ)λxienλL(\lambda)\propto \lambda^{\sum x_i} e^{-n\lambda}

Let

S=i=1nxiS=\sum_{i=1}^n x_i

Then

L(λ)λSenλL(\lambda)\propto \lambda^{S} e^{-n\lambda}

Goal: Find Conjugate Prior

A conjugate prior must have the same functional form in λ\lambda as the likelihood.

Look at likelihood kernel:

λSenλ\lambda^{S} e^{-n\lambda}

We want a prior of form:

p(λ)λα1eβλp(\lambda)\propto \lambda^{\alpha-1} e^{-\beta\lambda}

This is exactly the Gamma distribution.


Choose Prior: Gamma Distribution

Let

λGamma(α,β)\lambda \sim \text{Gamma}(\alpha,\beta)

PDF:

p(λ)=βαΓ(α)λα1eβλp(\lambda)=\frac{\beta^\alpha}{\Gamma(\alpha)} \lambda^{\alpha-1} e^{-\beta\lambda}

Ignoring constants:

p(λ)λα1eβλp(\lambda)\propto \lambda^{\alpha-1} e^{-\beta\lambda}

Posterior via Bayes Rule

p(λx)L(λ)p(λ)p(\lambda|x) \propto L(\lambda)\,p(\lambda)

Substitute:

p(λx)λSenλλα1eβλp(\lambda|x)\propto \lambda^{S} e^{-n\lambda}\cdot \lambda^{\alpha-1} e^{-\beta\lambda}

Combine powers:

p(λx)λS+α1e(n+β)λp(\lambda|x)\propto \lambda^{S+\alpha-1} e^{-(n+\beta)\lambda}

Recognize Posterior Form

This matches Gamma distribution:

λxGamma(α+S, β+n)\lambda|x \sim \text{Gamma}(\alpha+S,\ \beta+n)

Final Result

✅ Prior

λGamma(α,β)\lambda \sim \text{Gamma}(\alpha,\beta)

✅ Posterior

λxGamma(α+xi, β+n)\lambda|x \sim \text{Gamma}(\alpha+\sum x_i,\ \beta+n)

Interpretation

QuantityMeaning
α\alpha
        prior pseudo-count of events
β\beta
        prior exposure/time
S=xiS=\sum x_i        observed total events
nn
        number of observations

Posterior = prior + data

shape: α+S\text{shape: } \alpha + S
rate: β+n\text{rate: } \beta + n

Posterior Mean

Gamma mean:

E[λx]=α+Sβ+nE[\lambda|x]=\frac{\alpha+S}{\beta+n}

This equals:

weighted average of prior mean and sample mean\text{weighted average of prior mean and sample mean}

Numerical Example

Suppose:

  • Prior: λGamma(2,1)\lambda\sim\text{Gamma}(2,1)

  • Observations: x=[3,2,4]x=[3,2,4]

Then:

S=3+2+4=9,n=3S=3+2+4=9,\quad n=3

Posterior:

λxGamma(11,4)\lambda|x\sim\text{Gamma}(11,4)

Posterior mean:

11/4=2.7511/4=2.75

Why Gamma is conjugate

Because both prior and likelihood contain:

λ()e()λ\lambda^{(\cdot)} e^{-(\cdot)\lambda}

Multiplying them preserves this form → remains Gamma.

This is the essence of conjugacy.


 Summary

For Poisson likelihood, the Gamma prior is conjugate, and the posterior is obtained by simply adding observed counts to the prior parameters.

Comments

Popular posts from this blog

Advanced Mathematics for Computer Science HNCST409 KTU BTech Honors 2024 Scheme

AKS Primality Testing

Galois Field and Operations