MCMC in Bayesian Estimation

 

 MCMC in Bayesian Estimation 

🎯 The problem in Bayesian estimation

In Bayesian inference we want:

Posterior=p(θdata)\text{Posterior} = p(\theta \mid \text{data})

But this requires computing:

p(θD)=p(Dθ)p(θ)p(Dθ)p(θ)dθp(\theta|D)=\frac{p(D|\theta)p(\theta)}{\int p(D|\theta)p(\theta)d\theta}

👉 That denominator is often very hard or impossible to compute.


 What MCMC does (simple idea)

MCMC = a smart random sampling method

Instead of solving the posterior mathematically:

✔ It generates many random samples of the parameter
✔ These samples follow the posterior distribution
✔ Then we use those samples to estimate everything



Intuition with an example

Suppose:

  • You want to estimate probability of heads pp

  • Posterior distribution is complicated

Instead of calculating the exact formula:

  1. Start with some guess, say p=0.5

  2. Randomly move to nearby values

  3. Accept moves that fit the data better

  4. Repeat thousands of times

After many steps:

👉 The visited values of p form the posterior distribution


Why this works (Markov Chain idea)

  • Each new value depends only on the current value

  • This creates a Markov chain

  • Designed so that after many steps:

distribution of visited points=posterior\text{distribution of visited points} = \text{posterior}

 Procedure

  1. Start with initial parameter value θ0

  2. Propose a new value randomly

  3. Accept or reject based on posterior probability ratio

  4. Repeat many times

After sufficient iterations:

  • generated values behave like samples from the posterior

  • posterior mean ≈ sample average

  • uncertainty from spread of samples

Role in Bayesian estimation 

  • Allows inference when posterior has no closed form

  • Provides parameter estimates and credible intervals

  • Used in complex Bayesian models (hierarchical models, ML, neural networks)

 


What we do with those samples

From sampled values:

  • Posterior mean ≈ average of samples

  • Posterior variance ≈ variance of samples

  • Credible interval ≈ quantiles of samples

So Bayesian estimation becomes:

Estimation by simulation instead of calculus\text{Estimation by simulation instead of calculus}

Everyday analogy

Imagine:

  • Posterior = shape of a dark room

  • You cannot see the whole room

MCMC = walking randomly in the room

  • You spend more time in larger areas

  • By recording your positions, you learn the room's shape


Summary

MCMC is a method to sample from the posterior distribution in Bayesian inference when we cannot compute it directly.
These samples allow us to estimate parameters, uncertainty, and predictions

Example with Python Code

Problem

  • Toss coin 10 times

  • Observe 7 heads

  • Want posterior of probability pp

Bayesian model

  • Prior: pBeta(1,1)(uniform)

  • Likelihood: Binomial

  • True posterior:

pDBeta(8,4)p|D \sim \text{Beta}(8,4)

What MCMC does

  • Start with guess p=0.5p=0.5

  • Propose nearby random values

  • Accept values that fit data better

  • Repeat many times

  • Histogram of sampled values ≈ posterior

Result

  • Histogram = MCMC samples

  • Smooth curve = true Beta posterior

  • Means match closely → confirms sampling works


Conclusion:
MCMC converts Bayesian inference from a difficult integration problem into a sampling problem.



import numpy as np
import matplotlib.pyplot as plt
n, k = 10, 7

def posterior(p):

    if p<=0 or p>=1:

        return 0

    return p**k*(1-p)**(n-k)

samples=[]

p=0.5

for _ in range(20000):

    proposal=np.random.normal(p,0.08)

    a=min(1,posterior(proposal)/posterior(p) if posterior(p)>0 else 1)

    if np.random.rand()<a:

        p=proposal

    samples.append(p)


samples=np.array(samples[2000:])
plt.hist(samples,bins=40,density=True)
plt.title("MCMC Posterior Samples for Coin Toss")
plt.xlabel("p")
plt.show()
print("Posterior mean ≈",samples.mean())

Posterior mean ≈ 0.6580497314180891

Comments

Popular posts from this blog

Advanced Mathematics for Computer Science HNCST409 KTU BTech Honors 2024 Scheme

AKS Primality Testing

Galois Field and Operations