KL Divergence (Kullback–Leibler Divergence)
📘 KL Divergence (Kullback–Leibler Divergence)
What is KL Divergence?
KL divergence measures how different one probability distribution is from another.
-
= true distribution (data / reality)
-
= approximate or predicted distribution (model)
👉 It answers:
“How much information is lost when we use Q instead of P?”
Intuition (Simple Explanation)
-
If → KL divergence = 0 (perfect match)
-
If is very different → KL divergence large
-
It is not symmetric:
So it’s not a true distance — it's a directed divergence.
Coin Toss Example
True distribution
Model prediction
✔ Small divergence → reasonable prediction
Classification Example
True label = Cat
Model A:
Model B:
Model A gives smaller KL divergence → better model.
Relation to Entropy & Cross-Entropy
Thus:
👉 KL divergence = extra loss due to wrong prediction
Uses of KL Divergence
✅ Machine Learning
-
Training classification models
-
Loss functions (cross-entropy minimization)
-
Comparing probability outputs
✅ Bayesian Statistics
-
Measure difference between prior and posterior
-
Model selection
-
Variational inference (approximate posterior)
✅ Information Theory
-
Compression efficiency
-
Coding cost difference
✅ Deep Learning Applications
-
Variational Autoencoders (VAE)
-
Knowledge distillation
-
Reinforcement learning policy updates
Python Code Example
✔ Discrete KL divergence
✔ KL divergence for classification example
Important Properties
-
-
Equals 0 only when
-
Not symmetric
-
Not a metric
-
Measures information loss
Summary
KL divergence measures the difference between two probability distributions and . It is defined as
It represents the information lost when approximates .
KL divergence is always non-negative and equals zero only when the distributions are identical.
Applications:
-
Machine learning loss functions
-
Bayesian inference and variational methods
-
Comparing probability models
-
Information theory and coding
Example: If true distribution is and model predicts , KL divergence quantifies the prediction error.
Comments
Post a Comment