Kullback-Leibler divergence + relative entropy

Layer 0 — Mathematicsin the information-theory subtree

KL(P||Q) = E_P[log P/Q]. Non-negative; zero iff P=Q. Foundation of statistical-inference + variational-methods + ML. Pinsker inequality TV ≤ √(KL/2). Cross-listed with L4 free-energy.

Related concepts

Explore Kullback-Leibler divergence + relative entropy on the interactive knowledge graph →