Shannon entropy

Layer 0 — Mathematicsin the probability-statistics subtree

For a discrete random variable X with probability mass p, the Shannon entropy is H(X) = −Σ p(x) log p(x). Measures average uncertainty / information content in bits (log₂) or nats (ln). Non-negative; zero iff X is deterministic;…

Related concepts

Explore Shannon entropy on the interactive knowledge graph →