For a discrete random variable X with probability mass p, the Shannon entropy is H(X) = −Σ p(x) log p(x). Measures average uncertainty / information content in bits (log₂) or nats (ln). Non-negative; zero iff X is deterministic;…
For a discrete random variable X with probability mass p, the Shannon entropy is H(X) = −Σ p(x) log p(x). Measures average uncertainty / information content in bits (log₂) or nats (ln). Non-negative; zero iff X is deterministic;…