Nature Loves Entropy. The normal, binomial, Poisson, and other distributions are members of a family, the EXPONENTIAL FAMILY. Nature loves the members of this family. Nature loves them because nature loves entropy, and all of the exponential family distributions are MAXIMUM ENTROPY distributions. -Richard McElreath
A maximum entropy distribution is a probability distribution that satisfies certain constraints while maximizing the entropy. Entropy, in this context, is a measure of the uncertainty of a distribution over outcomes. It therefore also measures surprise.
n <- 1000 # number of samples
a <- 0 # lower bound
b <- 1 # upper bound
samples_uniform <- runif(n, a, b)
n <- 1000
rate <- 1 # inverse of mean
samples_exponential <- rexp(n, rate)
n <- 1000
mean <- 0
sd <- 1 # standard deviation
samples_gaussian <- rnorm(n, mean, sd)
n <- 1000
energies <- c(0, 1, 2) # energy levels
probs <- exp(-energies) # Boltzmann factors
probs <- probs / sum(probs) # normalize probabilities
samples_boltzmann <- sample(energies, n, replace = TRUE, prob = probs)
install.packages("extraDistr")
library(extraDistr)
n <- 1000
location <- 0
scale <- 1
samples_laplace <- rlaplace(n, location, scale)
n <- 1000
prob_success <- 0.5
samples_bernoulli <- rbinom(n, 1, prob_success)
n <- 1000
size <- 1
probs <- c(0.3, 0.5, 0.2)
samples_multinomial <- rmultinom(n, size, probs)
library(MCMCpack)
n <- 1000
alpha <- c(2, 3, 4)
samples_dirichlet <- rdirichlet(n, alpha)
Entropy Quotes
Information entropy corresponds to the expected number of yes-or-no questions. If we have to ask a lot of questions, the distribution is uncertain. The Model Thinker by Scott E. Page, Location
The battle to combat entropy by continually having to supply more energy for growth, innovation, maintenance, and repair, which becomes increasingly more challenging as the system ages, underlies any serious discussion of aging, mortality, resilience, and sustainability, whether for organisms, companies, or societies. -Scale by Geoffrey West, Location 397
I found it consoling after all these years to learn that writers are up against nothing less than the fundamental anarchy of the universe; entropy, prince of disorder, is sprinkling noise on everything we write. Ambiguity is noise. Redundancy is noise. Misuse of words is noise. Vagueness is noise. Jargon is noise. Pomposity is noise. Clutter is noise: all those unnecessary adjectives (“ongoing progress”), all those unnecessary adverbs (“successfully avoided”), all those unnecessary prepositions draped onto verbs (“order up”), all those unnecessary phrases (“in a very real sense”).-Writing to Learn by William Zinsser, Location 906