Development of Information Theory
Shannon defined information as reduction of uncertainty. Bit is the fundamental unit. Entropy measures information content and compression limits.
All articles tagged with "Mathematics"
Shannon defined information as reduction of uncertainty. Bit is the fundamental unit. Entropy measures information content and compression limits.
A guide to information theory — how Shannon defined information, what entropy measures, how data compression and error correction work, and how information connects to physics and AI.
A comprehensive history of mathematics from Babylonian arithmetic and Euclid's geometry through calculus, non-Euclidean geometry, Cantor's infinities, Godel's incompleteness theorems, and the unsolved problems of today.
Game theory is the mathematical study of strategic interaction. From the Prisoner's Dilemma to nuclear deterrence, this explainer covers Nash equilibria, cooperation, auctions, signaling, and why the theory changed economics, biology, and political science.
The butterfly effect describes how small changes in initial conditions can produce vastly different outcomes in complex systems. Learn the science behind chaos theory and where it applies.
Game theory is the mathematical study of strategic interaction. From the Prisoner's Dilemma to nuclear deterrence, this explainer covers Nash equilibria, cooperation, auctions, signaling, and why the theory changed economics, biology, and political science.
Exponential growth compounds on itself, doubling repeatedly. Learn why humans instinctively think linearly, how the chessboard problem illustrates exponentials, and how to reason better about them.