Using Decision Theory in Everyday Choices
Apply decision theory: list all options, define outcomes for each, assign probabilities to outcomes, calculate expected values, then choose highest value.
Step-by-step guides and clear explanations of complex topics. Break down difficult concepts into understandable pieces with practical examples.
Some ideas are worth understanding deeply. Explainers break down complex topics from cognitive biases to feedback loops, from the Dunning-Kruger effect to confirmation bias into clear, accessible explanations that build genuine understanding, not just surface familiarity.
This collection focuses on psychological phenomena, cognitive science, thinking frameworks, and decision-making processes. Each explainer answers the fundamental questions: How does this work? Why does it matter? When do you encounter it? How can you apply it?
What you'll find: Research-backed explanations, concrete examples from real life, practical applications, connections to related concepts, and insights that change how you see everyday thinking patterns.
Introduction to complex topics for those just getting started
10 articlesReal-world examples and detailed case studies
10 articlesPractical checklists, templates, and quick reference guides
10 articlesSide-by-side comparisons of tools, approaches, and methodologies
10 articlesDetailed explanations of how systems, tools, and processes function
10 articlesCommon mistakes, misconceptions, and lessons from failures
10 articlesDetailed step-by-step guides and tutorials
10 articlesClear explanations of technical terms and jargon
10 articlesIndustry trends, data-driven insights, and analytical perspectives
10 articlesSolutions to common problems and debugging approaches
0 articles
Apply decision theory: list all options, define outcomes for each, assign probabilities to outcomes, calculate expected values, then choose highest value.
Apply behavioral economics: recognize cognitive biases like anchoring and loss aversion, understand status quo bias, and design choices accounting for them.
Apply information theory: Entropy measures surprise and uncertainty. High entropy is informative, low is predictable. Remove redundancy, prioritize signal.
Apply network theory: weak ties connect different clusters bringing novel information. Strong ties provide reliable support and trust between close friends.
Make mental models actionable: Test predictions against reality, use them to guide decisions, identify blind spots, and refine through feedback loops.
Apply learning science: Spaced repetition at increasing intervals, retrieval practice testing yourself before reviewing, interleaving topics, elaboration.
Apply systems thinking: Map components and connections, identify feedback loops for growth/stability, find leverage points, test interventions, track effects.
Apply game theory: identify game type as zero-sum or positive-sum cooperation, map payoffs for each party, and find Nash equilibrium outcomes.
Apply communication theory: senders encode messages, receivers decode them with different interpretations. Anticipate misunderstandings by checking meaning.
ML training: Initialize model with random weights, forward pass makes predictions, calculate loss measuring error, backpropagation updates weights, repeat.
Search engines crawl pages by following links, index content by extracting text and metadata, then rank results using algorithms and relevance signals.
Load balancers distribute incoming requests across servers using algorithms like round robin for fairness and least connections for optimal routing.
Encryption transforms plaintext into ciphertext using algorithms and keys. Intercepted data is useless without the key. Symmetric and asymmetric types exist.
DNS resolution: Browser checks cache, queries recursive resolver like Google DNS, resolver checks cache, then queries root nameservers to find IP addresses.
APIs define software communication contracts. REST APIs use endpoints like /users and HTTP methods like GET for read and POST for create operations.
Transactions treat operations as single units—all succeed or all fail. ACID properties: Atomicity (all-or-nothing), Consistency, Isolation, Durability.
Database indexes use B-tree structures maintaining sorted pointers to rows. Like book indexes, they enable fast lookups without scanning entire tables.
Containers use Linux namespaces for isolated processes and cgroups for resource limits. Lightweight virtualization with separate filesystems and networks.
Run premortem: Imagine project failed spectacularly, everyone brainstorms why it failed, share scenarios, identify common patterns, create mitigation plans.
Map complex systems: Identify key components, draw connections showing flows and dependencies, mark causal links, find feedback loops, test predictions.
Set stage for learning, gather data on what happened, generate insights through discussion, decide actions, close with commitments and appreciation.
Assess accuracy by verifying facts and cross-checking claims. Check source credibility and expertise. Identify potential biases in presentation.
Keep simple with three to five key metrics. Make actionable so measurement drives improvement. Align with goals avoiding distortion.
Measure current state. Define desired state. Create comparator showing gap between them. Design response actions closing gaps.
List criteria for good decisions. Weight importance of each factor. Score options against criteria. Document rationale for future reference.
Identify key components. Map relationships showing how parts connect. Test predictions against reality. Refine based on failures.
Map what's rewarded with bonuses and recognition. Identify what's punished with penalties and criticism. Compare stated versus actual incentives.
Aristotle developed logic and syllogism. Socrates questioned assumptions. Descartes emphasized doubt. Enlightenment valued reason over authority.
Decision theory origins: Bernoulli introduced expected utility in 1738. Von Neumann and Morgenstern developed game theory and axioms of rationality in 1944.
Behavioral economics origins: Simon introduced bounded rationality in 1950s. Kahneman and Tversky revealed cognitive biases and heuristics in 1970s.
Scientific management quantified work. Accounting standardized financial measurement. Modern analytics expanded to all aspects of organizational performance.
Argyris and Schon distinguished single-loop learning, which fixes errors, from double-loop learning, which questions underlying assumptions.
Scientific management optimized tasks. Human relations in 1930s emphasized social factors. Systems theory saw organizations as interconnected wholes.
Shannon's mathematical model treats communication as signal transmission. Encoding sends messages. Decoding receives them. Noise creates errors.
Learn systems thinking fundamentals—understanding interconnections, feedback loops, and why problems require holistic approaches beyond linear thinking.
Analyze incentive failures—real cases where reward systems created perverse behavior, gaming, and outcomes opposite to intent.
Practical decision checklist—systematic steps for making better decisions under uncertainty, reducing bias and improving outcomes.
Cognitive revolution in the 1950s rejected behaviorism. Computers provided metaphors. Focus shifted to mental processes and information processing.
Understand incentive mechanics—how rewards and punishments shape behavior, create alignment, and why incentive design matters.
Understand feedback loops—how outputs become inputs, creating reinforcing or balancing cycles that shape system behavior over time.
Compare automated and manual processes—understand when automation worth the investment, when manual better, and tradeoffs in each approach.
Learn how to systematically identify underlying causes of problems using techniques like Five Whys, fishbone diagrams, and causal mapping.
Trace the evolution of systems thinking from ancient philosophy through cybernetics to modern complexity science and organizational theory.
Learn ethics fundamentals—understanding right and wrong, ethical frameworks, and how to think through moral dilemmas systematically.
Use cognitive load theory to design better learning experiences, study methods, and instructional materials that respect working memory limits.
Explore the technical architecture of Git and distributed version control, from commits and branching to merging and conflict resolution.
Understand decision making fundamentals—from recognizing decisions to evaluating options, learning how good decisions are actually made.
Understand why communication fails—from encoding problems to context mismatches, learning common breakdown points and how to prevent them.
Shannon defined information as reduction of uncertainty. Bit is the fundamental unit. Entropy measures information content and compression limits.
Discover how learning actually works—from memory formation to skill acquisition, understanding evidence-based learning principles.
The Dunning-Kruger effect is a cognitive bias where people with limited knowledge or competence in a domain overestimate their own ability. The less you know, the harder it is to recognize what you're missing. This creates a paradox: incompetence hides itself. As people gain expertise, they become more aware of the gaps in their knowledge.
Confirmation bias is the tendency to search for, interpret, and remember information that confirms pre-existing beliefs while ignoring contradictory evidence. It affects decision-making by creating blind spots you see what you expect to see rather than what's actually there. This bias is one of the most pervasive obstacles to clear thinking and rational judgment.
Feedback loops are mechanisms where outputs of a system circle back as inputs, creating self-reinforcing (positive feedback) or self-correcting (negative feedback) cycles. Positive loops amplify changes and drive exponential growth or collapse. Negative loops stabilize systems and maintain equilibrium. Understanding feedback loops is essential for systems thinking and recognizing patterns in complex environments.
Cognitive biases are systematic patterns of deviation from rationality in judgment and decision-making. They're mental shortcuts (heuristics) that help us process information quickly but can lead to errors. They matter because they're predictable, pervasive, and often invisible to the person experiencing them. Awareness of cognitive biases is the first step toward clearer thinking.
Availability bias (or availability heuristic) is the tendency to overestimate the likelihood of events that are easy to recall or imagine. If you can quickly think of examples, you assume they're common. This leads to distorted risk assessment plane crashes feel more likely than car accidents because they're more memorable and widely covered in media, even though statistically car accidents are far more frequent.
Anchoring bias occurs when people rely too heavily on the first piece of information they encounter (the 'anchor') when making decisions. Even irrelevant numbers can influence subsequent judgments. In negotiations, the first offer sets an anchor that affects all following counteroffers. Awareness of anchoring helps you recognize when initial information is disproportionately influencing your thinking.
The sunk cost fallacy is the tendency to continue investing in something because you've already invested time, money, or effort even when continuing no longer makes sense. Past costs are irrelevant to future decisions, but psychologically, we feel compelled to justify previous investments. Recognizing sunk costs helps you make decisions based on future value rather than past commitment.
Ready to apply what you've learned? Challenge yourself with interactive questions covering all explainers sub-topics. Choose between practice mode (10 questions with instant feedback) or test mode (20 questions with comprehensive results).