Update wiki/immediate/patterns-and-estimation.md
2bc21cb08a9a harrisonqian 2026-04-13 1 file
index 8eb0e41..827d841 100644
@@ -16,7 +16,7 @@ the actual answer is somewhere around 100-200. we're within a factor of 2, which
estimation is the skill of getting approximately right answers with minimal information. it's arguably more useful than precise calculation because in most real situations, you don't have precise inputs.
-in my math modeling competition work (HiMCM, MCM/ICM), the first step for every problem is estimation. before building a fancy model, you sanity-check: what's the right order of magnitude? if your model predicts that a city needs 50,000 ambulances, something is wrong. if it predicts 50, that might be reasonable. [engineering and modeling](/wiki/stem/engineering-and-modeling) always starts with this kind of gut-check.
+in my math modeling competition work (HiMCM, MCM/ICM), the first step for every problem is estimation. before building a fancy model, you sanity-check: what's the right order of magnitude? if your model predicts that a city needs 50,000 ambulances, something is wrong. if it predicts 50, that might be reasonable. [[engineering-and-modeling|engineering and modeling]] always starts with this kind of gut-check.
## order-of-magnitude reasoning
@@ -43,7 +43,7 @@ humans are pattern-recognition machines. we see faces in clouds, hear words in n
**when it fails**: seeing patterns in random noise. the "hot hand" debate in basketball raged for decades partly because humans are so eager to see streaks in random sequences. financial "technical analysis" finds patterns in stock charts that often aren't really there. conspiracy theories are pattern recognition run amok.
-the antidote is statistical thinking: asking "how likely would this pattern be by chance?" if you flip a coin 100 times, you'll almost certainly get a streak of 6 or 7 heads somewhere. that's not a pattern — it's expected randomness. this connects directly to [[immediate/probability-in-daily-life|probability]] — distinguishing signal from noise is fundamentally a probabilistic question.
+the antidote is statistical thinking: asking "how likely would this pattern be by chance?" if you flip a coin 100 times, you'll almost certainly get a streak of 6 or 7 heads somewhere. that's not a pattern — it's expected randomness. this connects directly to [[probability-in-daily-life|probability]] — distinguishing signal from noise is fundamentally a probabilistic question.
## mental math tricks
@@ -53,12 +53,12 @@ some useful estimation heuristics:
**dimensional analysis**: if your answer has the wrong units, it's wrong. this catches a surprising number of errors. "speed = distance × time" — nope, the units don't work out. must be distance / time.
-**anchor and adjust**: start with something you know and adjust. "how tall is that building?" well, each floor is about 3 meters, I count 15 floors, so ~45 meters. you're using [arithmetic](/wiki/immediate/arithmetic-everywhere) and visual estimation together.
+**anchor and adjust**: start with something you know and adjust. "how tall is that building?" well, each floor is about 3 meters, I count 15 floors, so ~45 meters. you're using [[arithmetic-everywhere|arithmetic]] and visual estimation together.
**break it down**: any complex estimation becomes tractable when you decompose it into simpler estimates. this is the core of fermi estimation — you might be wrong on individual factors, but errors tend to cancel when you multiply several rough estimates together.
## the connection to abstraction
-estimation is where [layer 1](/wiki/immediate/counting-and-measurement) thinking meets [layer 3](/wiki/structural/the-organizational-lens) thinking. the act of estimating forces you to build a mental model: what are the relevant quantities? how do they relate? what's the structure of the problem?
+estimation is where [[counting-and-measurement|layer 1]] thinking meets [[the-organizational-lens|layer 3]] thinking. the act of estimating forces you to build a mental model: what are the relevant quantities? how do they relate? what's the structure of the problem?
-this is exactly what harrison means by "the organizational lens" — you're not just computing, you're *structuring your understanding* of a situation. a good fermi estimate reveals the key parameters of a system, which parameters matter most (sensitivity analysis), and which you can safely ignore. that's mathematical thinking at its most practical. pattern recognition itself is what [[structural/abstraction-as-power|abstraction]] formalizes — noticing that two different situations share the same structure is both the essence of estimation and the essence of abstract mathematics.
\ No newline at end of file
+this is exactly what harrison means by "the organizational lens" — you're not just computing, you're *structuring your understanding* of a situation. a good fermi estimate reveals the key parameters of a system, which parameters matter most (sensitivity analysis), and which you can safely ignore. that's mathematical thinking at its most practical. pattern recognition itself is what [[abstraction-as-power|abstraction]] formalizes — noticing that two different situations share the same structure is both the essence of estimation and the essence of abstract mathematics.
\ No newline at end of file