Keyboard Navigation
W
A
S
D
or arrow keys · M for map · Q to exit
← Back to museum
The Six Laws of System Failure

Pattern Classes

Every exhibit in this museum is an instance of a root failure mechanic. These six classes are the physics beneath the patterns. Learn them, and you can recognize failures you have never seen before.

The language changes. The framework changes. The pattern does not.

Law 0Katie's Law — Laziness

Every system is shaped by the human drive to do less work. This is not a flaw. It is the economic force that produces all software — and all software failure.

Every pattern in this museum exists because a human made a reasonable choice to do less work. Two-digit years saved two columns on a punch card. String concatenation was faster than learning prepared statements. Trusting the cookie meant not building a token system. The six laws describe how systems fail. Katie's Law describes why humans build them that way. It is the gravity beneath the geology — the force that creates the layers this museum teaches you to read.

Boundary Collapse6 exhibits

When data crosses into a system that interprets structure, without being constrained or transformed, it becomes executable.

Direct Injection
Reflected Execution
Format Dissolution
Memory Invasion
Object Resurrection
Instruction Confusion
Ambient Authority4 exhibits

When a system trusts the presence of a credential instead of verifying the intent behind it, authentication becomes indistinguishable from authorization.

Confused Deputy
Predictable Identity
Transitive Trust4 exhibits

When a system inherits trust from a source it did not verify, the attack surface extends to everything that source touches.

Complexity Accretion6 exhibits

Systems do not become complex. They accumulate complexity — one reasonable decision at a time — until no single person can hold the whole in their head.

Temporal Coupling4 exhibits

Code that assumes sequential execution, stable state, or consistent timing will fail the moment concurrency, scale, or latency proves the assumption wrong.

Observer Interference2 exhibits

When the system that monitors health becomes a participant in the system it monitors, observation becomes a failure vector.

Pattern Relationships

Patterns do not exist in isolation. Each connection reveals how one failure enables, evolves into, or mirrors another. Click a class above to filter.

Cross-Domain Analog (9)
AI Bridge (4)
Enables (10)
Precursor (3)
The Insight

Mitigation does not remove a pattern. It relocates it. Each fix reduces exposure but introduces new assumptions. When those assumptions fail, the pattern re-emerges — often in a context the mitigation was not designed for. The AI era does not create new failure modes. It provides new execution contexts for the same six laws.