Origin
Alan Turing, 1936 ("On Computable Numbers"). Alonzo Church, 1936 (lambda calculus). Kurt Godel, 1931 (incompleteness theorems). Stephen Cook, 1971 (NP-completeness). Andrey Kolmogorov, 1963 (algorithmic complexity).
The Principle
There are problems that no computer can solve. Not because our computers are too slow, not because our algorithms are unoptimized, not because we haven't tried hard enough — but because mathematical proof demonstrates that no algorithm, running on any machine, for any amount of time, can produce the correct answer for all inputs.
There are problems that computers can solve but not quickly. The time required grows so fast with input size that the universe will end before the computation finishes. These are not engineering limitations. They are structural properties of the problems themselves.
Computability theory draws the line between possible and impossible. Complexity theory draws the line between feasible and infeasible. Every software engineer works within these lines, whether they know it or not.
Why This Is Bedrock
The Halting Problem means you cannot build a perfect bug detector. P vs NP means you cannot break modern encryption by brute force. Big-O notation means your algorithm's performance at scale is determined by its mathematical structure, not by how fast your hardware is. These are not opinions. They are theorems. They constrain every piece of software ever written, and ignoring them does not make them go away.
This pattern has been found in applications built by talented developers at respected organizations across every decade of software history. Its presence in a codebase is not a reflection of the developer who wrote it — it is a reflection of what that developer was taught, what tools they had, and the path that was easiest given what they were taught. The goal is not to find fault. The goal is to find the pattern — before it finds you.
Katie's Law: The developers were not wrong. The shortcut was not wrong. The context changed and the shortcut didn't.