Code that assumes sequential execution, stable state, or consistent timing will fail the moment concurrency, scale, or latency proves the assumption wrong.
Operations that work at small scale hit non-linear thresholds at production volume
If a system checks a condition and then acts on it without holding a lock or using an atomic operation — if code that works on small data fails on large data — if behavior changes under load — the system is temporally coupled to assumptions about sequencing, scale, or speed.
Every system that operates across time — concurrent threads, distributed nodes, growing datasets, eventual consistency — contains temporal assumptions. The more distributed the system, the more assumptions it makes about time, and the more ways those assumptions can fail.
Both are temporal coupling failures — one assumes the century won't change, the other assumes the dataset won't grow. Same pattern: the code is correct for the present, fatal for the future
Both are fixed-format data failures from the punch card era. Two-digit years lose century information. Unquoted fields lose delimiter boundaries. Same constraint: 80 columns, every character counts
Year
1950–1965
Context
Computers ran on punch cards. An IBM 80-column card was the universal storage medium — 80 characters per record, and every character was expensive. Storage was measured in kilobytes. Programmers optimized for space the way embedded developers today optimize for microseconds. When a program needed to store a year, the century was obvious — it was the twentieth. The first two digits were waste. Every system used two digits: 58, 63, 71.
Who Built This
Every programmer alive. COBOL programs at banks, FORTRAN programs at research labs, RPG programs on IBM System/3. This was not a pattern that propagated through tutorials. It was a constraint imposed by the physical medium. An 80-column punch card storing a date as MM/DD/YYYY used 10 columns — 12.5% of the entire record on a date field. MM/DD/YY used 8. Those two columns held payroll data.
Threat Model at Time
There was no threat model. The concern was fitting data onto a card. The year 2000 was further away than the entire history of commercial computing. No one writing COBOL in 1960 expected their code to still be running in 1999. They were wrong by forty years.
Why It Made Sense
Storage was physical and finite. Every byte had a cost measured in cardboard, tape, and core memory. Two-digit years were not a shortcut — they were the only rational choice given the constraints. The implicit assumption — that the century was always 19 — was true at the time and would remain true for decades. The assumption was correct. It was also temporary, and no one marked its expiration date.
This pattern has been found in applications built by talented developers at respected organizations across every decade of software history. Its presence in a codebase is not a reflection of the developer who wrote it — it is a reflection of what that developer was taught, what tools they had, and the path that was easiest given what they were taught. The goal is not to find fault. The goal is to find the pattern — before it finds you.
Katie's Law: The developers were not wrong. The shortcut was not wrong. The context changed and the shortcut didn't.