When data crosses into a system that interprets structure, without being constrained or transformed, it becomes executable.
Input writes past allocated boundaries, overwriting adjacent memory (control flow, return addresses)
If user input appears inside a query, command, template, object stream, or prompt without an intermediate representation that separates data from structure — you are not passing data. You are modifying structure.
Every new execution context recreates this pattern. SQL gave way to NoSQL, HTML to templates, cookies to JWTs, forms to APIs, prompts to agents. The language changes. The failure does not.
Buffer overflow was the original boundary collapse (1970s memory). SQL injection is the same mechanic in a higher-level execution context (1990s web)
Without memory protection, buffer overflows don't just corrupt local state — they can overwrite any memory in the system
Year
1972–present
Context
C gave programmers direct access to memory. It was fast, efficient, and trusted the programmer completely. Functions like strcpy, gets, and sprintf copied data into fixed-size buffers with no bounds checking. If the data was larger than the buffer, it kept writing — overwriting the return address on the stack, local variables, and anything else in its path. The programmer was responsible for checking sizes. The programmer often didn't.
Who Built This
Systems programmers writing operating systems, network daemons, and server applications. C was the language of infrastructure. Every UNIX utility, every network service, every kernel module was written in C. The buffer overflow was not a mistake in one program — it was a structural property of the language.
Threat Model at Time
In the 1970s, there was no threat model. Computers were shared among trusted users. By the late 1980s, networks connected untrusted users to services written in C. The threat model changed. The code didn't.
Why It Made Sense
C's lack of bounds checking was a performance feature. Every bounds check is a branch instruction. On 1970s hardware, branches were expensive. Trusting the programmer to get the size right was a rational trade-off when every CPU cycle mattered and the adversary was physics, not hackers.
This pattern has been found in applications built by talented developers at respected organizations across every decade of software history. Its presence in a codebase is not a reflection of the developer who wrote it — it is a reflection of what that developer was taught, what tools they had, and the path that was easiest given what they were taught. The goal is not to find fault. The goal is to find the pattern — before it finds you.
Katie's Law: The developers were not wrong. The shortcut was not wrong. The context changed and the shortcut didn't.