“My goal wasn't to make a ton of money. It was to build good computers.”
The Story
In 1976, Steve Wozniak hand-built a computer on a single circuit board, typed a BASIC interpreter into it from memory, and showed it to his friend Steve Jobs. Jobs wanted to sell it. Wozniak wanted to give the schematics away at the Homebrew Computer Club. Jobs won that argument, and Apple Computer was born — but the tension between those two instincts defines everything that followed.
The Apple I was impressive. The Apple II was a revolution.
Wozniak designed the Apple II's entire architecture: the processor bus, the memory system, the video output, the sound, the expansion slots, the ROM firmware. He wrote Integer BASIC — a full programming language implementation — by hand, in machine code, on paper, then keyed it into the machine byte by byte. No assembler. No compiler. Just a man who could hold an entire system in his head and translate between logic and silicon without tooling.
Then came the floppy disk controller. In 1978, Wozniak designed the Disk II controller card for the Apple II. Existing floppy controllers used 50 to 60 chips. Wozniak's design used 8. He replaced complex hardware state machines with software running on the CPU, shifting work from expensive silicon to cheap code. The design was so minimal that other engineers at Apple struggled to understand how it worked — it exploited timing characteristics of the hardware that weren't in any datasheet. It was, by any engineering standard, a masterpiece of constraint-driven design.
The Apple II shipped in 1977 and stayed in production until 1993. Sixteen years. It launched the personal computing industry, created the home software market, put VisiCalc (the first spreadsheet) into businesses, and brought programming to a generation of engineers who would go on to build the internet.
Wozniak's direct involvement in Apple's engineering ended after a plane crash in 1981 and his subsequent departure from day-to-day operations. But his design philosophy — do more with less, understand every layer, make the hardware and software work as one system — echoed through decades of engineering culture.
Why They're in the Hall
Wozniak is the founding archetype that TechnicalDepth is built to honor: the Pioneer and Builder who designed the actual system while someone else got the magazine covers.
The Woz-and-Jobs duality is the creation myth of Silicon Valley, and it established a pattern that has repeated for fifty years: the builder and the seller, the engineer and the executive, the person who made the thing and the person who made the thing famous. TechnicalDepth exists because the builder's story is consistently undertold. Wozniak is Patient Zero for that phenomenon.
His engineering approach is a case study in every constraint that shaped early software. When memory was measured in kilobytes, every byte mattered. When chips cost real money, minimizing chip count was a design objective, not an aesthetic preference. Wozniak's floppy disk controller didn't use 8 chips because he was showing off. It used 8 chips because that's what the economics and physics demanded, and he was skilled enough to meet that demand.
The techniques he used — hand-assembled firmware, hardware-software co-design, exploiting timing margins, replacing hardware state machines with software — are the ancestors of every modern embedded systems practice. When a firmware engineer today writes a bit-banged protocol handler or a DMA controller driver, they're working in a tradition Wozniak helped establish.
He's also a figure of unusual moral clarity in an industry that often lacks it. He gave away schematics. He funded public education. He has consistently said that engineering, not wealth, was the point. His famous line — that his goal was to build good computers — isn't false modesty. It's a statement of values that TechnicalDepth shares: the work matters more than the valuation.
In a museum dedicated to the patterns that make software succeed or fail, Wozniak represents the most fundamental pattern of all: deep understanding of every layer of the system you're building, from the transistor to the user interface. The Apple II worked because one person understood all of it. That kind of full-stack comprehension is rare now. TechnicalDepth argues it shouldn't be.
