Museum Wire
Law 0 · Katie's LawEvery system is shaped by the human drive to do less work. This is not a flaw. It is the economic force that produces all software — and all software failure.Law I · Boundary CollapseWhen data crosses into a system that interprets structure, without being constrained, it becomes executable.2026 IncidentAxios. 70 Million Downloads a Week. North Korea Inside.Law II · Ambient AuthorityWhen a system trusts the presence of a credential instead of verifying the intent behind it, authentication becomes indistinguishable from authorization.AXM-001Set Theory — Membership, Boundaries, and BelongingLaw III · Transitive TrustWhen a system inherits trust from a source it did not verify, the attack surface extends to everything that source touches.2026 IncidentClaude Code — The Accept-Data-Loss FlagLaw IV · Complexity AccretionSystems do not become complex. They accumulate complexity — one reasonable decision at a time — until no single person can hold the whole in their head.Law V · Temporal CouplingCode that assumes sequential execution, stable state, or consistent timing will fail the moment concurrency, scale, or latency proves the assumption wrong.2026 IncidentCopy Fail — 732 Bytes to Root on Every Linux DistributionAXM-002Boolean & Propositional Logic — True, False, and the Excluded MiddleLaw VI · Observer InterferenceWhen the system that monitors health becomes a participant in the system it monitors, observation becomes a failure vector.2025Replit Agent — The Vibe Code Wipe2024Air Canada Chatbot — The Policy That Wasn't2024Change Healthcare — One-Third of US Healthcare, One Missing MFA2024CrowdStrike — The Security Update That Broke the World2024Google Gemini Image Generation — The Six-Day Pause2024XZ Utils — The Two-Year Infiltration20233CX — The Supply Chain That Ate Another Supply Chain2023Amazon Prime Video — The Per-Frame State Machine2023Bing Sydney — The Chatbot That Went Rogue2023Samsung ChatGPT Leak — The Employee Who Pasted the Secret2022Meta Galactica — The Three-Day Scientific Oracle2021Colonial Pipeline — When Billing Shut Down the Fuel2021Facebook — The Six Hours That VanishedEFFODE · LEGE · INTELLEGELaw 0 · Katie's LawEvery system is shaped by the human drive to do less work. This is not a flaw. It is the economic force that produces all software — and all software failure.Law I · Boundary CollapseWhen data crosses into a system that interprets structure, without being constrained, it becomes executable.2026 IncidentAxios. 70 Million Downloads a Week. North Korea Inside.Law II · Ambient AuthorityWhen a system trusts the presence of a credential instead of verifying the intent behind it, authentication becomes indistinguishable from authorization.AXM-001Set Theory — Membership, Boundaries, and BelongingLaw III · Transitive TrustWhen a system inherits trust from a source it did not verify, the attack surface extends to everything that source touches.2026 IncidentClaude Code — The Accept-Data-Loss FlagLaw IV · Complexity AccretionSystems do not become complex. They accumulate complexity — one reasonable decision at a time — until no single person can hold the whole in their head.Law V · Temporal CouplingCode that assumes sequential execution, stable state, or consistent timing will fail the moment concurrency, scale, or latency proves the assumption wrong.2026 IncidentCopy Fail — 732 Bytes to Root on Every Linux DistributionAXM-002Boolean & Propositional Logic — True, False, and the Excluded MiddleLaw VI · Observer InterferenceWhen the system that monitors health becomes a participant in the system it monitors, observation becomes a failure vector.2025Replit Agent — The Vibe Code Wipe2024Air Canada Chatbot — The Policy That Wasn't2024Change Healthcare — One-Third of US Healthcare, One Missing MFA2024CrowdStrike — The Security Update That Broke the World2024Google Gemini Image Generation — The Six-Day Pause2024XZ Utils — The Two-Year Infiltration20233CX — The Supply Chain That Ate Another Supply Chain2023Amazon Prime Video — The Per-Frame State Machine2023Bing Sydney — The Chatbot That Went Rogue2023Samsung ChatGPT Leak — The Employee Who Pasted the Secret2022Meta Galactica — The Three-Day Scientific Oracle2021Colonial Pipeline — When Billing Shut Down the Fuel2021Facebook — The Six Hours That VanishedEFFODE · LEGE · INTELLEGE
Keyboard Navigation
W
A
S
D
or arrow keys · M for map · Q to exit
← Back to Hall of Heroes
Sakana AI pixel portrait
⬡ Pioneer⬢ Builderfame

Sakana AI

@sakanaai

The Lab That Asked What Nature Would Build

2020s · 3 min read
We believe the path to truly intelligent systems lies not in simply scaling up existing architectures, but in learning from the principles that govern intelligence in nature.

The Story

Sakana AI was founded in Tokyo in 2023 by David Ha and Llion Jones — two figures whose credentials represent the full arc of modern deep learning. Ha was head of research at Google Brain, one of the institutions that defined the scaling era. Jones is one of the eight authors on "Attention Is All You Need," the 2017 paper that introduced the transformer architecture and whose citation count is now effectively the citation count of the entire modern AI industry.

They left to ask a different question.

The scaling paradigm — more parameters, more data, more compute — has been extraordinarily productive. It has also concentrated AI research within a small number of institutions capable of financing the infrastructure. Sakana's founding thesis is that nature found a different path: intelligence emerged not from single massive systems but from collections of simpler agents, evolutionary selection, and emergent collective behavior. Schools of fish. Ant colonies. Immune systems. The name Sakana means "fish" in Japanese.

The most significant early result was the AI Scientist — a system that autonomously identifies research directions, runs experiments, analyzes results, writes papers, and performs peer review. It produced functional machine learning research papers end-to-end without human intervention. The quality is uneven and the papers require human review before publication. The implication — that the research pipeline itself may be automatable — is not.

Why They're in the Hall

Sakana belongs in the Hall as the most coherent counter-thesis to the dominant paradigm in the current era. While every other major lab is competing on parameter count, Sakana is competing on architecture philosophy.

The evolutionary approach has a specific relevance to TechnicalDepth's core concern: systems that emerge from selection processes rather than explicit design tend to be more robust to novel inputs and less brittle under distribution shift. They also tend to be less interpretable, which is its own failure class. Sakana is running an experiment in whether the benefits outweigh the costs.

The Pattern

Sakana is a Pioneer operating at the boundary of the known design space. The AI Scientist is simultaneously a demonstration that research automation works and a case study in what happens when the Observer (the researcher) becomes part of the system being studied. A lab that uses AI to generate its own research agenda is in a feedback loop that has not been run long enough to know where it leads.

The transformer was invented by eight people. One of them left to see if nature could do it better.