Keyboard Navigation
W
A
S
D
or arrow keys · M for map · Q to exit
← Back to Hall of Heroes
Google DeepMind pixel portrait
⬢ Builder⬡ Pioneerboth

Google DeepMind

@googledeepmind

The Lab That Solved the Protein Problem

2020s · 2 min read
AlphaFold is the most significant achievement we've made.

The Story

Google DeepMind's lineage contains two of the most significant technical achievements in the history of AI: AlphaGo (2016), which defeated the world Go champion using a combination of tree search and deep learning in a game that experts said would not fall to AI for decades; and AlphaFold 2 (2020), which solved the protein folding problem — predicting a protein's 3D structure from its amino acid sequence — a challenge that had been open for fifty years and was considered one of biology's grand unsolved problems.

The Google Brain team is also responsible for the Transformer architecture (Vaswani et al., "Attention Is All You Need," 2017) — the paper that every modern large language model is built on, including GPT-4, Claude, Gemini, and LLaMA. The attention mechanism described in that paper is the substrate of the AI era.

Why They're in the Hall

DeepMind belongs in the Hall for AlphaFold alone. The protein folding solution is not a product milestone. It is the kind of scientific contribution that opens entire research domains. Biologists now have access to predicted structures for 200 million proteins. Drug discovery timelines that would have taken years of crystallography now take hours of inference.

The exhibit also includes Gemini — and the museum cannot ignore the Gemini image generation pause of 2024, when Google's multimodal model generated historically inaccurate images with racial diversity corrections that produced absurd outputs (diverse groups of people in historical contexts where such diversity was not present, while failing to generate other groups accurately). The pause lasted six days. The failure was a Boundary Collapse at the RLHF layer: a correction designed to reduce historical bias overfit into a correction that introduced factual inaccuracy.

The Transformer Legacy

The authors of "Attention Is All You Need" did not foresee ChatGPT. The paper described a translation model. The recursive consequence — that the architecture would scale to general language capabilities — was an emergent property that nobody predicted from the architecture alone. This is the Complexity Accretion pattern applied to research: each incremental scaling experiment was rational and well-documented. The aggregate was not predicted.

AlphaFold is what happens when laziness is constructive: we built a machine to do the biology we didn't want to do by hand. The machine did fifty years of it in an afternoon.