The Soviet Oko satellite early-warning system misinterpreted sunlight reflecting off high-altitude clouds above a US missile base as five missile launches. The software was designed to detect launches but lacked filtering for atmospheric optical phenomena.
Petrov's decision was not officially recognized for years. He was neither rewarded nor punished. The incident remained classified until 1998. Petrov received international recognition later in life and died in 2017. The incident is the most consequential software false positive in human history.
The Incident
On September 26, 1983 — three weeks after the Soviet Union shot down Korean Air Lines Flight 007, killing 269 people — tensions between the United States and the Soviet Union were at their highest point in decades. At 00:14 Moscow time, the Soviet Union's Oko nuclear early-warning satellite system reported that an intercontinental ballistic missile had been launched from the United States toward the Soviet Union.
Lt. Col. Stanislav Petrov was the duty officer at the Serpukhov-15 command center. His job was to monitor the satellite data and, if a launch was confirmed, report it to his superiors — who would then have minutes to decide whether to launch a retaliatory nuclear strike.
Within minutes, the system reported four more launches. Five missiles, incoming.
The Decision
Petrov did not report the launches as confirmed. He judged the detection to be a false alarm and reported a system malfunction instead. His reasoning was not based on the data the system was showing him — the data said five missiles were incoming. His reasoning was based on what the system couldn't tell him:
1. A genuine US first strike would involve hundreds or thousands of missiles, not five.
2. The satellite system was new and known to have reliability issues.
3. Ground-based radar had not confirmed any launches.
Petrov was correct. The satellite had misinterpreted sunlight reflecting off high-altitude clouds above a US missile base as the infrared signature of missile launches.
Why It Matters
A software false positive nearly triggered nuclear war. The system performed exactly as designed — it detected an infrared signature consistent with a missile launch and reported it. The system was not wrong about what it saw. It was wrong about what it saw meant. The difference between detection and interpretation is the difference between data and judgment. On September 26, 1983, one person's judgment — informed by skepticism of software output — prevented the deaths of millions.
This is the most consequential software bug in human history. Not because of what happened, but because of what didn't.