In the United States, the military services have played a seminal role in advancing and shaping developments in computer science—from the earliest days to the present. The ENIAC, completed in 1946, was the first large-scale, electronic, digital (though not stored program) computer in the United States and was used to produce ballistics tables and refine hydrogen bomb designs.

IBM’s first entry into the commercial computer market was their Model 701, also known as the “Defense Calculator,” and was delivered just four years later, mainly in response to the start of the Korean War.

A third pole of innovation was occurring at universities where sophisticated research machines would be built over the next decade. All three kinds of military-type computing—government-sponsored, commercially-produced, and academically guided, had the Cold War as their backdrop.

As the demands of an ever-more elaborate scientific and technical environment impinged on both rivals, the United States pursued digital technology with a vigor unmatched by any other nation. Demand for supercomputers (a term exemplified by the CDC 6600 machine, introduced in 1964) the highest-performance machines, was almost entirely driven by the Cold War and made the U.S. the world leader in high-performance computing for decades.