Key Technological Trends Since World War Two
Issue Brief
In the years after World War Two, the industrial west simultaneously experienced three technological revolutions — a nuclear revolution, a biomedical revolution, and a computing revolution.
The interaction between these three hugely important developments largely defines the world of technology as we know it today.
All three postwar revolutions trace their origins to the breakthroughs in quantum theory achieved by people like Werner Heisenberg and Erwin Schroedinger in the 1920’s.
Quantum theory is basically a set of propositions about the nature of matter, focusing on the behavior of atomic and subatomic particles.
But because it seeks to explain the basic building blocks of the universe, it potentially has relevance to every other scientific field.
For example, even though Schroedinger was a physicist, he wrote a hugely influential books in 1944 entitled “What is Life?” that predicted the existence of molecule-based genetic codes for all life, based on the insights he had developed about other forms of matter in his work on quantum mechanics.
And because quantum theory generated critical insights into the behavior of electromagnetic forces, it provided much of the scientific foundation for breakthroughs in semiconductors, computing and networking technologies.
There’s an article in the New York Times today describing how quantum theory is contributing to the development of integrated circuits with sub-micron dimensions.
But of course, it is the nuclear revolution with which quantum mechanics and physics is most closely associated in the popular mind.
I told you last time that I didn’t want to spend much time on nuclear technology because the story is well known and the technology itself is mature rather than emerging.
But to briefly recount, in 1945 the six-year Manhattan Project culminated in the use of two fission devices on the Japanese cities of Hiroshima and Nagasaki, killing between 68,000 and 140,000 people at Hiroshima, and between 38,000 and 70,000 people at Nagasaki.
In 1949 the Soviet Union detonated its own fission device, triggering a race to develop the first fusion, or thermonuclear, device.
Fission devices have a maximum potential yield of about 500 kilotons, but there is no limit to the explosive power of fusion devices, so when both the U.S. and Russia tested H-bombs in 1953 — as they were called — it was a very ominous development.
The following year the U.S. launched the first nuclear-powered submarine, revolutionizing undersea warfare by providing an inexhaustible supply of energy that only required refueling every two decades or so.
Previous diesel-electric subs had depended on diesel engines to recharge electric batteries, which created a continuous requirement to resurface for oxygen to power the engines.
The advent of nuclear power meant subs could operate at high speed beneath the seas for weeks or even months at a time, without fear of exhausting their fuel.
But it was a different development that captured the popular imagination in the mid-1950’s — the Soviet launch of Sputnik, the Earth’s first artificial satellite.
Sputnik itself was a very simple device; its real significance lay in the use of a powerful booster to launch it in to orbit — precisely the sort of booster needed to deliver a nuclear warhead over intercontinental distances.
As a result of this unexpected leap forward in the Soviet space program, U.S. defense policy for the rest of the Eisenhower Administration was driven by fears that a “missile gap” might emerge favoring the Soviet Union.
None in fact did — the U.S. fairly quickly gained a lead in missile and space-launch technology that it did not relinquish for a generation.
That lead not only allowed America to dominate the global marketplace for communications satellites, but also enabled it to land men of the Moon for the first time in 1969.
You know how the nuclear revolution turned out.
By 1980 both sides had equipped their increasingly sophisticated missiles, both in silos on land and in submarines beneath the sea, with thousands of multiple independently-targettable reentry vehicles — or “MIRV’s.”
The proliferation of so much destructive power overturned traditional concepts of warfighting and defense, substituting an uneasy but seemingly inescapable “balance of terror.”
After the collapse of communism in the early 1990’s, the long process of climbing down from the nuclear precipice commenced.
U.S. intelligence now estimates that by the end of the decade, the Russians will only have about a thousand usable intercontinental-range warheads left, due to arms control reductions and low investment in upkeep.
Unfortunately, there are still thousands of strategic and tactical nuclear warheads stored in various parts of the old Soviet union, raising the very real possibility that the last act of the nuclear drama remains to be played.
While the nuclear revolution was running its course, a second revolution got under way in large part to support the defense establishment.
That was the revolution in computers, which traces its origins to the first digital computer ever developed, the ENIAC, which began operating in 1946.
ENIAC was originally developed for Army Ordnance to assist in the preparation of artillery aiming tables.
Although as big as a room, it had far less computing power than one of today’s laptops.
But ENIAC began a long-term relationship between the military and the scientific community that later led to many of the most important breakthroughs in computer processing and networking.
For example, the first major networked computer architecture was developed during the Eisenhower years to support the SAGE — the Semi-Automatic Ground Environment that the military built as a defense against Russian bomber attacks on the United States.
The technical challenges associated with field SAGE were so daunting that they led the Pentagon to make a vast investment in computer and networking technology at MIT’s Lincoln Labs, laying the foundation for much future work in cybernetics.
We’re going to talk about networking and network-centric warfare in far more detail next week.
But before leaving the subject for now, I’d like to note an interesting evolution.
Because of the demands of Cold-War defense, the U.S. in the 1950’s was spending 10% of its national wealth on the military every year.
That bought the nation a huge arsenal, but it also built up something else — a worrisome phenomenon that President Eisenhower referred to in his farewell address as the “military-industrial complex.”
His warning against letting the complex become too powerful has become famous, but few people remember today that the same speech he cautioned against the emergence of a scientific elite:
“In holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy itself become the captive of a scientific-technological elite.”
It isn’t hard to see why Eisenhower feared such an elite, because at the time high-tech was largely the province of a priesthood of physicists and engineers.
Find Archived Articles: