The Age Of Robotic Combat Systems Has Not Arrived
Issue Brief
Ever since Czech playwright Karel Capek introduced the concept of mechanical men in his 1921 drama,Rossum’s Universal Robots, prophets have been predicting the imminent arrival of machines that could take the place of people. Speculation about using robots in combat has been especially common. But like the time horizon for a human colony on Mars, the arrival of autonomous unmanned systems in combat keeps receding because the necessary technology is not within our grasp. Technology enthusiasts have solved this problem in popular culture by dumbing down the definition of “robot” to encompass systems that aren’t much smarter than a power mower, but that semantical trick doesn’t get us any closer to the world that Capek foresaw. If robotic warriors aren’t a total fantasy (to paraphrase Tommy Lee Jones inNo Country for Old Men), they’ll do until a real fantasy arrives.
I thought about how unrealistic our technology aspirations are last week when John Bennett of Defense News shared with me some briefing charts entitled “Future Vertical Lift Technology Opportunities.” The charts were jointly produced by the robotics division of the National Defense Industrial Association and the Pentagon’s Joint Advanced Concepts office. It’s a fairly conventional brief until it gets to unmanned vehicles, and then it claims, “As ROBOTICS applications increase, AUTONOMY will increase.” It goes on to talk about “heterogeneous robots and sensors self tasking and teaming,” adapting to novelty “through learning and experience,” and behaving “appropriately in reaction to autonomous target and/or scene recognition.” These things are all supposed to happen with little human intervention, which is one reason why “minimal core bandwidth” will be required.
If this vision of future combat had been introduced by Karel Capek’s grandson in a remake of Rossum’s Universal Robots, my reaction would be, “Wow, what an imagination — I wish I could live to see it happen!” But as an operational concept the joint force should be pursuing in a period of declining defense outlays, it’s just nonsense — the kind of vision that results when you ask humanities majors to comment on engineering. The bedrock fallacy of such ideas is that they fail to grasp the complexity of the human brain, and thus under-estimate the difficulty of building unmanned systems that can replace people in combat. So let’s look at a few facts about the human brain.
The human brain is the most complex item in the known universe. It is the end product of 500 million years of evolution. The number of calculations the brain performs per second is estimated at between 10 to the 15th and 10 to the 19th power — in other words, at least 10,000,000,000,000,000 operations per second. Our best supercomputers operate at around 10 to the 9th power. But they take up the space of a basketball court and consume many kilowatts of power, whereas the human brain weighs 2-3 pounds and consumes about 25 watts.
How likely is it that we could match the memory and computational power of the human brain in a payload suitable for an autonomous combat vehicle? And if we could, how would we generate the software needed to give it the situational awareness, judgment and values of a typical warfighter? The answer is, we can’t — which is why decades of work on artificial intelligence have failed to produce machines much smarter than a cockroach. Really. So while the notion of robotic warriors might suggest some interesting areas for science and technology research, don’t count on R2D2 to save your kids from the Next Big Threat.
Find Archived Articles: