Autonomous military robots that will fight future wars must be programmed to live by a strict warrior code, or the world risks untold atrocities at their steely hands, a la the Arnold Schwarzenegger hit Terminator, says a new US navy report
Autonomous military robots that will fight future wars must be programmed to live by a strict warrior code, or the world risks untold atrocitiesu00a0at their steely hands, a lau00a0the Arnold Schwarzenegger hit Terminator, says a new US navy report.
ADVERTISEMENT
The stark warningu00a0-- which includes discussion of a Terminator-style scenario in which robots turn on their human mastersu00a0-- is part of a hefty report funded by and prepared for the US Navy's high-tech and secretive Office of Naval Research.
The report, the first serious work of its kind on military robot ethics, envisages a fast-approaching era where robots are smart enough to make battlefield decisions that are at present the preserve of humans.
Eventually, it notes, robots could come to display significant cognitive advantages over human soldiers.
"There is a common misconception that robots will do only what we have programmed them to do," Patrick Lin, the chief compiler of the report, said. "Unfortunately, such a belief is sorely outdated, harking back to a time when... programs could be written and understood by a single person."
The reality, Dr Lin said, was that modern programs included millions of lines of code and were written by teams of programmers, none of whom knew the entire program.
Accordingly, no individual could accurately predict how the various portions of large programs would interact without extensive testing in the field u2014 an option that may either be unavailable or deliberately sidestepped by the designers of fighting robots.