(ORDO NEWS) — It can be assumed that Hollywood can predict the future. Indeed, Robert Wallace, head of the CIA’s Office of Technical Services and the American counterpart to the fictional MI6 Q, recounted how Russian spies watched the latest Bond movie to see what technology might come their way.
So Hollywood’s ongoing obsession with killer robots can be a major concern. The newest such film is the upcoming courtroom sex robot drama Dolly on Apple TV.
I never thought I’d write the phrase “courtroom drama about sex robots”, but here it is. Based on a 2011 short story by Elizabeth Bear, the plot concerns a billionaire who is killed by a sex robot, who then asks a lawyer to defend his bloody actions.
Real killer robots
Dolly is the latest in a long line of killer robot movies, including 2001’s HAL: Kubrick’s Space Odyssey and Arnold Schwarzenegger’s T-800 robot in The Terminator .
Indeed, the conflict between robots and humans was at the center of the very first feature-length science fiction film, Fritz Lang’s 1927 classic Metropolis .
But almost all of these films are wrong.
Killer robots will not be sentient humanoid robots with evil intentions. This could provide a dramatic storyline and box office success, but such technology is still decades, if not centuries away.
Indeed, contrary to recent fears, robots may never become sentient.
We have to worry about much simpler technologies. And these technologies are now beginning to appear on the battlefield in places like Ukraine and Nagorno-Karabakh.
The war has changed
Movies that use much simpler armed drones like Angel Has Fallen (2019) and Eye in the Sky (2015) paint perhaps the most accurate picture of the real future of killer robots.
On the nightly news On the TV news, we see how modern warfare is being transformed with increasingly autonomous drones, tanks, ships and submarines. These robots are only slightly more sophisticated than the ones you can buy at your local hobby store.
And increasingly, the decisions to identify, track and destroy targets are transferred to their algorithms.
This leads the world into a dangerous place with many moral, legal and technical problems. Such weapons, for example, will further aggravate our troubled geopolitical situation. We can already see Turkey becoming a major drone power.
And such a weapon crosses a moral red line in a terrible and terrifying world where irresponsible machines decide who lives and who dies.
Robot makers, however, are starting to resist this future.
Obligation not to use as a weapon
Last week, six leading robotics companies pledged that they would never use their robotic platforms as weapons.
Companies include Boston Dynamics, which makes Atlas, a humanoid robot capable of performing impressive back flips, and Spot, a dog robot that looks like it’s straight out of the Black Mirror television series.
This isn’t the first time robotics companies have been talking about this troubling future.
Five years ago, I organized an open letter signed by Elon Musk and over 100 founders of other AI and robot companies calling for the UN to regulate the use of killer robots. The letter even dropped the Pope to third place in the World Disarmament Award.
However, the fact that leading robotics companies promise not to weaponize their robotic platforms is more evidence of virtue than anything else.
For example, we’ve already seen third parties plant weapons on clones of Boston Dynamics’ Spot robot dog.
And such modified robots have proven their effectiveness in action. Iran’s leading nuclear scientist was killed by Israeli agents with a machine gun in 2020.
Collective action to protect our future
The only way to protect against this horrifying future is if countries collectively take action, as they did with chemical, biological and even nuclear weapons.
Such regulation will not be perfect, just as the regulation of chemical weapons is imperfect. But it will prevent gun companies from openly selling such weapons and, therefore, their distribution.
It is therefore even more important than the promise of the robotics companies to see that the UN Human Rights Council recently unanimously decided to investigate the human implications of new and emerging technologies, such as autonomous weapons, for human rights.
Several dozen countries have already called on the UN to regulate killer robots. The European Parliament, the African Union, the UN Secretary General, Nobel Peace Prize winners, church leaders, politicians and thousands of AI and robotics researchers like myself have called for regulation.
Australia is not the country that has supported these calls so far. But if you want to avoid that Hollywood future, you can discuss it with your political rep the next time you see them.
Contact us: [email protected]