Autonomous killer robots not such a good idea

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
Autonomous killer robots not such a good idea

A growing list of scientists and researchers including such luminaries as Stephen Hawking, Elon Musk, Steve Wozniak, Noam Chomsky and others have signed an open letter warning of the dangers of autonomous weapons development. The letter, created by the Future of Life Institute (FLI) and presented today at a conference in Buenos Aires, outlines what a bad idea it would be for countries to engage in AI research to create autonomous weapons.

The letter states that, “[these weapons] might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions,” and “Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce.”

The letter also warns, “It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity.”

That last bit bears repeating.

“… a military AI arms race would not be beneficial for humanity.”

That’s a fairly soft statement for something that is actually quite chilling.

It’s not too hard to imagine a small drone packed with explosives crashing into the White House (or a crowded public space or any target). Perhaps the Secret Service could take one out if they had round the clock expert skeet shooters with shotguns strategically placed around the building. But push that scenario out just a bit and you might have a small squadron of a dozen deadly drones all attacking at once.

And I could also imagine an army of rat-sized robotic spiders swarming through a jungle village, a town or even a city firing poisoned darts at anyone carrying a gun (or at anything that moves).

No, I don’t think a military AI arms race would be beneficial for humanity.

Let’s just hope the race hasn’t already begun.

[If you would like to sign the open letter you can go to http://tinyurl.com/awletter.]

Author