Autonomous killer robots not such a good idea

A growing list of scientists and researchers including such luminaries as Stephen Hawking, Elon Musk, Steve Wozniak, Noam Chomsky and others have signed an open letter warning of the dangers of autonomous weapons development. The letter, created by the Future of Life Institute (FLI) and presented today at a conference in Buenos Aires, outlines what a bad idea it would be for countries to engage in AI research to create autonomous weapons.

The letter states that, “[these weapons] might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions,” and “Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce.”

The letter also warns, “It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity.”

That last bit bears repeating.

“… a military AI arms race would not be beneficial for humanity.”

That’s a fairly soft statement for something that is actually quite chilling.

It’s not too hard to imagine a small drone packed with explosives crashing into the White House (or a crowded public space or any target). Perhaps the Secret Service could take one out if they had round the clock expert skeet shooters with shotguns strategically placed around the building. But push that scenario out just a bit and you might have a small squadron of a dozen deadly drones all attacking at once.

And I could also imagine an army of rat-sized robotic spiders swarming through a jungle village, a town or even a city firing poisoned darts at anyone carrying a gun (or at anything that moves).

No, I don’t think a military AI arms race would be beneficial for humanity.

Let’s just hope the race hasn’t already begun.

[If you would like to sign the open letter you can go to]

Guy Wright

Guy Wright has been covering the technology space since the days when computers had cranks and networks were steam powered. He has been a writer and editor for more years then he cares to admit.


Lenovo’s Powerful Potential Global Market Position

This last week I was with Lenovo going over their server, storage, and networking strategy as they continue their pivot to become one of the last remaining hardware focused vendors in a market more often defined by change than focus these days. One of the things that strikes me as a huge competitive advantage is that unlike most technology companies that are based in either the US or Asia, Lenovo is pretty much evenly balanced between the US and China putting them closer to a future model of being more of a global company than one located in any one country. Let’s explore that this week. The...

Mistakes Businessmen Make When First Promoting Their Startups

Statistics are pretty grim when looking at startups. Most of those launched will fail and the failure percentage is actually a lot higher than many imagine.

Virtual Reality: A Game-Changer for Sports

Virtual reality has the potential to make courtside seats at basketball games affordable for the everyday viewer.