Monday, October 2, 2017

Tech Briefing: Killer Robots and Me

I remember taking control of my first killer robot. I woke up one warm Afghanistan morning and took a gander outside. When I saw our obsolete Humvees being replaced with state of the art Mine Resistant Ambush Protected (MRAP) vehicles, I was ecstatic. As I approached the vehicle, I became even more in awe. The MRAP was equipped with a Common Remotely Operated Weapon (CROW) system. I could not believe it! No longer would I have to stick my head out of a vehicle to provide protection for my comrades. Instead, I now had the ability to provide that same support from the comfort of staying inside the vehicle, enhancing my safety, and the safety of everyone else. I will never forget the comforting feeling of being able to unload a .50 caliber weapon from the comfort of my own seat, and how beneficial it was for my team. However, many of the leading companies that develop artificial intelligence (AI), just signed a petition to the United Nations to eliminate them.

It could be quite possible that future soldiers will not be able to experience the feeling of safety I felt on that Afghanistan morning. In a recent article by SciDevNet, Neena Bhandari writes, "Founders of leading robotics and AI companies from 26 countries have, in an open letter to the United Nations...called for an international treaty to ban killer robots." Some of the key personnel signing this petition are Elon Musk and Mustafa Suleyman, both marvels of AI and machine learning for Tesla and Google. This made me ask the question, "Are they trying to take away the weapon that made me feel safe?" The answer is no; well, kind of.

Machine Learning and AI gurus are not discussing the type of weapon I did in the opening paragraph. The main concern is surrounding weapons that fire completely on their own, without human interaction.  They claim, "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and timescales faster than humans can comprehend." There is not a concern of these weapons being used for good (eliminating ISIS). Instead, these prominent industry leaders are worried about AI weapons getting into the wrong hands. They are also concerned with these AI weapons system violating international law.

So, will the weapon that I admired so much be out of commission? No, because it still requires a human to decide when to pull the trigger. That is not to say other weapon systems that probably provided more security than my .50 caliber weapon, will not be eliminated. Other weapons that assured my security, weapons that I could not see, are in jeopardy. Is this something that these tech giants should determine? Should the security of our U.S. Forces be threatened? I think not. The security I felt that morning should be shared with the many soldiers that put themselves in harms way everyday.

Take a look at how the crow works!

1 comment:

  1. Only a few days left! Join the CryptantCrabs giveaway and try your luck!😎

    Dapp.com is a largest dedicated platform for sharing exciting dapps and valuable knowledge about decentralized technology. We help everyone understand, create, and enjoy this exciting new technology with enthusiasm.

    Check this out to join. ⬇️⬇️
    https://dapplin.blogspot.com/2018/11/dapps-are-hosting-giveaway-with.html?m=1

    🔘Dapp.com
    https://t.me/dapp_com

    ReplyDelete