Robots have aided police officers for decades. These machines have become a regular fixture in almost all police operations, from disarming bombs to probing hostage situations. In recent years, however, the field of robotics has undergone an exponential boom, which could turn our comrades in bionic arms into foes, rather than friends.
Autonomous weapons are already on the rise. In 2014, for instance, a South African company started selling drones that could spit 80 pepper balls per second. In addition, police in North Dakota have received clearance to start using drones equipped with tear gas and Tasers. And if that doesn’t seem unsettling enough for you, here’s another not-so-fun fact: The police use of Tasers, which are touted as safe, but can actually trigger cardiac arrest, killed approximately 540 Americans between 2001 and 2013.(1)
Currently, these technologies are dependent upon operators to function. Nevertheless, autonomous weapons are already being implemented by the military, like in Israel, to monitor the nation’s borders. Furthermore, a Texas company has also developed a drone, which protects private property by hovering over the land and firing Taser darts at intruders until police arrive.(1)
A.I. makes a quantum leap
Artificial intelligence (A.I.) has undergone a quantum leap this year, as well. Major companies, like Google, Facebook and Microsoft, all have their own A.I. research labs, which bump out a steady stream of projects and academic papers. Right now, the central challenge of A.I. is to teach machines to think for themselves.
Thinking machines and autonomous weapons are a combustible mix, which could have dire consequences for humanity. Eventually, machines will be better at advancing A.I. than humans. At this tipping point, “we may be surpassed by our own creations,” notes Douglas Mulhal in his book Our Molecular Future.
Humanity won’t be able to keep up with the exponential boom in A.I., creating an even larger chasm between robots and people. Engineers could attempt to program robots to value human life. The problem, however, is that a moral code binding robots has not yet been developed. How can we expect to program robots to value human life when society can’t even agree when human life begins and ends? In fact, many autonomous weapons currently in use are designed to exterminate human life — not protect it.
“What might be the result of this lack of preparation?” asks Mulhal. “As in the movie A.I., we might start killing our creations for fear that they will gain the upper hand. We may try to use them against each other as we’ve done for millennia with other weapons, but are they going to listen?”(1)
“When the Joint Chiefs tell an autonomous robot soldier to wipe out thousands of soldiers retreating along a desert road, will it perhaps refuse, or offer a better solution to humanely disable them — or, more disturbingly, agree that it’s a fitting way to avenge what we may have lost?”(1)
Science fiction meets the Cloud
Since these robots would be connected to the Cloud, they could keep tabs on large numbers of people from the air and far away. They could be utilized to attack just about anyone on the planet — from the mightiest dictators to the lowliest civilians. In addition, these robots would likely develop self-protection programs to anticipate future attacks, making them difficult to turn off.
Although scientists are making advancements in A.I. for the betterment of humanity, all technology is prone to the threat of hackers, which seek to use autonomous weapons for malicious purposes. Hackers have already used drones to commit theft and import illegal narcotics. Ironically, the same kind of technology used to protect Israel’s border is being used to smuggle drugs over the Mexican border.
A bill of human technological rights that ensures civil liberties is most needed in a time of rapid technological growth. Otherwise, the rise of polished robots could likely lead to the gradual erosion of human rights. In fact, if technology were to continue advancing at its current rate, this erosion might not be so gradual after all.