Virus and Malware Database

Bots posing legitimate threat on a global scale!

by

in

Third revolution of Warfare

We have all seen sci-fi movies and read sci-fi stories about a robot evolution set to wipe-out humanity and while twenty-years ago all of this might have seemed far-fetched, currently such a scenario appears to be more of a reality rather than fiction. A recent example that was all over the news a couple of weeks ago was the language that two Facebook bots invented by themselves in order to optimize their communication. The project was immediately halted due to the unforeseen consequence of having two programs create a whole new language without being explicitly tasked to do so. While on its own, this isn’t a reason to be worried, the implications are clear – technology is getting smarter at an astonishing rate and we are not sure where this is going to lead us.

Of course, we are still not that close to creating an actual A.I. However, we are slowly starting to get there. That said, even if there’s still no Artificial Intelligence in the true sense of the word, researchers suggest that this does not mean that there’s no danger on the horizon. In fact, even without A.I., certain forms of technology might represent a substantial risk to all of us. A recent warning targeted at the UN issued by a group of renowned specialists in the fields of robotics and AI states that a there is a significant danger in the practice of repurposing different technologies into weapons and warfare tools. The reason for that is the possibility of such tech getting hacked and turned against its user. Within the open letter send to the UN, the team of specialists refers to the threat of using robots for military purposes as “third revolution of warfare”.

Insider Threat

The potential scale of the hazard that robotics might represent to everyone is huge and should not be ignored. In fact, even if a robot machine isn’t made into a warfare tool, it could still endanger the everyday lives of its users in the event that it gets taken over by a hacker with malicious intentions. It is not difficult to imagine the potential problems that might come if someone manages to hack into this sort of tech especially since bots are becoming more and more and more common in various aspects of people’s lives. For example, a lot of factories are currently using the so-called cobots – machines used to automate and speed up the production. While this might be helpful and beneficial in general, all that it takes to make all this technology turn against its users is a single exploit in the hands of the wrong person. And, when talking about tech vulnerabilities, there are quite a few of those when it comes to robots. The principal security consultant at the IOActive company revealed that after a recent research update about 50 weakspots and vulnerabilities were found in six of the largest creators of robotics technology. The potential for harm is immense! Not only cobots at factories but also household bots such as the famous Pepper robot could represent significant danger to anyone who makes use of them. For instance, Pepper might be hacked to record sound and video from around the house or execute other tasks without the knowledge of its owner and there are currently tens of thousands such robots sold to unsuspecting customers.

Should we be concerned?

While all of the aforementioned potential dreadful scenarios might seem rather disturbing, there’s still time to avoid such a threat if adequate precautions are taken. This is, in fact, the whole purpose of the open letter sent to UN – to issue a warning about the potential dangers that bots might represent if used carelessly and without caution for there is certainly a large number of people out there who are certain to try and exploit potential weaknesses in the system of those bots. All in all, even if the danger is still not as big and as urgent, it is imperative that both users and developers ensure that the software security of such technology is guaranteed before putting the tech into use.

 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *