Shadow Robot Company Director Rich Walker warned unregulated development A.I technology could ultimately be catastrophic for mankind. During an interview with Express.co.uk, he argued that A.I tech used in conflicts could quickly be categorised as a war crime similar to chemical and biological weapons. Mr Walker made it clear that tech experts are very much aware of the dangers of A.I in warfare and insisted that it should not be developed with that purpose in mind.
He said: “There are a number of organisations that are looking at what they call lethal autonomous weapon systems. This is artificial intelligence applied to war.
“Mostly they are saying that this is a bad idea, we shouldn’t be doing it.
“The positives of doing it are so small compared to the huge stack of negatives, so really we shouldn’t be doing this.
“But obviously, if you are trying to a fight a struggle that you think is hopeless and someone offers you a piece of technology that is a game-changer for what you are doing, you will probably grab that and run with it.
READ MORE: Artificial Intelligence: Why humans could one day marry robots
AI news: expert tells Express.co.uk how AI could be dangerous
“This is because you are fighting a struggle where you need all the advantages that you can get.
“I am lucky enough to not be in that position so I can sit over here and say we should not be developing that technology, we should make sure it is regulated and controlled.
“If that technology starts to be used we should treat it the same way we have treated other technologies people have used in desperate situations.
“We don’t allow people to use chemical weapons in war, we don’t allow people to use biological weapons and the use of land mines and cluster mines is heavily regulated or controlled.
“Firing at civilian populations is something we say is not part of war.
AI latest: A.I tech used in conflicts could quickly be categorised as a war crime
“Where does A.I fit into this mix? It looks like some of those things already.
“We have already decided that certain types of weapons are not acceptable, maybe artificial intelligence just fits into the ‘not acceptable’ bucket.”
Mr Walker was then asked whether we would need to see the negative impacts of the A.I being used in war before it becomes properly regulated, similar to mustard gas used in world war one.
He answered: “It is an interesting difference in two ways of looking at a problem.
“The typical American model is let us see what goes wrong and what we can do about this.
AI news: lethal AI could spark warfare
“The European model is more on the lines of, we have asked some experts and they have said these are things that could go wrong and one example is so bad that we are going to make sure that it never happens.
“We have looked and added a precautionary principle: If it is that bad let us not have that happen in the first place because accidently having that happen would be very bad.
“As I have gotten older I have moved away from the let’s give it a go and see what happens to the actually let us see what could go wrong and if what could go wrong is really bad let us definitely not do that.
“Overall with A.I and warfare, we can see some of the things that could go wrong so let us stop them from ever happening if we can.”
Japan and India vow to combine forces to create elite ROBOT soldiers [ANALYSIS]
Robot takeover? Expert warns of ‘INEVITABLE BATTLE of weaponised AI’ [VIDEO]
North Korea faces invasion by ROBOTS as South develops drone division [INSIGHT]
Size of UK space industry
The Shadow Robot Company has directed their focus on creating complex dexterous robot hands that mimick humans hands.
Despite their desire to innovate and establish new uses for A.I technology, the company intends to remain dedicated to ethical applications of artificial intelligence.
The robotics company uses tactical Telerobot technology to demonstrate how A.I programmes can be used alongside human interaction to create complex robotic relationships.
In the future, as they continue to develop their A.I programs, the Shadow Robot Company hopes to completely remove humans from dangerous situations in high risk jobs and tasks like nuclear waste and bomb disposal.