It’s no secret Elon Musk has been skeptical of the advancement in artificial intelligence. He is now leading a group of experts who have filed an open letter addressed to the United Nations calling for killer robots to be banned.
The consortium consists of 116 experts from 26 countries with the common goal of research in artificial intelligence and robotics. Another prominent leader aside from the SpaceX chief is Google’s Mustafa Suleyman who is the head of DeepMind Technologies focusing on machine learning.
In the open letter, the founders warned the research on robots could lead to a new revolution in warfare and prove to be more deadly than the traditional weapons we have nowadays. Once started, the situation will become impossible to contain so it is better to have precautionary measures in place. An excerpt from the letter in which they collectively voice their concerns is given below
Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.
This letter will open at the International Joint Conference on Artificial Intelligence (IJCAI) being held at Melbourne and the consortium will be putting forward the motion of banning “morally wrong” lethal autonomous weapons under the certain conventional weapons (CCW) ordinance brought into effect in 1984.
The UK initially opposed the ban on lethal autonomous weapons in 2015 saying that laws to prevent that are already in place. However, an artificial intelligence professor from the University of South Wales cautioned that this technology could be used to industrialize war and we must make decisions if we want to avoid that uncertain future.
What are your thoughts regarding the research in artificial intelligence and robotics? Do you think we need to skeptical of advancements in this particular area? Let us know!
Source – Guardian