The evolution of technology is attractive. AI is transforming our lives in every aspect and is becoming an integral part of industrial sectors like healthcare, manufacturing, education, politics, and others. AI is also extensively used in the defense sector to amplify the security of a country. But the aftermath of deploying autonomous 'Killer Robots' in the military has been not yet anticipated.
Killer Robots are automized weapon systems that can select and attack their targets without any human controls. These types of weapon systems use lethal force without any direct command from the human operator. This AI technology is implemented in different weapon systems like warships, fighter jets, and tanks.
Weaponized AI technologies form a massive part of the defense strategies of China, Russia, the US, and Isreal. The security strategies employed by these countries already involve weaponized AI and are further developing autonomous robots; capable of making potential decisions about life and death without any human intervention.
Over the past few years, the United Nations delegation has been debating about banning Killer Robots, formally known as the lethal autonomous weapons system (LAWS). These weapon systems carry a significant threat to civilian safety and well-being. It is speculated that fully autonomous weapon systems would not be capable of meeting the international humanitarian laws and standards, including the rules for proportional assessment, distinguishing between combatants and civilians, and intercepting military needs. Currently, these systems are employed to destroy enemy targets, efficiently, and save more of their soldiers. But no clear guidelines are mentioning as to which of these AI systems can be used in action and which cannot. Therefore, it will keep posing a fundamental threat to the right to life and human dignity.
In March 2020, a Kargu-2 quadcopter drone targeted withdrawing soldiers and convoys led by the Libyan National Army's Khalifa Haftar; during a civil dispute with Libya's government forces. Reports say that this weapon system fell into the category of the 'lethal autonomous weapons system or LAWS'.
The Kargu-2 quadcopter was produced by STM, a Turkish military technology company. It was built for asymmetric warfare and performed under two operating modes- manual and autonomous. It could also be linked together with other quadcopters to create a swarm of kamikaze attacks on the enemies.
These AI-driven robots are weapons of mass destruction and terror. If by chance, these systems are hacked by cybercriminals, terrorists, or other rogue states, there would be no stopping to using these technologies to suppress the civilians and attack them, even lead to conventional warfare and genocide.
The international humanitarian laws apply to actual battlefield strategies and decisions, unlike pre-programmed machines. With advancing technologies and the implementation of AI and machine learning in defense protocols, international humanitarian laws are also changing to keep up with the laws and regulations of warfare.
No machine has the capability to take an autonomous decision about the life and death of humans. Machines, capable of making decisions about ending one's life violate human dignity by reducing humans as mere objects.
Several human rights organizations and non-governmental companies have petitioned for a global ban on these autonomous Killer Robots. Without strict guidelines, current military innovations will lead to an unwanted transformation in defense technologies, leading to arms race and robot wars, jeopardizing the lives of millions.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.