Autonomous and Killers Robots have Already Assassinated People!

Autonomous and Killers Robots have Already Assassinated People!
Published on

Killer Robots are automated weapon systems that can select and attack their targets without human controls

AI is extensively used in the defense sector to amplify the security of a country. But the aftermath of deploying autonomous 'Killer Robots' in the military has not yet been anticipated. Killer Robots are automated weapon systems that can select and attack their targets without any human control. These types of weapon systems use lethal force without any direct command from the human operator. This AI technology is implemented in different weapon systems like warships, fighter jets, and tanks.

Autonomous weapon systems commonly known as killer robots may have killed human beings for the first time last year, according to a recent United Nations Security Council report on the Libyan civil war. History could well identify this as the starting point of the next major arms race, one that has the potential to be humanity's final one. Autonomous weapon systems are robots with lethal weapons that can operate independently, selecting and attacking targets without a human weighing in on those decisions. Militaries around the world are investing heavily in autonomous weapons research and development.

Meanwhile, human rights and humanitarian organizations are racing to establish regulations and prohibitions on such weapons development. Without such checks, foreign policy experts warn that disruptive autonomous weapons technologies will dangerously destabilize current nuclear strategies, both because they could radically change perceptions of strategic dominance, increasing the risk of preemptive attacks and because they could become combined with chemical, biological, radiological and nuclear weapons themselves.

A specialist in human rights with a focus on the weaponization of artificial intelligence finds that autonomous weapons make the unsteady balances and fragmented safeguards of the nuclear world, for example, the U.S. president's minimally constrained authority to launch a strike more unsteady and more fragmented.

Threats Posed by Killer Robots

Over the past few years, the United Nations delegation has been debating about banning Killer Robots, formally known as the lethal autonomous weapons system (LAWS). These weapon systems carry a significant threat to civilian safety and well-being. It is speculated that fully autonomous weapon systems would not be capable of meeting the international humanitarian laws and standards, including the rules for proportional assessment, distinguishing between combatants and civilians, and intercepting military needs. Currently, these systems are employed to destroy enemy targets, efficiently, and save more of their soldiers. But no clear guidelines are mentioned as to which of these AI systems can be used in action and which cannot. Therefore, it will keep posing a fundamental threat to the right to life and human dignity.

In March 2020, a Kargu-2 quadcopter drone targeted withdrawing soldiers and convoys led by the Libyan National Army's Khalifa Haftar; during a civil dispute with Libya's government forces. Reports say that this weapon system fell into the category of the 'lethal autonomous weapons system or LAWS'.

The Kargu-2 quadcopter was produced by STM, a Turkish military technology company. It was built for asymmetric warfare and performed under two operating modes- manual and autonomous. It could also be linked together with other quadcopters to create a swarm of kamikaze attacks on the enemies.

These AI-driven robots are weapons of mass destruction and terror. If by chance, these systems are hacked by cybercriminals, terrorists, or other rogue states, there would be no stopping using these technologies to suppress the civilians and attack them, even leading to conventional warfare and genocide.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net