Lethal Autonomous Weapons & Combat Drones Watch Thread

When will killer robots become a dominant force in world militaries?

2025
1
8%
2030
5
38%
2035
4
31%
2040
1
8%
2045
2
15%
 
Total votes: 13

User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Lethal Autonomous Weapons & Combat Drones Watch Thread

Post by Yuli Ban »

Killer Robots | Human Rights Watch
Fully autonomous weapons, also known as "killer robots," would be able to select and engage targets without meaningful human control. Precursors to these weapons, such as armed drones, are being developed and deployed by nations including China, Israel, South Korea, Russia, the United Kingdom and the United States. There are serious doubts that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity. Human Rights Watch calls for a preemptive ban on the development, production, and use of fully autonomous weapons. Human Rights Watch is a founding member and serves as global coordinator of the Campaign to Stop Killer Robots.



Drones may have attacked humans fully autonomously for the first time
Military drones may have autonomously attacked humans for the first time ever last year, according to a United Nations report. While the full details of the incident, which took place in Libya, haven’t been released and it is unclear if there were any casualties, the event suggests that international efforts to ban lethal autonomous weapons before they are used may already be too late.

The robot in question is a Kargu-2 quadcopter produced by STM.

Was a flying killer robot used in Libya? Quite possibly
Last year in Libya, a Turkish-made autonomous weapon—the STM Kargu-2 drone—may have “hunted down and remotely engaged” retreating soldiers loyal to the Libyan General Khalifa Haftar, according to a recent report by the UN Panel of Experts on Libya. Over the course of the year, the UN-recognized Government of National Accord pushed the general’s forces back from the capital Tripoli, signaling that it had gained the upper hand in the Libyan conflict, but the Kargu-2 signifies something perhaps even more globally significant: a new chapter in autonomous weapons, one in which they are used to fight and kill human beings based on artificial intelligence.

The Kargu is a “loitering” drone that can use machine learning-based object classification to select and engage targets, with swarming capabilities in development to allow 20 drones to work together. The UN report calls the Kargu-2 a lethal autonomous weapon. It’s maker, STM, touts the weapon’s “anti-personnel” capabilities in a grim video showing a Kargu model in a steep dive toward a target in the middle of a group of manikins. (If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill. The UN report heavily implies they were, noting that lethal autonomous weapons systems contributed to significant casualties of the manned Pantsir S-1 surface-to-air missile system, but is not explicit on the matter.)

Many people, including Steven Hawking and Elon Musk, have said they want to ban these sorts of weapons, saying they can’t distinguish between civilians and soldiers, while others say they’ll be critical in countering fast-paced threats like drone swarms and may actually reduce the risk to civilians because they will make fewer mistakes than human-guided weapons systems. Governments at the United Nations are debating whether new restrictions on combat use of autonomous weapons are needed. What the global community hasn’t done adequately, however, is develop a common risk picture. Weighing risk vs. benefit trade-offs will turn on personal, organizational, and national values, but determining where risk lies should be objective.

It’s just a matter of statistics.
We may be living in a brave new world right now.
And remember my friend, future events such as these will affect you in the future
User avatar
caltrek
Posts: 6474
Joined: Mon May 17, 2021 1:17 pm

Re: Drones may have attacked humans fully autonomously for the first time

Post by caltrek »

If a Killer Robot Were Used, Would We Know?

https://thebulletin.org/2021/06/if-a-ki ... d-we-know/

Introduction:
A recent UN report on Libya implies—but does not explicitly state—that a Turkish Kargu-2 drone was used to attack humans autonomously using the drone’s artificial intelligence capabilities. I wrote about the event in the Bulletin, and the story went viral. The New York Times, NPR, Axios, Gizmodo, and a solid couple dozen more outlets in at least 15 languages all covered it. The intensity of the response surprised some experts, who noted that weapons that operate autonomously have been around for years. Perhaps the significance of the Libyan incident is socially symbolic—an event that draws sudden public attention to an issue brewing for a long time—not a Sputnik moment, but a Kargu-2 moment.

But most of the attention ignored a very obvious question: How do we know the Kargu-2 was used autonomously? The vagueness in the UN report allows multiple interpretations, defense companies exaggerate their products’ capabilities, and how best to define so-called lethal autonomous weapons is a hotly debated issue. The question has far reaching implications beyond whether autonomous weapons were used in Libya. Groups like the Campaign to Stop Killer Robots seek comprehensive bans on autonomous weapons, yet a ban cannot be enforceable unless some way exists to verify autonomous weapons use.

The reality is verification would be extremely difficult.
Don't mourn, organize.

-Joe Hill
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

SGR-A1
The SGR-A1 is a type of sentry gun that was jointly developed by Samsung Techwin (now Hanwha Aerospace) and Korea University to assist South Korean troops in the Korean Demilitarized Zone. It is widely considered as the first unit of its kind to have an integrated system that includes surveillance, tracking, firing, and voice recognition. While units of the SGR-A1 have been reportedly deployed, their number is unknown due to the project being "highly classified"
Image
And remember my friend, future events such as these will affect you in the future
User avatar
caltrek
Posts: 6474
Joined: Mon May 17, 2021 1:17 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by caltrek »

Flanker Fighter Appears Among Unmanned Aircraft At China's Secretive Test Base
by Tyler Rogoway
July 2, 2021

https://www.thedrive.com/the-war-zone/4 ... -test-base

Introduction:
(War Zone) The United States is publicly pursuing its Skyborg artificial intelligence program—among other associated initiatives—that is mainly focused on infusing unmanned combat aircraft with advanced autonomous capabilities, but it could also work as a copilot of sorts for manned platforms. Now there seem to be indications that China could be test-flying a similar concept on a J-16 Flanker fighter. This is part of a broader push by both countries' air arms to rapidly enhance their unmanned capabilities, including the development of manned-unmanned teaming and optionally manned air combat concepts.

The War Zone obtained a satellite image dated June 1 of China's secretive test base near Malan in Xinjiang province, which is known to be on the leading edge of the country's unmanned military aircraft development efforts. The photo shows a lineup of various drones being tested at the base, all of which have been previously identified, outside of the large unmanned aircraft hangars that line a long taxiway that services the western extension of the base.

These line-up displays of unmanned aircraft that the site is actively working to develop are hardly new. In fact, they have become an odd staple of the facility. What makes this lineup different from those in the past is that it includes a variant of a manned Flanker fighter among the unmanned aircraft. While manned fighters do visit the main ramp of the base fairly regularly for training and development purposes, we have never seen the base's unmanned aircraft intermixed with a manned aircraft like this before…
Don't mourn, organize.

-Joe Hill
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

And remember my friend, future events such as these will affect you in the future
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

First 'AI War': Israel Used World's First AI-Guided Swarm Of Combat Drones In Gaza Attacks
In the ongoing conflict between Israel and the Occupied Palestinian Territory, the Israel Defense Forces (IDF) has deployed AI and supercomputers to identify strike targets in what they are calling the first artificial intelligence (AI) war. In May this year, the IDF used a swarm of AI-guided drones and supercomputing to comb through data and identify new targets within the Gaza Strip. It's thought this is the first time a swarm of AI drones has been used in combat.

The use of AI in drone strikes has seen a surge in warzones, with a recent UN report revealing Libya launched an autonomous weaponized drone attack on Haftar Affiliated Forces last year, the first time an AI-guided drone identified and possibly attacked human targets without human input. Now, the technology appears to have found significant use in the Israel-Gaza conflict, which reportedly saw over 4,400 rockets fired into Israel and 1,500 strikes into Gaza in the 11 days of intense fighting in May.

The exploitation and use of AI effectively in war requires a lot of information. The machine learning systems need to be fed with data collected through satellites, aerial reconnaissance vehicles, and years of ground intel. With that, it can identify targets and predict when and where enemy attacks may occur.
And remember my friend, future events such as these will affect you in the future
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

Don’t Arm Robots in Policing
Elected officials and local authorities across the United States and around the world should consider replicating an innovative legislative proposal that would prohibit police from arming robots used in their law enforcement operations.

The bill, introduced on March 18 by New York City council members Ben Kallos and Vanessa Gibson, would “prohibit the New York City Police Department (NYPD) from using or threatening to use robots armed with a weapon or to use robots in any manner that is substantially likely to cause death or serious physical injury.”

The proposed law comes after a social media outcry over the use of an unarmed 70-pound ground robot manufactured by Boston Dynamics in a policing operation last month in the Bronx. US Representative Alexandria Ocasio-Cortez criticized its deployment “for testing on low-income communities of color with under-resourced schools” and suggested the city should invest instead in education.
Image
Boston Dynamics presents its SpotMini robot at a conference in Hanover, Germany, June 13, 2018. © 2018 Photo by Laura Chiesa/Pacific Press/Sipa via AP Images
And remember my friend, future events such as these will affect you in the future
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

Killer Flying Robots Are Here. What Do We Do Now?
In the popular Terminator movies, a relentless super-robot played by Arnold Schwarzenegger tracks and attempts to kill human targets. It was pure science fiction in the 1980s. Today, killer robots hunting down targets have not only become reality, but are sold and deployed on the field of battle. These robots aren’t cyborgs, like in the movies, but autonomously operating killer drones. The new Turkish-made Kargu-2 quadcopter drone can allegedly autonomously track and kill human targets on the basis of facial recognition and artificial intelligence—a big technological leap from the drone fleets requiring remote control by human operators. A United Nations Security Council report claims the Kargu-2 was used in Libya to mount autonomous attacks on human targets. According to the report, the Kargu-2 hunted down retreating logistics and military convoys, “attack[ing] targets without requiring data connectivity between the operator and the munition.”

The burgeoning availability and rapidly expanding capabilities of drones pose urgent challenges to all of humanity. First, unless we agree to halt their development and distribution, autonomous killer drones like the Kargu-2 will soon be affordable and operable by anyone—from rogue states all the way down to minor criminal gangs and individual psychopaths. Second, swarms of killer drones may, through sheer numbers, render irrelevant the defenses against terrorist threats deployed by technologically advanced nations. Third, in creating a challenging new asymmetry in warfare, autonomous killer drones threaten to upset the balance of power that otherwise keeps the peace in various regions. The increasing ubiquity of affordable drones is an open invitation to one power and another to turn stable regions into battle zones.

The arrival and rapid proliferation of robot-like killer drones comes as no surprise.
Image
Sailors move an X-47B combat drone aboard the aircraft carrier USS George H.W. Bush in the Atlantic Ocean on May 14, 2013. MASS COMMUNICATION SPECIALIST 2ND CLASS TIMOTHY WALTER/U.S. NAVY VIA GETTY IMAGES
And remember my friend, future events such as these will affect you in the future
User avatar
caltrek
Posts: 6474
Joined: Mon May 17, 2021 1:17 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by caltrek »

NF-16D VISTA Becomes X-62A, Paves Way for Skyborg Autonomous Flight Tests
by Giancarlo Carsem
July 30, 2021

https://www.dvidshub.net/news/402134/nf ... ight-tests

Introduction:
(Defense Visual Information Distribution Service) The NF-16D Variable In-flight Simulator Aircraft (VISTA) has been redesignated as the X-62A, effective June 14, 2021.

The VISTA, which is operated by the Air Force Test Pilot School with the support of Calspan and Lockheed Martin, first flew in 1992 and has been a staple of the TPS curriculum. It has provided TPS students the ability to experience various flying conditions including simulation of other aircrafts’ characteristics.

“For more than two decades VISTA has been a vital asset for the USAF TPS and the embodiment of our goal to be part of the cutting edge of flight test and aerospace technology,” said William Gray, VISTA and TPS chief test pilot. “It has given almost a thousand students and staff members the opportunity to practice testing aircraft with dangerously poor flying qualities, and to execute risk-reduction flight test programs for advanced technologies.”

The VISTA is currently in the midst of an upgrade program which will fully replace the VISTA Simulation System (VSS). The upgrade program will also add a new system called the System for Autonomous Control of Simulation (SACS) to support autonomy testing for the Air Force Research Laboratory’s Skyborg program.

“The redesignation reflects the research done on the aircraft over the past almost 30 years, as well as acknowledges the major upgrade program that is ongoing to support future USAF autonomy testing,” said Dr. Chris Cotting, USAF TPS director of research.
caltrek's comment: What somewhat surprises me about this story is not so much the technology being described, but the relative transparency and matter-of-fact tone taken by the author.
Don't mourn, organize.

-Joe Hill
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: Lethal Autonomous Weapons Watch Thread

Post by Yuli Ban »

It’s time to ban autonomous killer robots before they become a threat
With their first use on the battlefield, AI-powered drones are now a growing problem the world must face
The subject of autonomous killer robots exercises many technologists, politicians and human rights activists. Indeed, the Financial Times’s advice page for would-be opinion writers complains that, in their pitches, “lots of people spin doomsday scenarios about robots”. But now these robots are on the battlefield — and we need to do something about it.

A UN report on the Libya conflict has revealed that, for the first time, humans were “hunted down” and presumably killed by “lethal autonomous weapons systems such as the STM Kargu-2”, which were programmed to attack targets with no human control. The Kargu-2 is a plate-sized quadcopter equipped with cameras, onboard AI and a warhead of roughly 1kg, enough to kill a room full of people.

Meanwhile, in early June an Israeli newspaper reported that the Israel Defense Forces had begun using AI-controlled drone swarms to attack targets in Gaza during the recent conflict there.
And remember my friend, future events such as these will affect you in the future
Post Reply