This Israeli-made killing racing drone is a nightmare for some


Last week, an Israeli defense contractor painted a chilling picture. In an approximately two-minute video on YouTube, which resembles an action film, soldiers on duty are suddenly pinned down by enemy gunfire and call for help.

In response, a tiny drone jumps to the rescue from its mothership and zooms behind the ship enemy soldiers and kill them with ease. While the situation is fake, the drone — unveiled last week by Israel-based Elbit Systems — is not.

The lanius, which can refer to butcher birds in Latin, represents a new generation of drones: nimble, equipped with artificial intelligence, and capable of spotting and killing. The machine is based on the design of a racing drone, which allows it to maneuver into tight spaces such as alleys and small buildings.

Exploding “kamikaze” drones usher in a new era of warfare in Ukraine

The company’s promotional content promotes its upgrades. After being sent into battle, Lanius’ algorithm can map the scene and scan people, distinguishing enemies from allies — and feeding all that data back to soldiers, who can then simply press a button to attack or kill whom you want to.

For gun critics, this presents a nightmare scenario that could change the dynamics of war.

“This is extremely worrying,” said Catherine Connolly, a weapons expert at Stop Killer Robots, an anti-gun advocacy group. “It’s basically just the machine deciding whether you live or die when we remove the human element of control over it.”

Representatives from Elbit Systems did not respond to a request for comment.

The use of drones in warfare has become commonplace. The United States drone arsenal is responsible for the deaths of enemies and civilians in the Middle East. In Russia’s war against Ukraine, Moscow has been seen with a killer drone capable of bombing targets and destroying them without notice.

Drones large and small have had an impact in the war. In particular, the use of the Turkish-made Bayraktar TB2 – a drone the size of a light aircraft and equipped with laser-guided missiles – in Ukraine has wreaked havoc on Russian tanks and trucks.

The US has inspected wreckage of kamikaze drones used by Russia in Ukraine

This is an attractive target for gun manufacturers.

Elbit Systems, headquartered in Haifa, Israel, says in promotional content His Lanius is loaded with features that would be especially helpful in urban warfare situations where troops cannot see their enemy well.

According to the drone datasheet, the drone is palm-sized, approximately 11 inches by 6 inches. It has a top speed of 45 miles per hour. It can fly for around 7 minutes and has the ability to carry lethal and non-lethal materials. It’s unclear how lethal the deadly materials would be.

The drone is equipped with WLAN and radio technology for communication. It can maneuver using GPS navigation, and the drone’s built-in artificial intelligence system can scan and map urban battlefields, returning soldiers a 3D map of their surroundings.

According to the company, the drone’s autonomous software helps with “enemy detection and classification”, helpful in “deadly ambushes”.

The company notes that the drone cannot make its own decision to kill someone and requires a “human-in-the-loop” to make the decision and pull the trigger.

Russia bombards Ukraine with missiles and drones, injuring civilians

Still, Stop Killer Robots’ Connolly has numerous concerns.

The trait that requires humans to be involved in a decision to kill, probably can be overridden, she said. “Modifying that would probably just take a software upgrade,” added Connolly. “There is … absolutely nothing to prevent the manufacturer from doing this, or for an attorney or agent buying these systems to require it.”

The Lanius’ ability to use algorithms to distinguish enemies from allies seems worrying, she said. The general public should know how the drone distinguishes between combatants and civilians, what data is used to train the system’s algorithm to make those calls, who Has the data used is flagged and what type of behavior is flagged as threatening, she said.

“It basically just shows that systems can now almost decide to use an algorithm… to take human lives,” she said.