Drone tech in Ukraine conflict could speed use of killer robots

KYIV, Ukraine (AP) — Advances in drones in Ukraine have accelerated a long-awaited technological trend that could soon bring the world’s first fully autonomous combat robot to the battlefield, ushering in a new era. era of war.

The longer wars go on, the more likely drones will be used to identify, select and strike targets without human help, according to military analysts, combatants and artificial intelligence researchers.

This would mark a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and counter-drone weapons with artificial intelligence. Russia also claims to have artificial intelligence weapons, although these claims are unsubstantiated. However, there is no confirmed instance of a country sending fully self-killing robots into battle.

Experts say it may only be a matter of time before Russia or Ukraine or both deploy them.

“Many states are developing this technology,” said Zachary Karen Born, a weapons innovation analyst at George Mason University. “Obviously, it’s not that difficult.”

The feeling of inevitability extends to activists who have tried for years to ban killer drones but now believe they must be content to limit the offensive use of the weapon.

Ukraine’s minister of digital transformation, Mykhailo Fedorov, agreed that fully autonomous killer drones are the “logical and inevitable next step” in weapons development. Ukraine has been doing “a lot of research and development in this direction,” he said.

“I think there is a lot of potential for this in the next six months,” Fedorov told The Associated Press in a recent interview.

Human fighters simply cannot process information and make decisions as quickly as machines, Yaroslav Honchar, a Ukrainian lieutenant-colonel in combat drones and co-founder of the nonprofit Aerorozvidka, said in a recent interview near the front lines.

Ukrainian military leaders currently prohibit the use of entirely separate lethal weapons, though that could change, he said.

“We haven’t crossed the line yet — and I say ‘yet’ because I don’t know what the future holds.” said Honchar, whose team pioneered drone innovation in Ukraine, turning cheap commercial drones into deadly drones. arms.

Russia could get autonomous AI from Iran or elsewhere. The long-range Shahed-136 explosive drone supplied by Iran has disabled Ukrainian power plants and terrorized civilians, but it’s not particularly smart. Iran also has other drones in its growing arsenal that are said to have AI capabilities.

Their Western manufacturers say that Ukraine could easily make its semi-autonomous weaponized drones completely self-contained to better withstand battlefield disturbances.

These drones include the US-made Switchblade 600 and the Polish Warmate, both of which currently require humans to select targets via live video. AI gets the job done. Known technically as “loitering munitions,” these drones can hover over a target for minutes, waiting for a clean shot.

“The technology to enable fully autonomous missions with the Switchblade almost exists today,” said Wahid Nawabi, CEO of its maker, AeroVironment. That will require policy changes — removing people from the decision-making loop — which he estimates will take three years.

Drones can already use cataloged images to identify targets such as armored vehicles. But there is disagreement over whether the technology is reliable enough to ensure machines don’t make mistakes and take the lives of non-combatants.

The Associated Press asked the defense ministries of Ukraine and Russia whether they used autonomous weapons in their offensives — and whether they would agree not to use them if the other side did the same. Neither responded.

It might not even be the first time either side is going to go on the attack with full AI.

An inconclusive United Nations report suggests that killer robots made their debut in Libya’s internecine conflict in 2020, when Turkish-made Kargu-2 drones killed an unknown number of fighters in fully automatic mode.

A spokesman for manufacturer STM said the report was based on “speculative, unverified” information and “should not be taken seriously”. He told the AP that the Kargu-2 cannot strike a target unless its operator tells it to do so.

Fully autonomous AI is already helping to defend Ukraine. Utah-based Fortem Technologies provides the Ukrainian military with a drone hunting system that combines a small radar with an unmanned aerial vehicle, both powered by AI. The radar is designed to identify enemy drones, which then neutralize them by firing webs at them — all without human assistance.

The number of AI-powered drones continues to grow. Israel has been exporting them for decades. Its radar killer harpy can hover over air defense radars for up to nine hours, waiting for them to activate.

Other examples include Beijing’s Blowfish-3 unmanned helicopter gunship. Russia has been developing a nuclear-tipped underwater artificial intelligence drone called Poseidon. The Dutch are currently testing a ground robot with a .50 caliber machine gun.

Honchal argues that Russia’s attacks on Ukrainian civilians show little respect for international law, and that if the Kremlin had any, Russia would be using lethal autonomous drones by now.

“I don’t think they’ll have any scruples,” agrees Adam Bartosiewicz, vice president of the WB Group, which makes Warmate.

Artificial intelligence is a priority for Russia. President Vladimir Putin said in 2017 that whoever masters the technology will rule the world. in a december. In a speech on the 21st, he expressed confidence in the Russian arms industry’s ability to embed artificial intelligence into war machines, emphasizing that “the most effective weapon systems are those that can operate quickly and practically in automatic mode.”

Russian officials have claimed that their Lancet drone can operate completely autonomously.

“It’s not easy to know if and when Russia crossed that line,” said Gregory C. Allen, the former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Center.

Switching a drone from remote piloting to full autonomy may not feel like it. So far, Drones capable of operating in both modes have performed better when piloted by humans, Allen said.

Stuart Russell, a professor at the University of California, Berkeley and a top AI researcher, said the technology isn’t particularly complicated. In the mid-2010s, colleagues he surveyed agreed that a graduate student could build an autonomous drone within a semester, “that would, say, find and kill a person inside a building,” he said.

Efforts to create international ground rules for military drone use have so far been fruitless. Nine years of informal UN talks in Geneva have yielded little progress, with major powers including the US and Russia opposing the ban. The previous session ended in December and no new round was scheduled.

Policymakers in Washington say they won’t agree to a ban because competitors developing drones can’t be trusted to use them ethically.

Toby Walsh, an Australian academic who shares Russell’s opposition to killer robots, wants consensus on certain restrictions, including a ban on systems that use facial recognition and other data to identify or attack individuals or groups of people.

“If we’re not careful, they will proliferate more easily than nuclear weapons,” said Walsh, author of “Machine Misbehaving.” “If you can make a robot kill one person, you can make it kill a thousand.”

Scientists also worry that AI weapons could be repurposed by terrorists. In a worrying scenario, the U.S. military is spending hundreds of millions of dollars writing code to power killer drones. It was then stolen and copied, effectively giving terrorists the same weapon.

So far, the Pentagon has neither clearly defined “artificial intelligence autonomous weapons,” nor authorized the use of any such weapons by the U.S. military, said Allen, a former Defense Department official. Any proposed system must be approved by the chairman of the Joint Chiefs of Staff and two deputy secretaries.

That hasn’t stopped the weapon from being developed across the United States. Ongoing projects at DARPA, military laboratories, academic institutions, and the private sector.

The Pentagon is emphasizing the use of artificial intelligence to augment human fighters. The Air Force is working on ways to pair pilots with drone wingmen. Former Undersecretary of Defense Robert O. Work, a promoter of the idea, said in a report last month that once AI systems outperformed humans, “it would be crazy not to adopt autonomous systems”— — He said that this threshold has been surpassed in 2015, and computer vision surpassed humans.

Humans are already sidelined in some defense systems. Israel’s Iron Dome missile defense system is authorized to fire automatically, although it is said to be monitored by a human who can intervene if the system tracks the wrong target.

Multiple countries and every branch of the U.S. military are developing drones that can strike in deadly synchronized swarms, according to Kallenborn, a George Mason fellow.

So will the war of the future be one of fighting to the last drone?

That’s what Putin predicted in a 2017 TV talk with engineering students: “When one side’s drone is destroyed by the other’s drone, it has no choice but to surrender.”


Frank Bajak reported from Boston. Associated Press reporters Tara Copp in Washington, Garance Burke in San Francisco and Suzan Fraser in Turkey contributed to this report.

Source link