A New AI Weapons Competition Altering the War in the Region
"Such technology represents the future threat," warns Serhiy Beskrestnov, who has just obtained a newly intercepted Russian unmanned aerial vehicle.
This proved to be far from typical drone either, he discovered. Powered by artificial intelligence, this drone is able to find and strike targets autonomously.
The consultant has inspected numerous unmanned systems in his role as Ukrainian defence forces consultant.
In contrast to other models, it did not transmit or receive communications, causing it to could not disrupted.
Russian and Ukrainian forces have both been testing AI in this war, and for certain tasks are now using it, for target identification, intelligence collection and de-mining.
For the nation's military, AI has become indispensable.
"Our military receives more than 50,000 footage feeds [from the front line] monthly which are processed by AI," says the defense official.
"It enables us rapidly handle huge volumes of information, pinpoint objectives and mark them on a map."
AI-Empowered Tech as a Strategic Tool
AI-empowered technology is seen as a instrument that can enhance military strategy, optimize assets and in the end save lives.
But when it comes to unmanned weapons systems, it's revolutionizing the warzone.
The country's soldiers currently employ AI-powered software enabling drones lock on a target and subsequently operate independently for the final segment until the mission concludes.
Signal disruption cannot be done and destroying such a small flying object is not easy.
In the future such technologies will likely become fully autonomous armaments that are able to detect and destroy objectives by themselves.
All a soldier must do is press a button on a smartphone app, notes Yaroslav Azhnyuk, head of a local developer.
It handles everything else, according to him, locating the target, dropping explosives, assessing the damage and then returning to base.
"It doesn't demand piloting skills from the user," he adds.
Interceptor Systems and Upcoming Developments
Defensive drones with that kind of autonomous capability could significantly strengthen defensive systems against enemy remote strike aircraft, like the infamous Shaheds.
"A computer-guided self-operating unit can be superior to a human in so many ways," says Azhnyuk. "It is sharper. It detects the objective sooner than a human can. It is more agile."
The official says such a system is not yet available, but he mentions the country is nearing finishing its creation. "They've partly implemented it in some devices," says the representative.
It's possible there will be thousands of such systems deployed by the end of 2026, predicts Azhnyuk.
Concerns and Risks of Full Autonomy
However Local developers are wary of fully making use of defence systems that rely entirely on AI, with no human involvement. The danger is that AI might not differentiate a friendly soldier from a Russian combatant, as they may be wearing the same uniform, says Vadym, who asked to remain anonymous.
The firm makes remote-operated automatic weapons, that use AI to automatically detect people and follow them. Because of worries about friendly fire, he states they avoid including an automatic shooting option.
"It can be activated, but we must get more experience and additional input from the military units to determine when it is safe to use this feature."
Ethical Issues and International Regulations
There are also fears that automated systems will violate the rules of war. How can they prevent injuring civilians, or tell apart soldiers who want to surrender?
According to the official, the final decision in such circumstances should rest with a human, even if AI would make it "simpler to choose". But it's not certain that states or armed groups will adhere to international humanitarian norms.
Therefore counteracting these systems becomes even more critical.
How can one halt a "swarm of drones" when electronic warfare or using aircraft, tanks or missiles proves useless?
The nation's highly successful "Spider Web" mission, when a hundred drones attacked enemy air bases in June, is believed to be supported by AI tools.
Many in Ukraine fear that the adversary may replicate that tactic, not just on the battlefront but further afield as well.
The country's leader warned the UN recently that AI was contributing to "the worst weapons competition in history."
He called for global rules for the application of AI in armaments, and stated the issue is "just as urgent as preventing the spread of atomic arms."