Algorithmic Warfare: Industry Prepping AI Tech for Next-Gen Aircraft – National Defense Magazine

Algorithmic Warfare: Industry Prepping AI Tech for Next-Gen Aircraft

Skyborg concept

AFRL image

NATIONAL HARBOR, Maryland — The Air Force wants its sixth-generation fighter aircraft to have a squad of uncrewed systems flying at its side. Before the autonomous aircraft becomes a program of record, the aerospace industry is eager to tackle the challenges of manned-unmanned teaming.

Secretary of the Air Force Frank Kendall has pitched the Air Force’s Next-Generation Air Dominance program as a package deal of crewed and uncrewed systems. While the collaborative combat aircraft program isn’t funded to start until 2024, industry executives said they are gearing up their autonomous capabilities to expand the potential for manned and unmanned teaming.

While there is certainty in the service that the uncrewed aircraft is the future, there are no requirements in place yet, said Gen. Mark Kelly, commander of Combat Air Command. Discussion is ongoing about how the acquisition process will work, he said.

Autonomy is one of three must-haves for the system, along with resilient communications links and the authority for the system to freely move. More testing and experiments will fill in the blanks, he said.

“I’m an advocate to iterate our way there because I think there’s so much we don’t know,” he said during a media roundtable at the Air and Space Forces Association’s annual conference in National Harbor, Maryland.

Operational tests for the collaborative combat aircraft will take place in two or three years, he said.

Industry needs to participate in the experimentation that will shape the autonomous capabilities, said Mike Atwood, senior director, advanced programs group at General Atomics Aeronautical Systems.

One area for industry to navigate alongside the Air Force is how it will face other artificial intelligence-based systems, he said during a panel discussion at the conference. That challenge could shape the ethical limits of autonomous systems.

The ADAIR-UX program — which is developing an AI-piloted aircraft with General Atomics for fighter jets to train against — will build awareness about the difficulty of facing AI as students at weapons schools practice against adversaries with lightning-fast decision making, he said.

“I think that will be maybe the Sputnik moment of cultural change, where we realize when we saw … F-22 and F-35s in the range, how challenging it is to go against that,” he said during a panel at the conference.

A new advancement in autonomous capabilities with potential for future AI-controlled aerial vehicles is reinforced learning, he said. Using algorithms, an operator can define the world that the machine is allowed to operate in and give it a set of actions. The machine can then self-learn all the possible combinations of those actions in the set environment.

This type of learning could be reassuring to those with concerns about AI, especially as the military begins to test its largest class of unmanned aerial vehicles, he said. Setting the limits of what the machine can do can be comforting, but it still allows the system to innovate, Atwood said.

“What we’re finding now in manned-unmanned teaming is the squadrons are ready to start accepting more degrees of freedom to the system — not just going in a circle, but maybe cueing mission systems, maybe doing electronic warfare [or] doing comms functionality,” he said.

He added programs like the loyal wingman program Skyborg — for which General Atomics provides core software — are advancing the autonomous capabilities needed for the aircraft of the future.

The AI-enabled system that controls unmanned vehicles will officially become a program of record in 2023. The program, alongside the Air Force’s three other Vanguard systems, will add data for programs like the next-generation air dominance family of systems, according to the Air Force.

“I think we’re on the precipice of something very, very special with the collaborative combat aircraft,” Atwood said.

Lockheed Martin has been internally considering how it could pull off industry collaboration similar in scope to the Manhattan Project, said John Clark, vice president and general manager at Skunk Works.

Given an urgent, national need, companies could band together to create a capability in 12 to 18 months, he said.

“The environment is not quite to that point, but it’s maybe one day or one event away from having that sort of environment,” he said during the panel.

Clark said losing the AlphaDogfight competition — a series of trials testing manned-unmanned teaming capabilities run by the Defense Advanced Research Projects Agency — called for an examination of Lockheed’s AI boundaries. During the competition, Lockheed limited AI control to follow Air Force doctrine, but the winner — Heron Systems, which was bought by software company Shield AI — was more flexible.

The Air Force and industry need to discuss where the range of acceptable behavior is for AI and how to build trust within that range, he said.

“We’re going to have to fail a few times and learn from those failures and then move forward with, ‘This is the right way to go through it,’” he said. “I think that that’s the No. 1 thing that’s keeping us from being able to really make the leap forward with this technology.”

Industry also wants to emphasize the importance of science, technology, engineering and math education for future pilots and operators, said Ben Strausser, principal lead of mosaic autonomy research at General Dynamics Mission Systems.

“Other discussions have talked about the importance of STEM education and making sure we have that level of understanding of literacy, so when we want to communicate what our unmanned systems are doing, there’s a level of … understanding for what the semantic descriptions of those algorithms mean,” he added during the panel.

Topics: Air Power, Robotics and Autonomous Systems, Unmanned Air Vehicles