Last week, the United States’ flying services announced they awarded indefinite-delivery, indefinite-quantity contracts to four companies to build the future Skyborg drone. The four companies, Northrop Grumman, Boeing, Kratos, and General Atomics, represent some of the United States’ most experienced aerospace companies.
What is it?
According to the Air Force, Skyborg is an “autonomy-focused capability that will enable the Air Force to operate and sustain low-cost, teamed aircraft that can thwart adversaries with quick, decisive actions in contested environments.”
The program essentially aims to get unmanned aerial systems airborne in support of pilot-centered operations. Skyborg will “provide them [Air Force pilots] with key data to support rapid, informed decisions. In this manner, Skyborg will provide manned teammates with greater situational awareness and survivability during combat missions.”
In an interview, the Air Force acquisitions chief, Will Roper, explained that Skyborg will bring “intelligent mass” to air battles by integrating artificial intelligence with piloted aircraft to maintain the United States’ competitive advantage in the air.
Low-cost, unmanned drones will change Air Force tactics in several ways. First, these drones could be used for doing more tedious, monotonous tasks like patrolling. This would provide expanded situational awareness and act as force multiplier by freeing up pilots for other tactical maneuvers. Drones like the Skyborg can also protect high value airframes like the F-22 Raptor or F-35 by taking on higher risk missions those planes would normally take—and more importantly, invaluable pilot lives.
“If successful, the program could lead to a family of airborne platforms that share a common intelligent nervous system that can react to our adversaries at machine speed,” Roper explained—the merging of man and machine. Though intriguing, the Skyborg program is not without its detractors.
Deciding on Decisions
In addition to simple pre-programmed play-book style tactical maneuvers, Skyborg will also use artificial intelligence in tandem with data the networked drones collect to execute advanced decision-making—potentially even firing on other drones—or enemy pilots.
One of the more controversial aspects of the program is just how much autonomy the Air Force would grant Skyborg and other unmanned aerial platforms. If armed, would they be permitted to independently push the fire button?
Though Roper acknowledged the potential ethical pitfalls presented by armed autonomous drones, he maintained that any armed drones would adhere to the same ethical standards all American airmen adhere to. “Overall, just like any tool, autonomous UAVs give the air force more options to meet the commander’s intent within the rules of engagement. Our professional airmen have been making these ethical decisions since the air force was founded, and autonomous UAVs will not change that dynamic.”
The Future of Air Combat
Another open question is the risk of conflict escalation posed by relatively cheap, unmanned systems. The lower risk in terms of costs to both blood and treasure may lower the bar for combat situations. We’ll see what this means for the future of aerial warfare.
Caleb Larson holds a Master of Public Policy degree from the Willy Brandt School of Public Policy. He lives in Berlin and writes on U.S. and Russian foreign and defense policy, German politics, and culture.