US Air Force Secretary Frank Kendall tests an AI-piloted fighter jet

Us Air Force Secretary Frank Kendall Tests An Ai Piloted Fighter Jet

US Air Force Secretary Frank Kendall flew an experimental AI-powered F-16 fighter jet in a test flight at Edwards Air Force Base in California.

The AI-controlled F-16, the X-62A VISTA (Variable In-flight Simulator Test Aircraft), engaged in an aerial dogfight with a human-piloted F-16.

The two jets flew within 1,000 feet of each other at speeds exceeding 550 miles per hour, performing complex high-speed maneuvers. 

Secretary Kendall, who experienced the AI-piloted jet firsthand during the hour-long flight, expressed the inevitability of AI weaponry: “It’s a security risk not to have it. At this point, we have to have it.”

The AI-piloted jet was pitted against a human-piloted adversary, attempting to force the opponent into vulnerable positions as per a real dogfight. 

At the end of the flight, Secretary Kendall declared he’d seen enough to trust AI on the battlefield. 

This follows a recent test of the same aircraft in a live dogfight situation.

Concern over AI weaponry grows

No surprise, the prospect of AI autonomously launching weapons without human intervention is immensely controversial. 

At a recent international conference, “Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation,” in Vienna, attendees from 143 countries debated the urgent need to regulate the use of AI in autonomous weapon systems (AWS).

Austrian Foreign Minister Alexander Schallenberg warned that the world is facing an “Oppenheimer Moment,” referencing the development of the first atomic bomb. 

He stressed, “At least let us make sure that the most profound and far-reaching decision — who lives and who dies — remains in the hands of humans and not of machines.”

Others have questioned who’d be responsible if AI weaponry went wrong. The manufacturer? The one in command of the operation? Or someone further down the rungs pushing buttons and monitoring equipment?

AI isn’t just being impregnated into weapons – it’s also being used for strategy. One recent study found that LLMs tend to escalate war games towards nuclear war. 

Autonomous drones are already being deployed by both sides in the war in Ukraine, while Israeli forces are allegedly using AI to identify human targets in the war in Gaza.

Despite these concerns, the US Air Force has ambitious plans for an AI-enabled fleet consisting of more than 1,000 unmanned warplanes operational by 2028. 

Vista claims that no other country has an AI jet like it, mainly referring to China’s absence of such weaponry as far as Western sources can tell.

While China has AI technology broadly on par with the US, there’s no indication that it runs field tests like this. 

As the US Air Force continues to push the boundaries of AI-powered aviation, the international community grapples with what this might mean for modern warfare. 

It’s precarious ground trodden only by science fiction plotlines, most of which don’t end favorably for anyone.