How Well Can AI Pick Targets From Satellite Photos? Army Test Aims to Find Out

Visible evidence of poor air quality appeared in satellite imagery of California on November 18, 2007.

Visible evidence of poor air quality appeared in satellite imagery of California on November 18, 2007. NASA image courtesy the MODIS Rapid Response Team at NASA GSFC

The Scarlet Dragon exercise is evaluating ways the service might put new tools to use in the very near term.

Two live GBU-32 bombs will be dropped on range targets selected by an artificial-intelligence tool on Thursday, part of a U.S. Army effort to see how AI might soon be used on the battlefield.

The F-35 strikes are part of the fourth iteration of the XVIII Airborne Corps’s Scarlet Dragon exercise, which aims to test whether applying AI to multiple data streams can speed up finding and hitting pre-invasion targets. The test will be run by operators with the XVIII Airborne Corps out of Fort Bragg.

The exercise uses the seed software behind Project Maven, the military’s flagship AI-for-targeting effort. But whereas Maven looked at full-motion video feeds from drones, the Army effort applies that same technology to satellite images. That opens up an opportunity to operate in a much larger area. The exercise spans multiple ranges from Virginia to Georgia, with thousands of potential targets spread over some 7,200 square kilometers. 

Early concept testing shows the idea is sound. Last August, the XVIII Airborne Corps conducted a sort of preliminary test to ensure the concept was sound. The results surpassed their expectations according to Col. Joe O’Callaghan, an XVIII Airborne Corps fire support officer.

“We conducted an experiment here to see how many targets we could find in an hour,” O’Callaghan said. But they found that “the metric wasn’t how many targets we could find in an hour, it was how many decisions we could make in an hour.”

His team “exponentially made more decisions than we even thought possible. So much that the number of decisions we were making was causing legacy computer systems to crash because we were exceeding their capability to take sequentially-derived targets. The volume of targets proceeding in parallel surpassed sequential technology,” he said. 

Thursday’s experiment will also test the ability to rapidly share across services and agencies; it will include elements from the Navy’s Fleet Forces Command and Naval Information Warfare Development Command; the Air Force’s 53rd Test Wing and 461st Air Control Wing; the National Reconnaissance Office; the National Geospatial Intelligence Agency, or NGA; and more. 

AI and other new technologies are bringing operators closer to NGA and other data suppliers. That’s because decisions that used to be made in big operations centers are increasingly moving down-range where lower-level commanders are able to exercise more control over targeting and other elements of warfare. 

“What NGA is doing now is helping us get to true sensor-to-shooter capability, shortening the distance between a combat support agency and a unit in the field by employing that AI and stream-lining, matching collection strategies with targeting requirements,” said O’Callaghan.

Each of the military services has begun running exercises aimed at joint all-domain command and control, or JADC2. The Army’s Project Convergence exercise, whose second iteration will take place in November, has become the military’s flagship JADC2 effort. But while Project Convergence is meant to test new technologies to shape how the Army will wage war in the year 2030 onward, Scarlet Dragon focuses much more on how the service might use AI if it had to invade somewhere tonight. But both efforts inform one another, said O’Callaghan.

Scarlet Dragon also shows how the military is beginning to tackle the challenge of fighting in places where there is no troop presence, so-called “over-the-horizon” operations, said O’Callaghan.

But the exercise’s most important objective is to train operators to think differently about how data will inform and accelerate operations. O’Callaghan called this essential: new AI capabilities are evolving faster than many predicted as new correlations are found across ever more widely-available data.

“We’re not stuck on just computer vision. We’re starting to look at correlational AI to bring multiple factors together,” he said. “The second that these more general AI capabilities come into play, we are going to be combat-ready with them.”