Army headset’s latest version clears hurdle, but service wishlist remains long

Spc. Layne Alfieri of the U.S. Army's 10th Mountain Division, dons an Integrated Visual Augmentation System (IVAS) 1.2 prototype during tests held by PEO-Soldier on Fort Drum, New York, on August 22, 2023

Spc. Layne Alfieri of the U.S. Army's 10th Mountain Division, dons an Integrated Visual Augmentation System (IVAS) 1.2 prototype during tests held by PEO-Soldier on Fort Drum, New York, on August 22, 2023 U.S. Army / Jason Amadi, PEO-Soldier

Service approves further development after Microsoft’s IVAS 1.2 proves lighter, crisper.

Soldiers who tried out the latest prototype of the U.S. Army’s futuristic infantry headset called it much improved over an earlier, heavier version. But the Army’s wishlist of capabilities for a soldier headset remains long and ambitious.  

From August 18 to 24, soldiers at Fort Drum, New York, tested version 1.2 of Microsoft’s Integrated Visual Augmentation System, or IVAS, which Army leaders hope will give forces a massive advantage in everything from training to mission rehearsal to combat. 

The testers “appreciated the improved form factor, with the flip-up heads up display and significantly improved center of gravity/weight distribution,” said David Patterson, a spokesman for PEO-Soldier. He said the positive reviews led the Army Acquisition Executive to greenlight IVAS 1.2 for more development and a planned operational assessment in fiscal 2025.

The new version has a slimmer visor and a new display that help reduce eyestrain and motion sickness, Microsoft says, as well as better night vision, compatibility with weapons optics, and more stable and reliable software. The system can also incorporate imagery from a rifle’s digital sight, allowing soldiers to peer around corners without exposing themselves to fire.

Defense One tested the new IVAS in a simulated training exercise at one of Microsoft’s Virginia offices. Compared to the original version, the new one is indeed smaller and lighter. The night-vision picture is clearer and crisper; a new option to fuse visible-light and thermal data helps when thermal imagery alone doesn’t work well, such as through glass.

IVAS comes with an embedded training tool, the Squad Immersive Virtual Trainer, which the Army hopes will become central to the way troops learn and practice their craft. It allows soldiers to fight holographic adversaries in realistic settings, then rewatch and learn from their efforts, down to the position of the gun at different moments. (When Defense One ran through several simulations, we did indeed improve over time, yet repeatedly died, as we are not actually good at this stuff.)

Microsoft delivered 20 IVAS 1.2 headsets in August; a spokesman said the company expects to deliver more by January for testing and refinement.

It was a critical test of a system that has seen delays, scaled-back expectations, and complaints about neck strain and glitchy software. In April, Army acquisition head Doug Bush told lawmakers that the Army “is absolutely prepared to end that arrangement and seek a new competition.”

Big Ambitions

But the program’s problems reflect the Army’s big ambitions for the headset as much as the actual hardware and software issues.

“It’s not ready,” PEO-SOF Warrior’s Col. Anh Ha said of IVAS in May. “But it's not because of any type of problems with industry. The technology that [Army leaders] want to get after is not there yet…They want to make sure they get the best, but they are building some right now to understand what they can and can't do in the realm of possibility, technically, for IVAS.”

The program owes its origins to an earlier vision of future tech: the “Iron Man” suit that then-SOCOM chief Adm. William McRaven announced in 2013. The armored exoskeleton was supposed to give the wearer super abilities, like running through walls and swatting away bullets. It was ultimately crushed by the limitations of physics, particularly battery technology. But a portion of the idea lived on in the idea of the hyper-enabled operator, supplied with data and optical gear to understand the battlefield better than his or her adversary.

The Army has relatively near-term goals for IVAS or whatever AR headset it eventually adopts, such as pulling in video feeds from soldier-borne drones such as the Teledyne FLIR Black Hornet. But ultimately, the service wants to use artificial intelligence to collect and disseminate battlefield data much more quickly and effectively, data collected from and sent to the headsets, the IVAS program’s head-up-display leader said in May 

“As we speak, the Army intel data platform is undergoing an operational test both here in the National Capital Region and all over the world. And that's gonna get deployed early next year. So [Army Data Platform] is a big data platform hosted on the cloud, multiple security enclaves,” Robert Luke told a crowd in Tampa. “Right now we're in the tens of millions of individual files and data objects and multiple security lanes, dozens of pipelines increasing every day…We have a warehouse full of data that is timely. We have countless sensors adding to it, basically, by the minute, in real time, 24 hours a day. So it's pretty obvious to us early on that we needed to be able to effectively manage this giant volume and complexity of data. The only way to do that is through artificial intelligence.”

The Army doesn’t want AI to replace human decision-making, Luke said, so much as allow human operators to make better decisions faster with much more data. The IVAS, ultimately, is just a one of several points where soldiers will access information about the world around them allowing them to act faster than their foes not only when they are dismounted but when they are operating vehicles and weapons around them. 

“At the tactical level,” he said, “On a fighting platform, with an open-systems architectural approach, I can now take a…data feed; I can link that to a the appropriate antenna that is listening in the EW spectrum; I can fuse that with multispectral imagery coming from multiple optics on the platform, and now present a synchronized multimodal, ‘Hey, this is what the battlefield looks like around you. I recognize these things are threats. Shoot this one first.’ 

“And the gun does it by itself because I've…accelerated this decision cycle filtering, multimodal information into that vehicle commander, you know, synthesized for him, and he's just pressing the button because he agreed with the prioritization and the identification,” he said. “If you're going to do that, you're basically out-fighting someone else's OODA loop and you're going to win every day.”