Video Game Technology Tapped to Medically Train Astronauts for Future Moon and Mars Missions

NASA astronaut Andrew Morgan poses for a portrait with a stethoscope for medical checks inside the U.S. Destiny laboratory module after an exercise session.

NASA astronaut Andrew Morgan poses for a portrait with a stethoscope for medical checks inside the U.S. Destiny laboratory module after an exercise session. NASA

Ultimately, lives might depend on the quality of that training.

Imagine this nightmare scenario. A crew of NASA astronauts are onboard a spacecraft hurtling towards Mars on one of the first human-crewed missions to the red planet. Suddenly, one of the crew has a medical emergency. They might even pass out, so they can’t tell their fellow crewmembers what is wrong. Maybe they are having a heart attack, or an aneurysm, or maybe it’s something more mundane like bad space ice cream? There is medical equipment on board the ship and a flight surgeon on call back on Earth, but, depending on when the emergency takes place during their journey, it might take a half hour or more for the crew to communicate with the doctor. By the time lifesaving medical advice is received, it might be too late.

One solution to this dilemma, which could become a very real possibility as NASA programs like Artemis get ready to send humans to the Moon and eventually Mars, would be to give astronauts both basic and advanced medical training. That way, they can triage or even begin treatment in emergencies long before word reaches Houston about the problem.

That was the goal of a 2019 grant award from the Translational Research Institute for Space Health, or TRISH, to build a virtual human simulation framework for NASA to help improve medical care during space missions. The end result had to be as ultra-realistic as possible. Because ultimately, lives might depend on the quality of that training.

The award was won by a company called Level Ex, which uses video game technology to serve the medical community. From virtual reality surgery games to mobile phone titles that help train dermatologists using AI-generated skin diseases, Level Ex is no stranger to using entertaining, cutting edge video game technology to train people about very serious subjects. 

The program funded by the grant award is now complete, and is scheduled to be a part of the upcoming SpaceX Polaris Dawn mission that is set to orbit the Earth for several days in March. 

NextGov caught up with Level Ex founder and CEO Sam Glassenberg to talk about making video games to support critical missions, medical training in space and how technology might take these very serious games and simulations to the next level.

NextGov: Can you first tell us a little bit about your background in video games and game technology?

Glassenberg: I started my career at LucasArts, where I created Star Wars games for the PlayStation 2 and Xbox consoles. From there I served as a leader of the DirectX Graphics team at Microsoft, where our focus was to advance the visual realism of video games across the industry. Prior to starting Level Ex, I was the CEO of the leading independent game publisher in Hollywood, releasing games based on popular films like The Hunger Games and Mission: Impossible. 

NextGov: When and how did you decide to found Level Ex and start making more serious games and simulations?

Glassenberg: I would love to say that this was some grand idea that I had, but honestly this company was founded completely by accident—aided by a bit of parental guilt. Coming from a family of doctors, I was always seen as the disgrace for bucking that trend and making video games for a living. My father kept pushing me to go to medical school until I was about 30 years old, and when that wasn’t feasible anymore he took a new approach. 

An anesthesiologist by trade, he called me up asking if I could put this gaming “nonsense” to good use. He wanted me to make a game to train his colleagues to perform a fiberoptic intubation—a tricky procedure that even experienced anesthesiologists can struggle with. He said, “make me a game that can run on their phones—I don’t want to have to drag anyone to the training center.” 

So that's what I did. I sat down for three weekends to throw this game together to get my father off of my back. Three years later, he called me to check on the game, and when I checked the Apple store, there were over 100,000 downloads from doctors, nurses and airway specialists. 

Clearly there was tremendous demand for this and an opportunity for some real impact.

NextGov: Why is it a good thing to use video game technology for medical training? How does typical video game technology compare with what is used by existing medical simulations?

Glassenberg: The video games industry is decades ahead of medical training in terms of graphics technology—even before you start talking about XR headsets and whatnot. We’ve been busy closing that gap—building on top of the latest game technology to create ultra-realistic patient scenarios. 

For example, we’ve built on top of video game skin rendering technology to create a platform that allows us to recreate any skin disease on skin of any color and any body part—and it’s indistinguishable from photographs. We’ve used graphics hardware honed for games to do everything from real-time fluid and tissue simulation to tracing ultrasound waves and x-rays. 

We’ve also built upon cloud gaming technology to create the first and only cloud gaming platform for healthcare. This allows multiple medical professionals to remotely examine, diagnose and perform surgery on virtual patients simulated in the cloud. Since all the computing is cloud-based, users can access it from any phone, tablet or desktop web browser over Zoom or Microsoft Teams. There’s no need to install any application or have any additional hardware, like a VR headset, on hand. 

NextGov: Tell us about the TRISH grant program for NASA. How real did they need the game to be in order to help train astronauts? 

Glassenberg: Ultra-realism was vital to the Virtual Human Simulation Framework project, in order for users to accurately identify anatomy and medical conditions common in space. Such medical conditions can include Spaceflight Associated Neuro-ocular Syndrome, or SANS; cardiac rhythm problems and changes in the shape of the heart due to prolonged microgravity exposure, specifically atrial fibrillation; and adverse health effects due to host-microorganism interaction, specifically respiratory infections.

To this end, the project required aggregation and visualization of data from spaceflight medical research studies, terrestrial medical research and best practice guidelines. Level Ex has medical professionals on staff, a physician advisor and a large contributor network comprised of hundreds of specialists across the globe. For this project, we also consulted with experts in ultrasound mechanics and clinical radiology from the University of Illinois Chicago, NASA and the KBR company

Our research and development team created some of the most advanced real-time ultrasound simulation technology available to date and demonstrated its application within a virtual headset platform. Our tech essentially tricks graphics hardware that is designed to simulate visible light in video games to instead trace ultrasound waves, recreating the complex artifacts that result from everything from acoustic shadowing—because, yes, sound casts a shadow—to ringdown effects caused by ultrasound waves echoing inside tissues. 

To train crews on these types of risks, our training scenario provides users with an integrated clinical training environment that includes clinical decision mechanics, virtual astronaut models and real-time ultrasound simulation technology that offers the level of ultra-realism necessary to image and identify different conditions. 

NextGov: How is the Level Ex simulation and technology being used on the upcoming Polaris Dawn mission?

Glassenberg: Crews must be prepared to diagnose and treat themselves as medical issues arise during space exploration missions, and intelligent medical imaging solutions are integral in building non-physician astronauts’ understanding of changes happening inside the body when they are thousands of miles from home.

Building on our prior work and in collaboration with TRISH and KBR, the solution we built for the Polaris Dawn mission consists of two parts—both aimed at enabling astronauts to better monitor their health and maximize their safety in space. First, there is the pre-flight orientation and training guide to teach the crew how to use a handheld Butterfly iQ device for ultrasound imaging that will be onboard the spacecraft. Then, during the five-day orbit mission, the crew will use just-in-time training and procedural guidance that we created to perform the ultrasound procedures on themselves and collect data. In order to learn more about how the zero gravity environment influences the human body, the crew will be tracking their blood flow patterns daily. 

This experiment will also test the efficacy of using virtual training solutions like video games for just-in-time training on medical technology and procedures. 

NextGov: That all sounds really amazing. Do you think that video game technology and these kinds of realistic medical simulations will become a requirement for keeping future astronauts safe in space?

Glassenberg: If space travel is going to become commercialized, we’re going to need solutions to quickly train laypeople how to deal with medical emergencies in resource-constrained environments. It will be a long time before we see an MRI machine up in space, but with just-in-time training on a handheld ultrasound device, we can address some of deep space’s very specific healthcare training needs. 

There’s also the issue of training time. Astronauts trained for years for the eight day Apollo missions. There’s no time in the training regimen for upcoming multiyear Mars missions to provide astronauts with a medical degree in space health. In many cases, astronauts will need to train independently on a just-in-time basis as issues arise. This virtual training platform is designed to do just that.

Whereas traditional in-person training often requires expensive simulation labs, which are not always accessible and may require travel, our video games can be accessed anywhere, at any time. Additionally, the virtual nature of our platform allows us to easily update and scale the games to address the latest advances in a particular procedure or tailor them to specific needs. 

John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology. He is the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys