World’s Fastest Academic Supercomputer Unveiled in Texas

Gorodenkoff/Shutterstock

Featured eBooks

Digital First
What’s Next for Federal Customer Experience
Cloud Smarter

The new system will power new discoveries for researchers from across the U.S. and world.

The Texas Advanced Computing Center at the University of Texas at Austin unveiled Frontera—the fastest supercomputer at any university and the fifth most powerful supercomputing system in the world. 

Funded through a $60 million award from the National Science Foundation and officially launched Tuesday, the system will support U.S. and international research teams as they work to solve some of the world’s most massive advanced computational challenges.

“The system itself is a remarkable system,” John West, TACC’s director of strategic initiatives and co-principal investigator on Frontera, told Nextgov. “It’s an incredible opportunity for open science to have access to a resource at this scale so that investment by the National Science Foundation is going to be incredibly important for discovery and innovation going forward.”

West, who previously led the Defense Department’s high-performance computing modernization program and was once responsible for supercomputing research and development across the agency’s enterprise, explained that NSF works with a variety of cyber-infrastructure providers across the country because running such state-of-the-art systems is incredibly resourced and facility intensive. It also requires a great deal of floor space and immense amounts of power and cooling. 

TACC already has several large-scale computing systems—including the 19th fastest system in the world, Stampede2—that solve a variety of highly complex computational jobs. But Frontera— Spanish for "frontier" and an allusion to the title of a 1945 report to President Harry Truman that led to the creation of NSF—will power even more cutting-edge discoveries. 

“Frontera is different,” West said. “Its audience is really those scientists that need the most capable computational resources, so it will run less of a mix of jobs, focusing instead on scientists at the very tip of computational capability that we can provide today.” 

Through a solicitation first awarded in 2018, the system aims to act as a resource not just to UT students but to the entire open science community, meeting the needs of some of the most massive science and engineering computational experiments that need to be performed. West and his team at TACC have been constructing the system all year. The system will operate for at least five years and in that time it will likely be used by thousands of researchers across nearly all fields of science. 

“The focus is on supporting the entire research enterprise so it is across all the scientific disciplines,” West said. “And this is not just a UT resource, this is a resource for scientists all over the world to use.”

Those who want to run research on the system—and who can prove that they require a computer at Frontera’s scale to solve their problems—will be selected to use it through a competitive application process. The gigantic machines are fairly specialized to run and are highly complex in their analysis and applications, so TACC has specialists on hand to support researchers who will work directly with it. Faculty from the university’s Oden Institute for Computational Engineering and Sciences, with partners from other schools including the California Institute of Technology, Ohio State University, Princeton University, the University of Chicago, the University of Utah and others will lead Frontera’s science applications and technology team.

“The idea here is not only provide the machine but provide the expertise that science needs to make use of the machine,” he said. 

Insiders expect the system’s outputs to have an impact on a wide range of scientific research, including things like helping predict the trajectory and intensity of future storms, genomics and precision agriculture, energy research ranging from solar power to cleaner coal, gravitational wave modeling, using modeling and deep learning to accelerate the development of new molecules for medicine and engineering—and beyond.

West noted that some early adopters began using the machine to power their own research this summer. For example, a chemist from the University of North Carolina used Frontera to run more than 3 million atomic force field calculations in less than 24 hours, which is considered “a major achievement” in high-speed quantum computation. 

“These users have been great for testing out new installations and testing out where the machine might need a little extra attention,” West said.

Because computational resources and instruments at this scale are enormously expensive and extremely difficult to run, West said the strongest deployment models require the resources and oversight that the government can provide, combined with the initiative and innovation of scientists across the country. As part of the solicitation, NSF also awarded TACC a planning grant for a leadership class computing facility that will be designed and deployed at some point in the future. 

It will aim to offer 10 times the power Frontera currently provides. 

“This really shows the depth of the commitment [NSF] has made to the open science community in terms of the computing assets that they are making available,” West said. “They are saying, ‘not only do you have a system now that’s the fifth largest in the world, but we are going to stay with it, and in five to seven years, you’re going to have a system that’s 10 times the capability.’”