Built by science for science
Characteristic science applications shape future of US NSF leadership-class computing facility
University of Texas at Austin
image: Astrophysicist James Stone at the Institute for Advanced Study is seeking to understand central problems in astrophysics — how matter accretes and how stars form in the turbulent interstellar medium. Understanding accretion disks is crucial for decoding the universe’s most spectacular events. Image shows logarithmic fluid frame mass density gas temperature in a poloidal slice through a tilted (45◦), sub-Eddington accretion disk about an a = 15/16 black hole.
Credit: DOI: 10.48550/arXiv.2409.16053
In 2026, U.S. scientists will gain a new powerhouse for discovery: Horizon, the largest open science supercomputer in the nation. With its extraordinary computing muscle, Horizon will accelerate basic research that drives innovation, fuels new industries, and strengthens the U.S. economy while probing the Universe’s greatest mysteries.
Thanks to the National Science Foundation’s support for basic and transformative research, TACC began building the U.S. NSF Leadership-Class Computing Facility in 2024. This innovative facility represents the next leap in large-scale computing to tackle the most demanding scientific problems in the nation’s research portfolio during the next decade.
“Today, almost all advanced products are a result of basic research discoveries,” said NSF LCCF Principal Investigator Dan Stanzione, who also serves as executive director of TACC. “Basic questions in science can have trillion-dollar implications, so it’s critical to invest in them.”
The NSF elevated the LCCF to a computing facility with sustained funding to be on a par with other large, strategic scientific initiatives such as the James Webb Space Telescope and the IceCube Neutrino Observatory.
When fully operational, the LCCF will function as a distributed facility with many partners, each contributing unique computational power, data analytics expertise, software, services, education, and training. Together, they will empower breakthroughs that wouldn’t be possible without such scale and collaboration.
And history shows why that matters: NSF-backed research has seeded game-changing technologies including the early innovations that led to the cellphone; revolutionizing manufacturing processes with innovations like 3D printing; and foundational AI research that has led to technologies like digital assistants, facial recognition, and image generators, to name a few.
Beyond Benchmarks
The LCCF’s design is being shaped from the ground up by science itself.
At its core is the Characteristic Science Applications (CSA) program — 11 cutting-edge software projects built to tackle critical challenges that will define the facility’s success.
“We’re building a system that’s not just powerful, but purpose-built for the challenges scientists face and for the way scientists work,” Stanzione said.
The CSA approach takes a different path from the way supercomputers have historically been designed. Instead of designing supercomputers around abstract benchmarks like the LINPACK test, which measures how fast a system solves dense linear equations, CSAs are built around real science applications and are designed not to achieve a particular performance benchmark, but to accomplish new science at unprecedented scale.
After all, top benchmark scores do not always translate to top performance on the complex problems researchers need to solve. What’s more, artificial intelligence is now a driving force in scientific modeling and simulation, transforming how researchers explore and solve complex problems.
“In addition to ensuring that we assess the capability of the LCCF in terms of real science and not just abstract benchmarks, the team is leading community researchers through the process of gaining real-world experience tuning applications for the new architecture,” John West, NSF LCCF deputy director, explained. “Our experiences from these codes will then shape the training and resources we provide to users when the machine becomes available, helping to ensure that communities can get up to speed quickly on the new system.”
The LCCF is the next phase of the Frontera project, which launched in 2019 and for six years straight has led as the nation’s most powerful academic supercomputer.
The current design for the LCCF includes a supercomputing system with a computational performance 10 times that of Frontera, including storage systems and associated infrastructure; testbeds with innovative technologies; computational scientists to work with facility users to lead the transition of application science code to make use of the new capability; and training to ensure that the workforce can make full use of the facility over the next decade.
TACC’s Vista system bridges the gap between the end of Frontera, and the deployment of the LCCF Horizon supercomputer, expected to launch later in 2026.
TACC and the CSA science teams are learning from their experiences on Vista and Frontera and applying the knowledge to optimize their use of Horizon, which will have similar architecture to Vista, but on a much grander scale.
“We chose problems that would clearly demonstrate the impact of our architectural decisions and performance projections,” said John Cazes, director of high performance computing at TACC. “We also made it a priority to keep scientists closely engaged through long-term partnerships, ensuring we stay aligned with their evolving challenges and can adapt immediately when their needs change.”
Meet the 11 innovative CSA projects making an impact.
NAMD: Molecular Mechanisms of Viral Infection
Emad Tajkhorshid of the University of Illinois at Urbana-Champaign uses molecular dynamics to track how viruses change over time and space. The work relies on NAMD, an open-source software package developed at the university’s NIH Center for Macromolecular Modeling and Visualization, harnessing massively parallel simulations of complex biological systems.
WESTPA: Application to Delta SARS-CoV-2 Spike Opening in Respiratory Aerosols
Chemist Lillian Chong of the University of Pittsburgh is leading a project that connects with research by Rommie Amaro of the University of California at San Diego, who built a 1 billion-atom model of the Delta strain of the coronavirus inside a respiratory aerosol.
Chong and Amaro previously showed that the weighted ensemble approach can efficiently capture rare events, making it possible to run detailed simulations of the spike protein opening in the original coronavirus strain. This research was part of an international collaboration that won the 2020 Gordon Bell Special Prize for COVID-19 research.
ChaNGa: Evolution of Baryons and Galaxies Across the Age of the Universe
Astrophysicist Tom Quinn of the University of Washington is unraveling how galaxies take shape, tracing their evolution across cosmic time and from the smallest structures to the grandest formations.
With the NSF LCCF, his team aims to build a unified model of galaxy formation, spanning everything from million-solar-mass galaxies to trillion-solar-mass clusters.
Enzo-E: Accelerating Cosmological Simulations of the First Galaxies Through Deep Learning
Astrophysicist Michael Norman of UC San Diego is simulating a vast population of distant, high-redshift galaxies, some of the earliest and most mysterious in the universe. Sparked by recent JWST discoveries that revealed an unexpectedly large number of these bright galaxies, Norman’s work seeks to unravel their true nature, one of astronomy’s biggest unanswered questions.
Athena++: Astrophysical Fluid Dynamics at Exascale
Astrophysicist James Stone at the Institute for Advanced Study is seeking to understand central problems in astrophysics — how matter accretes and how stars form in the turbulent interstellar medium. Understanding accretion disks is crucial for decoding the universe’s most spectacular events.
PSDNS: Large-Scale DNS
Pui-Kuen Yeung of the Georgia Institute of Technology is tackling the “turbulence problem,” a top-tier grand challenge for the world’s most powerful supercomputers.
His previous simulations have traced the motion of more than a billion particles, precisely calculating the trajectory and acceleration of every fluid element as it traverses regions of extreme, short-lived fluctuations, a phenomenon known as Lagrangian intermittency in turbulence.
EPW: Quantum Materials Engineering at the Exascale
Sabyasachi Tiwari and Feliciano Giustino of UT Austin plan to scale up the EPW code by developing hierarchical MPI/OpenMP parallelization. EPW is a powerful open-source tool that excels at probing properties such as the electrical conductivity of semiconductors and the onset of superconductivity in quantum materials.
MuST: Electron Localization in Materials
Yang Wang and colleagues at the Pittsburgh Supercomputing Center are optimizing the open-source software package MuST (Multiple Scattering Theory). MuST is well suited for investigating solid-state materials with complex or disordered crystal structures.
SeisSol: Off-Fault Inelastic Processes and Fluid Effects in Earthquake Simulation
Geophysicist Alice Gabriel of UC San Diego leads the SeisSol project, which develops earthquake simulation software to model how seismic waves travel through 3D environments and how faults rupture during earthquakes.
AWP-ODC: Seismic Simulation for Hazard Management
Yifeng Cui of the San Diego Supercomputer Center is advancing earthquake modeling. His team is adapting AWP-ODC — a powerful seismic wave simulation tool widely used by Southern California Earthquake Center researchers — for the LCCF Horizon system. The team aims to improve statewide seismic hazard maps that predict ground motion across California for the next 50 years.
MILC: Lattice QCD for Flavor Physics
Physicist Steven Gottlieb of Indiana University is taking on a central challenge in high-energy physics — the discovery of signs of physics beyond the Standard Model, made by discovering new subatomic particles or through precision tests of Standard Model predictions.
The latter approach is used by experiments such as the Large Hadron Collider beauty experiment of CERN, Japan’s Belle II experiment, and the Beijing Spectrometer III experiment, which are at the forefront of quark physics.
Powering Discovery, Improving Life
Scientific computing affects people’s lives in more ways than most people realize, even if they never step into a lab or touch a supercomputer: advancing medicine and health care, protecting the environment and public health, enhancing safety and infrastructure, enabling technological innovation, and improving everyday digital life.
“The transistor is the most economically important discovery of the 20th century,” Stanzione said. “This basic research led people to the discovery of semiconductor materials. The possible combinations that can lead to discovering new materials are infinite — you can’t do any of this without our ability to do computation and virtual experimentation.”
The NSF LCCF will collaborate with four distributed science centers to ensure that researchers across the country have access to its resources and services:
- Morehouse Supercomputing Facility
- National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign
- Pittsburgh Supercomputing Center, a joint center of Carnegie Mellon University and the University of Pittsburgh
- San Diego Supercomputer Center at the University of California San Diego
In addition to the distributed centers deploying hardware resources, Ohio State University will advance the software stack for high performance networking, and Cornell University will aid in workforce development.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.