News Release

Throughput computing enables astronomers to use AI to decode iconic black holes

Peer-Reviewed Publication

Morgridge Institute for Research

Black hole simulation

image: 

Artist impression of a neural network that connects the observations (left) to the models (right). 

view more 

Credit: EHT Collaboration/Janssen et al. (high-resolution version)

MADISON — An international team of astronomers has trained a neural network with millions of synthetic simulations and artificial intelligence (AI) to tease out new cosmic curiosities about black holes, revealing the one at the center of our Milky Way is spinning at nearly top speed. 

These large ensembles of simulations were generated by throughput computing capabilities provided by the Center for High Throughput Computing (CHTC), a joint entity of the Morgridge Institute for Research and the University of Wisconsin-Madison. The astronomers published their results and methodology today in three papers in the journal Astronomy & Astrophysics.

High-throughput computing, celebrating its 40th anniversary this year, was pioneered by Wisconsin computer scientist Miron Livny. It’s a novel form of distributed computing that automates computing tasks across a network of thousands of computers, essentially turning a single massive computing challenge into a supercharged fleet of smaller ones. This computing innovation is helping fuel big-data discovery across hundreds of scientific projects worldwide, including the search for cosmic neutrinos, subatomic particles and gravitational waves as well as to unravel antibiotic resistance.

In 2019, the Event Horizon Telescope (EHT) Collaboration released the first image of a supermassive black hole at the center of the galaxy M87. In 2022, they presented the image of the black hole at the center of our Milky Way, Sagittarius A*. However, the data behind the images still contained a wealth of hard-to-crack information. An international team of researchers trained a neural network to extract as much information as possible from the data.

From a handful to millions

Previous studies by the EHT Collaboration used only a handful of realistic synthetic data files. Funded by the National Science Foundation (NSF) as part of the Partnership to Advance Throughput Computing (PATh) project, the Madison-based CHTC enabled the astronomers to feed millions of such data files into a so-called Bayesian neural network, which can quantify uncertainties. This allowed the researchers to make a much better comparison between the EHT data and the models.

Thanks to the neural network, the researchers now suspect that the black hole at the center of the Milky Way is spinning at almost top speed. Its rotation axis points to the Earth. In addition, the emission near the black hole is mainly caused by extremely hot electrons in the surrounding accretion disk and not by a so-called jet. Also, the magnetic fields in the accretion disk appear to behave differently from the usual theories of such disks.

"That we are defying the prevailing theory is of course exciting," says lead researcher Michael Janssen, of Radboud University Nijmegen, the Netherlands. "However, I see our AI and machine learning approach primarily as a first step. Next, we will improve and extend the associated models and simulations."

Impressive scaling

"The ability to scale up to the millions of synthetic data files required to train the model is an impressive achievement," adds Chi-kwan Chan, an Associate Astronomer of Steward Observatory at the University of Arizona and a longtime PATh collaborator. "It requires dependable workflow automation, and effective workload distribution across storage resources and processing capacity."

“We are pleased to see EHT leveraging our throughput computing capabilities to bring the power of AI to their science,” says Professor Anthony Gitter, a Morgridge Investigator and a PATh Co-PI. “Like in the case of other science domains, CHTC’s capabilities allowed EHT researchers to assemble the quantity and quality of AI-ready data needed to train effective models that facilitate scientific discovery.”

The NSF-funded Open Science Pool, operated by PATh, offers computing capacity contributed by more than 80 institutions across the United States. The Event Horizon black hole project performed more than 12 million computing jobs in the past three years.

 “A workload that consists of millions of simulations is a perfect match for our throughput-oriented capabilities that were developed and refined over four decades” says Livny, director of the CHTC and lead investigator of PATh. “We love to collaborate with researchers who have workloads that challenge the scalability of our services.” 

Scientific papers referenced 

• Deep learning inference with the Event Horizon Telescope I. Calibration improvements and a comprehensive synthetic data library. By: M. Janssen et al. In: Astronomy & Astrophysics, 6 June 2025. [original (open access) | preprint (pdf)].

• Deep learning inference with the Event Horizon Telescope II. The Zingularity framework for Bayesian artificial neural networks. By: M. Janssen  et al. In: Astronomy & Astrophysics, 6 June 2025. [original (open access) | preprint (pdf)].

• Deep learning inference with the Event Horizon Telescope III. Zingularity results from the 2017 observations and predictions for future array expansions. By: M. Janssen et al. In: Astronomy & Astrophysics, 6 June 2025. [original (open access) | preprint (pdf)].

 


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.