Data analysis and storage

The Atlas Computing Cluster is the world's largest and most powerful resource dedicated to gravitational wave searches and data analysis.

The Atlas computing cluster at the Max Planck Institute for Gravitational Physics in Hannover is the worldwide largest cluster for gravitational-wave data analysis. Zoom Image
The Atlas computing cluster at the Max Planck Institute for Gravitational Physics in Hannover is the worldwide largest cluster for gravitational-wave data analysis. [less]

Atlas belongs to Bruce Allen's "Observational Relativity and Cosmology" division at the Max Planck Institute for Gravitational Physics in Hannover. GEO600 data are transferred to servers in the Atlas cluster room and are stored on the cluster's hierarchical storage managment system; other servers analyse some of the data for detector characterisation.

Atlas was officially launched in May 2008 with 1344 quad-core compute nodes. One month later it was ranked number 58 on the June 2008 Top-500 list of the worlds fastest computers. At that time it was the sixth fastest computer in Germany.

In early 2020, the cluster was extended to more than 50,000 physical CPU cores (about 90,000 logical ones) in 3,000 compute servers. These servers range from 2000 older 4 CPU core systems with 16GB RAM each, 550 systems with 28 CPU cores and 192GB RAM to the latest 444 ones with 64 CPU cores and 512 GB RAM each. Additionally, about 350 high performance, specialized graphics processing units (GPU) cards were added in parallel to the existing set of about 2,000 for special purpose computing. These additions raise the theoretical peak computing power of Atlas to more than 2 PFLOP/s.

 
loading content
Go to Editor View