Researchers at Argonne National Laboratory are using machine learning to investigate the mysteries of dark matter

TL;DR:

  • Scientists at Argonne National Laboratory are using machine learning (ML) to explore the mysteries of dark matter.
  • They’re developing ML algorithms to understand interactions between atomic nuclei and dark matter candidate particles.
  • The Aurora exascale supercomputer will enable calculations previously impossible on petascale machines.
  • The team’s ML software optimizes calculations in lattice quantum chromodynamics (LQCD), which is crucial for their research.
  • Intel compute engines and the oneAPI initiative enhance Aurora’s capabilities for deep learning and ML.
  • Access to an exascale supercomputer promises breakthroughs in nuclear physics and dark matter research.

Main AI News:

In the vast realm of scientific discovery, the enigma of dark matter remains an elusive puzzle, challenging our understanding of the universe’s fundamental workings. While scientists have made significant strides in deciphering particle interactions at atomic and subatomic scales, the elusive nature of dark matter continues to evade conventional scientific instruments and computational methods.

However, a dedicated team of researchers, armed with cutting-edge supercomputing resources at the US Department of Energy’s Argonne National Laboratory, is embarking on an ambitious journey to unravel the secrets of dark matter. Leveraging the remarkable capabilities of machine learning, they are poised to unlock a new era of exploration on the Aurora exascale high-performance computer (HPC) system.

The Quest for Answers: Dark Matter Research Team

The driving force behind this pioneering research includes esteemed co-principal investigators, Dr. William Detmold and Dr. Phiala Shanahan, hailing from MIT. Complementing their expertise, a collaborative effort extends across institutions, with researchers from New York University and international partners, all united under the banner of the US Lattice Quantum Chromodynamics (USQCD) collaboration. This nationwide infrastructure, dedicated to LQCD hardware and software, underscores the project’s scale and ambition.

Crucial financial support for this endeavor comes from the US Department of Energy and the National Science Foundation, with software development bolstered by a grant from the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program. Notably, the project holds the distinction of being an awardee in the Argonne Leadership Computing Facility’s (ALCF) Early Science Program (ESP) for Aurora, further highlighting its strategic importance.

Unveiling the Standard Model and the Dark Matter Enigma

The foundation of particle physics rests upon the standard model, a comprehensive theory that elucidates the strong force, weak force, and electromagnetism, albeit excluding gravity. According to established research, protons and atomic nuclei are composed of quarks and gluons, the elemental building blocks of the cosmos.

However, in the cosmic tapestry, dark matter remains an enigmatic thread, detected solely through its gravitational influence, eluding direct observation. Dr. Detmold emphasizes, “Our ESP team’s research centers on quantum chromodynamics (QCD), shedding light on how quarks interact within the atomic nucleus. Through LQCD simulations and contemporary physics experiments, we strive to unravel the atomic constituents and their potential interplay with dark matter.”

Machine Learning: Illuminating the Dark Path

To navigate the complex labyrinth of computations inherent in LQCD, the team has ingeniously devised their own machine learning software. Dr. Detmold elaborates, “Certain segments of LQCD calculations present formidable computational bottlenecks. Our ML software is meticulously crafted to expedite HPC algorithms, particularly in matrix inversions and extensive linear algebra computations.”

This ML algorithm synergizes seamlessly with other software tools, including USQCD libraries, TensorFlow, HDF5, and PyTorch. Employing a self-training approach, the model generates representative configurations of quarks and gluons, continuously learning to produce increasingly accurate samples.

Dr. Shanahan adds, “Our team pioneers novel machine-learning algorithms, empowering next-generation lattice QCD calculations in the realm of nuclear physics on Aurora.” The intricacies of LQCD, functioning in four dimensions on Aurora, pose a challenging numerical problem necessitating innovative software development.

A Journey Through Space-Time Lattice

The numerical journey traverses a spacetime lattice (grid), unveiling the properties and interactions of nuclei, with a keen eye on their potential entanglement with dark matter. The researchers commence with a modest volume and progressively expand their scope to encompass larger dimensions, ultimately extrapolating results to the infinite expanse of the universe.

Aurora: Pioneering the Future

In anticipation of their future endeavors on the Aurora supercomputer, the team has honed their expertise on a series of petascale supercomputers, including ALCF Mira and Theta, Summit at ORNL, and Marconi at CINECA in Italy. Aurora’s architectural prowess lies in its optimization for deep learning, with the ML software stack operating at an unprecedented scale.

This next-generation supercomputer boasts Intel compute engines, featuring the Intel Xeon CPU Max series and the Intel Data Center GPU Max series, alongside DAOS storage. Furthermore, Aurora stands as a testament to the Intel-led cross-industry oneAPI initiative, streamlining application development across diverse computing architectures.

The Future Illuminated: Unveiling the Subatomic Mysteries

Dark matter research remains a formidable computational challenge, replete with unanswered questions. Guided by the Argonne ESP program, a dedicated research team pioneers novel machine-learning algorithms, poised to unveil the interactions between nuclei and a diverse array of dark matter candidates.

The advent of exascale HPC systems heralds a transformative era, empowering the team to delve into realms hitherto inaccessible on petascale supercomputers. Dr. Detmold underscores the significance, stating that access to an exascale supercomputer will facilitate comparisons between numerical LQCD calculations, physical dark matter experiments, and predictions stemming from the standard model of particle physics. Dr. Shanahan encapsulates the optimism, declaring, “Aurora will usher in a new era, enabling the application of custom machine learning architectures in physics at an unprecedented scale. This holds the promise of unlocking nuclear physics calculations that have hitherto eluded traditional approaches—a watershed moment in scientific exploration.”

Conclusion:

The fusion of machine learning and high-performance computing on the Aurora system marks a significant leap in our ability to unravel the mysteries of dark matter. This technological advancement not only propels scientific discovery but also opens up opportunities for businesses and industries to leverage cutting-edge computing solutions for complex problem-solving and innovation.

Source