New AI method speeds up predictions of materials’ thermal properties by up to 1,000 times

  • Researchers from MIT and collaborators developed an AI method to predict materials’ thermal properties much faster.
  • Traditional methods for predicting heat transfer through materials are slow and complex, hindered by difficulties in modeling phonons.
  • The new machine-learning framework, VGNN, predicts phonon dispersion relations up to 1,000 times faster than conventional AI methods.
  • VGNN integrates virtual nodes into crystal structure graphs, enhancing flexibility and efficiency in predictions.
  • The method shows promise in accurately modeling complex alloy systems and predicting optical and magnetic properties.
  • Published in Nature Computational Science, the research highlights AI’s potential to revolutionize materials science and energy technology.

Main AI News:

Researchers from MIT and collaborators have developed a groundbreaking AI method that dramatically accelerates predictions of materials’ thermal properties. This innovation addresses a critical challenge in energy efficiency: approximately 70% of global energy output is lost as waste heat, highlighting the urgent need for more efficient thermal management in energy systems and electronics.

Traditional methods for predicting how heat moves through materials, particularly the thermal properties dependent on phonon dispersion relations, have been notoriously slow and complex. Phonons, subatomic particles responsible for heat transfer, interact in ways that are difficult to model accurately, posing a significant bottleneck in designing efficient energy systems and microelectronics.

Led by Mingda Li, along with researchers from MIT and institutions like Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory, the team devised a novel machine-learning framework. This framework utilizes a virtual node graph neural network (VGNN), which enhances flexibility and efficiency in predicting high-dimensional properties like phonon dispersion relations. Compared to conventional AI approaches, VGNNs can predict these relations up to 1,000 times faster, potentially revolutionizing materials design and energy system optimization.

Their approach not only promises faster computations but also greater accuracy in modeling complex alloy systems, which are challenging for traditional methods. By integrating virtual nodes that represent phonons into the crystal structure graph, the VGNN circumvents the need for extensive computations, thus streamlining predictions without sacrificing precision.

Moreover, the VGNN technique shows promise beyond phonons, extending to predictions of optical and magnetic properties. This versatility positions the technology as a versatile tool for accelerating research across multiple domains of material science.

In the words of Li, “This method could pave the way for designing next-generation energy materials and microelectronics that operate more efficiently, significantly reducing thermal losses and enhancing overall performance.”

The research, published in Nature Computational Science, underscores the potential of AI-driven innovations to tackle longstanding challenges in materials science and energy technology, marking a significant step towards sustainable energy solutions and advanced electronic devices.

Li is joined on the paper by co-lead authors Ryotaro Okabe, a chemistry graduate student; and Abhijatmedhi Chotrattanapituk, an electrical engineering and computer science graduate student; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; as well as others at MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. The research appears in Nature Computational Science.

Predicting phonons

Heat-carrying phonons are tricky to predict because they have an extremely wide frequency range, and the particles interact and travel at different speeds.

A material’s phonon dispersion relation is the relationship between energy and momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but there are so many high-precision calculations involved that models get bogged down.

If you have 100 CPUs and a few weeks, you could probably calculate the phonon dispersion relation for one material. The whole community really wants a more efficient way to do this,” says Okabe.

The machine-learning models scientists often use for these calculations are known as graph neural networks (GNN). A GNN converts a material’s atomic structure into a crystal graph comprising multiple nodes, which represent atoms, connected by edges, which represent the interatomic bonding between atoms.

While GNNs work well for calculating many quantities, like magnetization or electrical polarization, they are not flexible enough to efficiently predict an extremely high-dimensional quantity like the phonon dispersion relation. Because phonons can travel around atoms on X, Y, and Z axes, their momentum space is hard to model with a fixed graph structure.

To gain the flexibility they needed, Li and his collaborators devised virtual nodes.

They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes enable the output of the neural network to vary in size, so it is not restricted by the fixed crystal structure.

Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. While virtual nodes will be updated as the model updates real nodes during computation, they do not affect the accuracy of the model.

The way we do this is very efficient in coding. You just generate a few more nodes in your GNN. The physical location doesn’t matter, and the real nodes don’t even know the virtual nodes are there,” says Chotrattanapituk.

Cutting out complexity

Since it has virtual nodes to represent phonons, the VGNN can skip many complex calculations when estimating phonon dispersion relations, which makes the method more efficient than a standard GNN.

The researchers proposed three different versions of VGNNs with increasing complexity. Each can be used to predict phonons directly from a material’s atomic coordinates.

Because their approach has the flexibility to rapidly model high-dimensional properties, they can use it to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are especially challenging for traditional approaches to model.

The researchers also found that VGNNs offered slightly greater accuracy when predicting a material’s heat capacity. In some instances, prediction errors were two orders of magnitude lower with their technique.

A VGNN could be used to calculate phonon dispersion relations for a few thousand materials in just a few seconds with a personal computer, Li says.

This efficiency could enable scientists to search a larger space when seeking materials with certain thermal properties, such as superior thermal storage, energy conversion, or superconductivity.

Moreover, the virtual node technique is not exclusive to phonons, and could also be used to predict challenging optical and magnetic properties.

In the future, the researchers want to refine the technique so virtual nodes have greater sensitivity to capture small changes that can affect phonon structure.

Researchers got too comfortable using graph nodes to represent atoms, but we can rethink that. Graph nodes can be anything. And virtual nodes are a very generic approach you could use to predict a lot of high-dimensional quantities,” Li says.

The authors’ innovative approach significantly augments the graph neural network description of solids by incorporating key physics-informed elements through virtual nodes, for instance, informing wave-vector dependent band-structures and dynamical matrices,” says Olivier Delaire, associate professor in the Thomas Lord Department of Mechanical Engineering and Materials Science at Duke University, who was not involved with this work. “I find that the level of acceleration in predicting complex phonon properties is amazing, several orders of magnitude faster than a state-of-the-art universal machine-learning interatomic potential. Impressively, the advanced neural net captures fine features and obeys physical rules. There is great potential to expand the model to describe other important material properties: Electronic, optical, and magnetic spectra and band structures come to mind.”

Conclusion:

This breakthrough AI method not only significantly accelerates the prediction of materials’ thermal properties but also enhances the accuracy and scope of predictions beyond traditional methods. This advancement is poised to revolutionize materials design for energy systems and microelectronics, offering a faster and more precise approach to tackling complex challenges in material science. As industries increasingly prioritize efficiency and performance optimization, innovations like VGNNs could pave the way for transformative advancements in sustainable energy solutions and advanced electronic devices.

Source

Your email address will not be published. Required fields are marked *