Shanghai Jiao Tong University designs an AI model that increases molecular dynam

Recently, the Artificial Intelligence and Microstructure Laboratory at Shanghai Jiao Tong University has introduced the Transformer algorithm from generative artificial intelligence to propose an artificial intelligence model called T-AIMD, providing a solution to the long-standing problem of computational time consumption in molecular dynamics.

The T-AIMD model combines sequence features and physical descriptors (such as charge, temperature, etc.). In this way, the model not only learns the dynamic characteristics of the sequence but also incorporates the static properties of the material, thereby enhancing the model's generalization ability and the accuracy of predictions.

In addition, the model utilizes high-performance computing resources to support the training and computation of deep learning models. By optimizing the computational process with intelligent algorithms, it achieves rapid and accurate prediction of material properties.

Taking a material system containing 100 atoms as an example, if a 30-picosecond ab initio molecular dynamics (AIMD, also known as first-principles molecular dynamics) simulation is required, it would take two to three months to run on a high-performance central processing unit (CPU) computer.

Advertisement

The T-AIMD method, by learning the long-distance diffusion behavior of atoms, can predict the ionic conductivity of any ion in any crystal structure in an extremely short time, thereby achieving a speed-up of at least 100 times in the calculation of ionic conductivity.By integrating machine learning algorithms with deep learning networks, T-AIMD is capable of predicting the outcomes of entire AIMD simulations from a small range of sequence data, significantly reducing the experimental cycle and accelerating the development of materials and biological sciences.

The T-AIMD model can be applied to the field of materials. It is not limited to specific types of materials or structures and can be widely applied to the study of various ionic conductors, including lithium ion and magnesium ion conductors.

When T-AIMD is extended to the field of molecular dynamics (MD), it has potential applications in various biological protein systems, such as drug development, protein structure prediction, cellular molecular dynamics, and biological macromolecular complexes.

Recently, the related paper titled "Transformer enables ion transport behavior evolution and conductivity regulation for solid electrolyte" was published in Energy Storage Materials (IF 18.9) [1].

Tao Kehao, a doctoral student at Shanghai Jiao Tong University, is the first author, and Professor Li Jinjin serves as the corresponding author.Enhancing Molecular Dynamics Computation Efficiency by a Hundredfold

In recent years, the emergence of high-performance generative algorithms has marked the peak of the development of Generative Artificial Intelligence (GAI). In many fields, generative methods have gradually transitioned from random generation to targeted generation, and the maturity of GAI has greatly facilitated the development of fundamental disciplines.

At the same time, the advent of the Generative Pre-Trained Transformer (GPT) has brought new hope for solving many problems in materials science.

The "T" in the recently popular ChatGPT stands for Transformer. Due to the advantages of self-attention mechanisms, parallel computing, and positional encoding, the Transformer has almost outperformed traditional sequence neural networks in every aspect.MD and AIMD are indispensable methods for researchers to analyze the motion behavior of systems, solve problems related to system structure, properties, reaction mechanisms, and molecular interactions, and are widely applied in fields such as materials science, bioscience, and chemistry.

The advantage of AIMD is that it can provide high-precision simulation results. In AIMD simulations, the position and velocity of each atom evolve over time, and this information can be used to calculate the physical and chemical properties of substances, such as ionic conductivity.

However, it cannot be ignored that its disadvantages are high computational cost and simulation cost, especially for large systems and long time scales. For decades, this issue has been troubling researchers and has restricted the development of new materials and the exploration of life molecules.

To solve the above problems, researchers thought that they might try to introduce the Transformer algorithm from GAI.

The Transformer is widely used in processing sequence data, such as text or time series, and its core is the self-attention mechanism.Li Jinjin explained, "The Transformer is very suitable for processing long sequences and capturing long-term dependencies, which is particularly important in the connection between the AIMD sequences."

Figure | T-AIMD network architecture diagram (Source: Energy Storage Materials)

In the T-AIMD model, the Transformer is used to learn and predict the diffusion behavior of atoms in solid-state electrolytes.

Specifically, the model first learns the diffusion patterns of atoms from a small amount of AIMD simulation data. Then, these learned patterns are used to predict the diffusion behavior of atoms on a longer time scale.

The core advantage of this method is the combination of the accuracy of AIMD and the high efficiency and powerful sequence processing capability of the Transformer. T-AIMD uses a data-driven approach, relying on a large amount of training data to optimize model parameters."In this way, the model can capture complex physical processes and quickly predict behavior under unknown conditions, which is crucial for accelerating material development and application," said Li Jinjin.

Solving the problem of time consumption in molecular dynamics calculations

During the exploration process of this research, there are challenges in various aspects such as large-scale data processing and feature extraction, model training and optimization, multi-source data fusion, and model verification and experimental comparison.

The data volume generated by AIMD simulations is huge, and the temporal characteristics of the data require the model to effectively capture time dependence. Therefore, how to extract useful features from the original molecular dynamics data is one of the challenges.Researchers have employed the Transformer architecture to process sequential data, leveraging its self-attention mechanism to capture long-term dependencies. "We ensure the quality of input data and the efficiency of model training by designing a specialized data preprocessing procedure, including data normalization and time window segmentation," said Li Jinjin.

On the other hand, deep learning models, especially those based on the Transformer, have extremely high demands for computational resources during training and are prone to overfitting.

To address the issue of resource requirements, researchers trained the model in parallel on a high-performance computing platform. To prevent overfitting, they utilized regularization techniques such as dropout and L2 regularization. In addition, various optimization algorithms were employed to enhance the model's convergence speed and stability.

The challenge in this study is also reflected in the integration of data; T-AIMD needs to handle data from different sources (such as sequence features and material descriptors), which have significantly different dimensions and properties.

In response to this, they developed a hybrid feature extraction framework that can process both time-series data and static material properties simultaneously. By designing a fusion layer to integrate these different features and experimentally evaluating the effects of different fusion strategies, they optimized the model's predictive capabilities.In addition, comparing the results of the T-AIMD model predictions with actual experimental outcomes is crucial to verify the accuracy of the predictions, which demands highly precise experimental design and data collection. The laboratory collaborated with researchers from Tsinghua University to conduct a series of experimental validations on solid-state electrolyte materials.

By comparing the model-predicted conductivity with the experimentally measured values, they continuously adjusted and optimized the model parameters. At the same time, they also utilized published literature data to further verify the model's generalization capability.

Unlike organic materials, the crystal structures that Li Jinjin's laboratory is currently working on are difficult to effectively encode, and traditional computer languages struggle to reflect the important properties of crystal materials, which is also one of the important reasons limiting the application of large language models and generative artificial intelligence in crystal materials.

Therefore, they plan to further explore on the basis of T-AIMD, designing the molecular structure of materials according to specific application requirements.

By optimizing machine learning and artificial intelligence algorithms, the model can be made to derive the optimal material structure based on target functions (such as ionic conductivity, thermal stability, mechanical strength, and other material properties) in a reverse manner. This large model, which generates materials according to specific functions, points the way for the development of new materials.Empower Material Science and Life Science with AI

Li Jinjin is pursuing her Ph.D. in Physics at Shanghai Jiao Tong University and, after completing her postdoctoral research at the University of Illinois at Urbana-Champaign, she serves as a researcher at the University of California, Santa Barbara.

With the development of artificial intelligence and policy support, she returned to Shanghai Jiao Tong University to establish the Artificial Intelligence and Micro-Structure Laboratory (AIMS-Lab) at Shanghai Jiao Tong University, and serves as the laboratory director and doctoral supervisor.

The AI for Science of this laboratory is mainly focused on artificial intelligence material informatics and artificial intelligence life informatics. In recent years, the AI material informatics research and development platform AlphaMat, as well as the artificial intelligence platform for the design and discovery of biomolecules AlphaBio, has been developed.

The AlphaMat platform is a bridge connecting artificial intelligence with material science. To date, it has integrated more than 50 AI models, over 200 material data post-processing and analysis tools, and a proprietary material property database with millions of entries, capable of predicting more than 15 types of material properties (including formation energy, band gap, ionic conductivity, magnetism, bulk modulus, etc.) [1-6].Moreover, even users without programming experience can conveniently use this platform. Based on this software, the laboratory has accumulated the discovery of hundreds of new materials in various two-dimensional/three-dimensional systems, such as lithium battery electrode materials, solid-state electrolytes, perovskite materials, and catalytic materials.

In the field of life sciences, the laboratory has developed a unique AI protein large model and a specialized model co-evolution platform called AlphaBio.

Based on the AlphaBio large model, after pre-training, it empowers nearly a hundred downstream professional AI models, including AI protein function prediction models, AI protein folding mutation prediction models, AI enzyme preparation modification algorithms, AI force field development algorithms, etc. [7-10].

AlphaBio is promoting the vertical application of AI in the field of life sciences, playing an important role in drug development, protein structure/function prediction, cellular molecular dynamics, and the analysis of biological macromolecular complexes.

The decision to return to the country for development is closely related to Li Jinjin's analysis and prediction of the country and industry development. She said: "There is a gap between China and the United States in terms of computing power and data."The advantage of the United States lies in its ability to have more Graphics Processing Unit (GPU) chips and more computing power to develop large models, including AI general-purpose text/video/image. Moreover, it can conveniently access a vast amount of internet data from around the world, especially in English, which far exceeds the total amount of Chinese data.

"China's advantages are also very prominent. Many Chinese scientific research teams are developing high-performance algorithms, striving to achieve high-performance AI predictions quickly through algorithm innovation in the case of insufficient computing power. For example, in this new research, we have improved molecular dynamics simulation by hundreds of times through the Transformer algorithm," said Li Jinjin.

In addition, China has a strong real economy, communications, infrastructure, light industry, heavy industry, etc. Each enterprise and team has private data, and this data is not public.

Although it is not possible to train large models similar to GPT, as more and more enterprises create their own professional models, these data are expected to play a huge role, truly transforming AI into a new quality of productivity.

Li Jinjin pointed out that when the data of the real economy, industry experience, application scenarios, and AI are combined, it will bring a huge improvement to production efficiency, and in turn, it will also form a huge traction for AI technology. The second half of artificial intelligence is not about AI, but about the "+" in "AI+". China's home field advantage has just begun.According to the introduction, the laboratory has been working closely with related enterprises to empower large-scale industrial production through AI material design and AI life design, in order to optimize the industrial production process and reduce costs while increasing efficiency.

Looking forward to the development of AI in conjunction with more disciplines, Li Jinjin expressed that in the future, the combination of quantum computing and AI is expected to solve complex system simulation problems that traditional computing cannot handle, thus opening a new chapter in scientific research.