How AI tools accelerate discovery and analysis in materials science


Aug 25, 2025

Artificial intelligence is accelerating material discovery and design by automating analysis, guiding experiments, and enabling predictive modeling across spectroscopy, microscopy, and synthesis.

(Nanowerk Spotlight) Materials do not invent themselves. Behind every new battery chemistry, solar absorber, superconducting wire or wearable sensor lies a long chain of experiments, models, and failures. Discovering the right combination of elements and structure has historically required patience, luck, and exhaustive trial and error. Even with the rise of high-throughput computing and automated experiments, progress is often slowed by the complexity of material behavior, the scale of the design space, and the sheer volume of data modern instruments produce. What is changing now is not just the speed of analysis or simulation, it is the logic behind the search. Machine learning and other artificial intelligence methods are being adapted to materials science not as replacements for theory or experimentation, but as new instruments of inference. These systems learn from examples rather than equations. They identify patterns in noisy spectral data, suggest promising crystal structures from incomplete information, and steer complex experiments in real time. Tasks that once relied on expert judgment and manual processing can now be partially or fully automated. What was once empirical and reactive is becoming predictive and model driven. The acceleration is not just hypothetical. AI is already being used to classify microstructures in microscopy images, reconstruct 3D scans from sparse measurements, and identify new materials with specific mechanical, thermal, or optical responses. Recent advances in generative modeling and language based interfaces are further extending AI’s reach into automated synthesis planning and interactive data interpretation. Tools originally developed for image recognition or natural language generation are being repurposed to work with molecular structures, diffraction patterns, and electron energy spectra. A new review published in Advanced Intelligent Systems (“Applied Artificial Intelligence in Materials Science and Material Design”) outlines this shift in detail. Authored by Emigdio Chávez-Angel and collaborators, the article explores how artificial intelligence is being integrated across experimental platforms and research workflows. Rather than treating AI as a single solution, the authors show how diverse tools from deep neural networks to probabilistic models are helping researchers ask better questions, extract more from their data, and navigate increasingly complex material systems. Number of publications in the last 10 years including the keywords 'Machine learning' + 'Materials' (violet), 'AI' + 'Materials' (green), and 'Neural network' + 'Materials' (orange) (Left) Number of publications in the last 10 years including the keywords “Machine learning” + “Materials” (violet), “AI” + “Materials” (green), and “Neural network” + “Materials” (orange). (Right) Table of number of publications in the period 2024–2014. All the data included in the figure and table are derived from Clarivate (Web of Science). © Clarivate 2024. (Image: Reprinted from DOI:10.1002/aisy.202400986, CC BY) (click on image to enlarge) One area where these changes are especially visible is spectroscopy. Tools like Raman spectroscopy and infrared spectroscopy produce detailed signals about a material’s composition and structure. However, analyzing these signals, especially in complex systems or under changing experimental conditions, can be tedious and error prone. Neural networks, including convolutional models and encoder decoder architectures, are now being used to clean up raw data, classify spectra, and detect peaks more efficiently. This allows scientists to interpret signals in real time and even monitor processes like strain, doping, and temperature as they happen. In one example discussed in the paper, machine learning models were trained to interpret entire spectra instead of focusing on isolated peaks, allowing the FTIR signal from a polymer to be used as a spatially resolved thermometer. Explainable AI is also becoming important in spectroscopy. Unlike standard machine learning tools that operate as black boxes, explainable models highlight which parts of a spectrum contributed most to a given prediction. This helps researchers validate results and understand what the model is learning. One example featured in the paper used explainable techniques to identify critical vibrational modes in graphene oxide, linking spectral features to the degree of material reduction. In another, a fingerprint based data augmentation strategy helped improve the classification of microplastic contaminants, while also clarifying which parts of the infrared spectrum carried the most weight. Spectroscopy is just one domain. AI is also making a difference at large scale experimental facilities like synchrotrons. These facilities use intense beams of light, including X rays, to probe materials with high spatial and temporal resolution. But synchrotron experiments can generate massive volumes of data in short periods of time, creating new demands for rapid processing and interpretation. Machine learning models are now being used to automatically classify crystal structures from diffraction data, denoise tomographic scans, and align images before reconstruction. In one case, a generative model was used to reconstruct 3D tomographic images from incomplete data without needing extensive training datasets, reducing the burden on experimentalists and enabling faster feedback. Real time AI tools are especially valuable at synchrotron beamlines. Models can guide experiments as they run, identifying regions of interest, adjusting scanning resolution, or stopping measurements once target conditions are reached. One system reviewed in the paper used a closed loop feedback mechanism to control the growth of thin films, making adjustments based on real time reflectometry and Bragg diffraction signals. Another applied deep learning to segment dendritic growth during X ray radiography, enabling the study of complex solidification processes as they unfolded. Some facilities have gone further, integrating AI into their control software. Researchers at the ALBA Synchrotron trained large language models on synthetic X-ray photoelectron spectroscopy data and built an AI assistant that helps interpret experimental results in natural language. The goal is not to replace expertise but to reduce the time spent on manual interpretation and allow scientists to interact with experiments using tools modeled after chat interfaces. At smaller scales, scanning probe microscopy is also benefiting from AI. These instruments use physical probes to map surface properties at the nanometer scale. Atomic force microscopy (AFM), for example, provides topographical and mechanical information, while scanning tunneling microscopy gives insight into electronic structure. But like spectroscopy, these techniques generate large amounts of noisy data and require careful tip calibration and control. Neural networks are now used to correct image artifacts, identify defects, and even reconstruct 3D images from multiple scan angles. Several efforts have introduced autonomous scanning systems. In one case, deep learning models identified and tracked DNA molecules in AFM images over long time periods, reducing redundancy and enabling high resolution imaging without human intervention. Another study used spiral scanning combined with compressive sensing and machine learning to speed up image acquisition while preserving detail. Researchers have also applied AI to detect when an STM tip degrades and needs to be conditioned, which is crucial for ensuring image quality in extended sessions. Electron microscopy is undergoing a similar transformation. Neural networks trained on large datasets can now denoise images, detect individual atoms, and reconstruct structures from limited data. Generative models and variational autoencoders are used not only to enhance image quality but also to generate synthetic data for training supervised models. These tools are especially valuable for studying beam sensitive materials, where minimizing exposure is key. The paper highlights how high dimensional data methods like 4D STEM tomography are being integrated with AI models to analyze material structure at atomic resolution. Another area with growing attention is the design of new materials. Machine learning tools are being used not only to predict properties based on atomic structure but also to generate candidates that meet specific functional goals. Graph neural networks, which represent atoms and bonds as nodes and edges in a graph, are particularly well suited for this task. One platform developed by DeepMind used such networks to predict more than two million stable inorganic crystal structures, expanding known databases by an order of magnitude. Another initiative, Open Catalyst, provides deep learning tools for modeling reaction energetics at surfaces. These models have been trained on hundreds of millions of calculations and are designed to help identify catalysts for energy relevant reactions. Language models are also entering this space. A tool called CrystaLLM was trained on millions of crystallographic data files and can generate plausible new structures by predicting the next step in a crystal’s description. Unlike standard graph models, it uses text based representations and works with autoregressive logic similar to natural language generation. Other platforms such as ChemCrow use GPT style models to assist in tasks like organic synthesis and drug discovery, providing guidance via natural language prompts. Finally, the review explores how AI is being applied to metamaterials, which are artificially structured materials that achieve properties not found in nature. Designing these systems is challenging because small changes in geometry can cause large shifts in behavior, and the simulations needed to test designs are often computationally expensive. AI provides two useful paths: forward design, where properties are predicted from structure, and inverse design, where a desired response is used to suggest a structure. By training on thousands of simulation results, models can learn to propose optimal designs with far less effort than traditional parameter sweeps. These approaches are being used across electromagnetic, mechanical, thermal, and acoustic metamaterials. In one example, a deep learning model was used to design a metamaterial with a specific mechanical response. The structure was then 3D printed and tested, with experimental results closely matching the model’s predictions. In another, AI was used to optimize the geometry of a phononic crystal to control sound wave propagation, a task that would have required extensive trial and error using conventional methods. The review emphasizes that the future of AI in materials science lies in hybrid models that combine physics based simulations with data driven methods. These models are more interpretable, more robust, and better suited to cases where data is limited. Projects like the InCAEM initiative in Spain are beginning to build infrastructure that combines real time experimentation, large scale computing, and AI assisted analysis to support such hybrid systems. As artificial intelligence continues to evolve, its impact on materials science is becoming not just practical but foundational. The tools are not simply accelerating what scientists were already doing. They are changing how researchers formulate questions, run experiments, and interpret results. This shift has the potential to make the process of discovering and designing materials more systematic, more scalable, and more efficient.

If this article was useful, support our independent nanotechnology reporting with any amount.
Your contribution funds the next explainer and keeps Nanowerk open for everyone.


Support this article

Secure checkout by Stripe


Michael Berger
By
– Michael is author of four books by the Royal Society of Chemistry:
Nano-Society: Pushing the Boundaries of Technology (2009),
Nanotechnology: The Future is Tiny (2016),
Nanoengineering: The Skills and Tools Making Technology Invisible (2019), and
Waste not! How Nanotechnologies Can Increase Efficiencies Throughout Society (2025)
Copyright ©




Nanowerk LLC

For authors and communications departmentsclick to open

Lay summary


Prefilled posts