Generative AI predicts antiferromagnets for ultrafast spintronics


Sep 30, 2025

A generative AI framework predicts stable antiferromagnets, identifying semiconductors and metals with properties suited for ultrafast spintronics and advancing systematic discovery of magnetic materials.

(Nanowerk Spotlight) Antiferromagnetic materials have attracted growing attention in condensed matter physics and device engineering because they promise fast and stable information processing. Unlike conventional ferromagnets, where atomic magnetic moments align in the same direction and produce stray fields, antiferromagnets cancel their magnetization through alternating orientations of neighboring spins. This property makes them resilient to external perturbations and suitable for high-density circuits. The appeal has been clear for decades, yet finding and designing useful antiferromagnets has proved challenging. The interactions that govern their behavior are complex, the number of possible chemical combinations is vast, and traditional search methods struggle to keep pace. Theoretical tools such as density functional theory have been the mainstay for predicting magnetic order, but they face significant limitations. They can falter in handling strongly correlated electrons, and they are computationally intensive when applied across thousands of hypothetical compounds. Even when coupled with machine learning, these workflows tend to hover near known material prototypes rather than exploring the wider chemical landscape. This bottleneck has limited systematic discovery. At the same time, advances in generative artificial intelligence have opened new possibilities. Models trained on existing crystal structures can learn the statistical rules of chemistry and propose new arrangements of atoms that are both plausible and diverse. These developments create a foundation for a more directed approach to material prediction. A study published in Advanced Science (“A Generative Framework for Predicting Antiferromagnets”) describes such an approach. The researchers combine generative modeling, property prediction, and optimization into a single pipeline that can propose and test candidate antiferromagnets at scale. Their framework is designed to not only generate crystals that obey chemical and physical constraints but also to steer the generation process toward properties associated with antiferromagnetic behavior. By linking each stage to the next, the method reduces wasted effort and allows a more systematic exploration of unfamiliar chemical systems. text Overview of the generative framework for predicting antiferromagnets. The process begins with a crystal diffusion variational autoencoder trained to generate chemically valid structures. Candidate crystals are then passed to graph neural network predictors that estimate formation energy, magnetic moment, and electronic band gap. A genetic algorithm operates in the latent space to guide generation toward stable compounds with low net magnetization and appropriate electronic character. The final stage applies density functional theory to relax atomic positions, compare magnetic configurations, and confirm thermodynamic and electronic stability. Together these steps form a closed loop that links structure generation, rapid screening, optimization, and rigorous validation. (Image: Reprinted from DOI:10.1002/advs.202509488, CC BY) (click on image to enlarge) At the center of the framework is a model known as a crystal diffusion variational autoencoder, or CDVAE. An autoencoder compresses information about a crystal into a compact numerical representation and then reconstructs the structure from that representation. The diffusion element introduces a process of gradually refining noisy guesses until they resemble valid crystals. To improve reliability, the authors augment their training data by rotating the crystal lattices and atomic positions. This helps the model respect the symmetries of real materials. They also use a strategy called transfer learning: the model first trains on a large general dataset of about 45,000 crystals, then fine tunes on a smaller set focused specifically on antiferromagnets. This two-stage learning path allows the model to generate chemically valid and structurally consistent outputs, which would not be possible if it had been trained only on the limited magnetic dataset. Generating structures is only the first step. To identify promising candidates, the researchers built fast predictors based on graph neural networks. In these models, atoms are represented as nodes with features such as atomic mass or size, while connections between atoms capture distances and bonding environments. Passing information along this graph allows the model to link local geometry with overall properties. The team trained three predictors to estimate formation energy, magnetic moment, and electronic band gap. Formation energy reflects whether a compound is thermodynamically stable compared with its elemental constituents. Magnetic moment reveals whether magnetic orientations cancel out as expected for an antiferromagnet. Band gap distinguishes metals from semiconductors and insulators. These predictors can screen thousands of candidate crystals in seconds, narrowing the search to those most likely to succeed. To guide the generator toward useful regions of chemical space, the authors introduced an optimization step based on a genetic algorithm. This method maintains a population of candidate crystals, scores them using the fast predictors, and then recombines and mutates their representations to create new generations. Candidates that better satisfy the design goals are more likely to survive and reproduce. The fitness function in this case favors crystals that are stable, have low net magnetic moment, and possess either metallic behavior or a small band gap suited for device applications. By repeating this cycle, the system enriches the pool of promising candidates with each iteration. The final stage of the pipeline involves more rigorous validation with density functional theory. These calculations relax the atomic positions, compare the relative energies of ferromagnetic and antiferromagnetic states, and analyze the electronic structure in detail. They also compute thermodynamic stability by measuring how close a compound lies to the so-called convex hull, which represents the set of most stable phases for a given chemical system. For selected cases, the team added spin–orbit coupling, which identifies the preferred orientation of magnetic moments, and used corrections for strong electron interactions in transition metals. This stage ensures that the candidates flagged by the fast predictors are indeed physically meaningful. Applying the framework, the researchers generated about two thousand structures and screened them through the pipeline. Three antiferromagnetic semiconductors emerged as validated results: manganese sulfide (MnS), iron phosphate (FePO4), and manganese oxide (MnO). In each case the antiferromagnetic configuration was lower in energy than the ferromagnetic one, and the electronic band structures showed spin degeneracy consistent with antiferromagnetic order. The predicted band gaps were close to one electron volt for MnS and FePO4 and about one and a half electron volts for MnO, placing them in a range relevant for electronic devices. Thermodynamic analysis indicated that these compounds are stable, and the calculations pinpointed the crystallographic directions along which their spins prefer to align. Notably, the generated structures differed from those commonly listed in databases, suggesting that the model can move beyond simply reproducing known patterns. To test the importance of the optimization step, the team also allowed the generator to produce structures without guidance and then screened a larger batch of five thousand. This run yielded two metallic antiferromagnets, lithium vanadium oxide (LiVO2) and lithium iron nitride (LiFeN). Both showed antiferromagnetic order lower in energy than the ferromagnetic alternative, and their band structures crossed the Fermi level as expected for metals. Phonon calculations, which check vibrational stability, revealed no unstable modes. These results demonstrate that the generator can still find viable antiferromagnets without optimization, but the process is less efficient and tends to identify metals rather than small-gap semiconductors. By comparing the two search strategies, the researchers showed that optimization steers discovery toward materials more suitable for applications where semiconducting behavior is essential. This highlights a central advantage of the pipeline: it does not simply generate random candidates but learns to bias exploration toward the regions of chemical space that matter for device design. The broader significance of the work lies in its modular design. The generative model provides a flexible engine for proposing structures, the property predictors act as efficient filters, and the genetic algorithm offers a way to direct the search. Each component can be updated or replaced as better tools become available. For example, the authors note that adding explicit symmetry constraints could further improve the relevance of generated antiferromagnetic structures. The approach could also be adapted to other material classes where traditional search methods have struggled. For the field of spintronics, the implications are clear. Antiferromagnets combine fast spin dynamics with robustness against external fields, making them attractive for ultrafast and densely packed devices. Yet their discovery has been slowed by the sheer size of the search space. This framework offers a reproducible route to explore beyond known prototypes and to identify both semiconducting and metallic antiferromagnets. By coupling artificial intelligence with physical validation, the study demonstrates that new classes of magnetic materials can be uncovered in a systematic way rather than through trial and error.


Michael Berger
By
– Michael is author of four books by the Royal Society of Chemistry:
Nano-Society: Pushing the Boundaries of Technology (2009),
Nanotechnology: The Future is Tiny (2016),
Nanoengineering: The Skills and Tools Making Technology Invisible (2019), and
Waste not! How Nanotechnologies Can Increase Efficiencies Throughout Society (2025)
Copyright ©




Nanowerk LLC

For authors and communications departmentsclick to open

Lay summary


Prefilled posts