Researchers used China’s new Sunway supercomputer to scale neural network quantum chemistry to real molecular sizes, bridging AI and quantum science.
What if we could model the chemistry of life, the motion of every electron and the balance of every bond, not with future quantum systems but with existing supercomputing resources?
That is essentially what a team of Chinese researchers has begun to demonstrate with their work on neural-network quantum states (NNQS) using China’s newest Sunway supercomputer.
At a glance, it looks like a technical scaling milestone, which the machine has proven in Gordon Bell runs in years past. The project ran on 37 million processor cores and achieved 92% strong scaling and 98% weak scaling (meaning performance held nearly constant whether they added more processors to the same problem or increased the problem size along with them).
In other words, that level of efficiency shows the algorithm and the super are working almost perfectly in sync, which is rare at such scale and necessary for making such large quantum simulations practice
The researchers are showing that machine learning can be used to model complex quantum systems accurately enough to matter for real materials and molecules, and that it can be done on today’s largest classical supercomputers.
To understand why this matters, it helps to start with the machine itself.
The “New Sunway,” sometimes called Oceanlite, is China’s successor to the TaihuLight supercomputer. It runs on SW26010-Pro chips, each built from clusters of small compute cores that use local memory instead of cache, allowing extremely fine-grained control of data. Tens of thousands of these chips are linked together into a system with over forty million cores, capable of exascale performance or, about a billion-billion calculations per second.
The thing is, this architecture is very good at regular, predictable work like the deep learning training loops found in modern AI. But quantum chemistry is a far tougher fit.
To simulate the quantum state of a molecule, researchers often use NNQS, a relatively recent technique to train a neural network to try to describe every possible way electrons in a molecule could arrange themselves and move around.
Doing this well requires generating enormous numbers of random samples from that wavefunction, as the authors describe it, and calculating something called the “local energy” for each sample. Both steps are computationally brutal, and both become uneven as the problem grows, because some configurations are more demanding to evaluate than others.
The Sunway team built a new data-parallel NNQS-Transformer designed to fit the machine’s layered architecture. Management cores handled communication, while lightweight compute cores did the heavy math inside their local memory. A smart load-balancing system kept every core busy, and the code was written in Julia for flexibility but tuned to squeeze maximum performance from the hardware.
The result is an algorithm that finally scales quantum chemistry to the same extreme levels as today’s largest AI models.
Earlier NNQS models could handle only tiny molecular systems, but this one simulated structures with up to 120 spin orbitals (the basic units describing how electrons occupy energy levels in atoms).
It shows that neural networks can now tackle quantum problems large enough to matter for real chemistry and that Sunway can manage complex, irregular workloads once thought beyond its reach.
From a broader perspective, this matters because it hints at a bridge between classical and quantum computing. NNQS uses classical hardware to learn the behavior of quantum systems.
If these methods continue to scale, exascale supers could become practical for discovering new materials or drugs long before useful quantum processors arrive.
The Sunway project also offers a glimpse of China’s steady progress in large-scale computing.
While technical details remain sparse, each major research paper demonstrates not just raw performance but an increasingly robust software environment. The use of open, portable languages like Julia alongside deeply tuned native code shows that this ecosystem is maturing beyond one-off benchmarks into a sustainable platform for science.
In the end, this work has all the exciting HPC stuff like record-breaking speed but a more careful look shows that the same architectures that train language models can also learn the hidden structure of matter itself.
If that trajectory continues, machines like Sunway could become laboratories for discovering the physical laws that quantum computers will someday harness directly.
If this kind of work keeps scaling up, the next big challenge will be moving and managing the data fast enough. Neural network quantum simulations create huge amounts of information as they run, and millions of processor cores have to share it constantly.
All of this, of course, means future systems will need [even] faster storage and smarter networks that can deliver data almost as quickly as it is computed.
The math may be solved, but keeping the data flowing will decide how far this approach can really go.



