Supercomputers Predict New Turbulent Interactions in Fusion Plasmas

Fusion Plasmas
A high-resolution photo shows the inside of the Alcator C-Mod tokamak with a representative cross-section of a fusion plasma superimposed. The inset depicts the approximate extent of the plasma turbulence simulations performed as part of this work. These simulations clearly demonstrate the coexistence of long wavelength blobs and short wavelength “streamers” (the small, finger-like structures in the figure) that comprise the turbulence in the core of the experimental plasmas. Image courtesy of Nathan Howard

For decades, developing practical fusion energy has been impeded by anomalously high levels of heat loss from magnetically confined plasmas. Using data from dedicated experiments, advanced codes, and the supercomputing power, researchers are developing a greater understanding of the underlying physics of these heat losses. By more completely capturing the dynamics of plasma turbulence across an unprecedented range of spatial and temporal scales, researchers have reproduced experimental levels of heat loss observed experimentally where they previously could not.

The Impact

This work provides an explanation for a mystery that is half a century old and represents a significant advancement in our understanding of heat loss from fusion plasmas. Ultimately, these results will allow for the development of more robust and reliable models that can be used to optimize the design and operation of fusion reactors and push the development of fusion energy forward.


A collaboration of researchers from MIT, University of California at San Diego, and General Atomics have performed the first set of realistic plasma turbulence simulations that simultaneously capture the dynamics of small- and large-scale turbulence in fusion plasmas. Motivated by a decades-old observation that measured electron heat losses from fusion plasmas exceed predictions from theory, a set of dedicated experiments were performed on the Alcator C-Mod tokamak at the MIT Plasma Science and Fusion center. Using the data from these experiments and over 100,000,000 CPU hours on the National Energy Research Scientific Computing Center supercomputers, the researchers performed what were arguably the highest fidelity plasma turbulence simulations ever. These simulations demonstrated for the first time that large- and small-scale turbulence likely coexist in the core of typical fusion plasmas and exhibit significant nonlinear interactions. The synergistic interactions between large-scale turbulent eddies (generally assumed to be responsible for the ion heat losses) and small-scale turbulent structures, known at ETG streamers, enhance the total loss of heat from the plasma and provide an explanation for the anomalously high heat losses observed experimentally. This research demonstrated that only simulations that simultaneously capture both the small- and large-scale turbulence, as well as their interactions, can explain previously unexplained experimental quantities, providing strong evidence that the observations from simulation are indeed representative of reality. Most importantly, these simulations provide a clear path forward for the development of models that will be used to improve the performance, operation, and design of future fusion reactors.