MIT scientists completed one of the most demanding calculations in fusion science: predicting the temperature and density profiles of a magnetically confined plasma via first-principles simulation of plasma turbulence. The researchers used an optimization methodology developed for machine learning to dramatically reduce the CPU time required while maintaining the accuracy of the solution.