DescriptionLive-data before they become stored-data need to be utilized more intimately and intensively in fusion science. Filesystem is now too small for accepting the ever-growing extreme-scale simulation data. Experimental data requires real time analysis and feedback in much shorter time than a second to avoid catastrophic events and to improve the experimental results. The data must be analyzed, managed, reduced and visualized on the fly in-situ. While the simulations or experiments are running, data need to be fed into machine learning algorithms for feature detection and discovery, and to be utilized in the transferred and “anchored” learning. Physics constraints must coexist in the in-situ data space. To combine the exascale HPC capability with big experiments, such as ITER, a federated data system needs to be developed. This talk will describe the necessity and examples for collaborative development of these technologies in fusion science.