Macro and micro.
From the wonders of the universe to the intricacies of the atom, scientists use sophisticated simulations to further their research. And, not surprisingly, HPC is playing a key role – as it is, increasingly, in every scientific field.
As we explore the meaning of HPC is Now in the lead-up to SC19, let’s learn how two leading scientists at opposite ends of the macro/micro spectrum are redefining their disciplines, in large part due to the evolution of HPC.
At the macro level – Katrin Heitmann is a physicist and computational scientist at Argonne National Laboratory in the High Energy Physics Division; a senior member of the Kavli Institute for Cosmological Physics at the University of Chicago; and a member of The Northwestern Argonne Institute of Science and Engineering (NAISE). Her research focuses on computational cosmology, trying to understand the causes for accelerated expansion of the universe.
At the micro level – John E. Stone is a senior research programmer with the Theoretical and Computational Biophysics Group and the NIH Center for Macromolecular Modeling and Bioinformatics, as well as associate director of the CUDA Center of Excellence at the University of Illinois at Urbana-Champaign. He is lead developer of VMD, a high performance molecular visualization tool used by researchers all over the world.
Michela: You work in disparate domains, but have HPC in common. What role does supercomputing play in your research?
Katrin: Today, we are observing galaxies with very large telescopes. And the larger the telescopes, the larger the supercomputers we need to run better simulations, especially to resolve faint galaxies. Each galaxy is embedded in a diffuse “halo,” represented by a number of tracer particles. Our simulations follow the evolution of these tracer particles from the very early times to today.
Years ago, these simulations were very crude, and took months to complete. Now, they are much more detailed and can be completed in days.
John: I work on applying parallel computing and high-fidelity renderings to solve large-scale molecular visualization and analysis problems. As computer scientists, our role is to make computers useful for others. So my research is centered on enabling scientific applications of HPC.
For over 20 years, our NIH center has focused on helping make molecular modeling faster to address huge computational challenges. If we solve them, there’s a benefit to human health.
Michela: What are some challenges in adapting your science to HPC technology (or vice versa)?
John: We’re seeing huge numbers of new users of HPC who are just starting to kick the tires. But physicists and biologists – they don’t have time to become amateur software hackers. They need simply to deploy the technology. So, there’s a gap to bridge.
If we can close the gap between what they’re accustomed to doing within their labs, and what they need to optimize the use of supercomputer-class resources, they will have the methods they need to improve the quality of 3D atomic structures and other outcomes.
Michela: How have you seen the field evolve?
Katrin: Until recently, only a relatively few computational scientists used supercomputers. But now, the user base is much bigger. Scientists in every area are more comfortable using these machines. The reason is simple — you can do better science when supercomputing is involved.
Michela: Regarding the SC conferences you’ve attended, what is different now versus years ago?
John: I began attending SC conferences in the 1990s, as a grad student. Those early machines were behind glass; accessible by a privileged few. They were cantankerous, unreliable, difficult to use. People who used HPC were either computer scientists or someone unafraid to be an amateur computer scientist.
Today, you can optimize supercomputing with an interdisciplinary team. It’s more commonplace to have computer scientists embedded with application domain experts, and that’s why a lot of HPC software efforts are successful.
Katrin: I’ve attended almost every year for about 15 years. I’ve found several interesting trends over that time – the development of future-looking architectures (which some thought were crazy at first); the opportunity to talk with vendors to see what ideas have survived, and which haven’t; and the amazing technology developments to handle the increase in the data we have to deal with.
Michela: What are your expectations for the future? Where is HPC headed?
Katrin: This is an exciting time for cosmology. We’re no longer as limited by computing power. The field is going in two directions – higher resolutions to get more details; and the addition of more physics in our simulations. Our simulations are now of the same size as the surveys we carry out with actual telescopes, which is pretty amazing.
John: In five years’ time, my hope is that the combination of containerized HPC software, remote visualization, on-demand resource availability, and interactive supercomputing will close the gap so more non-traditional users can benefit from HPC.
Michela Taufer, PhD, General Chair, SC19
Michela Taufer is the Dongarra Professor in the Min H. Kao Department of Electrical Engineering & Computer Science, Tickle College of Engineering, University of Tennessee, Knoxville.