While the death rate from cancer has declined significantly over the past two decades, 17 million new cases of cancer were reported, worldwide, in 2018. And 9.5 million people died from the disease.1
Researchers around the globe are working steadily to identify new methods for detecting and treating all forms of cancer. And high-performance computing (HPC) is playing a critical role in their groundbreaking work.
Continuing our ongoing theme of how HPC is impacting various scientific fields, I have asked two leading computer scientists to share how they are using supercomputing in their research efforts with respect to cancer diagnosis and treatment.
Rick Stevens is Associate Laboratory Director for Computing, Environment and Life Sciences for the U.S. Department of Energy’s (DOE) Argonne National Laboratory. He is currently leader of Argonne’s Exascale Computing Initiative, and a Professor of Computer Science at the University of Chicago Physical Sciences Collegiate Division.
Amber Simpson is an Assistant Attending Computational Biologist in the Hepatopancreatobiliary Service in the Department of Surgery at Memorial Sloan-Kettering (MSK) Cancer Center. Specializing in medical image analysis and computer-aided surgery, her research group is focused on developing novel computing strategies for precision oncology.
Their exciting work further reveals the significance of our theme for SC19 in Denver: HPC is Now. HPC and researchers like Rick and Amber are helping improve lives today … and setting the stage for future breakthroughs in medicine and countless other fields.
Michela: Can you each provide a brief overview of your research?
Amber: My work is at the intersection of computation and medicine. To date, AI and machine learning have had little influence in the clinical care of patients. So, much of what I do is in image analysis – taking routinely acquired diagnostic images (like CT and MRI scans) and mining them for predictive info we can use to stratify oncology patients for treatment.
I believe my work – and Rick’s work – tries to bridge that gap between what a clinician can do today and what could be achieved through AI.
Rick: My primary interest is in how we can use advanced computing, including AI and machine learning, to develop new treatment strategies, drugs and ways of thinking about basic problems in biology.
Specifically, for cancer, we are trying to build predictive models of drug response – like predicting which drugs a patient would be more likely to respond to; or searching for combinations of drugs that would work better together. And understanding how to prioritize experiments for expanding chemotherapy development. The machine learning models we’re developing are general, but they all have these and other potential uses.
Michela: How are you leveraging HPC in your work?
Rick: What we are doing falls into the realm of predictive oncology. We are building machine learning models that are trained on data that comes from lab experiments or patient records. That’s millions and millions of data points collected over the years.
It’s very compute-intensive. We’re developing an active learning loop, where each successive generation of experiments improves the models to make the next set of predictions. So, we’re constantly building thousands of models and testing them against each other, and using HPC to optimize the models.
Amber: The ability for a human to do these high-level interpretations — to computationally think about data and extract features or variables — is limited. A human can only think about two to five variables at any given time.
What HPC and machine learning and AI give us is an ability to consider everything, all at once, in the model. To look for patterns in the data. That’s what is so exciting right now; you could potentially find a pattern and then design a clinical trial around that pattern. Using HPC to uncover relationships in data that were previously unexploited is a really interesting area.
Michela: How has the field evolved since you began your research?
Amber: Ten years ago, the goal of many researchers was to get a paper published in a respected computing journal. That’s important, but the ivory tower of computing was never where I wanted to be.
I’ve always wanted to work in the middle ground, linking cancer research to the needs of clinicians. And what’s interesting right now is that those of us who are operating in the middle are becoming much more valued. What I find interesting is understanding how clinicians do their work, and thinking of ways to help them do that.
Rick: I’ve been involved in driving HPC initiatives with colleagues around the country for more than 20 years. We didn’t know then if it was even possible to get to exascale, but we’re there.
What is very much a pleasant surprise is how fast deep learning technology has become useful, and how fast it is still evolving. The landscape has completely changed. As it’s pushing into science, the computing community is finding really innovative ways of using machine learning.
Michela: What do you expect to see from HPC and cancer research in the next five years?
Rick: My hope is that with these big machines we can open up a lot of new ideas and approaches in cancer research. These are big, hard problems, and finding a way to make large scale systems work to solve them is very gratifying. And it continues to motivate us.
Amber: It takes a long time to make an influence on medicine. Anything we come up with today in our lab needs to be evaluated in a clinical trial format to show benefits; and, ultimately, those results may change the clinical management of patients.
We’re creating a lot of biomarkers, and we’re approaching the point of having clinical trials that assess their value. The point is to change how we treat patients. So far, with AI, we’re not there yet. But we’re getting closer.
Michela Taufer, PhD, General Chair, SC19
Michela Taufer is the Dongarra Professor in the Min H. Kao Department of Electrical Engineering & Computer Science, Tickle College of Engineering, University of Tennessee, Knoxville.