DescriptionAs data sets from experimental user facilities and international instrument collaborations grow in both size and complexity there is an urgent need for new capabilities to transfer, reduce, analyze, store, search, and curate the data in order to facilitate scientific discovery. Supercomputing facilities around the world have begun to expand services and provide new capabilities for both types of communities in support of experiment workflows, from preprocessing to complete analysis. This panel discussion will offer a lively exchange between supercomputing facility staff and the application community, including users of experimental facilities. We will identify:
- What are the biggest pain points for experimental scientists in using HPC?
- What steps are being taken to address these pain points?
- How do experimental facilities partner with HPC centers?
- How do we design a sustainable model for collaboration, incorporating both technology development and business?