• Web Banner for Future Trends in Nuclear Physics Computing

Future Trends in Nuclear Physics Computing

Given new experiments starting up and on the horizon, and the vastly increasing data volumes even at small experiments, the Nuclear Physics community has in recent years been thinking about the next generation of data processing and analysis workflows that will maximize the science output. One context for this discussion has been a series of workshops, “Future Trends in Nuclear Physics Computing”. The workshop series has been initiated by Jefferson Lab and is connected to the monthly "Software & Computing Round Table" that is jointly organized by Brookhaven National Laboratory, the HEP Software Foundation, and Jefferson Lab. 

The design and development of software & computing constitute a cornerstone for the future success of the Nuclear Physics program. "Future Trends in Nuclear Physics Computing" will serve as a forum to discuss priorities for design and development as input for a community white paper to inform the next Long Range Plan for Nuclear Science. Each day of the workshop will have a theme to frame the discussion: "Where are we as a community?", "How can we make analysis easier?", "How can we scale up and down computing?". 

The third workshop was held online in 2020 with 207 participants. We focussed on the Nuclear Physics software & computing community itself. We identified what is unique about our community and discussed how we could strengthen common efforts and chart a path for software & computing in Nuclear Physics for the next ten years. 

The second workshop in 2017 with 56 participants started with an afternoon symposium on forward-looking topics, followed by 2.5 days of plenary talks and discussions. We examined our hardware and software strategy at a time horizon of ten years. Our goal was to work towards the definition of a common vision for software & computing in Nuclear Physics, and to recommend future directions for development. In our discussions, we adopted a data perspective and focussed on three areas: resource management and the interplay of I/O; compute and storage; machine learning for enhancing scientific productivity and appropriate task-based approaches; software portability, reusability and common infrastructure components.

In the first workshop in 2016 with 77 participants, we discussed trends in scientific computing and collected ideas on how to improve analysis workflows at existing Nuclear Physics programs, as well as to work towards analysis techniques and tools for future projects like the Electron-Ion Collider. We reviewed best practices in scientific computing and the future development of analysis techniques and tools, and we examined how future software & computing trends like big data or machine learning could improve productivity at existing or future experiments.

The design and development of software & computing constitute a cornerstone for the future success of the Nuclear Physics program. "Future Trends in Nuclear Physics Computing" will serve as a forum to discuss priorities for design and development as input for a community white paper to inform the next Long Range Plan for Nuclear Science. Each day of the workshop will have a theme to frame the discussion: "Where are we as a community?", "How can we make analysis easier?", "How can we scale up and down computing?". 

The third workshop was held online in 2020 with 207 participants. We focussed on the Nuclear Physics software & computing community itself. We identified what is unique about our community and discussed how we could strengthen common efforts and chart a path for software & computing in Nuclear Physics for the next ten years. 

The second workshop in 2017 with 56 participants started with an afternoon symposium on forward-looking topics, followed by 2.5 days of plenary talks and discussions. We examined our hardware and software strategy at a time horizon of ten years. Our goal was to work towards the definition of a common vision for software & computing in Nuclear Physics, and to recommend future directions for development. In our discussions, we adopted a data perspective and focussed on three areas: resource management and the interplay of I/O; compute and storage; machine learning for enhancing scientific productivity and appropriate task-based approaches; software portability, reusability and common infrastructure components.

In the first workshop in 2016 with 77 participants, we discussed trends in scientific computing and collected ideas on how to improve analysis workflows at existing Nuclear Physics programs, as well as to work towards analysis techniques and tools for future projects like the Electron-Ion Collider. We reviewed best practices in scientific computing and the future development of analysis techniques and tools, and we examined how future software & computing trends like big data or machine learning could improve productivity at existing or future experiments.