Jefferson Lab’s Center for Theoretical and Computational Physics is working with partner institutions to prepare for research in the age of quantum computing
Nuclear physicists have long pushed the envelope in supercomputing in efforts to shed light on the smallest bits of matter. Now, they’re taking aim at next-generation quantum computers in hopes of getting closer to a full understanding of the secret workings of the realm subatomic that gives rise to our visible universe.
Theoretical nuclear physicists based at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility are now part of the newly created Co-design Center for Quantum Advantage (C2QA), one of five National Quantum Information Science (QIS) Research Centers established in support of the National Quantum Initiative Act to advance QIS technology.
The C2QA is an interdisciplinary, multi-institutional center that will focus on quantum computing. The center is led by DOE’s Brookhaven National Laboratory and includes national labs, research centers, universities and industrial partners who aim to build the fundamental tools necessary for scalable, distributed and fault-tolerant quantum computer systems.
“We’re happy to be a part of this center, which we hope will advance our computation efforts and research programs in nuclear physics,” said Jianwei Qiu, Jefferson Lab associate director for Theoretical and Computational Physics.
The center brings together world-leading experts in quantum information sciences, materials science, computer science and theory to work together to resolve performance issues with today’s quantum computers by co-designing software and hardware.
According to Qiu, Jefferson Lab theorists will work with the center to develop the specialized algorithms, or software, that will be needed for quantum computation of observables in nuclear physics.
Today, theoretical nuclear physicists use lattice quantum chromodynamics for such computations. Quantum chromodynamics, or QCD, is the complex theory that describes how subatomic particles interact.
“Lattice QCD is a formulation of QCD that we use to calculate the properties of the nucleus to better understand the fundamental forces inside,” said Robert Edwards, a theoretical nuclear physicist in Jefferson Lab’s Center for Theoretical and Computational Physics.
However, QCD is so complex that even with ever more efficient supercomputers, some goals remain out of reach even with lattice QCD, including the ability to calculate the complex interactions between subatomic particles that take place every millisecond in experiments conducted with Jefferson Lab’s Continuous Electron Beam Accelerator Facility, a DOE User Facility.
“What we could do on a quantum computer is, in fact, formulate the theory closer to how it’s written in textbooks,” Edwards said. “We already have a clear understanding of how we would make these calculations, but the difficulty at this stage is ensuring that the quantum computers can be scaled up far enough to make these calculations.”
Edwards, who will lead the Jefferson Lab effort in the C2QA, said that this opportunity to begin designing the software alongside development of the hardware will enable nuclear physicists to formulate and test the ever-more-complex algorithms that will be needed for these computations and ensure that they’re ready to go as the hardware is able to run them.
“There are quite a few steps that we can make along the way in the formulation of the algorithms,” he said. “What we’ve already done – those techniques and that understanding – is translatable to these quantum methods. But, we would use them to tackle something that we can’t do even with all the high power we’ve developed in current computers.”
DOE release: White House Office of Technology Policy, National Science Foundation and Department of Energy Announce Over $1 Billion in Awards for Artificial Intelligence and Quantum Information Science Research Institutes
Contact: Kandice Carter, Jefferson Lab Communications Office, email@example.com