JEFFERSON LAB SEARCH

(Show results from this date)
(Show results to this date)
*Use spaces between key words, no punctuation needed *Sign In for authenticated content

  • coming soon

  • Thomas Jefferson National Accelerator Facility (Jefferson Lab) provides scientists worldwide the lab’s unique particle accelerator, known as the Continuous Electron Beam Accelerator Facility (CEBAF), to probe the most basic building blocks of matter by conducting research at the frontiers of nuclear physics (NP) and related disciplines. In addition, the lab capitalizes on its unique technologies and expertise to perform advanced computing and applied research with industry and university partners, and provides programs designed to help educate the next generation in science and technology.

    Majority of computational science activities in Jefferson Lab focus on these areas : large scale and numerical intensive Lattice Quantum Chromodynamics (LQCD) calculations, modeling and simulation of accelerators and the experiment detectors, fast data acquisition and streaming data readout, high throughput computing for data analysis of experimental data, and large scale distributed data storage and management.

    Many Jefferson Lab scientists and staffs lead or actively participate the computational efforts in the above areas. Among those are computer/computational scientists and computer professionals from newly formed computational sciences and technology division (CST), physicists from physics division and the Center for Theoretical and Computational Physics, and accelerator physicists from Center for Advanced Studies of Accelerators (CASA). In addition, collaborations with universities and industrial partners further research and development in computational science.

    Jefferson Lab maintains various state of art high performance computing resources onsite. CSGF students will utilize these resources to carried out their researches in the specific areas described below:

    Accelerator Modeling

    CASA and Jefferson Lab SRF institute focus on advanced algorithms, such as fast multipole methods, for multiparticle accelerator dynamics simulations, artificial intelligence (AI) and machine learning (ML) applied to superconducting RF (SRF) accelerator operations, and integrated large and multi-scale modeling of SRF accelerator structures. These areas will be an essential part of a national strategy to optimize DOE operational facility investments, and to strengthen Jefferson Lab’s core competency of world-leading SRF advanced design and facility operations. Especially, current active simulation projects

    like electron cooling, intra-beam scattering, and coherent synchrotron radiation present diverse research domains ranging from numerical algorithms development to parallel computing.

    Streaming Data Readout

    With tremendous advancement in micro-electronics and computing technologies in the last decade, many nuclear physics and high-energy physics experiments are taking advantage of these developments by upgrading their existing triggered data acquisition to a streaming readout model (SRO) , whereby detectors are continuously read out in parallel streams of data. An SRO system, which could handle up to 100 Gb/s data throughput, provides a pipelined data analysis model to nuclear physics experiments where data are analyzed and processed in near real-time fashion. Jefferson Lab is leading a collaborative research and development effort to devise SRO systems not only for CEBAF 12GeV experiments but also for the upcoming EIC facility. SRO development offers CSGF students some exciting research areas such as network protocol design, high speed data communication, high performance data compression and distributed computing.

    Physics Data Analysis

    Analysis of data from modern particle physics experiments uses technically advanced programming and computing techniques to handle the large volumes of data. One not only needs to understand aspects of parallel programming using modern languages such as C/C++, Java, and Python, but also must incorporate knowledge of experimental techniques involving error propagation and estimation in order to properly interpret the results. Aspects of this range from writing a single algorithm used in event reconstruction, to using the collection of algorithms written by others, to managing campaigns at HPC facilities that apply these algorithms to large datasets. Detector calibrations and final physics analysis are also significant parts of the analysis chain. CSGF students could participate in any of these areas.

    Machine Learning

    Rapid developments in hardware computational power and an ever increasing set of data has lead to explosive growth in machine learning techniques, specifically deep learning techniques. These techniques threaten to change just about every facet of modern life and nuclear physics is no exception. At Jefferson Lab machine learning is being developed for every step in the physics workflow. To deliver beam to the experimental halls the accelerator relies on radio frequency (RF) cavities to accelerate the electrons. Occasionally these cavities, of which there are over 400 in operation around the accelerator, fault which disrupts the delivery of the beam to experiments. To quickly identify and diagnose cavity faults A.I. is being developed and deployed. Experiments themselves are developing and/or deploying A.I. to monitor detector performance, decide what data to keep, reconstruct detector responses, simulate the detectors, and even to analyze collected data. With the active development of machine learning tools and techniques Jefferson Lab hopes to drive nuclear physics research forward, enabling physicists to more quickly obtain and analyze high quality data.

  • JLEIC Detector and IR Study Group

    • Brindza, Paul
    • Camsonne, Alexandre
    • Diefenthaler, Markus
    • Elouadrihiri, Latifa
    • Ent, Rolf
    • Fenker, Howard
    • Furletova, Julia
    • Gaskell, Dave
    • Hyde, Charles (ODU)
    • Horn, Tanja  (CUA)
    • Hoskins, Joshua (UVA)
    • Kalicy, Greg  (CUA)
    • Keppel, Cynthia
    • Lawrence, David
    • Lin, Fanglei
    • Montgomery, Rachel (Glasgow)
    • Morozov, Vasiliy
    • Nadel-Turonski, Pawel (Stony Brook)
    • Park, Kijun (HU)
    • Ploen, Christine (ODU)
    • Rossi, Patrizia
    • Sullivan, Michael (SLAC)
    • Ungaro, Maurizio
    • Wei, Guohui
    • Weiss, Christian
    • Yoshida, Rik
    • Zhao, Zhiwen (Duke)
    • Zhang, Yuhong

    The affliations of the mebers are Jefferson Lab, unless otherwise noted.

     

  • EIC Center Advisory Board

    Name Affiliation
    Alberto Accardi Hampton University
    Peter Arnold University of Virginia
    Ian Cloet Argonne National Laboratory
    Rolf Ent Jefferson Lab
    Keith Griffioen College of William and Mary
    Charles Hyde Old Dominion University
    Mark Pitt Virginia Tech
    Christian Weiss Jefferson Lab
    Yuhong Zhang Jefferson Lab
  • How to use Superscripts, Subscripts, and Greek Characters, EIC2 Ω

    To add Super or Sub Scripts to the page body:

    1. At the bottom of the text editor, make sure Text format is set to "Full HTML"
    2. Use the Superscript and Subscript button in the editor (2nd line, 5th and 6th button)
    3. Example:   EICH2O

    To add Super or Sub Scripts to the page title:

    1. To use sub script in the title, Use <sub> tag. For Eg: "This text contains <sub>subscript</sub> text."
    2. To use super script in the title, Use <sup> tag. For Eg: "This text contains <sup>subscript</sup> text."

    To add Greek Characters to page body: 

    1. To greek characters, copy past from the URL: https://www.w3schools.com/charsets/ref_utf_greek.asp   Ex, Ω β ψ

    To add Greek Characters to page Title: 

    1. Copy and paste the characters into the title just as the above step. 

     

     

     

     

     

  • Detector Testing Capabilities in Hall B

    Detector testing capabilities in Hall B (Summer 2020)

    Space available is indicated on the drawing: https://www.jlab.org/sites/default/files/eiccenter/HallBTesting.png

    - Area A and B are non-invasive or minimally invasive which can run with CLAS12 simultaneously. Area A is for small setups. Area C is for dedicated tests which cannot run simultaneously with CLAS12.  

    - For parasitic noninvasive or minimally invasive tests.

    Small setups can be placed upstream of CLAS12 target at the location of BAND. If BAND is used they can be placed just upstream of it or just downstream. They will see particles coming from the target at large angles. In principle some small detectors can be placed in front of  FTOF where we don't have LTCC or RICH installed, at the edges of the sector.

    - Larger setups can be installed on the platform between the forward carriage and downstream alcove with some modifications of the downstream beam line. In this case they will see what is going at small angles and/or additional scattering chamber with thin target could be installed there.

    - For dedicated test in addition to this location we can use space between the tagger and the solenoid.

    Space downstream of the CLAS12 detectors. For example, behind the calorimeters, we have enough space to test any type of muon detectors (small ones can be mounted directly on the forward carriage, large ones on some platform downstream of the forward carriage). Another good space we have is between R3 drift chambers and FTOF for sectors that do not have LTCC or RICH. In both places, we know particle type and momentum before the detector. Nice place for a tracking detector test.

    Some caveat:

    - In location A the composition of particles is not really known. This place can be used if they want to test detector operation in magnetic field. This place is for small detectors only.

    - In  location B the secondary target (foil or wire) can be installed. This will allow to estimate flux of particles. This place can be used for instance to test tracking detectors.

    Space between CTOF/CND PMTs and BAND. It is a high magnetic field area, and close to sensitive SVT electronics. Not so simple to do tests of a detector that may require frequent access or may generate electronic noise. 

    Engineering support: Bob Miller can provide engineering support and design support stands for these experiments.  We do have a couple of stands left over from CLAS that may be useful.  

    DAQ: Hall-B staff could provide  full support for daq/trigger. Depending on actual setup we may need to buy some hardware

    HV power supplies to operate PMT-based detectors: we can provide some if number of channels is not too large