next up previous
Next: Data Acquisition at Jefferson Up: Data Acquisition and Trigger Previous: Data Acquisition and Trigger

Introduction - Specifications That Drive Design

A detailed investigation of QCD and confinement through the high-statistics photoproduction of mesons at Jefferson Lab has been proposed. This long term experimental program would require the construction of a new experimental hall and a hermetic $4\pi$ detector capable of measuring energies and momenta of neutral and charged particles with very good resolution [1]. It is the goal of this workfest to identify the hardware and software requirements for the implementation of an efficient trigger and data acquisition system for this detector.

The Hall D detector is expected to run with tagged incident photon fluxes on target from 107/s to 108/s. With an incident electron beam energy of 12 GeV (3 $\mu$amps) and a 10-4 radiator, The total $\gamma p$cross sections lead to a hadronic rate into the detector (at the high flux) of around 360 $\,$kHz . By tagging the incident photon flux around the coherent bremsrahlung peak (at 9 GeV ) the rate of interesting events is reduced to about 14 $\,$kHz .

The primary difficulty for triggering that must be addressed is that for the high intensity experiments a coincidence with the photon tagger becomes ineffective at discriminating the events of interest; Hence, the incident photon energy must be reconstructed from the event in the detector. Ultimately one would need a full reconstruction of the event to identify the photon of interest. We propose to implement a two stage trigger, one in hardware and one in software.

A Level 1 hardware trigger can be implemented by using all prompt information in the detector including track counts and energy sums. These can be gathered parasitically from the front-end data stream. If the data at the front-end can be pipelined one would be able to implement a deadtimeless readout scheme upon a positive Level 1 decision (within 1-2 $\mu$sec). In addition, by clocking changes in the detector on the order of its response (250 $\,$MHz ), one can effectively pipeline the Level 1 trigger itself. The goal of the Level 1 is to get a 3:1 reduction in the total hadronic rate by cutting on those events generated from very low energy photons ( less than 2 GeV ). This would result in a input rate into the Level 2 of around 160 KHz.

Level 2 must be able to make a more accurate cut in incident photon energy - on the order of a 10:1 reduction. To make a serious cut, Level 2 must do a good job of reconstructing tracks through multiple detector elements. This stage is compute intensive and must come after ``event building''. To handle the 100 $\,$kHz input rate, events will have to be grouped in blocks (e.g. 64-256). Built event blocks can then be passed to the Level 3 processor farm. Use of an online farm allows both flexibility and scalability for the Hall D DAQ system.

Running experiments at the higher luminosities has implications for the detector itself. First, for triggering and reconstriction without the tagger, the detector must be hermetic. Position and energy resolutions for tracking chambers and calorimeters are not too restrictive (200 $\mu$m, 5-10%), but the segmentation and response should be good to handle the high rates.




next up previous
Next: Data Acquisition at Jefferson Up: Data Acquisition and Trigger Previous: Data Acquisition and Trigger
David Abbott
1/5/2000