August 4, 2015
For a laboratory, one of the primary roles of the staff of the Department of Energy is to provide us with oversight. Whatever we do is prescribed and proscribed by a number of layers of rules and regulations. Some of these are set by Congress, and these get built into the contract that Jefferson Science Associates has with the DOE. This contract is a living beast, and a senior member of the Thomas Jefferson Site Office, Wayne Skinner, is the Contracting Officer. When DOE orders change, there is an exchange with the lab ─ often with our Chief Financial Officer, Joe Scarcello ─ and the contract gets changed. Every time DOE sends more money it is “put on the contract.”
At a slightly less formal level, we get guidance in our Performance Evaluation Measurement Plan (the PEMP). The categories of PEMP evaluation may contain Notable Outcomes, which are specific performance goals associated with ─ to cite a current example ─ our handling of the Hall C Dipole and Hall B Torus magnets within the 12 GeV CEBAF Upgrade Project. In previous Montage articles, we have also often discussed the Office of Science Project Reviews executed by the Office of Project Assessment; we typically get two of these each year for the upgrade project. We also get two or so each year for the Utilities Infrastructure Modernization plan. The individual program offices ─ Nuclear Physics, High Energy Physics, Basic Energy Sciences ─ also oversee their programs.
The Office of Nuclear Physics uses Operations Reviews, Science and Technology visits, and Science and Technology Reviews. The latter involve a team of our peers in accelerator, nuclear physics and technology. Once upon a time they occurred annually, currently they have been conducted every two-three years. We had one in 2012 and then again last week, hence this article.
All reviews and audits code their reports, and the code can be different. In our science and project reviews, a system is used in which the review team picks out points that were made by the reviewees or which appear in the briefing material, which are strictly factual. These are called findings and are neutral. When the review team wants to say something with intent to draw attention or to suggest a direction, it has two options. The first is to insert a comment; “It could be very important to consider the possible consequences of actions A or B in response to eventuality C.” Another is to make a Recommendation; “The laboratory should complete the mitigation plan for the problem Y it has encountered in Z before July next year.” At this year’s Science and Technology review, we reported on the disposition of the 2012 recommendation. Often, before we get the final report, the reviewers will write individual letters to the agency and they are often transmitted in redacted form. But these Findings, Comments, and Recommendations are transmitted verbally, then in writing immediately following the review from the whole panel and the DOE observers.
This past week was our chance as a laboratory to “strut our stuff” in the Science and Technology Review. Bob McKeown led the preparation. Several senior managers, division leaders and others coordinated the development of the agenda and of the presentations. Two complete days of dry runs took place and many hours were spent in preparation. It is trivial but important that the review agenda should be adhered to as far as possible. This means that a talk has to be planned and executed, first and foremost within a time slot. Often there is material available to fill the available time several times over, so slashing and burning, triage if you wish, is the order of the day.
Tuesday and part of Wednesday were filled with plenary presentations. We provided a tour of the lab, and then several more hours in three separate breakout sessions where we often try to feature our younger scientists and engineers. Lots of talks, all well done, lots of beautiful information, opinions from the Chair of the Users Group Board of Directors, and from the Chair of the PAC; no stone left unturned. There were also questions from Day 1 to be answered at 8 a.m. on Day 2, and questions from Day 2, to be answered at 8 a.m. on Day 3. The team came together in B207, to facilitate verbal exchanges, and collectively moderate their report. [Of course, I was not in the room so a little of this is conjecture.]
At 1 p.m. on Thursday, July 30, Gulshan Rai, the Medium Energy Physics Program manager, presented the report. Section by section: Findings many, Comments numerous, Recommendations None; Findings many, Comments numerous, Recommendations None. And so it went, Experimental Program, Theory Program, Facility Operations and Future Facility Upgrades, Scientific and Technical Staff, Scientific Community Interactions, Management.
Recommendations None! Of course, we will work with the Comments, they will help us, but this was superb. Well Done.