Updated Home (markdown)

Gordon McCann 2021-10-11 17:35:55 -04:00
parent 19ec4944dd
commit e7d19a0fc9

18
Home.md

@ -1,3 +1,21 @@
# Introduction
This software is designed to take raw data from the CAEN-CoMPASS digitizer DAQ and provide an event-built data set for use in further physics analysis. This code is intended to be used with the DAQ setup for the Super-Enge Split-Pole Spectrograph (SESPS) and Silicon Array for Branching Ratio Experiments (SABRE) at Florida State University. A more general skeleton code is given in the repository [EventBuilder_Skeleton](https://github.com/sesps/EventBuilder_Skeleton) for use with other setups. This wiki will go through some of the concepts of event building, as well as outline some of the specific methods used in the code. Finally, it will also contain some commonly asked questions and answers for using the software.
# Event Building
To this software, event-building refers to the organization of several detector events (or "hits") into a single event based on time correlations. This involved ordering data sets in time, combining several data sources, as well as identifying and correlating data from different sources. In general, time correlations are defined by a coincidence window. A coincidence window is the time interval that defines whether or not two hits were coincident.
Related to the concept of coincidence windows, is the concept of dead time. Dead time refers to any time interval over which either the physical detector, DAQ hardware, or DAQ software is blind to incoming data and data could potentially be lost. The goal with event building software is to accurately build events while minimizing the amount of dead time injected into the detection system. For more details on dead time, types of dead time, or dead time corrections see Knoll, _Radiation Detection and Measurement_. In the case of the event building software, the larger the coincidence window implemented the greater the potential dead time incursion can be, however this software takes several steps to avoid dead time penalties.
The pipeline for event building is as follows: time-shifting and time-ordering, followed by slow sorting, followed by fast sorting (optional), and finally basic analysis.
## Time-Shifting and Time-Ordering
To minimize dead time, as was previously mentioned, it is in general beneficial to minimize the width of coincidence windows. To this end, the event builder allows the user to apply shifts to the timestamps of hits on a channel by channel (or board by board) basis. In this way, a detector can be shifted so that the timestamps of two detectors can be synchronized, thus meaning that the coincidence window for the detectors needs only be wide enough to capture the timing distribution rather than the time offset as well as the distribution between two detectors. Typically, in the SESPS-SABRE setup, shifts are applied to anode data and SABRE data such that these events are synchronized with scintillator data, thus making the delay-line data the only significant timing event in the slow coincidence window.
Data from the CoMPASS DAQ system is typically given in a raw binary format for each individual digitizer channel (in the full SESPS-SABRE setup this results in around 145 files). These files are by definition time-ordered, however for event building purposes we need to combine these files in a time-ordered way so that coincidence analysis can be performed.
## Slow Sorting and Fast Sorting
Once the data is properly ordered in time, the data must be sorted in to built events. The most general attempt at this is referred to as slow sorting by this code. Slow sorting is where all data that falls within a coincidence window is taken and placed into a single built event. This coincidence window is often referred to as the slow window. "Slow" comes from the fact that this is the largest window used by the program, so this sorting takes place over the largest time-scales, and is therefore slow. There are a few important things to note about slow sorting in the program. Foremost is that it does not have a master trigger. That is, data from any detector channel can start an event. This is essentially a requirement for using the time-shifts outlined above, as well as optimizing the slow window size. The window stays open until a hit with a timestamp that occurs outside of the slow window is found. Then, that hit starts the new built event and the previous event is flushed out to the next stage of the pipeline. Also, the slow sort algorithm does _not_ discard any data. The built event from slow sort is comprised of dynamically allocated arrays (read std::vector) thus meaning that in principle the slow sort incurs no intrinsic dead time other than from fragmentation of events.