High Level Triggering Fred Wickens High Level Triggering

  • Slides: 78
Download presentation
High Level Triggering Fred Wickens

High Level Triggering Fred Wickens

High Level Triggering (HLT) • Introduction to triggering and HLT systems – What is

High Level Triggering (HLT) • Introduction to triggering and HLT systems – What is Triggering – What is High Level Triggering – Why do we need it • Case study of ATLAS HLT (+ some comparisons with other experiments) • Summary 2

Simple trigger for spark chamber set-up 3

Simple trigger for spark chamber set-up 3

Dead time • Experiments frozen from trigger to end of readout – Trigger rate

Dead time • Experiments frozen from trigger to end of readout – Trigger rate with no deadtime = R per sec. – Dead time / trigger = sec. – For 1 second of live time = 1 + R seconds – Live time fraction = 1/(1 + R ) – Real trigger rate = R/(1 + R ) per sec. 4

Trigger systems 1980’s and 90’s • bigger experiments more data per event • higher

Trigger systems 1980’s and 90’s • bigger experiments more data per event • higher luminosities more triggers per second – both led to increased fractional deadtime • use multi-level triggers to reduce dead-time – first level - fast detectors, fast algorithms – higher levels can use data from slower detectors and more complex algorithms to obtain better event selection/background rejection 5

Trigger systems 1990’s and 2000’s • Dead-time was not the only problem • Experiments

Trigger systems 1990’s and 2000’s • Dead-time was not the only problem • Experiments focussed on rarer processes – Need large statistics of these rare events – But increasingly difficult to select the interesting events – DAQ system (and off-line analysis capability) under increasing strain - limiting useful event statistics • This is a major issue at hadron colliders, but will also be significant at ILC • Use the High Level Trigger to reduce the requirements for – The DAQ system – Off-line data storage and off-line analysis 6

Summary of ATLAS Data Flow Rates • From detectors > 1014 Bytes/sec • After

Summary of ATLAS Data Flow Rates • From detectors > 1014 Bytes/sec • After Level-1 accept ~ 1011 Bytes/sec • Into event builder ~ 109 Bytes/sec • Onto permanent storage ~ 108 Bytes/sec ~ 1015 Bytes/year 7

TDAQ Comparisons 8

TDAQ Comparisons 8

The evolution of DAQ systems 9

The evolution of DAQ systems 9

Typical architecture 2000+ 10

Typical architecture 2000+ 10

Level 1 (Sometimes called Level-0 - LHCb) • Time: one very few microseconds •

Level 1 (Sometimes called Level-0 - LHCb) • Time: one very few microseconds • Standard electronics modules for small systems • Dedicated logic for larger systems – ASIC - Application Specific Integrated Circuits – FPGA - Field Programmable Gate Arrays • Reduced granularity and precision – calorimeter energy sums – tracking by masks • Event data stored in front-end electronics (at LHC use pipeline as collision rate shorter than Level-1 decision time) 11

Level 2 • 1) few microseconds (10 -100) – hardwired, fixed algorithm, adjustable parameters

Level 2 • 1) few microseconds (10 -100) – hardwired, fixed algorithm, adjustable parameters • 2) few milliseconds (1 -100) – Dedicated microprocessors, adjustable algorithm • 3 -D, fine grain calorimetry • tracking, matching • Topology – Different sub-detectors handled in parallel • Primitives from each detector may be combined in a global trigger processor or passed to next level 12

Level 2 - cont’d • 3) few milliseconds (10 -100) - 2008 – –

Level 2 - cont’d • 3) few milliseconds (10 -100) - 2008 – – Processor farm with Linux PC’s Partial events received with high-speed network Specialised algorithms Each event allocated to a single processor, large farm of processors to handle rate – If separate Level 2, data from each event stored in many parallel buffers (each dedicated to a small part of the detector) 13

Level 3 • millisecs to seconds • processor farm – microprocessors/emulators/workstations – Now standard

Level 3 • millisecs to seconds • processor farm – microprocessors/emulators/workstations – Now standard server PC’s • full or partial event reconstruction – after event building (collection of all data from all detectors) • Each event allocated to a single processor, large farm of processors to handle rate 14

Summary of Introduction • For many physics analyses, aim is to obtain as high

Summary of Introduction • For many physics analyses, aim is to obtain as high statistics as possible for a given process – We cannot afford to handle or store all of the data a detector can produce! • What does the trigger do – select the most interesting events from the myriad of events seen • I. e. Obtain better use of limited output band-width • Throw away less interesting events • Keep all of the good events(or as many as possible) – But note must get it right - any good events thrown away are lost for ever! • High level trigger allows much more complex selection algorithms 15

Case study of the ATLAS HLT system Concentrate on issues relevant for ATLAS (CMS

Case study of the ATLAS HLT system Concentrate on issues relevant for ATLAS (CMS very similar issues), but try to address some more general points

Starting points for any HLT system • physics programme for the experiment – what

Starting points for any HLT system • physics programme for the experiment – what are you trying to measure • accelerator parameters – what rates and structures • detector and trigger performance – what data is available – what trigger resources do we have to use it 17

Physics at the LHC Interesting events are buried in a sea of soft interactions

Physics at the LHC Interesting events are buried in a sea of soft interactions B physics High energy QCD jet production top physics Higgs production 18

The LHC and ATLAS/CMS • LHC has – design luminosity 1034 cm-2 s-1 (In

The LHC and ATLAS/CMS • LHC has – design luminosity 1034 cm-2 s-1 (In 2008 from 1030 - 1032 ? ) – bunch separation 25 ns (bunch length ~1 ns) • This results in – ~ 23 interactions / bunch crossing • ~ 80 charged particles (mainly soft pions) / interaction • ~2000 charged particles / bunch crossing • Total interaction rate – b-physics – t-physics – Higgs fraction ~ 10 -3 fraction ~ 10 -8 fraction ~ 10 -11 109 sec-1 106 sec-1 10 -2 sec-1 19

Physics programme • Higgs signal extraction important but very difficult • Also there is

Physics programme • Higgs signal extraction important but very difficult • Also there is lots of other interesting physics – – – B physics and CP violation quarks, gluons and QCD top quarks SUSY ‘new’ physics • Programme will evolve with: luminosity, HLT capacity and understanding of the detector – low luminosity (2008 - 2009) • high PT programme (Higgs etc. ) • b-physics programme (CP measurements) – high luminosity (2010? ) • high PT programme (Higgs etc. ) • searches for new physics 20

Trigger strategy at LHC • To avoid being overwhelmed use signatures with small backgrounds

Trigger strategy at LHC • To avoid being overwhelmed use signatures with small backgrounds – Leptons – High mass resonances – Heavy quarks • The trigger selection looks for events with: – – Isolated leptons and photons, -, central- and forward-jets Events with high ET Events with missing ET 21

Example Physics signatures Objects Physics signatures Electron 1 e>25, 2 e>15 Ge. V Higgs

Example Physics signatures Objects Physics signatures Electron 1 e>25, 2 e>15 Ge. V Higgs (SM, MSSM), new gauge bosons, extra dimensions, SUSY, W, top Photon 1γ>60, 2γ>20 Ge. V Higgs (SM, MSSM), extra dimensions, SUSY Muon 1μ>20, 2μ>10 Ge. V Higgs (SM, MSSM), new gauge bosons, extra dimensions, SUSY, W, top Jet 1 j>360, 3 j>150, 4 j>100 Ge. V SUSY, compositeness, resonances Jet >60 + ETmiss >60 Ge. V SUSY, exotics Tau >30 + ETmiss >40 Ge. V Extended Higgs models, SUSY 22

Trigger 40 MHz ARCHITECTURE DAQ Three logical levels Hierarchical data-flow LVL 1 - Fastest:

Trigger 40 MHz ARCHITECTURE DAQ Three logical levels Hierarchical data-flow LVL 1 - Fastest: Only Calo and Mu Hardwired On-detector electronics: Pipelines ~40 ms LVL 2 - Local: LVL 1 refinement + track association Event fragments buffered in parallel ~4 sec. LVL 3 - Full event: “Offline” analysis Full event in processor farm ~2. 5 ms ~ 200 Hz Physics ~1 PB/s (equivalent) ~ 300 MB/s 23

Selected (inclusive) signatures 24

Selected (inclusive) signatures 24

Trigger design - Level-1 • Level-1 – sets the context for the HLT –

Trigger design - Level-1 • Level-1 – sets the context for the HLT – reduces triggers to ~75 k. Hz – has a very short time budget • few micro-sec (ATLAS/CMS ~2. 5 - much used in cable delays!) • Detectors used must provide data very promptly, must be simple to analyse – Coarse grain data from calorimeters – Fast parts of muon spectrometer (I. e. not precision chambers) – NOT precision trackers - too slow, too complex – (LHCb does use some simple tracking data from their VELO detector to veto events with more than 1 primary vertex) – (CMS plans track trigger for s. LHC - L 1 time => ~6 micro-s) – Proposed FP 420 detectors provide data too late 25

ATLAS Level-1 trigger system • Calorimeter and muon – trigger on inclusive signatures •

ATLAS Level-1 trigger system • Calorimeter and muon – trigger on inclusive signatures • muons; • em/tau/jet calo clusters; missing and sum ET • Hardware trigger – Programmable thresholds – Selection based on multiplicities and thresholds 26

ATLAS em cluster trigger algorithm “Sliding window” algorithm repeated for each of ~4000 cells

ATLAS em cluster trigger algorithm “Sliding window” algorithm repeated for each of ~4000 cells 27

ATLAS Level 1 Muon trigger RPC - Trigger Chambers - TGC Measure muon momentum

ATLAS Level 1 Muon trigger RPC - Trigger Chambers - TGC Measure muon momentum with very simple tracking in a few planes of trigger chambers RPC: Restive Plate Chambers TGC: Thin Gap Chambers MDT: Monitored Drift Tubes 28

Level-1 Selection • The Level-1 trigger - an “or” of a large number of

Level-1 Selection • The Level-1 trigger - an “or” of a large number of inclusive signals - set to match the current physics priorities and beam conditions • Precision of cuts at Level-1 is generally limited • Adjust the overall Level-1 accept rate (and the relative frequency of different triggers) by – Adjusting thresholds – Pre-scaling (e. g. only accept every 10 th trigger of a particular type) higher rate triggers • Can be used to include a low rate of calibration events • Menu can be changed at the start of run – Pre-scale factors may change during the course of a run 29

Example Level-1 Menu for 2 x 10^33 Level-1 signature Output Rate (Hz) EM 25

Example Level-1 Menu for 2 x 10^33 Level-1 signature Output Rate (Hz) EM 25 i 12000 2 EM 15 i 4000 MU 20 800 2 MU 6 200 J 200 3 J 90 200 4 J 65 200 J 60 + XE 60 400 TAU 25 i + XE 30 2000 MU 10 + EM 15 i 100 Others (pre-scaled, exclusive, monitor, calibration) Total 5000 ~25000 30

Trigger design - Level-2 • Level-2 reduce triggers to ~2 k. Hz – Note

Trigger design - Level-2 • Level-2 reduce triggers to ~2 k. Hz – Note CMS does not have a physically separate Level-2 trigger, but the HLT processors include a first stage of Level-2 algorithms • Level-2 trigger has a short time budget – ATLAS ~40 milli-sec average • Note for Level-1 the time budget is a hard limit for every event, for the High Level Trigger it is the average that matters, so a some events can take several times the average, provided thay are a minority • Full detector data is available, but to minimise resources needed: – – Limit the data accessed Only unpack detector data when it is needed Use information from Level-1 to guide the process Analysis proceeds in steps with possibility to reject event after each step – Use custom algorithms 31

Regions of Interest • The Level-1 selection is dominated by local signatures (I. e.

Regions of Interest • The Level-1 selection is dominated by local signatures (I. e. within Region of Interest - Ro. I) – Based on coarse granularity data from calo and mu only • Typically, there are 1 -2 Ro. I/event • ATLAS uses Ro. I’s to reduce network b/w and processing power required 32

Trigger design - Level-2 - cont’d • Processing scheme – extract features from sub-detector

Trigger design - Level-2 - cont’d • Processing scheme – extract features from sub-detector data in each Ro. I – combine features from one Ro. I into object – combine objects to test event topology • Precision of Level-2 cuts – Emphasis is on very fast algorithms with reasonable accuracy • Do not include many corrections which may be applied off-line – Calibrations and alignment available for trigger not as precise as ones available for off-line 33

ARCHITECTURE Trigger Calo Mu. Tr. Ch 40 MHz LVL 1 Muon Trigger ROD ROIB

ARCHITECTURE Trigger Calo Mu. Tr. Ch 40 MHz LVL 1 Muon Trigger ROD ROIB L ROD 120 Ro. I’s LVL 2 ~ 10 ms Ro. I requests L 2 SV ROB ROD GB/s ROB ROS Ro. I data = 1 -2% L 2 P T Event Filter EFP EFP L 2 N ~2 GB/s LVL 2 accept ~ 1 sec ~ 1 PB/s FE Pipelines 2. 5 ms LVL 1 accept 75 k. Hz ~2 k. Hz Other detectors 2. 5 ms Calorimeter Trigger H DAQ Read-Out Drivers Read-Out Links Read-Out Buffers Read-Out Sub-systems ~3 GB/s Event Builder EB ~3 GB/s EFN ~ 300 MB/s ~ 200 Hz ~ 300 MB/s 34

CMS Event Building • CMS perform Event Building after Level-1 • This simplifies the

CMS Event Building • CMS perform Event Building after Level-1 • This simplifies the architecture, but places much higher demand on technology: – Network traffic ~100 GB/s • Use Myrinet instead of Gb. E for the EB network • Plan a number of independent slices with barrel shifter to switch to a new slice at each event – Time will tell which philosophy is better 35

Example for Two electron trigger LVL 1 triggers on two isolated STEP 4 e/m

Example for Two electron trigger LVL 1 triggers on two isolated STEP 4 e/m clusters with p. T>20 Ge. V Signature (possible signature: Z–>ee) HLT Strategy: n n n Validate step-by-step Check intermediate signatures Reject as early as possible Sequential/modular approach facilitates early rejection STEP 3 Signature STEP 2 e 30 i + Iso– lation e 30 Iso– lation + pt> 30 Ge. V e ecand STEP 1 Cluster shape Level 1 seed EM 20 i e 30 pt> 30 Ge. V + track finding Signature e 30 i e track finding + time Signature ecand Cluster shape + EM 20 i 36

Trigger design - Event Filter / Level-3 • Event Filter reduce triggers to ~200

Trigger design - Event Filter / Level-3 • Event Filter reduce triggers to ~200 Hz • Event Filter budget ~ 4 sec average • Full event detector data is available, but to minimise resources needed: – Only unpack detector data when it is needed – Use information from Level-2 to guide the process – Analysis proceeds in steps with possibility to reject event after each step – Use optimised off-line algorithms 37

Electron slice at the EF Wrapper of Calo. Rec Trig. Calo. Rec EFCalo. Hypo

Electron slice at the EF Wrapper of Calo. Rec Trig. Calo. Rec EFCalo. Hypo Wrapper of new. Tracking matches electromagnetic clusters with tracks and builds egamma objects EF tracking EFTrack. Hypo Wrapper of Egamma. Rec Trig. Egamma. Rec EFEgamma. Hypo 38

HLT Processing at LHCb 39

HLT Processing at LHCb 39

Trigger design - HLT strategy • Level 2 – confirm Level 1, some inclusive,

Trigger design - HLT strategy • Level 2 – confirm Level 1, some inclusive, some semiinclusive, some simple topology triggers, vertex reconstruction (e. g. two particle mass cuts to select Zs) • Level 3 – confirm Level 2, more refined topology selection, near off-line code 40

Example HLT Menu for 2 x 10^33 HLT signature Output Rate (Hz) e 25

Example HLT Menu for 2 x 10^33 HLT signature Output Rate (Hz) e 25 i 40 2 e 15 i <1 gamma 60 i 25 2 gamma 20 i 2 mu 20 i 40 2 mu 10 10 j 400 10 3 j 165 10 4 j 110 10 j 70 + x. E 70 20 tau 35 i + x. E 45 5 2 mu 6 with vertex, decay-length and mass cuts (J/psi, psi’, B) 10 Others (pre-scaled, exclusive, monitor, calibration) 20 Total ~200 41

Example B-physics Menu for 10^33 LVL 1 : • • • MU 6 rate

Example B-physics Menu for 10^33 LVL 1 : • • • MU 6 rate 24 k. Hz (note there are large uncertainties in cross-section) In case of larger rates use MU 8 => 1/2 x. Rate 2 MU 6 LVL 2: • • • Run mu. Fast in LVL 1 Ro. I ~ 9 k. Hz Run ID recon. in mu. Fast Ro. I mu 6 (combined muon & ID) ~ 5 k. Hz Run Trig. Di. Muon seeded by mu 6 Ro. I (or MU 6) Make exclusive and semi-inclusive selections using loose cuts – B(mumu), B(mumu)X, J/psi(mumu) Run IDSCAN in Jet Ro. I, make selection for Ds(Phi. Pi) EF: • • • Redo muon reconstruction in LVL 2 (LVL 1) Ro. I Redo track reconstruction in Jet Ro. I Selections for B(mumu) B(mumu. K*) B(mumu. Phi), Bs. Ds. Phi. Pi etc. 42

LHCb Trigger Menu 43

LHCb Trigger Menu 43

Matching problem Background Off-line Physics channel On-line 44

Matching problem Background Off-line Physics channel On-line 44

Matching problem (cont. ) • ideally – off-line algorithms select phase space which shrink

Matching problem (cont. ) • ideally – off-line algorithms select phase space which shrink -wraps the physics channel – trigger algorithms shrink-wrap the off-line selection • in practice, this doesn’t happen – need to match the off-line algorithm selection • For this reason many trigger studies quote trigger efficiency wrt events which pass off-line selection – BUT off-line can change algorithm, re-process and recalibrate at a later stage • SO, make sure on-line algorithm selection is well known, controlled and monitored 45

Selection and rejection • as selection criteria are tightened – background rejection improves –

Selection and rejection • as selection criteria are tightened – background rejection improves – BUT event selection efficiency decreases 46

Selection and rejection • Example of a ATLAS Event Filter (I. e. Level-3) study

Selection and rejection • Example of a ATLAS Event Filter (I. e. Level-3) study of the effectiveness of various discriminants used to select 25 Ge. V electrons from a background of dijets 47

Other issues for the Trigger • Efficiency and Monitoring – In general need high

Other issues for the Trigger • Efficiency and Monitoring – In general need high trigger efficiency – Also for many analyses need a well known efficiency • Monitor efficiency by various means – Overlapping triggers – Pre-scaled samples of triggers in tagging mode (pass-through) • Final detector calibration and alignment constants not available immediately - keep as up-to-date as possible and allow for the lower precision in the trigger cuts when defining trigger menus and in subsequent analyses • Code used in trigger needs to be very robust - low memory leaks, low crash rate, fast • Beam conditions and HLT resources will evolve over several years (for both ATLAS and CMS) – In 2008 luminosity low, but also HLT capacity will be < 50% of full system (funding constraints) 48

Summary • High-level triggers allow complex selection procedures to be applied as the data

Summary • High-level triggers allow complex selection procedures to be applied as the data is taken – Thus allow large numbers of events to be accumulated, even in presence of very large backgrounds – Especially important at LHC - but significant at most accelerators • The trigger stages - in the ATLAS example – Level 1 uses inclusive signatures • muons; em/tau/jet calo clusters; missing and sum ET – Level 2 refines Level 1 selection, adds simple topology triggers, vertex reconstruction, etc – Level 3 refines Level 2 adds more refined topology selection • Trigger menus need to be defined, taking into account: – Physics priorities, beam conditions, HLT resources • Include items for monitoring trigger efficiency and calibration • Must get it right - any events thrown away are lost for ever! 49

Additional Foils 50

Additional Foils 50

51

51

The evolution of DAQ systems 52

The evolution of DAQ systems 52

ATLAS Detector 53

ATLAS Detector 53

ATLAS event - tracker end-view 54

ATLAS event - tracker end-view 54

Trigger functional design • Level 1 Input 40 MHz Accept 75 k. Hz Latency

Trigger functional design • Level 1 Input 40 MHz Accept 75 k. Hz Latency 2. 5 μs § § Inclusive triggers based on fast detectors Muon, electron/photon, jet, sum and missing ET triggers Coarse(r) granularity, low(er) resolution data Special purpose hardware (FPGAs, ASICs) • Level 2 Input 75 (100) k. Hz Accept O(1) k. Hz Latency ~10 ms § § Confirm Level 1 and add track information Mainly inclusive but some simple event topology triggers Full granularity and resolution available Farm of commercial processors with special algorithms • Event Filter Input O(1) k. Hz Accept O(100) Hz Latency ~secs § Full event reconstruction § Confirm Level 2; topology triggers § Farm of commercial processors using near off-line code 55

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ATLAS Trigger / DAQ Data Flow dual-socket server PC’s ~1600 ~100 Event Builder Event Filter (EF) Sub. Farm Inputs ~ 500 LVL 2 farm (SFIs) Ro. I Builder stores LVL 2 output Gigabit Ethernet Event data requests Delete commands Requested event data Regions Of Interest Event data pulled: partial events @ ≤ 100 k. Hz, full events @ ~ 3 k. Hz SDX 1 p. ROS Network switches LVL 2 Supervisor Secondlevel trigger USA 15 ~150 PCs Read-Out Subsystems (ROSs) Timing Trigger Control (TTC) USA 15 Data of events accepted 1600 by first-level trigger Read. Out VME Dedicated links Links Read. Out Drivers (RODs) UX 15 ATLAS detector Firstlevel trigger Event data pushed @ ≤ 100 k. Hz, 1600 fragments of ~ 1 k. Byte each UX 15 56

Event’s Eye View - step-1 • At each beam crossing latch data into detector

Event’s Eye View - step-1 • At each beam crossing latch data into detector front end • After processing, data put into many parallel pipelines - moves along the pipeline at every bunch crossing, falls out the far end after 2. 5 microsecs • Also send calo + mu trigger data to Level-1 57

Event’s Eye View - step-2 • The Level-1 Central Trigger Processor combines the information

Event’s Eye View - step-2 • The Level-1 Central Trigger Processor combines the information from the Muon and Calo triggers and when appropriate generates the Level-1 Accept (L 1 A) • The L 1 A is distributed in real-time via the TTC system to the detector front-ends to send data from the accepted event to the detector ROD’s (Read-Out Drivers) – Note must arrive before data has dropped out of the pipe-line - hence hard dead-line of 2. 5 micro-secs – The TTC system (Trigger, Timing and Control) is a CERN system used by all of the LHC experiments. Allows very precise real-time data distribution of small data packets • Detector ROD’s receive data, process and reformat it as necessary and send via fibre links to TDAQ ROS 58

Event’s Eye View - Step-3 • At L 1 A the different parts of

Event’s Eye View - Step-3 • At L 1 A the different parts of LVL 1 also send Ro. I data to the Ro. I Builder (Ro. IB), which combines the information and sends as a single packet to a Level-2 Supervisor PC – The Ro. IB is implemented as a number of VME boards with FPGAs to identify and combine the fragments coming from the same event from the different parts of Level-1 59

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ATLAS Level-2 Trigger dual-socket server PC’s ~1600 ~100 Event Builder Event Filter (EF) Sub. Farm Inputs ~ 500 LVL 2 farm (SFIs) Ro. I Builder Gigabit Ethernet Event data requests Requested event data Regions Of Interest Event data for Level-2 pulled: partial events @ ≤ 100 k. Hz Region of Interest Builder (Ro. IB) passes formatted information to one of the LVL 2 supervisors. p. ROS Network switches LVL 2 Supervisor Secondlevel trigger Step-4 USA 15 ~150 PCs Read-Out Subsystems (ROSs) stores LVL 2 output LVL 2 supervisor selects one of the processors in the LVL 2 farm and sends it the Ro. I information. LVL 2 processor requests data from the ROSs as needed (possibly in several steps), produces an accept or reject and informs the LVL 2 supervisor. Result of processing is stored in pseudo-ROS (p. ROS) for an accept. Reduces network traffic to ~2 GB/s c. f. ~150 GB/s if do full event build LVL 2 supervisor passes decision to the Data. Flow Manager (controls 60 Event Building).

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local

CERN computer centre Data storage SDX 1 ~30 Event rate ~ 200 Hz Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ATLAS Event Building dual-socket server PC’s ~1600 ~100 Event Builder Event Filter (EF) Sub. Farm Inputs ~ 500 LVL 2 farm (SFIs) p. ROS Network switches Ro. I Builder Gigabit Ethernet Event data requests Delete commands Regions Of Interest Event data after Level-2 pulled: full events @ ~3 k. Hz Requested event data Network switches LVL 2 Supervisor Secondlevel trigger USA 15 ~150 PCs Read-Out Subsystems (ROSs) stores LVL 2 output Step-5 For each accepted event the Data. Flow Manager selects a Sub. Farm Input (SFI) and sends it a request to take care of the building of a complete Event. The SFI sends requests to all ROSs for data of the event to be built. Completion of building is reported to the Data. Flow Manager. For rejected events and for events for which event Building has completed the Data. Flow Manager sends "clears" to the ROSs (for 100 300 events Together). Network traffic for Event Building is ~5 GB/s 61

~30 Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ~1600 ~100 Event Builder

~30 Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ~1600 ~100 Event Builder Event Filter (EF) Sub. Farm Inputs ~ 500 LVL 2 farm p. ROS Network switches Ro. I Builder Gigabit Ethernet Network switches LVL 2 Supervisor Secondlevel trigger (SFIs) Event data requests Delete commands Event rate ~ 200 Hz ATLAS Event Filter dual-socket server PC’s Requested event data Data storage SDX 1 Regions Of Interest CERN computer centre USA 15 ~150 PCs stores LVL 2 output Step-6 A process (EFD) running in each Event Filter farm node collects each complete event from the SFI and assigns it to one of a number of Processing Task’s in that node The Event Filter uses more sophisticated algorithms (near or adapted off-line) and more detailed calibration data to select events based on the complete event data Accepted events are sent to SFO (Sub-Farm Output) node to be written to disk Read-Out Subsystems (ROSs) 62

~30 Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ~1600 ~100 Event Builder

~30 Local Storage Sub. Farm Outputs (SFOs) Data. Flow Manager ~1600 ~100 Event Builder Event Filter (EF) Sub. Farm Inputs LVL 2 farm Ro. I Builder Secondlevel trigger (SFIs) p. ROS Network switches Gigabit Ethernet Network switches LVL 2 Supervisor Step-7 ~ 500 Event data requests Delete commands Event rate ~ 200 Hz ATLAS Data Output dual-socket server PC’s Requested event data Data storage SDX 1 Regions Of Interest CERN computer centre stores LVL 2 output The SFO nodes receive the final accepted events and writes them to disk The events include ‘Stream Tags’ to support multiple simultaneous files (e. g. Express Stream, Calibration, bphysics stream, etc) Files are closed when they reach 2 GB or at end of run USA 15 ~150 PCs Read-Out Subsystems (ROSs) Closed files are finally transmitted via Gb. E to the CERN Tier-0 for off-line analysis 63

ATLAS HLT Hardware First 4 racks of HLT processors, each rack contains - ~30

ATLAS HLT Hardware First 4 racks of HLT processors, each rack contains - ~30 HLT PC’s (PC’s very similar to Tier-0/1 compute nodes) - 2 Gigabit Ethernet Switches - a dedicated Local File Server 64

ATLAS TDAQ Barrack Rack Layout 65

ATLAS TDAQ Barrack Rack Layout 65

Naming Convention First Level Trigger (LVL 1) Signatures in capitals e. g. LVL 1

Naming Convention First Level Trigger (LVL 1) Signatures in capitals e. g. LVL 1 electron g photon MU mu muon HA tau fj forward jet JE je jet energy JT jt jet TM xe missing energy EM name isolated HLT in lower case: threshold EF in tagging mode mu 20 i _ pass. EF name isolated New in 13. 0. 30: • Threshold is cut value applied • previously was ~95% effic. point. • type e threshold MU 20 I HLT FJ More details : see : https: //twiki. cern. ch/twiki/bin/view/Atlas/Trigger. Physics. Menu 66

Min Bias Triggers Min. Bias Trigger available for the first time in 13. 0.

Min Bias Triggers Min. Bias Trigger available for the first time in 13. 0. 3 • • • Based on SP Counting Trigger if: >40 SCT SP or > 900 Pixel Clusters Trigger if: >40 SCT SP or > 900 Pixels clusters To be done: add MBTS trigger MBTS – Scintillators on the inside of endcap calorimeter giving LVL 1 info. 67

Electron Menu Coverage for L=1031 cm-2 s-1 16 LVL 1 Thresholds for EM (electron,

Electron Menu Coverage for L=1031 cm-2 s-1 16 LVL 1 Thresholds for EM (electron, photon) & HA (tau) EM 3, EM 7, EM 13 I, EM 18 I, EM 23 I, EM 100 • s Trigger Single electron triggers and pre-scaled triggers with HLT pass-thru’ for commissioning needs Low mass pairs Low-medium-p. T double/triple e-trigger High-p. T single e-trigger (LVL 1 p. T~18 Ge. V) Very high-p. T e-trigger Low p. T single e-trigger (LVL 1 p. T~7 Ge. V) Physics coverage Selections with isolated/non-isolated LVL 1 thresholds Triggers with L 2 and/or EF pass-through e. g. e 15, e 15 i, e 15_pass. HLT, e 15 i_pass. HLT, e 20_pass. L 2, e 20_pass. EF J/ , ee, DY are sources of isolated e with large stat. , useful for calib. at low-p. T, efficiency extraction at low-p. T e. g. 2 e 5, 2 e 10, e 5+e 7, e 5+e 10 Z ee, Susy, new phenomena e. g. 2 e 10, 2 e 15, 3 e 15 W e , Z ee, top, Susy, Higgs, Exotics etc. Loose selections and lots of redundancy e. g. e 20, e 20 i, e 25 i, e 15_x. E 20, e 10_x. E 30, e 105 Exotics, new phenomena e. g. em 105_pass. HLT Electrons from b, c decays (e typically not well isolated) Useful for E/p studies. Need tighter cuts to limit rate e. g. e 12 Rate 5 Hz 6 Hz 5 Hz 17 Hz 68

Photon Menus for 1031 Trigger Item Examples Physics Coverage Low pt item HLT pre-scale

Photon Menus for 1031 Trigger Item Examples Physics Coverage Low pt item HLT pre-scale 10 or 100 g 10, g 15 i Hadronic calibration, inclusive and diphoton cross section High pt item, no prescale g 20, g 20 i, g 25 i Direct photon, hadronic calibration g 105, g 120 Exotics, SUSY, unknown, hadronic calibration Multi photon, no Isol. no HLT prescale 2 g 10, 2 g 15, 2 g 20, 3 g 10 Di-photon cross section, Exotics, SUSY, calibration Triggers for commissioning with LVL 1 prescale and HLT in tagging mode em 15_pass. HLT, em_15 i_pass. HLT g 10_pass. L 2 g 10_pass. EF Selections with/without L 1 isolation, triggers with L 2/EF pass-through Very high pt item non isolated Total rate (including overlaps) ~10 Hz Rate 4 Hz 7 Hz 5 Hz 69

Muon Triggers Six LVL 1 thresholds : MU 4, MU 6, MU 10, MU

Muon Triggers Six LVL 1 thresholds : MU 4, MU 6, MU 10, MU 15, MU 20, MU 40 Isolation can be applied at the HLT Triggers Examples Motivation Rate B-physics, J/ , mm, DY 4 Hz Prescaled Low p. T single m Unprescaled Low p. T dimuon mu 4, mu 6 Prescaled triggers with HLT pass-thru’ mu 20 i with calculating but not applying isolation mu 20_pass. HLT commissioning 0. 5 Hz high p. T triggers with/without isolation mu 10, mu 15, mu 20 i, mu 40 2 mu 10, 2 mu 20 high-p. T physics: Z(mm), Susy, Higgs, Exotics etc. 20 Hz 2 mu 4, mu 4+mu 6, 2 mu 6 2. 5 Hz 70

Bphysics LVL 1 + Muon at HLT • 2 mu 4 : 2. 5

Bphysics LVL 1 + Muon at HLT • 2 mu 4 : 2. 5 Hz • mu 4 & mu 6 pre-scaled : 4 Hz LVL 1 + ID & MU at HLT: • mu 4_Ds. Phi. Pi_FS, MU 4_Jpsimumu_FS, MU 4_Upsimumu_FS, • MU 4_Bmumu_FS, MU 4_Bmumu. X_FS Loose selections ~10 Hz 71

Tau Triggers 16 LVL 1 Thresholds for EM (electron, photon) & HA (tau) HA

Tau Triggers 16 LVL 1 Thresholds for EM (electron, photon) & HA (tau) HA 5, HA 6, HA 9 I, HA 11 I, HA 16 I, HA 25 I, HA 40 Signature Example Motivation Single tau prescaled single tau unprescaled tau 45, tau 45 i Tau+MET tau 20 i+xe 30 W -> at low luminosity and H−> , 5 Hz SUSY, etc at high lumi. Tau 2 tau 25 i, 2 tau 35 i H->tautau 3 Hz Z tt, preparation for 1033 SUSY, Charged Higgs 5 Hz exotics and heavy Higgs Rate 15 Hz tau 60, tau 100 tau+e, mu, tau, jet tau 20 i_e 10, tau 20 i_mu 10, tau 20 i_j 70, tau 20 i_4 j 50, tau 20 i_bj 18 72

Single Jet Triggers • Strategy: • • Initially use LVL 1 selection with no

Single Jet Triggers • Strategy: • • Initially use LVL 1 selection with no active HLT selection and b-jet trigger in tagging mode 8 LVL 1 Jet thresholds: – Highest un-prescaled, value determined by rate considerations (Aim for ~20 Hz) – Other threshold set to equalize bandwidth across the ET spectrum – Lowest threshold used to provide Ro. I for Bphysics trigger. 73

Jet Triggers (contd) Triggers Motivation single jet j 5, j 10, j 18, j

Jet Triggers (contd) Triggers Motivation single jet j 5, j 10, j 18, j 23, j 35, j 42, j 70, j 120, j 200, j 400 QCD, Exotics multi-jet 3 J 10, 4 J 10, 3 J 18, 3 J 23, 4 J 18, 4 J 23, 4 J 35 searches pp->XX, X->jj, top, SUSY forward jets FJ 10, FJ 18, FJ 26, FJ 65, 2 FJ 10, 2 FG 26, 2 FJ 65, FJ 65_FJ 26 VBF jet energy sum JE 280, JE 340 SUSY Trigger Rates for Forward Jets Trigger Rates for multi-jets 74

Bjet Triggers • • • Jets tagged as B-jets at HLT based on track

Bjet Triggers • • • Jets tagged as B-jets at HLT based on track information Will allow lower LVL 1 jet thresholds to be used For initial running the Bjet triggers will be in tagging mode. Active selection will be switched on once the detector & trigger are understood. 75

Missing ET, Total Sum. ET 8 LVL 1 Missing ET thresholds 76

Missing ET, Total Sum. ET 8 LVL 1 Missing ET thresholds 76

Combined Triggers • Menu contains large no. combined signatures Type Examples Motivation tau+e, tau+mu,

Combined Triggers • Menu contains large no. combined signatures Type Examples Motivation tau+e, tau+mu, e+mu tau 15 i_e 10, tau 25 i_mu 6, tau 20 i_mu 10, e 10_mu 6 tt, SUSY tau+Missing ET tau 45_xe 40, tau 45 i_xe 20 W, tt, SUSY, exotics tau+jet tau 25 i_j 70 W, tt, SUSY, exotics mu+jet mu 4_j 10 exotics jet + missing ET j 70_xe 30 SUSY, exotics Total Rate 46 Hz 77

Total Rates LVL 1 Rate (Hz) LVL 1 47, 000 LVL 2 865 EF

Total Rates LVL 1 Rate (Hz) LVL 1 47, 000 LVL 2 865 EF 200 LVL 2 EF 15 78