LAr DCS HW OnCall Training q LAr DCS

  • Slides: 29
Download presentation
LAr DCS HW On-Call Training

LAr DCS HW On-Call Training

q LAr DCS system overview Ø 14 PVSS (main DCS tool for ATLAS) projects

q LAr DCS system overview Ø 14 PVSS (main DCS tool for ATLAS) projects are running on the 14 PC Ø Lar DCS control and monitoring systems : Ø ROD system – Wiener VME crates; Ø FEC system Low Voltage and Temperatures Ø HV system ( 8 projects) Ø Lar DCS monitoring systems : Ø LAr temperature readout Ø LAr purity Ø LAr Sub-detector Control Station (SCS) – integration of the LAr DCS subsystems using FSM tool + LAr FE Cooling monitoring S. Chekulaev 2

ATLAS GCS Local Control Stations TERMO PUR ROD FEC LV &TEMP CAN PSU x

ATLAS GCS Local Control Stations TERMO PUR ROD FEC LV &TEMP CAN PSU x 16 3 CAN Bus PURITY CRATE ELMB x 27 D LV PS STATUS 48 MB 270 V PS HV 3 HV 7 HV MODULES 17 Wiener CRATES 12 PURITY BOARDS USA 15 ISEG 8 CANBUS 3 CANBUS 4 CANBUS 8 CANBus 280 V PS ELMB x 58 A HECLVTEMP HV 0 HV 1 4 CABUS FEC PCATLLARSCS Station Names : PCATLLAR… (151 modules in 5 racks) HEC LV BOX x 8 ELMB ECA : 36 ECC : 37 S. Chekulaev UX 15 3

q LAr DCS system integration Ø The LAR DCS ( as well as ATLAS)

q LAr DCS system integration Ø The LAR DCS ( as well as ATLAS) is represented by means of a finite state machine (FSM) hierarchy which is operated by a DCS operator through an FSM and alarm screen. Ø In ATLAS the DCS is organized in three functional horizontal layers and the FSM is the main tool for the implementation of the full control hierarchy Ø The LAR sub-detector controls station is top of the LAR FSM tree. S. Chekulaev 4

q. ATLAS FSM Architecture S. Chekulaev 5

q. ATLAS FSM Architecture S. Chekulaev 5

LAr FSM hierarchy LAr EMEC_A BARREL_A HEC/FCAL_A EMEC_C BARREL_C HEC/FCAL_C HV ROD COOLING FEC

LAr FSM hierarchy LAr EMEC_A BARREL_A HEC/FCAL_A EMEC_C BARREL_C HEC/FCAL_C HV ROD COOLING FEC HV PS CRATE_1 HV_Sector_1 CRATE_2 CRATE_N HV_Sector_N CRATE_3 φ wedge_1 φ wedge_32 S. Chekulaev COOLING INFRASTRUC 6

q LAr main FSM panel S. Chekulaev 7

q LAr main FSM panel S. Chekulaev 7

q. Ui Panel Layout HV φ – wedges EMB PS Cooling loops ROD crates

q. Ui Panel Layout HV φ – wedges EMB PS Cooling loops ROD crates FE crates S. Chekulaev 8

q LAR partition Ui panel Ø Φ – granularity ; Ø All graphic objects

q LAR partition Ui panel Ø Φ – granularity ; Ø All graphic objects are “connected” to the FSM objects (color of the cycles or wedges color is changing if STATUS or STATE of the FSM object is changing) ; Ø Text is displayed if the cursor points to a graphic object Ø There is a possibility to reach corresponding FSM object from the Ui panel S. Chekulaev 9

q ATLAS FSM Ø The “STATE” and “STATUS” are defund for each FSM node.

q ATLAS FSM Ø The “STATE” and “STATUS” are defund for each FSM node. They are two aspects that work in parallel and provide all the necessary information about the behavior of any system at any level in the hierarchy. Ø The STATE defines the “operational mode of the system” Ø The STATUS gives more details about “how well the system is working” (i. e. it warns about the presence of errors). The STATUS is somehow similar to the alert screen. Having the display of the STATUS within the FSM is useful to find out faster the information located in the PVSS panel of the element with an error https: //edms. cern. ch/file/685114//FSM_INTEGRATION_GUIDELINE. pdf S. Chekulaev 10

q LAr Alarm screen S. Chekulaev 11

q LAr Alarm screen S. Chekulaev 11

q Alarm handling Ø Alarms from the PVSS alert configurations at the data point

q Alarm handling Ø Alarms from the PVSS alert configurations at the data point level will be displayed using the framework (FW) Alarm Screen, and are intended to be used for detailed problem tracking and acknowledgement. Ø A simplified alarm handling mechanism is introduced at the level of the FSM (“STATUS”). The STATUS allows for context based signalization of problems and error tracking inside the control hierarchy directly on the FSM operator interface. S. Chekulaev 12

q Access to the LAR DCS system Ø System monitoring could be done using

q Access to the LAR DCS system Ø System monitoring could be done using of the LAR FSM Ui panels, from ATN – directly , from GPN – WTS (cerntsatldcs) Ø Limited number of the DCS actions could be done from the FSM panels Any action should be approved by LAr Run Coordinator !!! E-log entry should be send before start and after finish !!! S. Chekulaev 13

S. Chekulaev 14

S. Chekulaev 14

S. Chekulaev 15

S. Chekulaev 15

q Access control Ø ALL users which have P 1 role – DSC: LAR:

q Access control Ø ALL users which have P 1 role – DSC: LAR: admin, expert, observer and DSC: LARHV: admin, expert, observer can login on the WTS Ø ACR handshake mechanism is activated on the WTS shell, in order to login on the FSM or (and) local DCS PC access should be confirmed either Shift Leader or LAr Run Coordinator Ø If You have any DCS: LAR(xxx): expert role You can login on the FSM, for example DCS: LARHECLV: expert Ø If You have P 1 - DSC: LAR: admin, expert You can login on the local DCS PC ( HECLV, SCS, FEC, TERMO, ROD, PURITY ) Ø If You have P 1 - DSC: LARHV: admin, expert You can login on the local LAr HV PC P 1 roles for HW ONCALL experts : DCS: LAR: observer, DDCS: LAR: expert, CS: LARHV: expert S. Chekulaev 16

LAr DCS sub-sytems and last problem (alarms)

LAr DCS sub-sytems and last problem (alarms)

q. ROD crate monitoring and control Ø HW Ø 17 Wiener VME crates (

q. ROD crate monitoring and control Ø HW Ø 17 Wiener VME crates ( 7 racks in USA 15 L 2) + 4 TTC crates in USA 15 L 1 Ø 4 CANBUS lines to PC ( with one Kvaser Card) installed in ROD rack 10 -18 (USA 15) Ø Dedicated cooling station https: //atlasop. cern. ch/twiki/bin/view/Main/LAr. Rod. Backend S. Chekulaev 18

S. Chekulaev 19

S. Chekulaev 19

S. Chekulaev 20

S. Chekulaev 20

q. ROD crate monitoring and control ØPoints to attention : ØWiener PS temperature ØSBC

q. ROD crate monitoring and control ØPoints to attention : ØWiener PS temperature ØSBC reset Øexchange of Wiener PS S. Chekulaev 21

FEC Low Voltage and Temperature q HW Ø 58 x the same system (32

FEC Low Voltage and Temperature q HW Ø 58 x the same system (32 for Barrel and 26 for EC) 280 V Power Supply in USA 15 LV Power Supply in Tile finger region ELMB monitors FEC voltages, water temperature, LV Power supply Ø 12 CANBUS lines to PC (with 4 Kvaser cards) installed in LAr DCS rack, 2 CAN lines from each cryostat face to warrant the readout and 4 lines in USA 15 (for 280 PS) Ø 5 CAN Power Supply units (LAr DCS rack in USA 15) S. Chekulaev 22

FEC Low Voltage and Temperature q SW Ø Ø PVSS project “ ATLLARFEC ”,

FEC Low Voltage and Temperature q SW Ø Ø PVSS project “ ATLLARFEC ”, sys. # 55 , sys. Name = “ATLLARFEC” JCOP Framework – ELMB, CAN PSU (only) OPC CANopen server DDC q STATUS S. Chekulaev 23

FEC Low Voltage and Temperature S. Chekulaev 24

FEC Low Voltage and Temperature S. Chekulaev 24

q. LAr Purity Ø HW Ø 30 devises inside of the cryostats Ø 12

q. LAr Purity Ø HW Ø 30 devises inside of the cryostats Ø 12 analog boards in the front-end crates of the cryostats Ø 3 U crate is housed in the DCS rack (6 boards) Ø After digitization and histograming the results are transferred via a CANBUS to a PC. S. Chekulaev 25

q. LAr Purity Ø SW Ø Readout with Labview Ø PVSS project “ATLLARPUR”, sys.

q. LAr Purity Ø SW Ø Readout with Labview Ø PVSS project “ATLLARPUR”, sys. # 53 , sys. name = “ATLLARPUR” running on the same PC Ø OPC server for communication Labview and PVSS Ø Point to attention : Ø Purity spikes S. Chekulaev 26

S. Chekulaev 27

S. Chekulaev 27

FCAL current Purity Barrel C 3 S. Chekulaev 28

FCAL current Purity Barrel C 3 S. Chekulaev 28

HEC LV system q HW Ø 270 V power supply in USA 15 Ø

HEC LV system q HW Ø 270 V power supply in USA 15 Ø 4 HEC Low Voltage Power boxes per EC in Tile finger Region, 8 in total Ø 9 ELMB/per box ( 8 ELMB will be used for the control and measurement of the low voltage regulators and one to monitor and control power box) Ø 9 CANBUS lines ( 1 line/box + 1 for 270 PS) Ø Serial control lines for redundancy q SW Ø PVSS project “ ATLLARHECLV ”, sys. # 52 , sys. name = “ATLLARHECLV” Ø OPC server S. Chekulaev 29