Shake Alert CISN Testing Center CTC Development Philip

  • Slides: 45
Download presentation
Shake. Alert CISN Testing Center (CTC) Development Philip Maechling, Maria Liukis, Thomas H. Jordan

Shake. Alert CISN Testing Center (CTC) Development Philip Maechling, Maria Liukis, Thomas H. Jordan Southern California Earthquake Center (SCEC) 14 October 2010 SCEC: An NSF + USGS Research Center

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -present). Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 2

EEW Testing Center Provides On-going Performance Evaluation • • • Performance summaries available through

EEW Testing Center Provides On-going Performance Evaluation • • • Performance summaries available through login (www. scec. org/eew) Evaluation results for 2010 include 144 M 4+ earthquakes in CA Testing Region Cumulative raw summaries (2008 -present) posted at (scec. usc. edu/scecpedia/Earthquake_Early_Warning)

EEW Testing Center Provides On-going Performance Evaluation Example Performance Information from Algorithm Testing System

EEW Testing Center Provides On-going Performance Evaluation Example Performance Information from Algorithm Testing System for 2010 Total Events M 4. 0+ in California Testing Region in 2010 : 146 Events M 4. 0+ in region in 2010 with EEW triggers : 57 Events M 4. 0+ in region in 2010 with only good triggers Events M 4. 0+ in region in 2010 with on missed triggers Events M 4. 0+ in region in 2010 with both types triggers : 45 : 11

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -present). Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 5

CISN Testing Center (CTC) Forecast Evaluation Processing System Shake. Map RSS Feed Ground Motion

CISN Testing Center (CTC) Forecast Evaluation Processing System Shake. Map RSS Feed Ground Motion Observations Retrieve Data Filter Catalog Earthquake Catalog Filtered Earthquake Catalog Observed ANSS EQ Parameter and Ground Motion Data ANSS Earthquake Catalog Evaluation tests comparing Forecasts and Observations CISN Decision Modules Shake. Alert Earthquake Parameter Forecast Load Reports Shake. Alert Ground Motion Forecast CISN User Modules Shake. Alert Forecast Information CISN EEW Testing Center (CTC) and Web Site

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -present). Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 9

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -present). Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 11

CTC Progress in 2010 Current Shake. Alert CTC retrieves Shake. Map RSS data and

CTC Progress in 2010 Current Shake. Alert CTC retrieves Shake. Map RSS data and plots observations for all Mag 3. 0+ earthquakes in California Testing Region as shown (left). 12

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -Present) Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 13

CISN Testing Center (CTC) Forecast Evaluation Processing System Shake. Map RSS Feed Ground Motion

CISN Testing Center (CTC) Forecast Evaluation Processing System Shake. Map RSS Feed Ground Motion Observations Retrieve Data Filter Catalog Earthquake Catalog Filtered Earthquake Catalog Observed ANSS EQ Parameter and Ground Motion Data ANSS Earthquake Catalog Evaluation tests comparing Forecasts and Observations CISN Decision Modules Shake. Alert Earthquake Parameter Forecast Load Reports Shake. Alert Ground Motion Forecast CISN User Modules Shake. Alert Forecast Information CISN EEW Testing Center (CTC) and Web Site

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system

CTC Progress in 2010 1. 2. 3. 4. 5. 6. Operating algorithm evaluation system with California-based performance reports and raw data available (2008 -Present) Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added Shake. Map RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in Shake. Maps for each event. Began nightly automated retrieval of observational data from Shake. Map RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 15

CTC Progress in 2010 Initial CTC Evaluation Test is defined in 2008 CISN EEW

CTC Progress in 2010 Initial CTC Evaluation Test is defined in 2008 CISN EEW Testing Document (as updated July 2010). Previous Algorithm Testing Center did not implement this summary. Access to Shake. Map RSS ground motion observations makes automated implementation practical. 16

Related EEW Activities 1. Caltech helped us with EEW analysis for SCEC M 8

Related EEW Activities 1. Caltech helped us with EEW analysis for SCEC M 8 scenario earthquake simulation for a north-to-south (Cholamane to Bombay Beach) rupture. Triggers station list, and warning times for event for each station listed on web page (http: //scec. usc. edu/scecpedia/M 8_EEW_Analysis) 2. Caltech Civil Engineering helped us with building response studies for this event using Caltech Frame 3 D system resulting in animations of 18 story steel frame building at various sites in California. 3. Visual Guide to Modified Mercalli Intensity Scale using You. Tube video as examples of MMI levels posted: (http: //scec. usc. edu/scecpedia/SCEC_Visualization_Projects) 17

Scientific and Technical Coordination Issues 1. 2. 3. 4. 5. Prioritize forecast evaluation tests

Scientific and Technical Coordination Issues 1. 2. 3. 4. 5. Prioritize forecast evaluation tests to be implemented in CTC Coordinate Shake. Alert information transfer to CTC SCEC science planning of EEW forecast evaluation experiments Use of EEW in time-dependent PSHA information Consider Extending Shake. Map format as CAP-based forecast exchange format. – Send forecasts information (and time of report) to produce: – Shake. Map Intensity Maps – Shake. Map Uncertainties Maps 6. Consider Shake. Alert interfaces to support comparative EEW performance tests. Provide access to information for each trigger: – Stations Used In Trigger – Stations Available when declaring Trigger – Software Version declaring Trigger 18

End 19

End 19

Proposed CTC Evaluation Tests 20

Proposed CTC Evaluation Tests 20

Design of an Experiment Rigorous CISN EEW testing will involve the following definitions: –

Design of an Experiment Rigorous CISN EEW testing will involve the following definitions: – – – Define a forecast Define testing area Define input data used in forecasts Define reference observation data Define measures of success forecasts 21

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 22

Experiment Design Summary 1. 1: Magnitude X-Y Diagram Measure of Goodness: Data points fall

Experiment Design Summary 1. 1: Magnitude X-Y Diagram Measure of Goodness: Data points fall on diagonal line Relevant: T 2, T 3, T 4 Drawbacks: Timeliness element not represented Which in series of magnitude estimates should be used in plot. 23

Experiment Design Summary 1. 2: Initial magnitude error by magnitude Measure of Goodness: Data

Experiment Design Summary 1. 2: Initial magnitude error by magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T 2, T 3, T 4 Drawbacks: Timeliness element not represented 24

Experiment Design Summary 1. 3: Magnitude accuracy by update Measure of Goodness: Data points

Experiment Design Summary 1. 3: Magnitude accuracy by update Measure of Goodness: Data points fall on horizontal line Relevant: T 3, T 4 Drawbacks: Timeliness element not represented 25

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 26

Experiment Design Summary 2. 1: Cumulative Location Errors Measure of Goodness: Data points fall

Experiment Design Summary 2. 1: Cumulative Location Errors Measure of Goodness: Data points fall on vertical zero line Relevant: T 3, T 4 Drawbacks: Does not consider magnitude accuracy or timeliness 27

Experiment Design Summary 2. 2: Magnitude and Location error by time after origin Measure

Experiment Design Summary 2. 2: Magnitude and Location error by time after origin Measure of Goodness: Data points fall on horizontal zero line Relevant: T 3, T 4 Drawbacks: Event-specific not cumulative 28

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 29

Experiment Design Summary 3. 1 : Intensity Map Comparisons Measure of Goodness: Forecast map

Experiment Design Summary 3. 1 : Intensity Map Comparisons Measure of Goodness: Forecast map matches observed map Relevant: T 4 Drawbacks: Not a quantitative results 30

Experiment Design Summary 3. 2: Intensity X-Y Diagram Measure of Goodness: Data points fall

Experiment Design Summary 3. 2: Intensity X-Y Diagram Measure of Goodness: Data points fall on diagonal line Relevant: T 1, T 2, T 4 Drawbacks: Timeliness element not represented Which in series of intensity estimate should be used in plots T 3. 31

Experiment Design Summary 3. 3: Intensity Ratio by Magnitude Measure of Goodness: Data points

Experiment Design Summary 3. 3: Intensity Ratio by Magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T 1, T 2, T 4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. 32

Summary 3. 3: Predicted to Observed Intensity Ratio by Distance and Magnitude Measure of

Summary 3. 3: Predicted to Observed Intensity Ratio by Distance and Magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T 1, T 2, T 4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. 33

Summary 3. 3: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate

Summary 3. 3: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. 34

Summary 3. 4: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate

Summary 3. 4: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. 35

Experiment Design Summary 3. 5: Statistical Error Distribution for Magnitude and Intensity Measure of

Experiment Design Summary 3. 5: Statistical Error Distribution for Magnitude and Intensity Measure of Goodness: No missed events or false alarms in testing area Relevant: T 4 Drawbacks: 36

Experiment Design Summary 3. 6: Mean-time to first location or intensity estimate (small blue

Experiment Design Summary 3. 6: Mean-time to first location or intensity estimate (small blue plot) Measure of Goodness: Peak of measures at zero Relevant: T 1, T 2, T 3, T 4 Drawbacks: Cumulative and does not involve accuracy of estimates 37

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 38

Experiment Design No examples for System Performance Summary defined as Summary 4. 1: Ratio

Experiment Design No examples for System Performance Summary defined as Summary 4. 1: Ratio of reporting versus non-reporting stations: 39

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 40

Experiment Design Summary 5. 1: Missed event and False Alarm Map Measure of Goodness:

Experiment Design Summary 5. 1: Missed event and False Alarm Map Measure of Goodness: No missed events or false alarms in testing area Relevant: T 3, T 4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness 41

Experiment Design Summary 5. 2: Missed event and False Alarm Map Measure of Goodness:

Experiment Design Summary 5. 2: Missed event and False Alarm Map Measure of Goodness: No missed events or false alarms in testing area Relevant: T 3, T 4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness 42

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3

Proposed Performance Measures Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. – – – Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers 43

Experiment Design Summary 6. 1: Missed Event map Measure of Goodness: No missed events

Experiment Design Summary 6. 1: Missed Event map Measure of Goodness: No missed events in testing region Relevant: T 3, T 4 Drawbacks: Must define missed event. Does not indicate timeliness 44

End 45

End 45