AQ monitoring data management Austrian Experience i Stockphoto

  • Slides: 55
Download presentation
AQ monitoring, data management Austrian Experience © i. Stockphoto. com/duskomatic

AQ monitoring, data management Austrian Experience © i. Stockphoto. com/duskomatic

Content Legal requirements, institutional arrangements Data management Quality assurance Data aggregation Recommendations 2

Content Legal requirements, institutional arrangements Data management Quality assurance Data aggregation Recommendations 2

Legal requirements, institutional arrangements 3

Legal requirements, institutional arrangements 3

Legal framework for AQ monitoring, assessment, reporting, and management Legal basis of monitoring, assessment

Legal framework for AQ monitoring, assessment, reporting, and management Legal basis of monitoring, assessment and reporting at 3 levels: 1. EU Directives (2008/50/EC, 2004/107/EC) give basic requirements ; Commission Implementing Decision 2011/850/EU detailed reporting requirements 2. Austrian Ambient Air Quality Act: transposition in national legislation, more administrative details 3. Ordinance for air quality measurement: technical details on: Minimum number and location of monitoring sites per zone information of the public, reporting QA/QC 4

Administrative Responsibilities Umweltbundesamt National background monitoring network national reference laboratory for quality assurance nation-wide

Administrative Responsibilities Umweltbundesamt National background monitoring network national reference laboratory for quality assurance nation-wide information of the public international data exchange, reporting Governments of 9 Federal Provinces Operation of regional AQ monitoring network, incl. quality assurance Information of the public in its territory Air quality management (AQ plans and programmes) 5

Zones and agglomerations Three agglomerations (>250. 000 inh. ): Vienna, Linz, Graz Zones for

Zones and agglomerations Three agglomerations (>250. 000 inh. ): Vienna, Linz, Graz Zones for SO 2, NOx, CO, PM 10, PM 2, 5, B(a)P - Non-agglomeration zones follow administrative boundaries of Federal Provinces Zones for Ozone, Benzene and Heavy Metals follow pollution patterns 6

Zones and agglomerations – ozone 7

Zones and agglomerations – ozone 7

AQ monitoring network PM 10 number of exceedance days 8 © i. Stockphoto. com/cmisje

AQ monitoring network PM 10 number of exceedance days 8 © i. Stockphoto. com/cmisje

Data management in Austria 9

Data management in Austria 9

AQ Data flows and management – overview Monitoring station (continuous monitoring) AQ data-base raw

AQ Data flows and management – overview Monitoring station (continuous monitoring) AQ data-base raw data Different validation levels Reports Information about function of station Daily, monthly data validation Final data validation, correction after calibration Calibration 10

AQ Data base AQ data management and dissemination Data checking tool: visualisation setting validity

AQ Data base AQ data management and dissemination Data checking tool: visualisation setting validity and control flags correction of data Near-real-time information of the public (www) Daily AQ report Hourly ozone map Hourly ozone report Line graphs Monthly, Annual AQ reports Submission to EC/EEA; EMEP; GAW 11

Data reporting, information of the public daily AQ reports information of the public about

Data reporting, information of the public daily AQ reports information of the public about exceedances of limit, information or alert values monthly AQ reports annual AQ reports, and the submission of AQ data to international data-bases (reporting to EU, EEA, EMEP) 12

EU e-reporting Air quality data & meta information Zones (B) Assessment regimes (C) Assessment

EU e-reporting Air quality data & meta information Zones (B) Assessment regimes (C) Assessment methods meta-data (D) Primary assessment data (E) Aggregated assessment data (F) Attainment information (G) AQ zones Assessmen t regimes AQ Plans Attainment information AQ data model Assessmen t methods Air quality management information Air Quality plans (H) Source apportionments (I) Scenarios (J) Measures (K) Aggregated data Primary UTD data Primary validated data Source: EEA 13

Data quality assurance, quality control 14

Data quality assurance, quality control 14

Quality assurance: Data checking and validation 1. Daily data check Identification of technical problems

Quality assurance: Data checking and validation 1. Daily data check Identification of technical problems at the monitoring station Failure of power supply or data transmission Defect of monitoring device Discussion of non-plausible values Flagging of erroneous values in the data-base 2. Monthly data check Flagging if necessary (information from station maintenance) 3. Annual data check Data correction according to calibration results if necessary 15

Daily data validation to identify data affected by technical problems at the monitoring station

Daily data validation to identify data affected by technical problems at the monitoring station to flag these data “not valid” to inform the responsible technician about technical problems (to exchange, repair or maintenance of monitoring equipment) 16

Information flow for daily data validation Monitoring station Raw AQ data AQ Data-base NRT

Information flow for daily data validation Monitoring station Raw AQ data AQ Data-base NRT AQ information Zero, Span values Statistical aggregation Monitoring station control: Temperature, Humidity, Air flow Monitoring device status information Plausibility check: Expert knowledge on „what is to be expected“ at this station Daily AQ Report Visualisation Exceedance of information or alert values Up-to-date (UTD) data submission to EEA Daily validation: setting validity and validation flags 17

Quality assurance at monitoring station 1. Daily automatic zero- and span-check (SO 2, NO,

Quality assurance at monitoring station 1. Daily automatic zero- and span-check (SO 2, NO, CO: gas cylinder; for ozone: calibrator) 2. Periodic station maintenance every 14 days: Manual zero- and span-check Exchange of filters, cleaning of air inlet Repair or exchange of equipment if necessary 3. Calibration 4 times per year By transfer standard (SO 2, NO, CO, O 3). Flow check for PM devices Cleaning of central air inlet 18

Information flow for monthly data validation Monitoring station Station maintenance (every second week): Information

Information flow for monthly data validation Monitoring station Station maintenance (every second week): Information about technical problems, zero/span check Use of information applied for the daily data check (zero, span values, monitoring device status information) AQ Data-base Monthly AQ Report Statistical aggregation Update of UDT data to EEA Visualisation Monthly validation: setting validity and validation flags, correction of zero deviation 19

Quarterly/annual data validation Information available: Zero and span deviation from the quarterly calibration by

Quarterly/annual data validation Information available: Zero and span deviation from the quarterly calibration by transfer standard Additional information to be used: Automatic zero and span check Manual zero and span checks from biweekly station maintenance Information about technical problems from technician observed at 2 -weekly station maintenance and calibration Status bits from monitoring devices give information about specific problems 20

Information flow for quarterly/annual (=final) data validation Monitoring station Station calibration (every 3 months)

Information flow for quarterly/annual (=final) data validation Monitoring station Station calibration (every 3 months) by transfer standard: Zero and span deviation Information from daily or monthly data check (zero, span values, monitoring device status information, information from station maintenance) National Annual AQ Report AQ Data-base Statistical aggregation Visualisation Annual submission of finally validated data to EU, EMEP Exceedance/ compliance information to EU Quarterly (=final) data validation: Correction of zero or span deviation setting validity and validation flags 21

Annual check of traceability National reference laboratory: Comparison of reference standards to reference standards

Annual check of traceability National reference laboratory: Comparison of reference standards to reference standards of other national reference laboratories Participation in European laboratory intercomparisons Regional AQ monitoring network: Comparison of standards to reference standards of national reference laboratory Participation in national laboratory intercomparisons 22

Data aggregation 23

Data aggregation 23

Information stored in data-base Corrected half-hour mean values original half-hour mean values Validity flags

Information stored in data-base Corrected half-hour mean values original half-hour mean values Validity flags Data check flags Information about corrections and changes: Reason of correction (from a predefined list) Time Technician Zero- and span-values Status bits 24

Criteria for data aggregation Laid down in annex VII and XI of the AQD

Criteria for data aggregation Laid down in annex VII and XI of the AQD (2008/50/EC) Application of specific rounding rules Certain data quality objectives apply (data capture, time coverage) Values below detection limit: Guidance to e-reporting Implementing Decision 2011/850/EU Aggregation according to definition of the threshold values (e. g. annual mean, number of daily mean values above a certain value, AOT 40, …) 25

Recommendations automatic daily function control in station (zero, span, calibration) and online data transfer

Recommendations automatic daily function control in station (zero, span, calibration) and online data transfer of information calibration 4 times per year important for quality of data traceability of standards important for QA/QC documentation also when not accredited meteorological parameters should be monitored at stations as well automated, standard methods should be preferred wherever possible 26

Recommendations regular exchange within country and between countries get involved in AQUILA consider future

Recommendations regular exchange within country and between countries get involved in AQUILA consider future costs of monitoring methods, laboratory equipment and material from the beginning keep it simple, choose simple, cost-efficient methods; keep number of stations low at the beginning preserve continuity of personnel and methods provide regular training for personnel 27

Further information e-reporting guidance Ambient Air Quality Directive (2008/50/EC) Review of European Air Quality

Further information e-reporting guidance Ambient Air Quality Directive (2008/50/EC) Review of European Air Quality policy 28

Contact & Information Christian Nagl +43 -1 -31304 -5866, christian. nagl@umweltbundesamt. at Umweltbundesamt www.

Contact & Information Christian Nagl +43 -1 -31304 -5866, christian. nagl@umweltbundesamt. at Umweltbundesamt www. umweltbundesamt. at Monitoring, analyzing and reporting on air quality in Turkmenistan Ashgabat ■ 2 -3 December 2014 29

Some further information 30

Some further information 30

Data checking and validation – Validity flags Automatically created flags (station data logger) Criterion

Data checking and validation – Validity flags Automatically created flags (station data logger) Criterion valid >75% minute values not valid <75% minute values Manual data validation (data checking application) Criterion not valid malfunction of monitoring device (due to various reasons); not ambient air measured valid, data corrected data manually altered (correction after calibration) 31

Data checking and validation – Data validation flags Step Control procedure Time interval Data

Data checking and validation – Data validation flags Step Control procedure Time interval Data used for 0 “unchecked” (raw) data from station data logger 1 first plausibility check daily Daily AQ report; near-real-time information of the public - exceedance of limit, information or alert values 2 plausibility check based on information from station maintenance monthly Monthly AQ report 3 Correction of zero- or spanannually deviation based on calibration with transfer standard Near-real-time information of the public (no exceedance of limit, information or alert values) Annual AQ report Data submission to international databases: EU/EEA, EMEP, GAW 32

Daily data validation – problem (solving) Problem Possible causes Possible actions No data of

Daily data validation – problem (solving) Problem Possible causes Possible actions No data of the whole monitoring network § data-base out of order contact responsible IT expert(s) § modems for receiving data our No data from one monitoring station § electric power supply out of No data from one monitoring device § defect of monitoring device – of order § order modem at monitoring station out of order status bits can give detailed information contact company which supplies electricity try to contact modem Visit monitoring site visit monitoring site: repair or replace monitoring device 33

Daily data validation – problem (solving) Problem Possible causes Possible actions Non plausible values

Daily data validation – problem (solving) Problem Possible causes Possible actions Non plausible values § defect of monitoring device – § set validity flag “not valid” § visit monitoring site: repair or status bits can give detailed information replace monitoring device Zero or span checks out of allowed range § problems at monitoring device visit monitoring site: repair or Temperature or humidity inside monitoring station out of range § air conditioning out of order or calibration equipment replace monitoring device or calibration equipment Calibration by independent device visit monitoring site: repair airconditioning 34

Data visualisation and checking application Data checking tool – example: Visualisation of 6 time

Data visualisation and checking application Data checking tool – example: Visualisation of 6 time series 35

Data visualisation and checking application Example: Tabular visualisation of half-hour mean values with validity

Data visualisation and checking application Example: Tabular visualisation of half-hour mean values with validity flags („gültig“, „unter 90%“, „ungültig“ and status bits during zero-span-check („ SPAN ZERO “) 36

Monthly data check Problem Possible causes Possible actions Non plausible values (not assessed by

Monthly data check Problem Possible causes Possible actions Non plausible values (not assessed by daily data check) defect of monitoring device or airconditioning – status bits and information by technician (station maintenance) can give detailed information set validity flag “not valid” AQ data slightly below zero Small negative zero offset negative values to 0 (for publication in monthly report) 37

Quarterly/annual data check – problem (solving) Problem Possible causes Possible actions Zero or span

Quarterly/annual data check – problem (solving) Problem Possible causes Possible actions Zero or span deviation at calibration Zero or span drift Correct data according of monitoring to calibration results device AQ data slightly below 0 after zero deviation correction Small negative set negative values to zero offset or drift 0 38

Correction of measurement data Numerical correction of measurement data is based on the results

Correction of measurement data Numerical correction of measurement data is based on the results of the zero and span deviation observed by the calibration with the transfer standard: Xcorr (t) = k(t) * Xorig(t) + d(t) (Xoriginal, Xcorrected measurement value) Calibration function k and the zero offset d can be interpolated linearly over time 39

Automated outlier check identifies measurement values outside a pre-defined range list of values which

Automated outlier check identifies measurement values outside a pre-defined range list of values which have to be checked manually on plausibility and possible technical problems validity flags not set automatically 40

Automated outlier check + Advantages: Saves time by automated identification of outliers Enables fast

Automated outlier check + Advantages: Saves time by automated identification of outliers Enables fast survey of technical problems in a large network − Disadvantages: Criteria for outliers have to be defined individually for different types of monitoring stations, depending on “usual” pollution level Cannot take into account extraordinary events (desert dust, extremely unfavourably dispersion situations) Cannot replace “expert knowledge” 41

Data flow for information of the public Monitoring station Regional Network Data base local/regional

Data flow for information of the public Monitoring station Regional Network Data base local/regional information on www Daily data check/ validation National Data base Nation-wide Information on www Maintenance, repair Information on technical problems at a monitoring station EEA Ozone web 42

Data visualisation and checking application Data checking, flagging and correction is done using a

Data visualisation and checking application Data checking, flagging and correction is done using a data checking application which visualises AQ data from the data-base visualises function control values (zero, span) from the data-bases enables setting data control flags enables correcting data (including changing data to “not valid”) Keeps original data (after any correction) 43

Data visualisation and checking application Example: Comparison of different pollutants – NOx (green) and

Data visualisation and checking application Example: Comparison of different pollutants – NOx (green) and PM 10 (light blue) peaks coincide with each other and with low O 3 (orange) concentrations 44

Data visualisation and checking application Example: Comparison NOx concentrations at three different stations 45

Data visualisation and checking application Example: Comparison NOx concentrations at three different stations 45

Data visualisation and checking application Example: Visualisation of Zero and Span values (NO, ppb)

Data visualisation and checking application Example: Visualisation of Zero and Span values (NO, ppb) 46

Data visualisation and checking application Example: Tabular visualisation of half-hour mean values with validity

Data visualisation and checking application Example: Tabular visualisation of half-hour mean values with validity flags („gültig“, „unter 90%“, „ungültig“ and status bits for defect of device („OUT OF SERVICE “) 47

Quaterly/Annual data corrction Correction of span deviation by function Xcorr (t) = k(t) *

Quaterly/Annual data corrction Correction of span deviation by function Xcorr (t) = k(t) * Xorig(t) + d(t) Example: span deviation 8% at end of period (gives factor 1, 08) 48

Data quality objectives Criteria in Annex I of AQD and Annex IV of 4

Data quality objectives Criteria in Annex I of AQD and Annex IV of 4 DD Uncertainty (fixed measurements) 15%: SO 2, NOx, CO; 25%: PM, Pb, C 6 H 6 Time coverage: continuous fixed measurements: 100% Time coverage = 100 * Nplanned / Nyear % Data capture (90% for fixed measurement): Data capture = 100 * Nvalid / Ntotal % regular calibration or normal maintenance not included (5% of measurement time) “effective” data capture criterion for continuous measurement is 85% 49

Rounding rules Data should be used with same number of digits as obtained Commercial

Rounding rules Data should be used with same number of digits as obtained Commercial rounding rules: <0. 5 rounded to 0, ≥ 0. 5 rounded to 1 Rounding should very last step before comparing to threshold Rounding to same numeric accuracy as environmental objective Rounding if no environmental objective Value x Number of decimals Example before rounding After rounding x ≥ 10 integer 17. 83 18 1 ≤ x < 10 1 decimal 2. 345 2. 3 0. 1 ≤ x < 1 2 decimals 0. 865 0. 87 0. 01 ≤ x < 0. 1 3 decimals 0. 0419 0. 042 Etc… 50

Data below detection limit 51

Data below detection limit 51

Meta-Information Zones (B) Spatial delimitation, area, population Assessment regimes (C) Links Zone, pollutant, environmental

Meta-Information Zones (B) Spatial delimitation, area, population Assessment regimes (C) Links Zone, pollutant, environmental objective (different types of Limit Values, Target Values, etc. ) with Assessment Type (fixed measurement, indicative measurement, modelling, objective estimation – depending on pollution level) Assessment methods meta-data (D) Information on monitoring networks, location and description of monitoring stations, measurement equipment, sampling and analytical equipment Source: EEA 52

Primary assessment data (E) E 1 a: Validated primary assessment data – deadline 30.

Primary assessment data (E) E 1 a: Validated primary assessment data – deadline 30. Sept. of following year E 2 a: Preliminary up-to-date assessment data – hourly transmission. Update after further validation steps is possible. Dataflow comprises Reference to D (monitoring station) Time period of data set Value with Start & End-time Validity & Validation status for finally validated data: Information related to data quality objectives 53

Aggregated assessment data (F) Statistics 120 100 Number of exceedances per year 80 AOT

Aggregated assessment data (F) Statistics 120 100 Number of exceedances per year 80 AOT 40 40 20 2008 2007 2006 2005 2004 2003 2002 2001 0 1999 …. 60 2000 Annual mean, daily mean Aggregation by EEA based on primary assessment data (E) 54

Attainment information (G) Legally relevant information by Member States on attainment or exceedance of

Attainment information (G) Legally relevant information by Member States on attainment or exceedance of limit or target values per zone Information to be provided: Numeric value of exceedance or Number of exceedances per year Exceedance description: spatial delimitation of exceedance area, affected population/ecosystem area Derogation of contributions from winter sanding & salting and/or natural sources has to be documented Final Exceedance description The attainment/exceedance information has to be provided by the MS and is checked by EEA using the aggregated data (F) based on the primary validated assessment data (E 1). 55