Applications and the Grid The European Data Grid

  • Slides: 44
Download presentation
Applications and the Grid The European Data. Grid Project Team http: //www. eu-datagrid. org

Applications and the Grid The European Data. Grid Project Team http: //www. eu-datagrid. org Data. Grid is a project funded by the European Union Grid Tutorial 9/18/2020 – n° 1

Overview n An applications view of the Grid – Use Cases n High Energy

Overview n An applications view of the Grid – Use Cases n High Energy Physics s s n Earth Observation s s n Why we need to use GRIDs in HEP ? Brief mention of the Monarc Model and its evolutions towards LCG HEP data analysis and management and HEP requirements. Processing patterns Testbeds 1 and 2 validation : what has already been done on the testbed ? Current GRID-based distributed comp. model of the HEP experiments Mission and plans What do typical Earth Obs. applications do ? Biology s dg. BLAST Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 2

What all applications want from the Grid Ø A homogeneous way of looking at

What all applications want from the Grid Ø A homogeneous way of looking at a ‘virtual computing lab’ made up of heterogeneous resources as part of a VO(Virtual Organisation) which manages the allocation of resources to authenticated and authorised users n n A uniform way of ‘logging on’ to the Grid Basic functions for job submission, data management and monitoring § Ability to obtain resources (services) satisfying user requirements for data, CPU, software, turnaround…… Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 3

Common Applications Issues u Applications are the end-users of the GRID : they are

Common Applications Issues u Applications are the end-users of the GRID : they are the ones finally making the difference u All applications started modeling their usage of the GRID through USE CASES : a standard technique for gathering requirements in software development methodologies u Use Cases are narrative documents that describe the sequence of events of an actor using a system [. . . ] to complete processes u What Use Cases are NOT: n the description of an architecture n the representation of an implementation Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 4

Why Use Cases ? Applications domain ALICE ATLAS Domains interface CMS LHCb Other. HEP

Why Use Cases ? Applications domain ALICE ATLAS Domains interface CMS LHCb Other. HEP Other Apps HEP Common Application Layer … Data. GRID middleware PPDG, Gri. Phyn, EU-Data. GRID Bag of Services (GLOBUS, Codor-G , …) OS & Net services Computer Scientist domain Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 5

Use Cases (HEPCAL) http: //lhcgrid. web. cern. ch/LHCgrid/SC 2/RTAG 4/finalreport. doc Obtain Grid authorisation

Use Cases (HEPCAL) http: //lhcgrid. web. cern. ch/LHCgrid/SC 2/RTAG 4/finalreport. doc Obtain Grid authorisation Job catalogue update Ask for revocation of Grid authorisation Job catalogue query Grid login Job submission Browse Grid resources Job Output Access or Retrieval Error Recovery for Aborted or Failing Production Jobs DS metadata update Job Control DS metadata access Steer job submission Dataset registration to the Grid Job resource estimation Virtual dataset declaration Job environment modification Virtual dataset materialization Dataset upload User-defined catalogue creation Data set access Dataset transfer to non-Grid storage Dataset replica upload to the Grid Data set access cost evaluation Data set replication Job splitting Production job Analysis 1 Data set transformation Job monitoring Simulation Job Experiment software development for the Grid Physical data set instance deletion Data set deletion (complete) VO wide resource reservation User defined catalogue deletion (complete) VO wide resource allocation to users Data retrieval from remote Datasets Condition publishing Data set verification Software publishing Data set browsing Browse condition database Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 6

High Energy Physics applications Grid Tutorial - 9/18/2020 – Applications and the Grid -

High Energy Physics applications Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 7

The LHC challenge u HEP is carried out by a community of more than

The LHC challenge u HEP is carried out by a community of more than 10, 000 users spread all over the world u The CERN Large Hadron Collider at CERN is the most challenging goal for the whole HEP community in the next years u Test the standard model and the models beyond it (SUSY, GUTs) at an energy scale ( 7+7 Te. V p-p) corresponding to the very first instants of the universe after the big bang (<10 -13 s) , allowing the study of quark-gluon plasma u LHC experiments will produce an unprecedented amount of data to be acquired, stored, analysed : n n 1010 collision events / year (+ same from simulation) This corresponds to 3 - 4 PB data / year / experiment (ALICE, ATLAS, CMS, LHCb) Data rate ( input to data storage center ): up to 1. 2 GB/s per experiment Collision event records are large: up to 25 MB (real data) and 2 GB (simulation) Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 8

The LHC detectors ATLAS CMS ~6 -8 Peta. Bytes / year ~1010 events/year ~103

The LHC detectors ATLAS CMS ~6 -8 Peta. Bytes / year ~1010 events/year ~103 batch and interactive users LHCb Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 9

online system multi-level trigger filter out background reduce data volume 40 M l 1

online system multi-level trigger filter out background reduce data volume 40 M l 1 (40 TB/ sec) s peci 7 5 al h K leve ardw l 2 - Hz (7 a 5 5 KH embedde GB/sec) re d pr z( leve l o B/se cessor s c) PCs 5 G 100 3 (100 Hz M les. robertson@cern. ch Hz leve B/se data c) r e c offli o ne a rding & naly sis Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 10

detector event filter (selection & reconstruction) reconstruction Data Handling and Computation for Physics Analysis

detector event filter (selection & reconstruction) reconstruction Data Handling and Computation for Physics Analysis raw data event reprocessing processed data event summary data analysis batch physics analysis event simulation interactive physics analysis les. robertson@cern. ch analysis objects (extracted by physics topic)

Deploying the LHC Global Grid Service Lab m Uni x grid for a regional

Deploying the LHC Global Grid Service Lab m Uni x grid for a regional group Uni a CERN Tier 1 Lab a UK USA Tier 3 physics department France The LHC Tier 1 Computing Tier 2 Uni n Centre Italy CERN Tier 0 Japan Desktop. . . Lab b Germany Lab c Uni y grid for a physics study group Uni b les. robertson@cern. ch

HEP Data Analysis and Datasets u Raw data (RAW) n u u u hits,

HEP Data Analysis and Datasets u Raw data (RAW) n u u u hits, pulse heights Reconstructed data (ESD) ~ 100 k. Byte n u ~ 1 MByte tracks, clusters… Analysis Objects (AOD) k. Byte ~ 10 n Physics Objects n Summarized n Organized by physics topic Reduced AODs(TAGs) ~1 k. Byte histograms, statistical data on collections of events Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 13

HEP Data Analysis –processing patterns u Processing fundamentally independent (Embarassing parallel) due to independent

HEP Data Analysis –processing patterns u Processing fundamentally independent (Embarassing parallel) due to independent nature of ‘events’ n n So have concepts of splitting and merging Processing organised into ‘jobs’ which process N events s (e. g. simulation job organised in groups of ~500 events which takes ~ day to complete on one node) A processing for 10**6 events would then involve 2, 000 jobs merging into total set of 2 Tbyte n u Production processing is planned by experiment and physics group data managers(this will vary from expt to expt) n n n Reconstruction processing (1 -3 times a year of 10**9 events) Physics group processing (? 1/month). Produce ~10**7 AOD+TAG This may be distributed in several centres Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 14

Processing Patterns (2) u Individual physics analysis - by definition ‘chaotic’ (according to work

Processing Patterns (2) u Individual physics analysis - by definition ‘chaotic’ (according to work patterns of individuals) n u Hundreds of physicists distributed in expt may each want to access central AOD+TAG and run their own selections. Will need very selective access to ESD+RAW data (for tuning algorithms, checking occasional events) Will need replication of AOD+TAG in experiment, and selective replication of RAW+ESD n This will be a function of processing and physics group organisation in the experiment Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 15

Alice: Ali. En-EDG integration EDG RB Server v EDG UI Installation v JDL translation

Alice: Ali. En-EDG integration EDG RB Server v EDG UI Installation v JDL translation v Certificates v Alice SE on EDG nodes v Alice Data Catalogue access by EDG nodes Ali. En CE EDG UI EDG CE EDG SE Ali. En SE WNs Data Catalogue (Cerello, Barbera, Buncic Saiz, et al. ) Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 16

What have HEP experiments already done on the EDG testbeds 1. 0 and 2.

What have HEP experiments already done on the EDG testbeds 1. 0 and 2. 0 u The EDG User Community has actively contributed to the validation of the first and second EDG testbeds (feb 2002 – feb 2003) u All four LHC experiments have ran their software (firstly in some preliminary version) to perform the basics operations supported by the testbed 1 features provided by the EDG middleware u Validation included job submission (JDL), output retrieval, job status query, basic data management operations ( file replication, register into replica catalogs ), check of possible s/w dependencies or incompatibility (e. g. missing libs, rpms) problems u ATLAS, CMS, Alice have run intense production data challenges and stress tests during 2002 and 2003 Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 17

The CMS Stress Test u CMS Monte. Carlo production using BOSS and Impala tools.

The CMS Stress Test u CMS Monte. Carlo production using BOSS and Impala tools. n n u Originally designed for submitting and monitoring jobs on a ‘local’ farm (eg. PBS) Modified to treat Grid as ‘local farm’ December 2002 to January 2003 n 250, 000 events generated by job submission at 4 separate UI’s n 2, 147 event files produced n n 500 Gb data transferred using automated grid tools during production, including transfer to and from mass storage systems at CERN and Lyon Efficiency of 83% for (small) CMKIN jobs, 70% for (large) CMSIM jobs Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 18

The CMS Stress Test SE Disk Space (GB) lxshare 0393 100 lxshare 0384 1000(=100*10)

The CMS Stress Test SE Disk Space (GB) lxshare 0393 100 lxshare 0384 1000(=100*10) 40 + grid 007 g 1000 gppce 05 16 gppse 05 360 tbn 09 22 tbn 03 35 ccgridli 03 120 ccgridli 08 400 ccgridli 07 200 Legnaro cmsgrid 001 50 cmsgrid 002 513(+513) Padova grid 001 12 grid 005 680 Ecole Polytechnique polgrid 1 4 polgrid 2 220 Imperial College gw 39 16 fb 00 450 Site CE Number of CPUs CERN lxshare 0227 122 CNAF testbed 008 RAL NIKHEF LYON * * Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 19

CMS Stress Test : Architecture of the system CMS Ref. DB SE CE CMS

CMS Stress Test : Architecture of the system CMS Ref. DB SE CE CMS software BOSS DB parameters EDG Job output filtering Runtime monitoring WN JDL Workload Management System input data location UI IMPALA/BOSS Push data or info Pull info Replica Manager SE CE CMS software SE CE data registration CE CMS software SE Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 20

Main results and observations from CMS work u RESULTS Could distribute and run CMS

Main results and observations from CMS work u RESULTS Could distribute and run CMS s/w in EDG environment n Generated ~250 K events for Nb. of evts n physics with ~10, 000 jobs in 3 week period time u OBSERVATIONS n Were able to quickly add new sites to provide extra resources n Fast turnaround in bug fixing and installing new software n n Test was labour intensive (since software was developing and the overall system was initially fragile) New release EDG 2. 0 should fix the major problems providing a system suitable for full integration in distributed production Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 21

Earth Observation applications (WP 9) § § Global Ozone (GOME) Satellite Data Processing and

Earth Observation applications (WP 9) § § Global Ozone (GOME) Satellite Data Processing and Validation by KNMI, IPSL and ESA The Data. Grid testbed provides a collaborative processing environment for 3 geographically distributed EO sites (Holland, France, Italy) 9/18/2020 22 Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 22

Earth Observation ESA missions: • about 100 Gbytes of data per day (ERS 1/2)

Earth Observation ESA missions: • about 100 Gbytes of data per day (ERS 1/2) • 500 Gbytes, for the next ENVISAT mission (2002). Data. Grid contribute to EO: • enhance the ability to access high level products • allow reprocessing of large historical archives • improve Earth science complex applications (data fusion, data mining, modelling …) Federico. Carminati , EU review presentation, 1 March 2002 Source: L. Fusco, June 2001 Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 23

ENVISAT • • 3500 MEuro programme cost Launched on February 28, 2002 10 instruments

ENVISAT • • 3500 MEuro programme cost Launched on February 28, 2002 10 instruments on board 200 Mbps data rate to ground 400 Tbytes data archived/year ~100 “standard” products 10+ dedicated facilities in Europe • ~700 approved science user projects Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 24

Earth Observation § Two different GOME processing techniques will be investigated § § §

Earth Observation § Two different GOME processing techniques will be investigated § § § OPERA (Holland) - Tightly coupled - using MPI NOPREGO (Italy) - Loosely coupled - using Neural Networks The results are checked by VALIDATION (France). Satellite Observations are compared against ground-based LIDAR measurements coincident in area and time. Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 25

GOME OZONE Data Processing Model § § § Level-1 data (raw satellite measurements) are

GOME OZONE Data Processing Model § § § Level-1 data (raw satellite measurements) are analysed to retrieve actual physical quantities : Level-2 data provides measurements of OZONE within a vertical column of atmosphere at a given lat/lon location above the Earth’s surface Coincident data consists of Level-2 data co-registered with LIDAR data (ground-based observations) and compared using statistical methods Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 26

Raw satellite data from the GOME instrument The EO Data challenge: Processing and validation

Raw satellite data from the GOME instrument The EO Data challenge: Processing and validation of 1 year of GOME data Level 1 LIDAR data ESA – KNMI Processing of raw GOME data to ozone profiles With OPERA and NNO IPSL Validate GOME ozone profiles With Ground Based measur. Level 2 Data. Grid Visualization Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 27

GOME Processing Steps (1 -2) Step 1: Transfer Level 1 data to the Grid

GOME Processing Steps (1 -2) Step 1: Transfer Level 1 data to the Grid Storage Element Step 2: Register Level 1 data with the Replica. Manager Site H Site G Site CE SE F Site CE SE E Site CE SE D Replicate to other SEs if necessary User Interface Submit job CE SE C Site CE SE Site B CE SE Replica Manager Replicate User CE SE Replica Catalog Mydata Meta. Data input data Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 28

GOME Processing Steps (3 -4) Step 3: Step 4: Submit jobs to process Level

GOME Processing Steps (3 -4) Step 3: Step 4: Submit jobs to process Level 1 data, produce Level 2 data LFN : : PFN Transfer Level 2 data products to the Storage Element Information Index Search User Search Submit job User Interface Myjob JDL script Replica Catalog Resource Broker Request status Retrieve result Check certificate Site H Site G Site CE SE F Site CE SE E Site CE SE D CE SE C Site CE SE Site B CE SE input data Executable LFN LFN Certificate Authorities LFN Logical filename PFN Physical filename Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 29

GOME Processing Steps (5 -6) Step 5: Produce Level-2 / LIDAR Coincident data perform

GOME Processing Steps (5 -6) Step 5: Produce Level-2 / LIDAR Coincident data perform VALIDATION Level 2 Step 6: Visualize Results LIDAR COINCIDENT DATA Validation Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 30

Biomedical Applications Genomics, post-genomics, and proteomics Explore strategies that facilitate the sharing of genomic

Biomedical Applications Genomics, post-genomics, and proteomics Explore strategies that facilitate the sharing of genomic databases and test grid-aware algorithms for comparative genomics Medical images analysis Process the huge amount of data produced by digital imagers in hospitals. Federico. Carminati , EU review presentation, 1 March 2002 Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 31

Biology and Bio-informatics applications ØThe international community of Biologists has a keen interest in

Biology and Bio-informatics applications ØThe international community of Biologists has a keen interest in using of bio-informatic algorithms to perform research on the mapping of the human genomic code ØBiologist make use of large, geographically distributed databases with already mapped, identified sequences of proteins belonging to sequences of human genetic code (DNA sequencies) ØTypical goal of these algorithms is to analyse different databases, related to different samplings, to identify similarities or common parts Ødg. BLAST (Basic Local Alignment Search Tool ) is an example of such an application seeking particular sequences of proteins or DNA in the genomic code Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 32

u Grid technology opens the perspective of large computational power and easy access to

u Grid technology opens the perspective of large computational power and easy access to heterogeneous data sources. u. A grid for health would provide a framework for sharing disk and computing resources, for promoting standards and fostering synergy between bio-informatics and medical informatics u. A first biomedical grid is being deployed by the Data. Grid IST project http: //dbs. cordis. lu/fep-cgi/srchidadb? ACTION=D&SESSION=221592002 -1018&DOC=27&TBL=EN_PROJ&RCN=EP_RCN_A: 63345&CALLER=PROJ_IST Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 33

Biomedical requirements u Large user community(thousands of users) n u n Large volume management

Biomedical requirements u Large user community(thousands of users) n u n Large volume management (a hospital can accumulate TBs of images in a year) Security n u data updates and data versioning High priority jobs n u u communication between user interface and computation Parallelization n MPI site-wide / grid-wide n Thousands of images n u privileged users Interactivity n Data management n u anonymous/group login u Operated on by 10’s of algorithms Pipeline processing n pipeline description language / scheduling disk / network encryption Limited response time n fast queues Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 34

Diverse Users… u Patient n u Physician n u may obtain read access to

Diverse Users… u Patient n u Physician n u may obtain read access to anonymous medical data for research purposes. Nominative data should be blanked before transmission to these users Biologist n u has complete read access to patients data. Few persons have read/write access. Researchers n u has free access to own medical data has free access to public databases. Use web portal to access biology server services. Chemical/Pharmacological manufacturer n owns private data. Need to control the possible targets for data storage. Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 35

…and data u u Biological Data n Public and private databases n Very fast

…and data u u Biological Data n Public and private databases n Very fast growth (doubles every 8 -12 months) n Frequent updates (versionning) n Heterogenous formats Medical data n Strong semantic n Distributed over imaging sites n Images and metadata Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 36

Web portals for biologists u Biologist enters sequences through web interface u Pipelined execution

Web portals for biologists u Biologist enters sequences through web interface u Pipelined execution of bio-informatics algorithms n Genomics comparative analysis (thousands of files of ~Gbyte) s u Genome comparison takes days of CPU (~n**2) n Phylogenetics n 2 D, 3 D molecular structure of proteins… The algorithms are currently executed on a local cluster n Big labs have big clusters … n But growing pressure on resources – Grid will help s More and more biologists s compare larger and larger sequences (whole genomes)… s to more and more genomes… s with fancier and fancier algorithms !! Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 37

Example GRID application for Biology: dg. BLAST Ødg. BLAST requires as input a given

Example GRID application for Biology: dg. BLAST Ødg. BLAST requires as input a given sequence (protein or DNA) to be searched and a pointer to the set of databases to be queried. ØDesigned for high speed (trade off vs sensitivity) A score is assigned to every candidate sequence found and the results graphically presented Ø Ø uses an heuristic algorithm Ødetects relationships among sequences which share only isolated regions of similarity. ØBlastn: compares a nucleotide query sequence against a nucleotide sequence database ØBlastp: compares an amino acids query sequence against a protein sequence Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 38

The Visual Data. Grid Blast, a first genomics application on Data. Grid u u

The Visual Data. Grid Blast, a first genomics application on Data. Grid u u A graphical interface to enter query sequences and select the reference database A script to execute the BLAST algorithm on the grid A graphical interface to analyze result Accessible from the web portal genius. ct. infn. it Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 39

Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 40

Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 40

Other Medical Applications u Complex modelling of anatomical structures n u Surgery simulation n

Other Medical Applications u Complex modelling of anatomical structures n u Surgery simulation n u MRI modelling, artifacts modeling, parallel simulation Mammographies analysis n u Realistic models, real-time constraints Simulation of MRIs n u Anatomical and functional models, parallelizatoin Automatic pathologies detection Shared and distributed data management n Data hierarchy, dynamic indices, optimization, caching Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 41

Summary ( 1 / 2 ) ØHEP, EO and Biology users have deep interest

Summary ( 1 / 2 ) ØHEP, EO and Biology users have deep interest in the deployment and the actual availability of the GRID, boosting their computer power and data storage capacities in an unprecedented way. u. Currently evaluating the basic functionality of the tools and their integration into data processing schemes. Will move onto areas of interactive analysis, and more detailed interfacing via APIs Hopefully experiments will do common work in interfacing applications to GRID under the umbrella of LCG n HEPCAL (Common Use Cases for a HEP Common Application Layer) work will be used as a basis for the integration of Grid tools into the LHC prototype n http: //lcg. web. cern. ch/LCG/SC 2/RTAG 4 u. There are many grid projects in the world and we must work together with them e. g. in HEP we have Data. Tag, Crossgrid, Nordugrid + US Projects(Gry. Phyn, PPDG, i. VDGL) n Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 42

Summary ( 2 / 2 ) Ø Many challanging issues are facing us :

Summary ( 2 / 2 ) Ø Many challanging issues are facing us : Ø Ø Ø strengthen effective massive productions on the testbed EDG keep up the pace with next generation grid computing evolutions, implementing or interfacing them to EDG further develop middleware components for all EDG workpackages to address growing user’s demands. EDG 2. 0 will implement many new functionality. Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 43

Acknowlegements and references u Thanks to the following who provided material and advice n

Acknowlegements and references u Thanks to the following who provided material and advice n n u J Linford(WP 9), V Breton(WP 10), J Montagnat(WP 10), F Carminati(Alice), JJ Blaising(Atlas), C Grandi(CMS), M Frank(LHCb), L Robertson(LCG), D Duellmann(LCG/POOL) , T Doyle(UK Grid. PP), O. Maroney (RAL) F Harris(WP 8), I Augustin(WP 8) N Brook(LHCb), P Hobson (CMS), J Montagnat (WP 10) Some interesting WEB sites and documents - LHC Review http: //lhc-computing-review-public. web. cern. ch/lhc-computing-review-public/Public/Report_final. PDF (LHC Computing Review) n LCG http: //lcg. web. cern. ch/LCG/SC 2/RTAG 6 (model for regional centres) http: //lcg. web. cern. ch/LCG/SC 2/RTAG 4 n GEANT http: //www. dante. net/geant/ n POOL n WP 8 (HEPCAL Grid use cases) (European Research Networks) http: //lcgapp. cern. ch/project/persist/ http: //datagrid-wp 8. web. cern. ch/Data. Grid-WP 8/ http: //edmsoraweb. cern. ch: 8001/cedar/doc. info? document_id=332409 ( Requirements) n WP 9 http: //styx. srin. esa. it/grid http: //edmsoraweb. cern. ch: 8001/cedar/doc. info? document_id=332411 (Reqts) n WP 10 http: //marianne. in 2 p 3. fr/datagrid/wp 10/ http: //www. healthgrid. org http: //www. creatis. insa-lyon. fr/MEDIGRID/ http: //edmsoraweb. cern. ch: 8001/cedar/doc. info? document_id=332412 (Reqts) Grid Tutorial - 9/18/2020 – Applications and the Grid - n° 44