Evaluating Crosslanguage Information Retrieval Systems Carol Peters IEICNR

  • Slides: 42
Download presentation
Evaluating Cross-language Information Retrieval Systems Carol Peters IEI-CNR

Evaluating Cross-language Information Retrieval Systems Carol Peters IEI-CNR

Outline w Why IR System Evaluation is Important w Evaluation programs w An Example

Outline w Why IR System Evaluation is Important w Evaluation programs w An Example SPINN Seminar, Copenhagen 26 -27 October 2001

What is an IR System Evaluation Campaign? w An activity which tests the performance

What is an IR System Evaluation Campaign? w An activity which tests the performance of different systems on a given task (or set of tasks) under standard conditions w Permits contrastive analysis of approaches/technologies SPINN Seminar, Copenhagen 26 -27 October 2001

How well does system meet information need? w System evaluation: how good are document

How well does system meet information need? w System evaluation: how good are document rankings? w User-based evaluation: how satisfied is the user? SPINN Seminar, Copenhagen 26 -27 October 2001

Why we need Evaluation w evaluation permits hypotheses to be validated and progress assessed

Why we need Evaluation w evaluation permits hypotheses to be validated and progress assessed w evaluation helps to identify areas where more R&D is needed w evaluation saves developers time and money CLIR systems are still in experimental stage Evaluation is particularly important! SPINN Seminar, Copenhagen 26 -27 October 2001

CLIR System Evaluation is Complex CLIR systems consist of integration of components and technologies

CLIR System Evaluation is Complex CLIR systems consist of integration of components and technologies need to evaluate single components u need to evaluate overall system performance u need to distinguish methodological aspects from linguistic knowledge u SPINN Seminar, Copenhagen 26 -27 October 2001

Technology vs. Usage Evaluation: u u u shows value of a technology for user

Technology vs. Usage Evaluation: u u u shows value of a technology for user determines the technology thresholds that are indispensable for specific usage provides directions for choice of criteria for technology evaluation Influence of language and culture on usability of technology needs to be understood SPINN Seminar, Copenhagen 26 -27 October 2001

Organising an Evaluation Activity u u u select control task(s) provide data to test

Organising an Evaluation Activity u u u select control task(s) provide data to test and tune systems define protocol and metrics to be used in results assessment Aim is an objective comparison between systems and approaches SPINN Seminar, Copenhagen 26 -27 October 2001

Test Collection w Set of documents - must be representative of task of interest;

Test Collection w Set of documents - must be representative of task of interest; must be large w Set of “topics” - statement of user needs from which system data structure (query) is extracted w Relevance judgments – judgments vary by assessor but no evidence that differences affect comparative evaluation of systems SPINN Seminar, Copenhagen 26 -27 October 2001

Using Pooling to Create Large Test Collections A variety of different systems retrieve the

Using Pooling to Create Large Test Collections A variety of different systems retrieve the top 1000 documents for each topic. Assessors create topics. Systems are evaluated using relevance judgments. Ellen Voorhees – CLEF 2001 Workshop Form pools of unique documents from all submissions which the assessors judge for relevance. SPINN Seminar, Copenhagen 26 -27 October 2001

Cross-language Test Collections Consistency harder to obtain than for monolingual u parallel or comparable

Cross-language Test Collections Consistency harder to obtain than for monolingual u parallel or comparable document collections u multiple assessors per topic creation and relevance assessment (for each language) u must take care when comparing different language evaluations (e. g. , cross run to mono baseline) Pooling harder to coordinate u need to have large, diverse pools for all languages u retrieval results are not balanced across languages Taken from Ellen Voorhees – CLEF 2001 Workshop SPINN Seminar, Copenhagen 26 -27 October 2001

Evaluation Measures w Recall: measures ability of system to find all relevant items no.

Evaluation Measures w Recall: measures ability of system to find all relevant items no. of rel. items retrieved recall = -----------------no. of rel. items in collection w Precision: measures ability of system to find only relevant items no. of rel. items retrieved precision = -----------------total no. of items retrieved Recall-Precision Graph is used to compare systems SPINN Seminar, Copenhagen 26 -27 October 2001

Main CLIR Evaluation Programs w TIDES: sponsors TREC (Text REtrieval Conferences) and TDT (Topic

Main CLIR Evaluation Programs w TIDES: sponsors TREC (Text REtrieval Conferences) and TDT (Topic Detection and Tracking) - Chinese-English tracks in 2000; TREC focussing on English/French - Arabic in 2001 w NTCIR: Nat. Inst. for Informatics, Tokyo. Chinese. English; Japanese-English C-L tracks w AMARYLLIS: focused on French; 98 -99 campaign included C-L track; 3 rd campaign begins Sept. 01 w CLEF: Cross Language Evaluation Forum - C-L evaluation for European languages SPINN Seminar, Copenhagen 26 -27 October 2001

Cross-Language Evaluation Forum w Funded by DELOS Network of Excellence for Digital libraries and

Cross-Language Evaluation Forum w Funded by DELOS Network of Excellence for Digital libraries and US National Institute for Standards and Technology (200 -2001) w Extension of CLIR track at TREC (1997 -1999) w Coordination is distributed - national sites for each language in multilingual collection SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF Partners (2000 -2001) w Eurospider, Zurich, Switzerland (Peter Schäuble, Martin Braschler) w IEEC-UNED,

CLEF Partners (2000 -2001) w Eurospider, Zurich, Switzerland (Peter Schäuble, Martin Braschler) w IEEC-UNED, Madrid, Spain (Felisa Verdejo, Julio Gonzalo) w IEI-CNR, Pisa, Italy (Carol Peters) w IZ Sozialwissenschaften, Bonn, Germany (Michael Kluck) w NIST, Gaithersburg MD, USA (Donna Harman, Ellen Voorhees) w University of Hildesheim, Germany (Christa Womser. Hacker) w University of Twente, The Netherlands (Djoerd Hiemstra) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF - Main Goals Promote research by providing an appropriate infrastructure for: u CLIR

CLEF - Main Goals Promote research by providing an appropriate infrastructure for: u CLIR system evaluation, testing and tuning u comparison and discussion of results u building of test-suites for system developers SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Task Description Four main evaluation tracks in CLEF 2001: multilingual information retrieval

CLEF 2001 Task Description Four main evaluation tracks in CLEF 2001: multilingual information retrieval u bilingual IR u monolingual (non-English) IR u domain-specific IR plus u experimental track for interactive C-L systems u SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Data Collection w Multilingual comparable corpus of news agencies and newspaper documents

CLEF 2001 Data Collection w Multilingual comparable corpus of news agencies and newspaper documents for six languages (DE, EN, FR, IT, NL, SP). Nearly 1 million documents w Common set of 50 topics (from which queries are extracted) created in 9 European languages (DE, EN, FR, IT, NL, SP+FI, RU, SV) and 3 Asian languages (JP, TH, ZH) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Creating the Queries w Title: European Industry w Description: What factors damage

CLEF 2001 Creating the Queries w Title: European Industry w Description: What factors damage the competitiveness of European industry on the world's markets? w Narrative: Relevant documents discuss factors that render European industry and manufactured goods less competitive with respect to the rest of the world, e. g. North America or Asia. Relevant documents must report data for Europe as a whole rather than for single European nations. Queries are extracted from topics: 1 or more fields SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Creating the Queries w Distributed activity (Bonn, Gaithersburg, Pisa, Hildesheim, Twente, Madrid)

CLEF 2001 Creating the Queries w Distributed activity (Bonn, Gaithersburg, Pisa, Hildesheim, Twente, Madrid) w Each group produced 13 -15 queries (topics), 1/3 local, 1/3 European, 1/3 international w Topic selection at meeting in Pisa (50 topics) w Topics were created in DE, EN, FR, IT, NL, SP and additionally translated to SV, RU, FI and TH, JP, ZH w Cleanup after topic translation SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Multilingual IR Topics either DE, EN, FR, IT FI, NL, SP, SV,

CLEF 2001 Multilingual IR Topics either DE, EN, FR, IT FI, NL, SP, SV, RU, ZH, JP, TH documents English German French Italian Participant’s Cross-Language Information Retrieval System One result list of DE, EN, FR, IT and SP documents ranked in decreasing order of estimated relevance SPINN Seminar, Copenhagen 26 -27 October 2001 Spanish

CLEF 2001 Bilingual IR Task: query English or Dutch target document collections Goal: retrieve

CLEF 2001 Bilingual IR Task: query English or Dutch target document collections Goal: retrieve documents for target language, listing results in ranked list Easier task for beginners ! SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Monolingual IR Task: querying document collections in FR|DE|IT|NL|SP Goal: acquire better understanding

CLEF 2001 Monolingual IR Task: querying document collections in FR|DE|IT|NL|SP Goal: acquire better understanding of languagedependent retrieval problems u u different languages present different retrieval problems issues involved include word order, morphology, diacritic characters, language variants SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Domain-Specific IR Task: querying a structured database from a vertical domain (social

CLEF 2001 Domain-Specific IR Task: querying a structured database from a vertical domain (social sciences) in German u German/English/Russian thesaurus and English translations of document titles u Monolingual or cross-language task Goal: understand implications of querying in domain-specific context SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Interactive C-L Task: interactive document selection in an “unknown” target language Goal:

CLEF 2001 Interactive C-L Task: interactive document selection in an “unknown” target language Goal: evaluation of results presentation rather than system performance SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001: Participation 34 participants, 15 different countries N. America Asia Europe SPINN Seminar,

CLEF 2001: Participation 34 participants, 15 different countries N. America Asia Europe SPINN Seminar, Copenhagen 26 -27 October 2001

Details of Experiments Track # Participants # Runs/Experiments Multilingual 8 26 Bilingual to EN

Details of Experiments Track # Participants # Runs/Experiments Multilingual 8 26 Bilingual to EN 19 61 Bilingual to NL 3 3 Monolingual DE 12 25 Monolingual ES 10 22 Monolingual FR 9 18 Monolingual IT 8 14 Monolingual NL 9 19 Domain-specific 1 4 Interactive 3 6 SPINN Seminar, Copenhagen 26 -27 October 2001

Runs per Topic Language SPINN Seminar, Copenhagen 26 -27 October 2001

Runs per Topic Language SPINN Seminar, Copenhagen 26 -27 October 2001

Topic Fields SPINN Seminar, Copenhagen 26 -27 October 2001

Topic Fields SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Participation w w w CMU Eidetica Eurospider * Greenwich U HKUST Hummingbird

CLEF 2001 Participation w w w CMU Eidetica Eurospider * Greenwich U HKUST Hummingbird IAI * IRIT * ITC-irst * JHU-APL * Kasetsart U KCSL Inc. w Medialab w w w w w Nara Inst. of Tech. National Taiwan U OCE Tech. BV SICS/Conexor SINAI/U Jaen Thomson Legal * TNO TPD * U Alicante U Amsterdam U Exeter w w w w w U Glasgow * U Maryland * (interactive only) U Montreal/RALI * U Neuchâtel U Salamanca * U Sheffield * (interactive only) U Tampere * U Twente (*) UC Berkeley (2 groups) * UNED (interactive only) (* = also participated in 2000) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Approaches All traditional approaches used: u commercial MT systems (Systran, Babelfish, Globalink

CLEF 2001 Approaches All traditional approaches used: u commercial MT systems (Systran, Babelfish, Globalink Power Translator, ) u u u both query and document translation tried bilingual dictionary look-up (on-line and in-house tools) aligned parallel corpora (web-derived) comparable corpora (similarity thesaurus) conceptual networks (Eurowordnet, ZH-EN wordnet) multilingual thesaurus (domain-specific task) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Techniques Tested Text processing for multiple languages: u Porter stemmer, Inxight commercial

CLEF 2001 Techniques Tested Text processing for multiple languages: u Porter stemmer, Inxight commercial stemmer, on-site tools – – u u simple generic “quick&dirty” stemming language independent stemming separate stopword lists vs single list morphological analysis n-gram indexing, word segmentation, decompounding (e. g. Chinese, German) use of NLP methods, e. g. phrase identification, morphosyntactic analysis SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Techniques Tested Cross-language strategies included: w integration of methods (MT, corpora and

CLEF 2001 Techniques Tested Cross-language strategies included: w integration of methods (MT, corpora and MRDs) w pivot language to translate from L 1 -> L 2 (DE -> FR, SP, IT via EN) w N-gram based technique to match untranslatable words w prior and post-translation pseudo-relevance feedback (query expanded by associating frequent cooccurrences) w vector-based semantic analysis (query expanded by associating semantically similar terms) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Techniques Tested w Different strategies experimented for results merging w This remains

CLEF 2001 Techniques Tested w Different strategies experimented for results merging w This remains still an unsolved problem SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2001 Workshop w Results of CLEF 2001 campaign presented at Workshop, 3 -4

CLEF 2001 Workshop w Results of CLEF 2001 campaign presented at Workshop, 3 -4 September 2001, Darmstadt, Germany w 50 researchers and system developers from academia and industry participated. w Working Notes containing preliminary reports and statistics on CLEF 2001 experiments distributed. SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF-2001 vs. CLEF-2000 w w Most participants were back Less MT More Corpus-Based People

CLEF-2001 vs. CLEF-2000 w w Most participants were back Less MT More Corpus-Based People really start to try each other’s ideas/methods: n n n corpus-based approaches (parallel web, alignments) n-grams combination approaches SPINN Seminar, Copenhagen 26 -27 October 2001

“Effect” of CLEF w Many more European groups w Dramatic increase of work in

“Effect” of CLEF w Many more European groups w Dramatic increase of work in stemming/decompounding (for languages other than English) w Work on mining the web for parallel texts w Work on merging (breakthrough still missing? ) w Work on combination approaches SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2002 Accompanying Measure under IST programme: Contract No. IST-2000 -31002. October 2001 CLEF

CLEF 2002 Accompanying Measure under IST programme: Contract No. IST-2000 -31002. October 2001 CLEF Consortium IEI-CNR, Pisa; ELRA/ELDA, Paris; Eurospider, Zurich; UNED, Madrid; NIST, USA; IZ Sozialwissenschaften, Bonn Associated Members University of Hildesheim, University of Twente, University of Tampere (? ) SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2002 Task Description Similar to CLEF 2001: u u u multilingual information retrieval

CLEF 2002 Task Description Similar to CLEF 2001: u u u multilingual information retrieval bilingual IR (not to English!) monolingual (non-English) IR domain-specific IR interactive track Plus feasibility study for spoken document track (within DELOS – results reported at CLEF) Possible cooordination with Amaryllis SPINN Seminar, Copenhagen 26 -27 October 2001

CLEF 2002 Schedule Call for Participation - November 2001 u Document release – 1

CLEF 2002 Schedule Call for Participation - November 2001 u Document release – 1 February 2002 u Topic Release – 1 April 2002 u Runs received - 15 June 2002 u Results communicated – 1 August 2002 u Paper for Working Notes - 1 September 2002 u Workshop - 19 -20 September u SPINN Seminar, Copenhagen 26 -27 October 2001

Evaluation - Summing up w system evaluation is not a competition to find the

Evaluation - Summing up w system evaluation is not a competition to find the best w evaluation provides opportunity to test, tune, and compare approaches in order to improve system performance w an evaluation campaign creates a community interested in examining the same issues and comparing ideas and experiences SPINN Seminar, Copenhagen 26 -27 October 2001

Cross-Language Evaluation Forum For further information see: http: //www. clef-campaign. org or contact: Carol

Cross-Language Evaluation Forum For further information see: http: //www. clef-campaign. org or contact: Carol Peters - IEI-CNR E-mail: carol@iei. pi. cnr. it SPINN Seminar, Copenhagen 26 -27 October 2001