multipla-project.org
talks [MULTIPLA]
http://www.multipla-project.org/talks.html
Crossing the Boundaries of Domains and Languages. Philipp Cimiano, Towards the Multilingual Semantic Web. Invited Talk at the Seminar on new Trends in Intelligent Systems and Soft Computing. University of Granada, February 2011. Philipp Sorg, Cross-lingual Information Retrieval based on Multiple Indexes. Talk at the CLEF Workshop. Corfu, Greece, October 2009. Philipp Sorg, Explicit vs. Latent Concept Modelsfor Cross-LanguageInformation Retrieval. Talk at the IJCAI 2009 Conference. Pasadena, CA, July 2009.
clef-clsr.umiacs.umd.edu
Cross-Language Speech Retrieval Track at CLEF 2007
https://clef-clsr.umiacs.umd.edu/index.html
Registration for CLEF 2007 is now open. Clarification on run submissions for the English task. For the Searching English task, you should submit runs for ALL 105 topics. Of which. 63 (63 qid.txt). Are the 2006 training topics for which QRELS are available (these 63 topics should be used for system tuning); and. 33 (33 qid.txt). Are the 2006 testing topics for which QRELS are available (You must NOT use these 33 queries for system tuning). The remaining 9 topics with no QRELS available. Participation in t...
clef-clsr.umiacs.umd.edu
Cross-Language Speech Retrieval Track at CLEF 2007
https://clef-clsr.umiacs.umd.edu/data.html
For more details on how to participate and get the collections, see Guidelines. An English and a Czech collection are used in the track this year. The format of each interview is consistent and shown as follows:. The tag for each document. Document identifier: VHF[IntCode]-[SegId].[SequenceNum]. Metadata about the entire interview. Full name of every person mentioned. Thesaurus keywords assigned to the segment. ASR transcript produced in 2003. ASR transcript produced in 2004. The Czech interviews use a n...
clef-clsr.umiacs.umd.edu
Cross-Language Speech Retrieval Track at CLEF 2006
https://clef-clsr.umiacs.umd.edu/2006
Website for CLEF-CLSR 2006. For last year's website. The goal of the CLEF Cross-Language Speech Retrieval (CL-SR) track is to develop and evaluate systems for ranked retrieval of spontaneous converational speech. Participation in the track is very easy - at a minimum, teams can treat it as a simple CLIR (or even monolingual IR) task. One of our central goals is to create a community that has both interest in and experience with IR in collections of spontaneous conversational speech. Oard at umd.edu.
clef-clsr.umiacs.umd.edu
Cross-Language Speech Retrieval Track at CLEF 2005
https://clef-clsr.umiacs.umd.edu/2005
April 15 2005: Version 2.0 of the CLSR Collection is now available. Click here. For more info. If you are experiencing any problems with the release, try our FAQs. February 15 2005: Version 1.0 of the CLSR Collection is now available. Click here. The goal of the CLEF Cross-Language Speech Retrieval (CLSR) track is to develop and evaluate systems for ranked retrieval of spontaneous converational speech. Oard at umd.edu. Gareth.Jones at computing.dcu.ie. Available at clef-clsr (at) umiacs.umd.edu.
koelle.blog.uni-hildesheim.de
» Veröffentlichungen Ralph Koelle's Weblog & Homepage
http://koelle.blog.uni-hildesheim.de/veroffentlichungen
Ralph Koelle's Weblog and Homepage. Ralph Koelle's Blog @ Uni Hildesheim. Dez 2nd, 2013 by koelle. Matthias Maifarth, Joachim Griesbaum, Ralph Kölle (2013). Mobile device usage in higher education, in: Claudia Bremer, C; Detlef Krömker (Hg). E-Learning zwischen Vision und Alltag, Medien in der Wissenschaft, Band 64, Münster u.a.: Waxmann. 332-337. Clemens Roth, Joachim Griesbaum, Ralph Kölle (2013). Was bedeutet 11.882 Bearbeitungen? Nadine Pietras, Ralph Kölle, Joachim Griesbaum (2013). Mehrwerte von Ad...
promise-noe.eu
CLEF
http://www.promise-noe.eu/clef-conference-series;jsessionid=03E33C7F7A55D0BF4CD1D908EE1CA108
The Cross-Language Evaluation Forum (CLEF) promotes R&D in multilingual information access by:. Developing an infrastructure for the testing, tuning and evaluation of information retrieval systems operating on European languages in both monolingual and cross-language contexts;. Creating test-suites of reusable data which can be employed by system developers for benchmarking purposes. Http:/ www.clef-campaign.org/.
clef-qa.fbk.eu
Links - CLEF 2008
http://clef-qa.fbk.eu/2008/links.html
Evaluation Best Practice and Collaboration for Multilingual Information Access. Center for the Evaluation of Language and Communication Technologies. Fondazione Bruno Kessler - Centro per la ricerca scientifica e tecnologica. Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo". National Institute of Standards and Technology.