TrebleCLEF Logo
Evaluation, Best Practices and Collaboration for Multilingual Information Access
Latest News
CLEF 2010: Padua, Italy September 2010
Read more - Link

TrebleCLEF workshop at eChallenges : Best Practices for Multilingual Information Access Istanbul,...
Read more - Link

Read all news

Workshop on Novel Methodologies for Evaluation in Information Retrieval

European Conference for Information Retrieval - ECIR '08
Glasgow, United Kingdom, 30 March 2008


photos of the workshop

Information retrieval is an empirical science; the field cannot move forward unless there are means of evaluating the innovations devised by researchers.
However the methodologies conceived in the early years of IR and used in the campaigns of today are starting to show their age and new research is emerging to understand how to overcome the twin challenges of scale and diversity.

The methodologies used to build test collections in the modern evaluation campaigns were originally conceived to work with collections of 10s of thousands of documents. The methodologies were found to scale well, but potential flaws are starting to emerge as test collections grow beyond 10s of millions of documents. Support for continued research in this area is crucial if IR research is to continue to evaluate large scale search.

With the rise of the large Web search engines, some believed that all search problems could be solved with a single engine retrieving from a one vast data store. However, it is increasingly clear that evolution of retrieval is not towards a monolithic solution, but instead to a wide range of solutions tailored for different classes of information and different groups of users or organizations. Each tailored system on offer requires a different mixture of component technologies combined in distinct ways and each solution requires evaluation.

Programme Committee

The Workshop Chair is Mark Sanderson. Co-organisers are Martin Braschler, Nicola Ferro and Julio Gonzalo.