000 04710nam a2200601 i 4500
001 6812850
003 IEEE
005 20200413152902.0
006 m eo d
007 cr cn |||m|||a
008 110618s2011 cau foab 000 0 eng d
020 _a9781598299724 (electronic bk.)
020 _z9781598299717 (pbk.)
024 7 _a10.2200/S00368ED1V01Y201105ICR019
_2doi
035 _a(CaBNVSL)gtp00548366
035 _a(OCoLC)742535693
040 _aCaBNVSL
_cCaBNVSL
_dCaBNVSL
050 4 _aZA3075
_b.H275 2011
082 0 4 _a025.04
_222
100 1 _aHarman, D. K.
_q(Donna K.)
245 1 0 _aInformation retrieval evaluation
_h[electronic resource] /
_cDonna Harman.
260 _aSan Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) :
_bMorgan & Claypool,
_cc2011.
300 _a1 electronic text (x, 107 p.) :
_bdigital file.
490 1 _aSynthesis lectures on information concepts, retrieval, and services,
_x1947-9468 ;
_v# 19
538 _aMode of access: World Wide Web.
538 _aSystem requirements: Adobe Acrobat Reader.
500 _aPart of: Synthesis digital library of engineering and computer science.
500 _aSeries from website.
504 _aIncludes bibliographical references (p. 87-105).
505 0 _a1. Introduction and early history -- Introduction -- The Cranfield tests -- The MEDLARS evaluation -- The SMART system and early test collections -- The Comparative Systems Laboratory at Case Western University -- Cambridge and the "Ideal" Test Collection -- Additional work in metrics up to 1992 --
505 8 _a2. "Batch" Evaluation Since 1992 -- 2.1. Introduction -- 2.2. The TREC evaluations -- 2.3. The TREC ad hoc tests (1992-1999) -- Building the ad hoc collections -- Analysis of the ad hoc collections -- The TREC ad hoc metrics -- 2.4. Other TREC retrieval tasks -- Retrieval from "noisy" text -- Retrieval of non-English documents -- Very large corpus, web retrieval, and enterprise searching -- Domain-specific retrieval tasks -- Pushing the limits of the Cranfield model -- 2.5. Other evaluation campaigns -- NTCIR -- CLEF -- INEX -- 2.6. Further work in metrics -- 2.7. Some advice on using, building and evaluating test collections -- Using existing collections -- Subsetting or modifying existing collections -- Building and evaluating new ad hoc collections -- Dealing with unusual data -- Building web data collections --
505 8 _a3. Interactive Evaluation -- Introduction -- Early work -- Interactive evaluation in TREC -- Case studies of interactive evaluation -- Interactive evaluation using log data --
505 8 _a4. Conclusion -- Introduction -- Some thoughts on how to design an experiment -- Some recent issues in evaluation of information retrieval -- A personal look at some future challenges -- Bibliography -- Author's biography.
506 1 _aAbstract freely available; full-text restricted to subscribers or individual document purchasers.
510 0 _aCompendex
510 0 _aINSPEC
510 0 _aGoogle scholar
510 0 _aGoogle book search
520 3 _aEvaluation has always played a major role in information retrieval, with the early pioneers such as Cyril Cleverdon and Gerard Salton laying the foundations for most of the evaluation methodologies in use today. The retrieval community has been extremely fortunate to have such a well-grounded evaluation paradigm during a period when most of the human language technologies were just developing. This lecture has the goal of explaining where these evaluation methodologies came from and how they have continued to adapt to the vastly changed environment in the search engine world today. The lecture starts with a discussion of the early evaluation of information retrieval systems, starting with the Cranfield testing in the early 1960s, continuing with the Lancaster "user" study for MEDLARS, and presenting the various test collection investigations by the SMART project and by groups in Britain.
530 _aAlso available in print.
588 _aTitle from PDF t.p. (viewed on June 18, 2011).
650 0 _aInformation retrieval
_xEvaluation.
650 0 _aInformation storage and retrieval systems
_xEvaluation.
653 _aEvaluation
653 _aTest collections
653 _aInformation retrieval
653 _aCranfield paradigm
653 _aTREC
776 0 8 _iPrint version:
_z9781598299717
830 0 _aSynthesis digital library of engineering and computer science.
830 0 _aSynthesis lectures on information concepts, retrieval, and services,
_x1947-9468 ;
_v# 19.
856 4 2 _3Abstract with links to resource
_uhttp://ieeexplore.ieee.org/servlet/opac?bknumber=6812850
999 _c561852
_d561852