IRS13
Scientific Report No. IRS-13 Information Storage and Retrieval
Document Length
chapter
E. M. Keen
Harvard University
Gerard Salton
Use, reproduction, or publication, in whole or in part, is permitted for any purpose of the United States Government.
V-[OCRerr]7
Co-ordination of Two Terms
TOTAL RELEVANT NON-RELEVANT [OCRerr]
RECALL* PRECISION[OCRerr]
INPUT AVERAGE DOCUMENTS DOCUMENTS DOCUMENTS RATIO RATIO
TEXT LENGTH
RETRIEVED RETRIEVED RETRIEVED
Titl[OCRerr] 7 623 111 512 56.1% 17.&7o
Indexing 1 1[OCRerr] 1,239 138 1,101 69.7% ii.i%
Indexing 2 22 2,029 162 1,867 81.8% 8.0%
Indexing 3 33 2,381 166 2,221 83e8% 7.0%
Abstract 60 2,820 i66 2,65[OCRerr] 83.8% 5.9%
L_ _-
Co-ordination of Four Terms
IN?UT AVERAGE TOTAL RELEVANT NON-RELEVANT RECALL* FRECISI0[OCRerr]
DOCUMENTS DOCUMENTS DOCUMENTS
TEXT LENGTH RETRIEVED RETRIEVED RETRIEVED RATIO RATIO
Title 7 147 29 18 114.6% 61.7%
Indexing 1 114 1140 60 80 30.3% 142.9%
Indexing 2 22 295 87 208 143.9% 29.5%
Indexing 3 33 1412 97 315 149.0% 23.5%
Abstract 60 5214 92 1432 146.5% 17.6%
* These ratios are computed using the aggregates (Ilmicro!? evaluation).
Cranfield Project results comparing titles, abstracts, and indexing at three
levels of exhaustivity, using search term co-ordination of two and four terms,
Word Form Language 13a.
Figure 33.