SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]q[OCRerr]iltg - Garne[OCRerr]e Mellon University
Summary Statistics
Run Number CLARTM-full, manual
Num of Queries 501
Total number of documents over all queries
Retrieved: 50000
Relevant: 10489
Reljet: 6785
Document Level Averages
[OCRerr]ecall Level Averages
Precision
Recall Precision
At 5 docs 0.6120
0.00 0.7191
At 10 docs 0.5940
0.10 0.5825
At 15 docs 0.5800
0.20 0.4987
At 20 docs 0.5700
0.30 0.4505
At 30 docs 0.5527
0.40 0.3848
At 100 docs 0.4396
0.50 0.3295
At 200 docs 0.3436
0.60 0.2739
At 500 docs 0.2176
0.70 0.2109
At 1000 docs 0.1357
0.80 0.1683
0.90 0.0951 R-Precision (precision after R
1.00 0.0126 docs retrieved (where R is
the number of relevant docu-
precision over all
ments))
vant docs
i-interpolated 0.3302 Fxact 0.3642
NORMAL DEVIATE - Fallout-Recall
ut Level Averages
)ut Recall 001 0.02 0.16 C 0.84 0.98 0.9[OCRerr]999
)00 0.3602
[OCRerr]20 0.4615 2 0.98
[OCRerr]0 0.5383
060 0.5783 1 0.84
080 0.6143
0 0.5
100 0.6371
120 0.6611 -1 0.16
140 0.6781
160 0.6946 -2 0.02
`180 0.7071 -3 0.001
200 0.7199 -3 -2 -1 0 1 2 3
Recall-Precision Curve
1
0.8
0.6
Precision
0.4
0.2
0
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
1
0.8
0.6
Recall
0.4
0.2
0
0 0.05 0.15 0.2
0.1
Fallout x 100