SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
`ults - Conquest Software, Inc.
[ Summary Statistics Recall-Precision Curve
Run Number CnQstl-full, manual 1 I II
Num of Queries ½50
[OCRerr]otal num [OCRerr] 0f [OCRerr] over all queries
Retrieved: 50000
Relevant: 10185
Rel[OCRerr]ret:6494I
Document Level Averages
[OCRerr]ecall Level Averages
Precision
Recall Precision
At 5 docs 0.4480
0.00 0.6831
At 10 docs 0.4360
0.10 0.4552
At 15 docs 0.4427
0.20 0.4010
At 20 docs 0.4310
0.30 0.3538
At 30 docs 0.4393
0.40 0.2951
At 100 docs 0.3954
0.50 0.2361
At 200 docs 0.2997
0.60 0.1807
At 500 docs 0.1962
0.70 0.1147
At 1000 docs 0.1299
0.80 0.0725
0.90 0.0454 R-Precision (p[OCRerr]recisionafterRI
1.00 0.0054 docs retrieved (where R is
the number of relevant docu-
`rage precision over all
ments)) -1
`vant docs
[OCRerr]-interpolated 0.2343 Exact 0.2869
NORMAL DEVIATE - Fallout-Recall
ut Level Averages
)ut Recall .0 Ol 0.02 0.16 0.5 0.84 0.98 0.9[OCRerr]999
)00 0.3320
[OCRerr]20 0.4506 2 0.98
040 0.5406
0.84
060 0.5993 1
080 0.6400 0.5
0
100 0.6696
120 0.6763 -1 0.16
140 0.6763
0.02
160 0.6763 -2
180 0.6763 -3 0.001
p200 0.6763 3 -2 -1 0 1 2 3
0.8 -
0.6
Precision
0.4
0.2
0
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
1
0.8
0.6
Recall
0.4
0.2
0
0 0.05 0.1 0.15 0.2
Fallout x 100