SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ults - University of Cahfornia, Los An[OCRerr]e1es
Summary Statistics
Run Number uclaal-category B, automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8'
Relevant: 3929
Reljet: 3204
ecall Level Averages
Recall Precision
0.00 0.8126
0.10 0.6075
0.20 0.5292
0.30 0.4385
0.40 0.3861
0.50 0.3210
0.60 0.2767
0.10 0.2284
0.80 0.1646
0.90 0.0938
1.00 0.0163
ige precision over all
mt docs
nterpolated 0.3345
Level Averages
Recall
() 0.3638
D 0.4872
D 0.5666
D 0.6114
[OCRerr] 0.6617
[OCRerr] 0.6944
D 0.7136
[OCRerr] 0.7335
[OCRerr] 0.1451
[OCRerr] 0.7604
[OCRerr] 0.1748
Document Level Averages
Precision
At 5 docs 0.5840
At 10 docs 0.5380
At 15 docs 0.4973
At 20 docs 0.4660
At 30 docs 0.4333
At 100 docs 0.3098
At 200 docs 0.2216
At 500 docs 0.1140
At 1000 docs 0.0641
R-Precision (precision after k
docs retrieved (where R is
the number of relevant docu-
ments))
0.6
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
Exact 0.3629
0.6
NORMAL DEVIATE - Fallout-Recall
Ol 0.02 0.98 0 )[OCRerr]999 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3
Recall