SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ults - Verity, Inc.
Summary Statistics
Recall-Precision Curve
Run Number TOPIC2-fuII1 manual
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 31624 0.8
Relevant: 10185
Rel[OCRerr]ret: 5614
mcall Level Averages
Recall Precision
0.00 0.8491
0.10 0.5786
0.20 0.4943
0.30 0.3906
0.40 0.2774
0.50 0.2062
0.60 0.1221
0.10 0.0663
0.80 0.0353
0.90 0.0137
1.00 0.0000
ige precision over all 1
mt docs
nterpolated 0.2464 J
Document Level Averages
Precision
At 5 docs 0.6280
At 10 docs 0.5880
At 15 docs 0.5827
At 20 docs 0.5680
At 30 docs 0.5407
At 100 docs 0.4624
At 200 docs 0.3347
At 500 docs 0.2002
At 1000 docs 0.1123
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3211
0.6
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
Level Averages
Recall
) 0.3744
[OCRerr] 0.4689
[OCRerr] 0.5232
[OCRerr] 0.5446
[OCRerr] 0.5512
[OCRerr] 0.5512
[OCRerr] 0.5512
) 0.5512
[OCRerr] 0.5512
) 0.5512
[OCRerr] 0.5512
NORMAL DEVIATE - Fallout-Recall
0.6
Recall
101 0.02 0.98 0 )[OCRerr]999 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3
-3 -2 -1 0 1 2
3
0.001