SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - HNC, Inc.
Summary Statistics
Run Number HNCad1-full, automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10785
Rel[OCRerr]ret: 6944
0.6
ecall Level Averages
Recall Precision
0.00 0.7750
0.10 0.5762
0.20 0.4934
0.30 0.4215
0.40 0.3536
0.50 0.2672
0.60 0.2041
0.70 0.1271
0.80 0.0575
0.90 0.0215
1.00 0.0007
[OCRerr]ge precision over all
[OCRerr]nt docs
nterpolated 0.2787
Level Averages
Recall
0 0.4087
0 0.5261
0 0.5867
0.6272
0.6533
) 0.6681
) 0.6720
) 0.6720
) 0.6720
Document Level Averages
Precision
At 5 docs 0.5560
At 10 docs 0.5680
At 15 docs 0.5507
At 20 docs 0.5470
At 30 docs 0.5367
At 100 docs 0.4520
At 200 docs 0.3652
At 500 docs 0.2247
At 1000 docs 0.1389
R-Precision (precision after R
docs retrieved (where R Is
the number of relevant docu-
ments))
Precision
0.4
0.2
0 I I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
1
0.8
Exact 0.3474
0.6
NORMAL DEVIATE Fallout-Recall
[OCRerr]1 0.02 0.16 0.84 0.98 0
99 0.4<
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 3
Recall