SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - Cornell University
Summary Statistics ]
Run Number crnlV2-full1 automatic 1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 8018
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.7644
0.10 0.5209
0.20 0.4739
0.30 0.4388
0.40 0.3971
0.50 0.3544
0.60 0.2950
0.70 0.2241
0.80 0.1356
0.90 0.0661
1.00 0.0076
[OCRerr]ge precision over all 1
[OCRerr]nt docs
nterpolated 0.3163 J
Level Averages
Recall
D 0.4516
0.5815
0.6605
) 0.7148
0.7514
) 0.7675
) 0.7713
) 0.7713
) 0.7713
Document Level Averages
________________ Precision
At 5 docs 0.5480
At 10 docs 0.5220
At 15 docs 0.5187
At 20 docs 0.5130
At 30 docs 0.5013
At 100 docs 0.4434
At 200 docs 0.3789
At 500 docs 0.2529
At 1000 docs 0.1604
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3640
0.6
NORMAL DEVIATE - Fallout-Recall
101 0.02 0.84 0.9g 0
)[OCRerr]999 0.4
2 0.98
0.2
0.84
0 0.5 0
Recall
Recall-Precision Curve
1
Precision
0.4
0.2
0 L I I I
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3