SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - Universitaet Dortmund
L Summary Statistics I Recall-Precision Curve
F Run Number dortVl-full, automatic [OCRerr] 1
[OCRerr] Num of Queries 50 j
Total number of documents over all queries 1
0.8
Retrieved: 50000 1
Relevant: 10489
Rel[OCRerr]ret: 7386 0.6
ecall Level Averages
Recall Precision
0.00 0.7864
0.10 0.6082
0.20 0.5373
0.30 0.5023
0.40 0.4446
0.50 0.3788
0.60 0.2997
0.70 0.2415
0.80 0.1467
0.90 0.0590
1.00 0.0049
age precision over all
ant docs
interpolated 0.3516
t Level Averages
it Recall
)0 0.3912
)[OCRerr]0 0.4894 2
[OCRerr] 0.5521
,0 0.5974 1
30 0.6316
0
)0 0.6602
[OCRerr]0 0.6863 -1
[OCRerr] 0.7020
50 0.7152 -2
30 0.7279
-3
[OCRerr] 0.7389
Document Level Averages Precision
Precision 0.4
At S docs 0.6280
At 10 docs 0.5980
0.2
At 15 docs 0.5960
At 20 docs 0.5790
At 30 docs 0.5653 0
At 100 docs 0.4520
At 200 docs 0.3661
At 500 docs 0.2352
At 1000 docs 0.1477
R-Precision (precision after R 1
docs retrieved (where R is
the number of relevant docu-
ments)) 0.8
Exact 0.3947
NORMAL DEVIATE - Fallout-Recall
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
0.6 -
0.84 0.98 0 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
I I I 0.001
-3 -2 -1 0 1 3