SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
esults - Siemens Corporate Research, Inc.
[OCRerr]ary Statistics
Run Number siemsl-full, automatic I
Num of Queries 50
of documents over all queries
50000
Retrieved: 10489
Relevant: 6666
Rel[OCRerr]ret: I
Recall Level Averages
Recall Precision
0.00 0.1391
0.10 0.5851
0.20 0.5002
0.30 0.4330
0.40 0.3631
0.50 0.3050
0.60 0.2333
0.10 0.1555
0.80 0.0860
0.90 0.0214
1.00 0.0042
[OCRerr]erage precision over all
levant docs
)n-interpolated 0.2984
Document Level Averages
Precision
At 5 docs 0.6080
At 10 docs 0.5960
At 15 docs 0.5693
At 20 docs 0.5450
At 30 docs 0.5260
At 100 docs 0.4010
At 200 docs 0.3229
At 500 docs 0.2092
At 1000 docs 0.1333
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3482
NORMAL DEVIATE F[OCRerr]Iout[OCRerr]RecaII
lout Level Averages
ilout Recall [OCRerr].o0i 0.02 0.16 0.5 0.84 0.98 0.9[OCRerr]999
0000 0.3305
0020 0.4414 2 0.98
0040 0.4944
1 0.84
0060 0.5411
[OCRerr]0080 0.5731 0 0.5
)0100 0.6022
)0120 0.6232 -1 0.16
)0140 0.6427
-2 0.02
[OCRerr]0160 0.6628
D0180 0.6759 -3 0.001
00200 0.6865 3 -2 -1 0 1 2 3
1
0.8
0.6
Precision
0.4
0.2
0
1
0.8
0.6
Recall
0.4
02
0
Recall[OCRerr]Precision Curve
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
0 0.05 0.1 0.15 0.2
Fallout x 100