SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - Mead Data Central, Inc.
Run Num be
Recall-Precision Curve
50 1
[
Total numb
Retrieved:
0.8
Relevant: 10785
Rel[OCRerr]ret:3282j
t
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.4228
0.10 0.1880
0.20 0.1385
0.30 0.1045
0.40 0.0840
0.50 0.0643
0.60 0.0445
0.70 0.0365
0.80 0.0183
0.90 0.0000
1.00 0.0000
age precision over all
ant docs
interpolated 0.0819
Level Averages
t Recall
0 0.1242
0 0.1144
0 0.2072
0 0.2337
0 0.2571
0 0.2740
0 0.2805
0 0.2805
D 0.2805
[OCRerr] 0.2805
[OCRerr] 0.2805
31
2
1
0
Document Level Averages
Precision
At 5 docs 0.2520
At 10 docs 0.2320
At 15 docs 0.2267
At 20 docs 0.2200
At 30 docs 0.2073
At 100 docs 0.1688
At 200 docs 0.1383
At 500 docs 0.0954
At 1000 docs 0.0656
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
m ents))
Exact 0.1231
NORMAL DEVIATE - Fallout-Recall
)01 0.02 0.16 0. 5 0.84 0.98 0
7
0.6
Precision
0.4
0.2
0
1
0.8
0.6
[OCRerr]999 0.4
0.98
0.2
0.84
0.5 0
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
-1 0 0.05 0.1 0.15 0.2
0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3