SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - The Environment Research Institute of Michi[OCRerr]an
Summary Statistics
Run Number erimrl-fulI, automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 41829 0.8
Relevant: 10489
Rel[OCRerr]ret: 4152
0.6[OCRerr]
Recall Level_Averages
Recall Precision
0.00 0.5948
0.10 0.3190
0.20 0.2334
0.30 0.1817
0.40 0.1280
0.50 0.0841
0.60 0.0441
0.70 0.0210
0.80 0.0115
0.90 0.0020
1.00 0.0020
erage precision over all
evant docs
n-interpolated 0.1219
[OCRerr]t Level Averages
)ut Recall
000 0.1771
020 0.2227
040 0.2581
060 0.2834
080 0.3169
100 0.3383
120 0.3545
140 0.3794
160 0.3886
180 0.3995
[OCRerr]0 0.4104
Document Level Averages
Precision
At 5 docs 0.3720
At 10 docs 0.3580
At 15 docs 0.3347
At 20 docs 0.3340
At 30 docs 0.3113
At 100 docs 0.2304
At 200 docs 0.1802
At 500 docs 0.1203
At 1000 docs 0.0830
R-Precision (precision after
docs retrieved (where R
the number of relevant
ments))
R
Is
docu-
Exact 0.1814
Precision
0.4
0.2
0
1
0.8
0.6
NORMAL DEVIATE - Fallout-Recall
p01 0.02 0.98 0 )[OCRerr]999
0.4
2 0.98
0.2
0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I