SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - Thinking Machines Corp.
Summary Statistics
Run Number TMC6-full, automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 46075 0.8 -
Relevant: 10489
Rel[OCRerr]ret: 6061
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.6276
0.10 0.4638
0.20 0.4142
0.30 0.3724
0.40 0.3194
0.50 0.2670
0.60 0.1950
0.70 0.1482
0.80 0.1123
0.90 0.0547
1.00 0.0211
age precision over all 1
ant docs
interpolated 0.2553 J
Level Averages
t Recall
[OCRerr]0 0.2924
0 0.3747
0 0.4267
0 0.4887
0 0.5237
0 0.5472
0 0.5666
D 0.5883
0 0.6021
[OCRerr] 0.6172
[OCRerr] 0.6274
Document Level Averages
Precision
At 5 docs 0.4320
At 10 docs 0.4180
At 15 docs 0.4187
At 20 docs 0.4200
At 30 docs 0.4060
At 100 docs 0.3396
At 200 docs 0.2839
At 500 docs 0.1868
At 1000 docs 0.1213
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
L
Exact 0.2991
NORMAL DEVIATE - Fallout-Recall
*( -.- ). (
)01 0.02
0.16
5 0.84 0.98
0
2
1
-1
0
-2
-3
)~999
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16
Fallout x 100
0.02
0.001
-3 -2 -1 0 1 3
0.6
Precision
0.4
0.2
0
1
0.8
0.6
0.4
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I