SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]sults - University of California, Berkeley
Summary Statistics
Run Number
N um of Queri 50
Total number
Retrieved: 50000
Relevant: 10489
fRel~ret:7237I
1
0.8<
0.6 -
Recall-Precision Curve
KY
ecall [OCRerr]vel Averages
Recall Precision
0.00 0.8361
0.10 0.6331
0.20 0.5384
0.30 0.4781
0.40 0.4231
0.50 0.3611
0.60 0.2812
0.70 0.2400
0.80 0.1714
0.90 0.0928
1.00 0.0190
rage precision over all
[OCRerr]ant docs
-interpolated 0.3538
it Level Averages
ut Recall
00 0.3969
20 0.4892
40 0.5489
160 0.5881
180 0.6227
.00 0.6458
L20 0.6690
140 0.6864
L60 0.7007
[80 0.7158
[OCRerr]0 0.1277
Document [OCRerr]vel Averages
Precision
At 5 docs 0.6600
At 10 docs 0.6360
At 15 docs 0.6160
At 20 docs 0.6040
At 30 docs 0.5753
At 100 docs 0.4608
At 200 docs 0.3650
At 500 docs 0.2310
At 1000 docs 0.1447
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Precision
0.4
0.2
0
1
0.8
0.6
Exact 0.3852
NORMAL DEVIATE - Fallout-Recall Recall
2
1 0.84
0 0.5 0
-1 0.16
-2 0.02
-3
0.84 0.98 0 0.4
0.2
0.98
3 -2 -1 0 1 2 3
0.001
0 `).2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
0 0.05 0.1 0.15 0.2
Fallout x 100