SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - Beilcore
Summary Statistics )
Run Number Isiri-full, automatic 1 Recall-Precision Curve
1
L Num of Queries 50
Total number of documents over all queries 1
Retrieved: 50000 0.8
Relevant: 10489
Rel[OCRerr]ret: 6522 J
Recall Level Averages
Recall Precision
0.00 0.7569
0.10 0.5281
0.20 0.4302
0.30 0.3779
0.40 0.3222
0.50 0.2446
0.60 0.1935
0.70 0.1474
0.80 0.0847
0.90 0.0295
1.00 0.0043
rage precision over al 1
`,ant docs 0.266:
-interpolated
it Level Averages
Lit Recall
0[OCRerr]0 0.3042
20 0.4031
40 0.4660
60 0.5058
80 0.5365
o0 0.5635
20 0.5838
40 0.6034
0.6276
BO 0.6431
0.6554
Document Level Averages
Precision
At 5 docs 0.6120
At 10 docs 0.5480
At 15 docs 0.5387
At 20 docs 0.5110
At 30 docs 0.4880
At 100 docs 0.3798
At 200 docs 0.3037
At 500 docs 0.2011
At 1000 docs 0.1304
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3050
NORMAL DEVIATE - Fallout-Recall
0.6
Precision
0.4
0.2
0
1
0.8
0.6
Recall
)01 0.02 0.84 0.98 0
0.4
0.2 -
2 0.98
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 3