SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ults - CITRI, Royal Melbourne Institute of Technology
L Summary Statistics
Run Number citri2-full, automatic
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 1495
ecall Level Averages
Recall Precision
0.00 0.1856
0.10 0.5335
0.20 0.4611
0.30 0.4086
0.40 0.3526
0.50 0.2932
0.60 0.2306
0.70 0.1613
0.80 0.1053
0.90 0.0510
1.00 0.0097
rage precision over all
[OCRerr]ant docs
interpolated 0.2814
it Level Averages
Ft Recall
Fo 0.4050
20 0.5313 2
40 0.6116
60 0.6684
80 0.7040
00 0.1216 0
20 0.1291 -1
40 0.1291
60 0.1291 -2
80 0.1291
00 0.1291 -3
Document Level Averages
Precision
At 5 docs 0.5600
At 10 docs 0.5280
At 15 docs 0.5133
At 20 docs 0.5000
At 30 docs 0.4900
At 100 docs 0.4354
At 200 docs 0.3606
At 500 docs 0.2358
At 1000 docs 0.1499
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
1
0.8<
0.6
Precision
0.4
0.2 -
0
Exact 0.3388
NORMAL DEVIATE - Fallout-Recall
Recall
1
0.8
0.6
0.84 0.98 0 0.4[OCRerr]
0.98
0.2
0.84
0.5 0
0.16
0.02
I I I 0.001
-3 -2 -1 0 1 2 3
Recall-Precision Curve
I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
0 0.05 0.1 0.15 0.2
Fallout x 100