SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - City University, London
Summary Statistic s]
Run Number cityri-full, automatic
1
Num of Queries 50 J
Total number of documents over all queries
Retrieved: 50000 0.8<
Relevant: 10489
Rel[OCRerr]ret: 6801 0.6
Recall-Precision Curve
Recall Level Averages
Recall Precision
0.00 0.1888
0.10 0.6028
0.20 0.5101
0.30 0.4380
0.40 0.3851
0.50 0.3070
0.60 0.2494
0.70 0.1826
0.80 0.1269
0.90 0.0606
1.00 0.0048
[OCRerr]rage precision over all
vant docs 0.3149
i-interpolated
ut Level Averages
[OCRerr]ut Recall
)[OCRerr] 0.3610
)20 0.4444
)40 0.5009
)60 0.5433
)80 0.5791
L00 0.6074
[20 0.6294
[40 0.6476
[60 0.6636
[80 0.6784
[OCRerr]0 0.6918
Document Level Averages
Precision
At 5 docs 0.6280
At 10 docs 0.5940
At 15 docs 0.5867
At 20 docs 0.5690
At 30 docs 0.5333
At 100 docs 0.4316
At 200 docs 0.3385
At 500 docs 0.2154
At 1000 docs 0.1360
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Precision
0.4
0.2
0
1
0.8
Exact 0.3607
0.6 -
NORMAL DEVIATE - Fallout-Recall
Recall
ol 0.02 0.84 0.98 0. 999 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I