SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - University of Illinois at Chicago
[
L
Summary Statistics
Run Numbe
Recall-Precision Curve
NumofQueries 50 1
Total number of documents over all queries
Retrieved: 50000
0.8 -
Relevant: 10489
Rel[OCRerr]ret: 3067
0.6
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.3965
0.10 0.2189
0.20 0.1543
0.30 0.1025
0.40 0.0807
0.50 0.0578
0.60 0.0361
0.10 0.0195
0.80 0.0088
0.90 0.0000
1.00 0.0000
[OCRerr]ge precision over all
mt docs
nterpolated 0.0780
Level Averages
Recall
) 0.1231
) 0.1693
0.1977
0.2278
0.2414
0.2664
0.2814
0.2918
0.3115
0.3272
0.3373
Document Level Averages
Precision
At 5 docs 0.1960
At 10 docs 0.2020
At 15 docs 0.2133
At 20 docs 0.2090
At 30 docs 0.2053
At 100 docs 0.1652
At 200 docs 0.1311
At 500 docs 0.0876
At 1000 docs 0.0613
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.1440
NORMAL DEVIATE - Fallout-Recall
[OCRerr].001 0.02 0.16 a
2
1
0
-1
-2
-3
.5 0.84 0.98 a
7
-3 -2 -1 0 1 2 3
Precision
0.4<
0.2
0
0 0.2 0.4 0.6 08
Recall
Fallout-Recall Curve
I -
1
0.8 -
0.6
[OCRerr][OCRerr]999 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16
Fallout x 100
0[OCRerr]02
0.001
Recall