SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
"its - Swiss Federal Institute of Technology (ETH)
Summary Statistics I
F Run Number schaul-full, automatic 1
Num of Queries 50 j
Total number of documents over all queries 1
0.8 [OCRerr]-,
Retrieved: 50000 1
Relevant: 10785
Reljet: 7081 j 0.6 -
Recall-Precision Curve
[OCRerr]call Level Averages
Recall Precision
0.00 0.1692
0.10 0.4912
0.20 0.4231
0.30 0.3649
0.40 0.3025
0.50 0.2529
0.60 0.1887
0.70 0.1213
0.80 0.0127
0.90 0.0403
1.00 0.0000
[OCRerr]ge precision over all
[OCRerr]nt docs
nterpolated 0.2517 1
Level Averages
t Recall
31
0.3620
0.4763 2
0.5547
0.6102 1
0.6507
0
0.6743
0.6771 -1
0.6771
0.6771 -2
0.6171
-3
0.6771
Document Level Averages
Precision
At 5 docs 0.5360
At 10 docs 0.4960
At 15 docs 0.4800
At 20 docs 0.4720
At 30 docs 0.4513
At 100 docs 0.4000
At 200 docs 0.3302
At[OCRerr]500 docs 0.2158
At 1000 docs 0.1416
R-Precision (precision after R
docs retrieved (where R is
of relevant docu-
the number
ments))
Exact 0.3148
NORMAL DEVIATE - Fallout-Recall
.001 0.02 0.16 0.5 0[OCRerr]84 0.98 0.[OCRerr]
I I I I
-~7-
-3 -2 -1 0 1
3
Precision
0.4
0.2
0
1
0.8
0.6
[OCRerr]999 0.4
0.98
0.2
0.84
0.5 0
0.16
0.02
0.001
Recall
I I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
0 0.05 0.1 0.15 0.2
Fallout x 100