SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
snlts - University of California, Berkeley
Summary Statistics
Run Number Brkly4-full, automatic Recall-Precision Curve
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10489
Rel[OCRerr]ret: 6339
0.6
[OCRerr]call Level Averages
Recall Precision
0.00 0.7606
0.10 0.5512
0.20 0.4543
0.30 0.4004
0.40 0.3522
0.50 0.2797
0.60 0.2032
0.70 0.1665
0.80 0.1166
0.90 0.0631
1.00 0.0147
Ige precision over all
mt docs
nterpolated 0.2905
Document Level Averages Precision
Precision 0.4
At 5 docs 0.6080
At 10 docs 0.5920
0.2
At 15 docs 0.5707
At 20 docs 0.5580
At 30 docs 0.5160 0
At 100 docs 0.3934
At 200 docs 0.3117
At 500 docs 0.1984
At 1000 docs 0.1268
[OCRerr]-[OCRerr]recision (precision after R
1
docs retrieved (where R is
relevant docu-
the number of
ments))
Exact 0.3332
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I I
Level Averages
Recall
() 0.3395
D 0.4263
0 0.4791
0 0.5212
0 0.5500
0 0.5747
0 0.5939
0 0.6096
0 0.6255
0 0.6388
0 0.6530
NORMAL DEVIATE - Fallout-Recall
0.6
Recall
31 ol 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3