SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]u1ts - Beilcore
Summary Statistics
Run Number lsir2-full, automatic I 1
Num of Queries 50 j
Total number of documents over all queries 1
0.8
Retrieved: 50000
Relevant: 10489
Rel[OCRerr]ret: 1155 J 0.6
call Level Averages
[OCRerr]ecall Precision
0.00 0.8533
0.10 0.6270
0.20 0.5435
0.30 0.4122
0.40 0.4080
0.50 0.3546
0.60 0.2913
0.70 0.2032
0.80 0.1282
0.90 0.0717
1.00 0.0044
ige precision over all
tnt docs
,terpolated 0.3442
Level Averages
Recall
0.3478
0.4676
0.5269
0.5817
0.6136
0.6407
0.6632
0.6192
0.6973
0.7079
0.7174
Document Level Averages
Precision
At 5 docs 0.6960
At 10 docs 0.6660
At 15 docs 0.6253
At 20 docs 0.5960
At 30 docs 0.5740
At 100 docs 0.4524
At 200 docs 0.3562
At 500 docs 0.2274
At 1000 docs 0.1431
R- Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3804
NORMAL DEVIATE - Fallout-Recall
)01 0.02 0.16 (
5
0.84 0.98 0
½(½
2
1
Precision
0.4
0.2
0
1
0.8
0.6
Recall
[OCRerr]999 0.4
0.98
0.2 -
0.84
0 0.5 0
-1 0.16
-2 0.02
-3 -7 0.001
I I I I
-3 -2 -[OCRerr] 0 1 2 3
Recall-Precision Curve
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
0 0.05 0.1 0.15 0.2
Fallout x 100