SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - University of Central Florida
Summary Statistics
Run Number UCFIS1-category B, automatic 1
Num of Queries 50 J
Total number of documents over all queries
0.8
Retrieved: 50000
Relevant: 2064
Rel[OCRerr]ret: 102 0.6
Recall-Precision Curve
[OCRerr]call Level_Averages
Recall Precision
0.00 0.0891
0.10 0.0286
0.20 0.0062
0.30 0.0015
0.40 0.0015
0.50 0.0011
0.60 0.0000
0.10 0.0000
0.80 0.0000
0.90 0.0000
1.00 0.0000
3ge precision over all
[OCRerr]nt docs[OCRerr]
-¼nterpolatedo.00601
Document Level Averages
Precision
At 5 docs 0.0400
At 10 docs 0.0260
At 15 docs 0.0187
At 20 docs 0.0170
At 30 docs 0.0127
At 100 docs 0.0070
At 200 docs 0.0039
At 500 docs 0.0024
At 1000 docs 0.0020
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.0128 1
Precision
0.4
0.2
0
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
A
1
0.8 -
0.6
A A _
Level Averages
t Recall
0 0.0307
0 0.0339
.0 0.0369
0 0.0314
0 0.0384
0 0.0427
0 0.0427
[OCRerr] 0.0427
[OCRerr]0 0.0436
10 0.0436
)0 0.0439
NORMAL DEVIATE - Fallout-Recall
Recall
Ol 0.02 0.84 0.98 0. 0.4
0.2
2 0.98
1 0.84
0 0.5 0<
-1 0.16
-2 0.02
-3 I I 0.001
-3 -2 -1 0 1 2 3
I [OCRerr] [OCRerr] Y
0 0.05 0.1 0.15 0.2
Fallout x 100