SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]ults - CITRI, Royal Melbourne Institute of Technology
Summary Statistics ]
Run Number citril-full, automatic 1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 7439
ecall Level_Averages
Recall Precision
0.00 0.8023
0.10 0.5298
0.20 0.4570
0.30 0.4027
0.40 0.3482
0.50 0.2951
0.60 0.2271
0.70 0.1514
0.80 0.0887
0.90 0.0379
1.00 0.0000
[OCRerr]ge precision over all
[OCRerr]nt docs
nterpolated 0.2804
Level Averages
Recall
0[OCRerr] 0.3934
0 0.5301
0 0.6112
0 0.6630
0 0.6972
0 0.7158
D 0.7192
0 0.7192
0 0.7192
0 0.7192
0 0.7192
Document Level Averages
Precision
At S docs 0.5680
At 10 docs 0.5320
At 15 docs 0.5173
At 20 docs 0.5140
At 30 docs 0.4913
At 100 docs 0.4224
At 200 docs 0.3541
At 500 docs 0.2340
At 1000 docs 0.1488
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3359
NORMAL DEVIATE - Fallout-Recall
ol 084 0.98
0.4[OCRerr]
0.2
Recall
2 0.98
Recall-Precision Curve
1
0.8
0.6
Precision
0.4
0.2
I I I I
0
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
1
0.8
0.6
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3