SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
`esults - Universitaet Dortmund
[ Summary Statistics
Run Number dortPl-full, automatic Recall-Precision Curve
Num of Queries 50 1
L
Total number of documents over all queries
Retrieved: 50000 0.8[OCRerr]
Relevant: 10489
Rel[OCRerr]ret: 7148
{
0.6
ecall Level Averages Document Level Averages Precision
Recall Precision _______________ Precision
0.4
0.00 0.7899 At 5 docs 0.6600
0.10 0.6598 At 10 docs 0.6380
0.20 0.5844 At 15 docs 0.6387 0.2
0.30 0.5331 At 20 docs 0.6300
0.40 0.4765 At 30 docs 0.6080
0
0.50 0.4134 At 100 docs 0.4816
0.60 0.3289 At 200 docs 0.3904
0.70 0.2502 At 500 docs 0.2500
0.80 0.1665 At 1000 docs 0.1550
0.90 0.0801 R-Precision (precision after R
1.00 0.0041 docs retrieved (where R is 1
Ige precision over all 1 the number of relevant docu-
Int docs ments)) 0.8
nterpolated 0.3800 Exact 0.4195
Level Averages 0.6
NORMAL DEVIATE - Fallout-Recall
Recall
) 0.3933
) 0.5213
) 0.5807
0.7370
0.7511
0.7666
0.7743
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
Recall
3:1 ol 0.98 0
0.4.
2 0.98
0.2
1 0.84
0 0.5
0
-1 0 0.05 0.1 0.15 0.2
0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3