SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - University of Massachusetts at Amherq.t
[ Summary Statistics
Run Number INQOOl-full, automatic 1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 8281
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.8275
0[OCRerr]10 0.5979
0.20 0.5458
0.30 0.4832
0.40 0.4369
0.50 0.3758
0.60 0.3181
0.70 0.2382
0.80 0.1634
0.90 0.0825
1.00 0.0094
`age precision over all I
`ant docs
interpolated 0.3556
t Level Averages
t Recall
0 0.4769
.0 0.6107
.0 0.6913
0 0.7427
`0 0.7713
0 0.7890
0 0.1933
0 0.7933
0 0.7933
0 0.7933
0 0.7933
Document Level Averages
________________ Precision
At 5 docs 0.6160
At 10 docs 0.5960
At 15 docs 0.5880
At 20 docs 0.5840
At 30 docs 0.5733
At 100 docs 0.4934
At 200 docs 0.4099
At 500 docs 0.2665
At 1000 docs 0.1656
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3946
0.6
NORMAL DEVIATE - Fallout-Recall
)01 0.02 0.84 0.98 0
)[OCRerr]999 0.4
2 0.98
0.2
1 0.84
0 0.5
0
Recall
Recall-Precision Curve
1
0.8'
0.6
Precision
0.4
0.2
0 I I I
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
I I I
0 0.05 0.1 0.15 0.2
-1 0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3