SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - University of Massachusetts at Amherst
[ Summary Statistics
[ Run Number INQOO3-full, automatic
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10489
Reljet: 1161
ecall Level Averages
Recall Precision
0.00 0.1852
0.10 0.6315
0.20 0.5296
0.30 0.4162
0.40 0.4246
0.50 0.3621
0.60 0.2195
0.10 0.2309
0.80 0.1135
0.90 0.1016
1.00 0.0122
`age precision[OCRerr] over all
`ant docs
interpolated 0.3538
t Level Averages
it Recall
)0 0.3148
[OCRerr]0 0.4880
10 0.5412
)0 0.5823
[OCRerr]0 0.6106
)0 0.6341
?0 0.6516
10 0.6180
)0 0.6926
30 0.1063
)0 0.1155
Document Level Averages
Precision
At S docs 0.6440
At 10 docs 0.6320
At 15 docs 0.6021
At 20 docs 0.5820
At 30 docs 0.5593
At 100 docs 0.4452
At 200 docs 0.3493
At 500 docs 0.2295
At 1000 docs 0.1432
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
1
0.8
0.6
Precision
0.4
0.2
0
1
0.8
Exact 0.3893
NORMAL DEVIATE - Fallout-Recall
Recall
Recall-Precision Curve
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I I -`
0.6 -
[OCRerr]1 0[OCRerr]02 5 0.84 0.98 0 0.4
0.2
2 0.98
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 I I I I 0.001
-3 -2 -1 0 1 3