SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
Lilts - University of California, Los Angeles
Summary Statistics
Recall-Precision Curve
un Number uclafi-category B, feedback, frozen evaluation 1
um of Queries 50
Total number of documents over all queries
0.8<
etrieved: 50473
elevant: 3929
cljet: 3047 J 0.6
ecall Level Averages
Recall Precision
0.00 0.8125
0.10 0.5951
0.20 0.5218
0.30 0.4113
0.40 0.3484
0.50 0.2813
0.60 0.2289
0.10 0.1915
0.80 0.1351
0.90 0.0764
1.00 0.0105
age precision over all
ant docs
interpolated 0.3091 1
L Level Averages
it Recall
Fo 0.3425
?0 0.4483 2
[OCRerr] 0.5041
,0 0.5489 1
30 0.5898
0
)0 0.6216
[OCRerr]0 0.6431 -1
[OCRerr] 0.6653
50 0.6864 -2
BO 0.1030
-3
00 0.1112
Document Level Averages Precision
Precision 0.4
At 5 docs 0.5880
At 10 docs 0.5220
0.2
At 15 docs 0.4893
At 20 docs 0.4640
At 30 docs 0.4360 0
At 100 docs 0.2884
At 200 docs 0.1992
At 500 docs 0.1058
At 1000 docs 0.0609
R-Precision (precision after R 1
docs retrieved (where R is
the number of relevant docu-
ments)) 0.8
Exact 0.3459
NORMAL DEVIATE - Fa11out-R[OCRerr]ca11
Recall
0 0.2 0.4 0.6 08
Recall
Fallout-Recall Curve
I I
0.6 -
0.98 0 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
0.001
-3 -2 -1 0 1 2 3