SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ilts - New York University
Summary Statistics
Run Number nyuir2-category B, automatic 1
Num of Queries 50 3
Total number of documents over all queries
0.8
Retrieved: 49876
Relevant: 3929
Rel[OCRerr]ret: 3274 I 0.6 -
Recall-Precision Curve
`call Level Averages
Recall Precision
0.00 0.7528
0.10 0.5567
0.20 0.4721
0.30 0.4060
0.40 0.3617
0.50 0.3135
0.60 0.2703
0.70 0.2231
0.80 0.1667
0.90 0.0915
1.00 0.0154
[OCRerr]ge precision over all
[OCRerr]nt docs
nterpolated 0.3111 ]
Level Averages
t Recall
0 0.3407
0 0.4886
[OCRerr] 0.5620
[OCRerr]0 0.6193
[OCRerr]0 0.6569
)0 0.7012
[OCRerr]0 0.7202
10 0.7435
)0 0.7572
30 0.7732
)0 0.7842
Document Level Averages Precision
Precision 0.4
At 5 docs 0.5360
At 10 docs 0.4880
0.2
At 15 docs 0.4693
At 20 docs 0.4390
At 30 docs 0.4067 0
At 100 docs 0.3094
At 200 docs 0.2139
At 500 docs 0.1137
At 1000 docs 0.0655
R-Precision (precision after R I
docs retrieved (where R is
the number of relevant docu-
ments)) 0.8
Exact 0.3320
0.6
NORMAL DEVIATE - Fallout-Recall
Recall
0.84 098 0.4
0.2
0.98
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3