SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
Lilts - Thinking Machines Corp.
Summary Statistics
Run Number TMC9-full, automatic
Num of Queries 50
Total number of documents over all queries
1
Retrieved: 49998 0.8
Relevant: 10185
Rel[OCRerr]ret: 6181 0.6 -
Recall-Precision Curve
`call Level_Averages
Recall Precision
0.00 0.5141
0.10 0.3692
0.20 0.3195
0.30 0.2104
0.40 0.2203
0.50 0.1113
0.60 0.1144
0.70 0.0682
0.80 0.0357
0.90 0.0104
1.00 0.0000
ige precision over all
mt docs
iterpolated 0.1136
Document Level Averages
Precision
At 5 docs 0.2800
At 10 docs 0.3180
At 15 docs 0.3320
At 20 docs 0.3450
At 30 docs 0.3501
At 100 docs 0.3144
At 200 docs 0.2661
At 500 docs 0.1840
At 1000 docs 0.1237
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.2436
Precision
0.4
0.2
0
Level Averages NORMAL DEVIATE - Fallout-Recall
Recall
Recall
0.2125
0.3846
0.4610
0.5126
0.5540
0.5827
0.5904
0.5904
0.5904
0.5904
0.5904
1
0.8
0.6
101 0.02 0.16 ( 5 0.84 0.98 0 9 0.4
2 0.98
0.2
1 0.84
0 0.5 0
I I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 I I I I 0.001
-3 -2 -1 0 1 2 3