SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
re8ults - Thinking Machines Corp.
Summary Statistics
Recall-Precision Curve
Run Number TMC1-full1 automatic 1
1 -I
Num of Queries 50
Total number of documents over all queries
Retrieved: 46122 0.8
Relevant: 10489
Reljet: 5112
0.6 -
Recall Level Averages
Recall Precision
0.00 0.5085
0.10 0.3828
0.20 0.3318
0.30 0.3064
0.40 0.2632
0.50 0.1964
0.60 0.1581
0.10 0.1240
0.80 0.0184
0.90 0.0453
1.00 0.0202
brage precision over all 1
`vant docs j
i-interpolated 0.2045 j
Document Level Averages
Precision
At5docs 0.3120
At 10 docs 0.3360
At 15 docs 0.3341
At 20 docs 0.3360
At 30 docs 0.3281
At 100 docs 0.2920
At 200 docs 0.2440
At 500 docs 0.1688
At 1000 docs 0.1142
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.2564
Precision
0.4
0.2
0
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
1
0.8 -
ut Level Averages
ut Recall
[OCRerr] 0.2333
[OCRerr]20 0.3224
[OCRerr]0 0.3114
[OCRerr]0 0.4123
[OCRerr]0 0.4433
.00 0.4121
.20 0.4961
.40 0.5158
.60 0.5360
.80 0.5482
[OCRerr]0 0.5611
NORMAL DEVIATE - Fallout-Recall
0.6
Recall
101 0.02 0.16 ( 5 flR4 0.98 0
0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3