SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
esults - Thinking Machines Corn.
[ Summary Statistics
[ Run Number TMC8-full, automatic Recall-Precision Curve
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 49998 0.8
Relevant: 10185
Rel[OCRerr]ret: 6497
Recall Level Averages
Recall Precision
0.00 0.5482
0.10 0.4005
0.20 0.3411
0.30 0.2911
0.40 0.2525
0.50 0.1914
0.60 0.1341
0.10 0.0855
0.80 0.0509
0.90 0.0125
1.00 0.0000
rage precision over all 1
~docs
-interpolated 0.1939
it Level Averages
[OCRerr]t Recall
DO 0.2962
20 0.4158
[OCRerr] 0.4942
50 0.5490
[OCRerr]0 0.5911
[OCRerr]0 0.6191
[OCRerr]0 0.6260
[OCRerr] 0.6260
,0 0.6260
30 0.6260
)0 0.6260
Document Level Averages
Precision
At 5 docs 0.3520
At 10 docs 0.3680
At 15 docs 0.3733
At 20 docs 0.3180
At 30 docs 0.3820
At 100 docs 0.3342
At 200 docs 0.2835
At 500 docs 0.1942
At 1000 docs 0.1299
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.2612
NORMAL DEVIATE - Fallout-Recall
101 0.02 0.16 ( 5 0.84 0.98 0
0.6
Precision
0.4
0.2
0
Recall
1
0.8
0.6
0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3