SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - VPI [OCRerr] SU
Summary Statistics
]
Run Number VTcms2-full manual Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10185
Rel[OCRerr]ret: 1616
Recall Level_Averages
Recall Precision
0.00 0.8693
0.10 0.6115
0.20 0.5409
0.30 0.4591
0.40 0.3834
0.50 0.3172
0.60 0.2451
0.70 0.1659
0.80 0.0909
0.90 0.0364
1.00 0.000
[OCRerr]rage precision over a
vant docs 0.320:11
i-interpolated
Document Level Averages
Precision
At 5 docs 0.6800
At 10 docs 0.6280
At 15 docs 0.6240
At 20 docs 0.6130
At 30 docs 0.5820
At 100 docs 0.4790
At 200 docs 0.3835
At 500 docs 0.2430
At 1000 docs 0.1535
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3708
0.6
[OCRerr]t Level Averages NORMAL DEVIATE - Fallout-Recall
ut Recall
i00 0.4410
`20 0.5666
40 0.6420
60 0.6877
80 0.7179
00 0.7347
20 0.7414
40 0.7414
60 0.7414
80 0.7414
o0 0.7414
Recall
3:1 0.98 a )[OCRerr]999
0.4
2 0.98
0.2
1 0.84
0 0.5
0
0.6
Precision
0.4
0.2 -
I I I
0
1
0.8
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
I I I
0 0.05 0.1 0.15 0.2
-1 0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3