SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - HNC, Inc
Summary Statistics
Run Number HNCrt1-full, automatic Recall-Precision Curve
Num of Queries 50 J 1
Total number of documents over all queries 1
Retrieved: 49970 1 0.8[OCRerr]
Relevant: 10489
Rel[OCRerr]ret: 5833
0.6
Recall Level Averages
Recall Precision
0.00 0.7867
0.10 0.5873
0.20 0.5083
0.30 0.4187
0.40 0.3311
0[OCRerr]50 0.2691
0.60 0.1962
0.70 0.1130
0.80 0.0451
0.90 0.0180
1.00 0.0023
erage precision over all
-vant docs
ri-interpolated 0.2810
ut Level Averages
)ut Recall
[OCRerr] 0.3211
D20 0.3893
[OCRerr]40 0.4274
[OCRerr]0 0.4554
D80 0.4791
[OCRerr]0 0.5037
120 0.5152
[OCRerr]0 0.5260
[OCRerr]0 0.5373
[OCRerr]0 0.5458
200 0.5504
Document Level Averages
Precision
At 5 docs 0.6120
At 10 docs 0.6020
At 15 docs 0.5853
At 20 docs 0.5740
At 30 docs 0.5480
At 100 docs 0.4228
At 200 docs 0.3223
At 500 docs 0.1924
At 1000 docs 0.1167
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3383
Precision
0.4
0.2
0
1
0.8
0.b
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
y
NORMAL DEVIATE - Fallout-Recall
Ol 0.02 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 ________________ I I 0.001
Recall
-3 -2 -1 0 1 2 3