SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
esults - Belicore
Summary Statistics j
Run Number Isial-full, automatic Recall-Precision Curve
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10785
Rel[OCRerr]ret: 4756
0.6<
Recall Level Averages
Recall Precision
0.00 0.5897
0.10 0.3100
0.20 0.2392
0.30 0.1929
0.40 0.1482
0.50 0.1077
0.60 0.0616
0.70 0.0257
0.80 0.0159
0.90 0.0000
1.00 0.0000
rage precision over all 1
[OCRerr]ant docs
-interpolated 0.1307 1
it Level Averages
Lit Recall
0[OCRerr]0 0.2200
20 0.3034
40 0.3601
[OCRerr]0 0.4050
BO 0.4350
[OCRerr]0 0.4576
[OCRerr]0 0.4642
[OCRerr]0 0.4642
,0 0.4642
30 0.4642
)0 0.4642
Document Level Averages
Precision
At 5 docs 0.3720
At 10 docs 0.3340
At 15 docs 0.3200
At 20 docs 0.3210
At 30 docs 0.3060
At 100 docs 0.2664
At 200 docs 0.2092
At 500 docs 0.1406
At 1000 docs 0.0951
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Precision
0.4
0.2
0
1
0.8
L Exact 0.1937
0.6
NORMAL DEVIATE - Fallout-Recall
)01 0.02 0.84 0.98 0
[OCRerr][OCRerr]999 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I