SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]esn1ts - University of Massachusetts at Amherst
Summary Statistics ]
I Run Number INQOO4-fulI, manual 1 1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10489
Rel[OCRerr]ret: 7386 0.6
Recall-Precision Curve
Recall Level Averages Document Level Averages Precision
Recall Precision Precision 0.4
0.00 0.8489 At S docs 0.6680
0.10 0.6450 At 10 docs 0.6580
0.2
0.20 0.5327 At 15 docs 0.6200
0.30 0.4821 At 20 docs 0.5940 0 0.2 0.4 0.6 0.81
0.40 0.4304 At 30 docs 0.5747 0
0.50 0.3633 At 100 docs 0.4474
0.60 0.3054 At 200 docs 0.3547 Recall
0.70 0.2325 At 500 docs 0.2360
0.80 0.1762 At 1000 docs 0.1477
Fallout-Recall Curve
0.90 0.1054 R-Precision (precision after R
1
1.00 0.0151 docs retrieved (where R is
..rage precision over all the number of relevant docu-
vant docs ments)) 0.8
-interpolated 0.3612 Exact 0.3951
Jt Level Averages
ut Recall
[OCRerr] 0.3956
20 0.4966
40 0.5621
60 0.6083
80 0.6396
.00 0.6649
.20 0.6860
.40 0.7025
.60 0.7203
.80 0.7313
[OCRerr]00 0.7435
NORMAL DEVIATE Fallout-Recall
0.6
Recall
)01 0.98 0 )[OCRerr]999 0.4[OCRerr]
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3