SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - Cornell University
Summary Statistics
Recall-Precision Curve
Run Number crnlRl-full, automatic 1
1
Num of Queries 50
Jotal number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10489
Rel[OCRerr]ret: 7164
Recall Level Averages
Recall Precision
0.00 0.8268
0.10 0.1060
0.20 0.6282
0.30 0.5636
0.40 0.4900
0.50 0.4246
0.60 0.3416
0.10 0.2483
0.80 0.1543
0.90 0.0615
1.00 0.0062
.`rage precision over all
vant docs
-interpolated 0.3952
Document Level Averages
Precision
At 5 docs 0.6800
At 10 docs 0.6180
At 15 docs 0.6640
At 20 docs 0.6560
At 30 docs 0.6241
At 100 docs 0.4950
At 200 docs 0.3914
At 500 docs 0.2526
At 1000 docs 0.1553
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.4213
0.6
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I I
it Level Averages
[OCRerr]ut Recall
[OCRerr]00 0.4126
`20 0.5260
`40 0.6041
`60 0.6485
`80 0.6786
00 0.1056
20 0.1241
40 0.1433
60 0.1588
80 0.1685
O0 0.1194
NORMAL DEVIATE - Fallout-Recall
0.6
Recall
101 0.02 0.16 5 0.84 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3