SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]sIiIts - Beilcore
Summary Statistics
Run Number Isiasm-full, automatic Recall-Precision Curve
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000 0.8 -
Relevant: 10785
Rel[OCRerr]ret: 7869
Recall Level Averages
Recall Precision
0.00 0.7127
0.10 0.5122
0.20 0.4698
0.30 0.4241
0.40 0.3753
0.50 0.3379
0.60 0.2758
0.70 0.2073
0.80 0.1331
0.90 0.0443
1.00 0.0000
rage precision over all
vant docs
-interpolated 0.3018
it Level Averages
ut Recall
00 0.4272
20 0.5560
40 0.6362
60 0.6927
80 0.7323
DO 0.7493
20 0.1540
40 0.7540
60 0.7540
80 0.7540
DO 0.7540
Document Level Averages
Precision
At 5 docs 0.5240
At 10 docs 0.5020
At 15 docs 0.5027
At 20 docs 0.4950
At 30 docs 0.4873
At 100 docs 0.4306
At 200 docs 0.3716
At 500 docs 0.2469
At 1000 docs 0.1574
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3580
NORMAL DEVIATE - Fallout-Recall
0.6
Precision
0.4
0.2
I I I I
0
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
1
0.8
0.6
Recall
101 0.02 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3