SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]su1ts - New York University
Summary Statistics
Run Number nyuirl-category B, automatic 1 Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 2064
Rel[OCRerr]ret: 1390
`call Level Averages
Recall Precision
0.00 0.5400
0.10 0.3937
0.20 0.3423
0.30 0.2512
0.40 0.2263
0.50 0.2032
0.60 0.1614
0.10 0.1295
0.80 0.0905
0.90 0.0442
1.00 0.0284
ge precision over all
nt docs
iterpolated 0.2038
Level Averages
Recall
1)
0.2413
0.2950
2
0.3314
0.3126 1
0.3872
0
0.4046
0.4181 -1
0.4328
0.4415 -2
0.4600
0.4698
Document Level Averages
Precision
At 5 docs 0.3360
At 10 docs 0.3240
At 15 docs 0.2933
At 20 docs 0.2790
At 30 docs 0.2401
At 100 docs 0.1412
At 200 docs 0.0939
At 500 docs 0.0489
At 1000 docs 0.0218
R-Precision (precision after R
docs retrieved (where R is
0.6
Precision
0.4
0.2
0
1
the number of relevant docu-
ments)) 0.8 -
Exact 0.2267
0.6
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I I
NORMAL DEVIATE - Fallout-Recall
0.84 0.98 0 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
0.001
Recall
-3 -2 -1 0 1 2 3