SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]u1ts - New York University
Summary Statistics
Run Number nyuir3-category B, automatic Recall-Precision Curve
Num of Queries 50 1
lotal number of documents over all queries
Retrieved: 49811 0.8
Relevant: 3929
Reljet: 3281
ecall Level Averages
Recall Precision
0.00 0.7528
0.10 0.5514
0.20 0.4724
0.30 0.4076
0.40 0.3621
0.50 0.3142
0.60 0.2711
0.10 0.2237
0.80 0.1691
0.90 0.0916
1.00 0.0160
lge precision over all
Int docs
nterpolated 0.3118
Level Averages
Document Level Averages
Precision
At 5 docs 0.5360
At 10 docs 0.4880
At 15 docs 0.4101
At 20 docs 0.4410
At 30 docs 0.4080
At 100 docs 0.3094
At 200 docs 0.2140
At 500 docs 0.1140
At 1000 docs 0.0656
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
0.6
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I I
Exact 0.3321
0.6
NORMAL DEVIATE - Fallout-Recall
0.84 098 0
0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16
FalloutxlOO
-2 0.02
0.001
-3 -2 -1 0 1 2 3
Recall Recall
D 0.3412
D 0.4885
D 0.5629
0.6220
0.6586
0.6991
0.7206
3 0.7445
3 0.1581
3 0.1744
3 0.7866