SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - Advanced Decision Systems
Summary Statistics
Run Number ADS2-ap only, automatic 1
of Queries 50
Total number of documents over all queries
0.8
Retrieved: 33034
Relevant: 5671
Rel[OCRerr]ret:l468I 0.6
Recall-Precision Curve
Recall level Averages
Recall Precision
0.00 0.2199
0.10 0.1572
0.20 0.1304
0.30 0.1160
0.40 0.0885
0.50 0.0884
0.60 0.0114
0.70 0.0612
0.80 0.0362
0.90 0.0201
1.00 0.0196
ucrage precision over all 1
levant docs
)n-interpolated 0.0821 ]
lout Level Averages
lout Recall
0000 0.0506
0020 0.0817 2
0040 0.1070
0060 0.1248 1
0080 0.1403
0
0100 0.1568
0120 0.1643 -1
0140 0.1687
0160 0.1742 -2
`0180 0.1801
-3
`0200 0.1813
Document Level Averages Precision
Precision 0.4
At 5 docs 0.1280
At 10 docs 0.1260
0.2 ¼- [OCRerr]
At 15 docs 0.1320
At 20 docs 0.1300 0 0.2 04 0.6 081
At 30 docs 0.1300 0
At 100 docs 0.1118
At 200 docs 0.0838 Recall
At 500 docs 0.0530
At 1000 docs 0.0294
Fallout-Recall Curve
R-Precision [OCRerr]precision after R 1
docs retrieved (where R is
of relevant docu-
the number
ments))
Exact 0.1092
NORMAL DEVIATE - Fallout-Recall
Recall
0.8 -
0.6
101 0.02 0.98 0 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
0.001
3 -2 -1 0 1 2 3