SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
"its - University of Massachusetts at Amherst
Summary Statistics
Run Number INQOO2-full, manual 1
Num of Queries 50
L
4otal number of documents over all queries
Retrieved: 50000
Relevant: 10185
Rel[OCRerr]ret: 8165
mcall Level Averages
Recall Precision
0.00 0.1162
0.10 0.6299
0.20 0.5604
0.30 0.4920
0.40 0.4331
0.50 0.3122
0.60 0.3133
0.10 0.2268
0.80 0.1545
0.90 0.0815
1.00 0.0122
[OCRerr]ge precision over all 1
[OCRerr]nt docs
nterpolated 0.3565 ]
Level Averages
t[OCRerr] Recall
0.4860
0.6161 2
0.6885
0.1359 1
0.1639
0
0.1802
0.1859 -1
0.1859
0.1859 -2
0.1859
-3
0.1859
0.8
0.6
Precision
0.4
0.2
0
1
0.8
0.6
Document Level Averages
Precision
At 5 docs 0.6000
At 10 docs 0.6160
At 15 docs 0.6040
At 20 docs 0.6140
At 30 docs 0.5880
At 100 docs 0.5058
At 200 docs 0.4089
At 500 docs 0.2626
At 1000 docs 0.1633
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3954
NORMAL DEVIATE - Fallout-Recall
Recall
Recall- Precision Curve
I I I -
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
0.98 0. 0.4
0.98
0.2
0.84
0.5 0
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
0.001
-3 -2 -1 0 1 2 3