SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
"ults - New York University
Summary Statistics ]
Run Number nyuirl-category B, automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 49884 0.8 -
Relevant: 3929
Reljet: 2983
0.6 -
[OCRerr]call Level Averages
Recall Precision
0.00 0.7013
0.10 0.4874
0.20 0.4326
0.30 0.3531
0.40 0.3016
0.50 0.2637
0.60 0.2175
0.10 0.1617
0.80 0.1176
0.90 0.0684
1.00 0.0102
ge precision over all
nt docs
iterpolated 0.2649 J
Level Averages
Recall
0.2825
0.4257
0.4979
0.5494
0.5768
0.6112
0.6395
0.6574
0.6757
0.6905
0.7014
2
Document Level Averages
Precision
At 5 docs 0.4920
At 10 docs 0.4420
At 15 docs 0.4240
At 20 docs 0.4050
At 30 docs 0.3640
At 100 docs 0.2720
At 200 docs 0.1886
At 500 docs 0.1026
At 1000 docs 0.0597
R-Precision (precision after R
docs retrieved (where R is
relevant docu-
the number of
ments))
Exact 0.3003
NORMAL DEVIATE - [OCRerr]1out-RecaII
101 0.02
-3 -2 -1 0 1 2 3
0.2
1 0.84
0 0.5
0
0 0.05 0.1 0.15 0.2
-1 0.16
Fallout x 100
-2 0.02
-3
0.001
Precision
0.4
0.2
0
1
0.8
0.6
.999 0.4
0.98
Recall
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I