SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ilts - Queens College, CUNY
Summary Statistics
J
Run Number pircs3-full, automatic
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10185
Rel[OCRerr]ret: 1456
call Level Averages
[OCRerr]ecall Precision
0.00 0.7538
0.10 0.5529
0.20 0.4127
0.30 0.4088
0.40 0.3531
0.50 0.2994
0.60 0.2311
0.10 0.1677
0.80 0.1094
0.90 0.0551
1.00 0.0145
[OCRerr]e precision over all
[OCRerr]t docs
terpolated 0.2949
evel Averages
Recall
0.4118
0.5411
0.6092
0.6608
0.6990
0.7245
0.1288
0.1288
0.1288
0.1288
0.7288
Document Level Averages
________________ Precision
At 5 docs 0.5640
At 10 docs 0.5560
At 15 docs 0.5467
At 20 docs 0.5460
At 30 docs 0.5200
At 100 docs 0.4466
At 200 docs 0.3660
At 500 docs 0.2342
At 1000 docs 0.1491
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3437
NORMAL DEVIATE - Fallout-Recall
Ol 0.02 0.16 0.84 0.98 0
[OCRerr]999
2 0.98
Recall-Precision Curve
1
0.8
0.6
Precision
0.4
0.2
I I I I
0
Recall
1
0.8
0.6
0.4
0.2
1 0.84
0 0.5 0
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3