SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
`suits - Carnegie Mellon University
L Summary Statistics ]
F Run Number CLARTM-full1 manual
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 8229
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.1455
0.10 0.5811
0.20 0.5240
0.30 0.4622
0.40 0.4135
0.50 0.3593
0.60 0.2983
0.10 0.2312
0.80 0.1516
0.90 0.0814
1.00 0.0062
`age precision over all
ant docs
interpolated 0.3383 j
t Level Averages
Recall
Fo 0.4521
0 0.5940
[OCRerr] 0.6130
0 0.7211
0 0.1696
0 0.1868
.0 0.7932
0 0.7932
0 0.1932
0 0.1932
`0 0.1932
Document Level Averages
Precision
At 5 docs 0.5840
At 10 docs 0.5140
At 15 docs 0.5640
At 20 docs 0.5590
At 30 docs 0.5433
At 100 docs 0.4846
At 200 docs 0.3915
At 500 docs 0.2601
At 1000 docs 0.1646
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3141
NORMAL DEVIATE - Fallout-Recall
)01 0.02 0.16
Recall-Precision Curve
1
0.8
0.6 -
Precision
0.4
0.2
I I I
0
1
0.8
0.6
Recall
0.R4 0.98
0.4
0.2
2 0.98
1 0.84
0 0.5 0
0 0.2 0.4 0.6 0.8 1
Recall
Fallout-Recall Curve
I I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3