SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - Queens College, CUNY
Run Number pircsl-full, automatic Recall-Precision Curve
[ Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10489
Rel[OCRerr]ret: 5913
Recall Level Averages
Recall Precision
0.00 0.6360
0.10 0.4732
0.20 0.3947
0.30 0.3523
0.40 0.2999
0.50 0.2457
0.60 0.1911
0.70 0.1443
0.80 0.1094
0.90 0.0571
1.00 0.0087
rage precision over all
[OCRerr]ant docs
-interpolated 0.2488
t Level Averages
it Recall
[OCRerr]0 0.2840
[OCRerr]0 0.3622
10 0.4133
,0 0.4448
30 0.4690
)0 0.4951
[OCRerr]0 0.5229
[OCRerr] 0.5603
)0 0.5746
30 0.5853
)0 0.5942
0.6
Document Level Averages Precision
Precision 0.4
At 5 docs 0.4920
At 10 docs 0.4840
At 15 docs 0.4547 0.2
At 20 docs 0.4420
At 30 docs 0.4147 0
At 100 docs 0.3382
At 200 docs 0.2758
At 500 docs 0.1828
At 1000 docs 0.1183
R-Precision (precision after R
1
docs retrieved (where R is
relevant docu-
the number of
ments))
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
Exact 0.2950
0.6
NORMAL DEVIATE - Fallout-Recall
101 0.02 0.16 0.98 0 1[OCRerr]999
0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
Recall
-3 -2 -1 0 1 2 3