SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - Carnegie Mellon University
I Summary Statistics
Run Number CLARTA-full, automatic 1
Num of Queries 50
1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10785
Rel[OCRerr]ret: 8109
0.6
[OCRerr]ecall Level Averages Document Level Averages Precision
Recall Precision Precision 0.4
0.00 0.7581 At 5 docs 0.5800
0.10 0.5676 At 10 docs 0.5680
0.2
0.20 0.5038 At 15 docs 0.5547
0.30 0.4489 At 20 docs 0.5490
0.40 0.3938 At 30 docs 0.5253 0
0.50 0.3455 At 100 docs 0.4644
0.60 0.2806 At 200 docs 0.3874
0.70 0.2172 At 500 docs 0.2542
0.80 0.1463 At 1000 docs 0.1622
0.90 0.0827 R-Precision (precision after R
1
1.00 0.0070 docs retrieved (where R is
[OCRerr]age precision over all the number of relevant docu-
[OCRerr]ant docs ments)) 0.8
interpolated 0.3264 ] Exact 0.3645
Recall-Precision Curve
I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
t Level Averages
Pt Recall
)[OCRerr]0 0.4407
)[OCRerr]0 0.5756
[OCRerr] 0.6613
,0 0.7169
30 0.7520
)0 0.7732
[OCRerr] 0.7767
[OCRerr] 0.7767
,0 0.7767
30 0.7767
)0 0.7767
NORMAL DEVIATE - Fallout-Recall
0.6
Recall
301 0.02 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3