SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - Cornell University
Summary Statistics
Recall-Precision Curve
Run Number crnlCl-full1 automatic 1
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10489
Reljet: 7808 0.6
call Level Averages
Recall Precision
0.00 0.8355
0.10 0.1202
0.20 0.6516
0.30 0.5816
0.40 0.5142
0.50 0.4518
0.60 0.3542
0.70 0.2519
0.80 0.1547
0.90 0.0677
1.00 0.0062
ige precision over all 1
mt docs j
[OCRerr]terpolated 0.4091 J
Level Averages
Recall
) 0.4336
) 0.5602
[OCRerr] 0.6110
) 0.6592
[OCRerr] 0.6886
[OCRerr] 0.7144
[OCRerr] 0.7338
[OCRerr] 0.7489
D 0.1588
D 0.7681
D 0.7763
Document Level Averages Precision
Precision 0.4
At 5 docs 0.7240
At 10 docs 0.7000
0.2
At 15 docs 0.6840
At 20 docs 0.6660 0 0.2 0.4 0.6 0.81
At 30 docs 0.6240 0
At 100 docs 0.5156
At 200 docs 0.4104 Recall
At 500 docs 0.2570
At 1000 docs 0.1562
Fallout-Recall Curve
R-Precision (precision afjer R
1
docs retrieved (where R is
relevant docu-
the number of
ments))
Exact 0.4367
NORMAL DEVIATE - F[OCRerr]1out-Reca11
0.8
0.6 -
Recall
Ol 0.02 0.16 0.98 0 0.4[OCRerr]
2 0.98
0.2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3