SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]sults - Carnegie Mellon University
Summary Statistics
[Run Number CLARTA-full, automatic Recall-Precision Curve
[ Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8 L
Relevant: 10489
Rel[OCRerr]ret: 6811
0.6
[OCRerr]call Level Averages Document Level Averages Precision
Recall Precision Precision 0.4
0.00 0.8101 At 5 docs 0.6200
0.10 0.5781 At 10 docs 0.6060
0.20 0.4941 At 15 docs 0.5893 0.2
0.30 0.4368 At 20 docs 0.5730 0 0.2 0.4 0.6 0.81
0.40 0.3851 At 30 docs 0.5460 0
0.50 0.3358 At 100 docs 0.4346
0.60 0.2749 At 200 docs 0.3416 Recall
0.70 0.2074 At 500 docs 0.2184
0.80 0.1561 At 1000 docs 0.1362
0.90 0.0933 Fallout-Recall Curve
R-Precision {precision after R
1.00 0.0141 1
docs retrieved (where R is
ge precision over all the number of relevant docu-
nt docs ments)) 0.8
iterpolated 0.3269 Exact 0.3646
Level Averages
Recall
0.3545
0.4571
0.5408
0.5801
0.6132
0.6422
0.6603
0.6822
0.6959
0.7096
0.7172
0.6
NORMAL DEVIATE - Fallout-Recall
101 0.02 0.84 0.98 0
0.4
0.2
Recall
2 0.98
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3