SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
Lilts - Universitaet Dortmund
[ Summary Statistics
Run Number dortQ2-full, automatic
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 8259
`call Level Averages
[OCRerr]ecall Precision
0.00 0.7673
0.10 0.5518
0.20 0.5060
0.30 0.4572
0.40 0.4114
0.50 0.3635
0.60 0.3048
0.70 0.2388
0.80 0.1565
0.90 0.0777
1.00 0.0086
ge precision over all
nt docs
iterpolated 0.3340 1
Document Level Averages
Precision
At 5 docs 0.5560
At 10 docs 0.5440
At 15 docs 0.5427
At 20 docs 0.5440
At 30 docs 0.5213
At 100 docs 0.4626
At 200 docs 0.3959
At 500 docs 0.2607
At 1000 docs 0.1652
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3722
1
0.8
0.6
Precision
0.4
0.2
0
1
0.8
0.6
Level Averages NORMAL DEVIATE - Fallout-Recall
Recall
Recall
ol 0.02 0.98 0 )[OCRerr]999 0.4
0.4625
0.5959
0.6731
0.7216
0.7643
0.7850
0.7908
0.7908
0.7908
0.7908
0.7908
2 0.98
0.2
1 0.84
0 0.5 0
Recall-Precision Curve
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 I I I 0.001
-3 -2 -1 0 1 2 3