SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
ults - City University, London
L Summary Statistics 1 Recall-Precision Curve
Run Number cityau-full1 automatic 1
Num of Queries 50 j
Total number of documents over all queries 1
0.8 -
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 6412 j 0.6 -
ecall Level_Averages
Recall Precision
0.00 0.7148
0.10 0.4224
0.20 0.3528
0.30 0.2983
0.40 0.2623
0.50 0.2191
0.60 0.1835
0.10 0.1299
0.80 0.0931
0.90 0.0421
1.00 0.0000
`age precision over all
ant docs
interpolated 0.2272
Document Level Averages
Precision
At 5 docs 0.5000
At 10 docs 0.4640
At 15 docs 0.4667
At 20 docs 0.4570
At 30 docs 0.4340
At 100 docs 0.3512
At 200 docs 0.2970
At 500 docs 0.1966
At 1000 docs 0.1282
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.2849
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
0.6 -
t Level Averages NORMAL DEVIATE - Fallout-Recall Recall
)[OCRerr]0 0.3344
0.4481
0.5113
0.5761
BO 0.6171
60 0.6438
00 0.6438
0.84 0.98 0 0.4
0.2
0.98
2
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 I I I 0.001
-3 -2 -1 0 1 2 3