SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - City University, London
L Summary Statistics
Run Number cityr2-fulI, automatic
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10489
Rel[OCRerr]ret: 7134
Recall Level Averages
Recall Precision
0.00 0.8313
0.10 0.6341
0.20 0.5491
0.30 0.4990
0.40 0.4356
0.50 0.3618
0.60 0.2908
0.10 0.2305
0.80 0.1551
0.90 0.0900
1.00 0.0085
rage precision over all
[OCRerr]ant docs
-interpolated 0.3562 J
it Level Averages
it Recall
[OCRerr] 0.3949
[OCRerr]0 0.4731
[OCRerr] 0.5331
50 0.5722
30 0.6050
)0 0.6278
[OCRerr]0 0.6510
[OCRerr] 0.6696
,0 0.6851
30 0.7065
)0 0.7181
j
1
I
1
0.8
0.6
Document Level Averages Precision
_______________ Precision 0.4
At 5 docs 0.6920
At 10 docs 0.6500
At 15 docs 0.6213 0.2
At 20 docs 0.6060
At 30 docs 0.5613 0
At 100 docs 0.4494
At 200 docs 0.3588
At 500 docs 0.2261
At 1000 docs 0.1427
R- Precision (precision after R
1
docs retrieved (where R is
the number of relevant docu-
ments)) 0.8
Exact 0.3877
0.6
NORMAL DEVIATE - Fallout-Recall
Recall
101 0.a2 0.16 ( 5
0.84 0.98 0
0.4
0.2
2 0.98
1 0.84
0 0.5 0
-1 0.16
-2 0 02
-3 0.001
-3 -2 -1 0 1 2 3
Recall-Precision Curve
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
0 0.05 0.1 0.15 0.2
Fallout x 100