SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suits - University of California[OCRerr] Los An[OCRerr]e1es
f Summary Statistics
Run Number uclaa2-category B1 automatic Recall-Precision Curve
Num of Queries 50 1
Total number of documents over all queries
Retrieved: 50000 0.8
Relevant: 3929
Reljet: 3059
0.6 -
[OCRerr]ecall Level Averages
Recall Precision
0.00 0.7464
0.10 0.5880
0.20 0.5024
0.30 0.4058
0.40 0.3429
0.50 0.2698
0.60 0.2168
0.10 0.1803
0.80 0.1209
0.90 0.0695
1.00 0.0098
*age precision over all
ant docs
interpolated 0.2951 ]
Document Level Averages
Precision
At 5 docs 0.5440
At 10 docs 0.5180
At 15 docs 0.4920
At 20 docs 0.4650
At 30 docs 0.4201
At 100 docs 0.2160
At 200 docs 0.1958
At 500 docs 0.1051
At 1000 docs 0.0612
RLprecision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
Exact 0.3289
Precision
0.4
0.2
0
1
0.8
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
t Level Averages
[OCRerr]t Recall
Fo 0.3245
[OCRerr]0 0.4324
[OCRerr] 0.4885
[OCRerr] 0.5313
[OCRerr]0 0.5838
[OCRerr] 0.6135
0 0.6339
[OCRerr] 0.6561
[OCRerr] 0.6698
0 0.6852
0 0.7066
0.6
NORMAL DEVIATE - [OCRerr]Iout-RecaI1
101 0.02 0.84 0.98
0.4
0.2
Recall
2 0.98
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3