SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
`sults - Queens College, CUNY
Summary Statistics )
[OCRerr] Run Number pircs4-fulI, automatic 1
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 7464
Recall Level_Averages
Recall Precision
0.00 0.7621
0.10 0.5569
0.20 0.4773
0.30 0.4108
0.40 0.3554
0[OCRerr]50 0.3017
0.60 0.2343
0.70 0.1702
0.80 0.1128
0.90 0.0602
1.00 0.0154
`rage precision over all
.`vant docs
i-interpolated 0.2981
ut Level Averages
[OCRerr]t Recall
)00 0.4140
)20 0.5449 2
)40 0.6128
)60 0.6637 1
)80 0.7008
0
100 0.7244
120 0.7293 -1
[40 0.7293
[60 0.7293 -2
[80 0.7293
-3
[OCRerr]00 0.7293
Document Level Averages
Precision
At 5 docs 0.5760
At 10 docs 0.5760
At 15 docs 0.5507
At 20 docs 0.5460
At 30 docs 0.5200
At 100 docs 0.4494
At 200 docs 0.3680
At 500 docs 0.2348
At 1000 docs 0.1493
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments)) _______________
1
0.8
0.6
Precision
0.4
0.2
0
[OCRerr]t03467
NORM [OCRerr]L DEVIATE - Fallout-Recall
Recall
101 0.02 0.98 0
-3 -2 -1 0 1 2 3
1
0.8
0.6
0.4~
0.98
0.2
0.84
0.5 0
Recall-Precision Curve
I I I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
I I
0 0.05 0.1 0.15 0.2
0.16 Fallout x 100
0.02
0.001