SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
[OCRerr]sIiIts - City University, London
Summary Statistics
[ Run Number citymf-fu II, feedback Recall-Precision Curve
1
Num of Queries 50
Total number of documents over all queries
Retrieved: 51221 0.8
Relevant: 10785
Rel[OCRerr]ret: 6409
0.6 -
Recall Level Averages
Recall Precision
0.00 0.7328
0.10 0.4944
0.20 0.4180
0.30 0.3552
0.40 0.2809
0.50 0.2205
0.60 0.1584
0.70 0.0901
0.80 0.0482
0.90 0.0184
1.00 0.0000
rage precision over all
[OCRerr]ant docs
interpolated 0.2324
it Level Averages
Recall
[OCRerr] 0.3417
20 0.4289
[OCRerr]0 0.4898
50 0.5366
30 0.5682
[OCRerr]0 0.5932
[OCRerr]0 0.6012
[OCRerr] 0.6012
50 0.6012
30 0.6012
)0 0.6012
Document Level Averages Precision
Precision 0.4
At S docs 0.4920
At 10 docs 0.5040
At 15 docs 0.4827 0.2
At 20 docs 0.4750
At 30 docs 0.4680 0
At 100 docs 0.4004
At 200 docs 0.3162
At 500 docs 0.1979
At 1000 docs 0.1275
R-Precision (precision after R
1
docs retrieved (where R is
of relevant docu-
the number
ments))
Exact 0.2971
NORMAL DEVIATE - Fallout-Recall
i01 0.02 0.16
-3 -2 -1 0 1 2 3
2 0.98
Recall
0.8
0.6
0.4
0.2
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
1 0.84
0 0.5 0
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 0.001