SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
results - Verity, Inc.
[OCRerr]um mary Statistics
Run Number TOPIC2-full, manual
Num of Queries 50
]
1
Total number of documents over all queries
Retrieved: 40869 0.8
Relevant: 10489
Rel[OCRerr]ret: 5600
Recall-Precision Curve
Recall Level Averages
Recall Precision
0.00 0.7300
0.10 0.5963
0.20 0.4923
0.30 0.4160
0.40 0.3450
0.50 0.2139
0.60 0.1183
0.10 0.1350
0.80 0.0963
0.90 0.0289
1.00 0.0120
.[OCRerr]rage precision over all
.[OCRerr]ant docs
i-interpolated 0.2830
[OCRerr]t Level Averages
ut Recall
00 0.3293
`20 0.4000
`40 0.4399
[OCRerr]0 0.4617
80 0.4843
00 0.4966
20 0.5094
40 0.5243
60 0.5355
80 0.5500
DO 0.5631
0.6 -
Document Level Averages Precision
Precision
0.4
At 5 docs 0.6080
At 10 docs 0.5800
At 15 docs 0.5513 0.2
At 20 docs 0.5390
At 30 docs 0.5221
0
At 100 docs 0.4028
At 200 docs 0.3048
At 500 docs 0.1805
At 1000 docs 0.1120
R-Precision (precision after R
docs retrieved (where R is 1
the number of relevant docu-
ments)) 0.8
Fxact 0.3418
0.6
NORMAL DEVIATE - Fallout-Recall
5 0.84 0.98 0
0.4
0.2 -
Recall
2 0.98
0 02 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
1 0.84
0 0.5 0
-1 0 0.05 0.1 0.15 0.2
0.16
Fallout x 100
-2 0.02
-3 0.001
-3 -2 -1 0 1 2 3