SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
Its - Siemens Corporate Research, Inc.
L Summary Statistics ]
Run Number siems3-full, automatic ]
Num of Queries 50
Total number of documents over all queries
Retrieved: 50000
Relevant: 10785
Rel[OCRerr]ret: 8065
all Level Averages
ecall Precision
).00 0.8164
).10 0.5849
).20 0.5262
).30 0.4774
).40 0.4262
).50 0.3771
).60 0.3111
).70 0.2338
).80 0.1362
).90 0.0513
1.00 0.0000
r[OCRerr]e precision over all
it docs
[OCRerr]erpolated 0.3397
evel Averages
Recall
0.4714
0.5999
0.6680
0.7207
0.7543
0.7720
0.7757
0.7757
0.7757
0.7757
0.7757
Document Level Averages
Precision
At 5 docs 0.5960
At 10 docs 0.5920
At 15 docs 0.5773
At 20 docs 0.5630
At 30 docs 0.5453
At 100 docs 0.4692
At 200 docs 0.3979
At 500 docs 0.2598
At 1000 docs 0.1613
R-Precision (precision after R
docs retrieved (where R is
the number of relevant docu-
ments))
1
0.8
0.6
Precision
0.4
0.2
0
Exact 0.3931
NORMAL DEVIATE - Fallout-Recall
Recall
1
0.8
0.6
Ol 0.02 0.16 5 0.84 0.98 0 0.4
2 0.98
0.2
1 0.84
0 0.5 0
Recall-Precision Curve
I I I
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I
I I
0 0.05 0.1 0.15 0.2
-1 0.16 Fallout x 100
-2 0.02
-3 I I I I 0.001
-3 -2 -1 0 1 2 3