IRS13
Scientific Report No. IRS-13 Information Storage and Retrieval
An Analysis of the Documentation Requests
chapter
E. M. Keen
Harvard University
Gerard Salton
Use, reproduction, or publication, in whole or in part, is permitted for any purpose of the United States Government.
X-9
The discussion of unclear requests in part [OCRerr]D is closely linked to
the relevance decisions, since as J. O'Connor shows [1], relevance disagree-
ments are often due to unclear request forms; furthermore, since many requests
that are thought to be clear are not so in fact, one is led to different
request interpretations and hence to different relevance decisions. Probably
more examples than the five given in part [OCRerr]D exist, but by the stringent
criteria for clarity suggested by O'Connor, many real user requests would
be regarded as unclear also.
Several of the requests deal with quite similar topics, and sometimes
do not have as many relevant documents in c[OCRerr]mon as the requests suggest.
Exan[OCRerr]les are requests AlS and 38, Bl and B16, B9 and Bil, and A5, B3 and 36.
A clear error of judgment is seen for document 7, where the photo cam position
method that is described for producing NASA's "Scientific and Technical
Aerospace Reports" is thought relevant to request A7, [OCRerr]ihich demands documents
on systems for producing original papers by computer. The request preparer
probably did not realize that NASA STAR is not a series of original reports.
No specific examples have been found of documents that should have been
recognized as relevant, except in those cases where two or more requests
seem very similar, as noted.
5. Request Performance
A) General Performance Analysis Methods.
It was intended to divide the individual requests into three groups,
namely:
a) Requests which perform badly on all processing options;
b) Requests which perform well on some options and badly on others;
c) Requests which perform weli on all options.