Guidelines for the TREC-2001 Video Track


Goal:

Data:

Tasks:

Motivations for initial task choices:

Example types of needs:

I'm interested in video material / information about:
  1. a specific person

  2. e.g., I want all the information you have on Ronald Reagan.
  3. one or more instances of a category of people

  4. e.g., Find me some footage of men wearing hardhats.
  5. a specific thing

  6. e.g., I'm interested in any material on Hoover Dam. I'm looking for a picture of the OGO satellite.
  7. one or more instances of a category of things

  8. e.g., I need footage of helicopters.
  9. a specific event/activity

  10. e.g., I'm looking for a clip of Ronald Reagan reading a speech about the space shuttle
  11. one or more instances of a category of events/activities

  12. e.g., I want to include several different clips of rockets taking off. I need to explain what cavitation is all about.
  13. other?

Topics:

Evaluation:

Each participating group is allowed to submit the results from up to two system variants.

A submission checker program in Java is provided to find some basic errors in submissions, but it is the participating group's responsibility to submit well-formed data for evaluation. There is no guarantee that ill-formed submissions will be evaluated.

The software used to evaluate the submissions is available to researchers along with a high-level description of the various matching and scoring algorithms involved. This software was produced by NIST, an agency of the U.S. government, and by statute is not subject to copyright in the United States. Recipients of this software assume all responsibilities associated with its operation, modification and maintenance.

Results for TREC-2001:

The evaluated submissions are available from the TREC archive. Tables of the measures calculated by NIST are also available.

Milestones for TREC-2001:

15. Jan
Revised proposal posted to trecvid discussion list for comment>
15. Feb
Groups intending to participate send short application to NIST.
 1. Mar
Participating groups post to the trecvid list an estimate of how many topics of what sorts they intend to contribute (e.g., known-item topics that will need a human in the loop, general search requests for interactive and fully automatic processing, known-item topics that can be handled fully automatically, etc.
16. Apr
Participating groups submit planned test topics to NIST.

NIST will pool them - interactive vs batch, known item(s) searches versus general. Systems will be tested against the union of the appropriate submitted topics. If enough are created, a random sample can be distributed for use in system development.
 1. May
Remaining detail of guidelines complete, including schedule for distribution of test topics and for evaluation.
18. May
Topic definitions frozen
15. June
Test set for shot boundary detection announced.
Submission checker program available
 1. August
Shot boundary detection submissions due at NIST for evaluation.     Here is a DTD for shot boundary results on one file, one for results on multiple files, and a small example of what a site would send to NIST for evaluation. Please check your submission to see that it is well-formed.
17. Aug
General and known-item search submission due at NIST for evaluation.     Here is a DTD for search results on one topic, one for results on multiple topics, and a small example of what a site would send to NIST for evaluation. Please check your submission to see that it is well-formed.
 1. Oct
Results of shot boundary detection evaluation returned to participants
 5. Oct
Results of evaluations returned to participants
28. Oct
Conference notebook papers due at NIST)
13.-16. Nov
TREC 2001 Conference at NIST in Gaithersburg, Md.
 4. Feb 2002
Final proceedings papers due at NIST

Guideline issues still to be resolved:

Contacts:


National Institute of Standards and Technology Home Last updated: Friday, 01-Mar-2019 14:20:54 EST
Date created: Tuesday, 21-Nov-00
For further information contact Paul Over (over@nist.gov)