TRECVID 2018 guidelines

Ad-hoc Video Search

Task coordinator: Georges Quénot and George Awad

System task:

Data:

Collaborative annotation:

Sharing of components:

Participation types:

Training types:

Submissions:

Three main submission types will be accepted:

  1. Fully automatic (F) runs (no human input in the loop): System takes a query as input and produced result without any human intervention.

  2. Manually-assisted (M) runs: where a human can formulate the initial query based on topic and query interface, not on knowledge of collection or search results. Then system takes the formulated query as input and produces result without further human intervention.

  3. Relevance-Feedback (R) runs: System takes the official query as input and produce initial results, then a human judge can assess the top-5 results and input this information as a feedback to the system to produce a final set of results. This feedback loop is strictly permitted only once.

Each team may submit a maximum of 4 prioritized runs, per submission type, with 2 additional if they are of the "no annotation" training type and the others are not. The submission formats are described below.

Please note: Only submissions which are valid when checked against the supplied DTDs will be accepted. You must check your submission before submitting it. NIST reserves the right to reject any submission which does not parse correctly against the provided DTD(s). Various checkers exist, e.g., Xerces-J: java sax.SAXCount -v YourSubmision.xml.

  • Participants will submit real results against IACC.3 data in each run for all and only the 30 Ad-hoc queries released by NIST and for each query at most 1000 shot IDs.

  • Here for download (right click and choose "display page source" to see the entire files) is a DTD for Adhoc search results of one main run, the container for one run, and a small example of what a site would send to NIST for evaluation. Please check all your submissions to see that they are well-formed.

  • Please submit each of your runs in a separate file, named to make clear which team has produced it. EACH file you submit should begin, as in the example submission, with the DOCTYPE statement:

    <!DOCTYPE videoAdhocSearchResults SYSTEM "https://www-nlpir.nist.gov/projects/tv2018/dtds/videoAdhocSearchResults.dtd">

    that refers to the DTD at NIST via a URL and with a videoAdhocSearchResults element even though there is only one run included. Each submitted file must be compressed using just one of the following: gzip, tar, zip, or bzip2.

  • Remember to use the correct shot IDs in your submissions - the ones from video segment elements the mp7 files in the master shot reference with the format shotFILENUMBER_SHOTNUMBER. Do not use the ID associated with the video (TRECVID2016_FILENUMBER) or the one associated with a keyframe (shotFILENUMBER_SHOTNUMBER_RKF).

  • Submissions will be transmitted to NIST via a password-protected webpage.

    Evaluation:

    All queries (approx. 30) will be evaluated by assessors at NIST after pooling and sampling.

    Please note that NIST uses a number of rules in manual assessment of system output.

    Measures:

  • Mean extended inferred average precision (mean xinfAP), which allows sampling density to vary e.g. so that it can be 100% in the top strata, which are most important for average precision.

  • As in past years, other detailed measures based on recall, precision will be provided by the sample_eval software.

  • Speed will also be measured: clock time per query search, reported in seconds (to one decimal place).

    Issues: