Video is becoming a new means of documenting everything from recipes to how to change a tire of a car. Ever expanding multimedia video content necessitates development of new technologies for retrieving relevant videos based solely on the audio and visual content of the video. Participating MED teams will create a system that quickly finds events in a large collection of search videos.
Given an evaluation collection of videos (files) and a set of event kits, provide a rank and confidence score for each evaluation video as to whether the video contains the event. Both the Pre-Specified and AdHoc Event tasks will be supported. NIST will create up to 10 new AdHoc event kits.
The development data will be the same as last year. The evaluation search collection will be the YFCC100M portion of the search collection from last year, along with an additional 100 thousand clips from YFCC100M.
Submissions will follow the MED '16 paradigm of submissions being made as a single tarball bundle and minimal hardware/runtime reporting. Each team can submit up to 5 Pre-Specified Event runs and up to 2 AdHoc event runs. Each run must contain results for a given condition.
For each event, the submissions will be pooled across all runs and a sample judged by human assessors at NIST. Mean inferred average precision will be used to measure run-level effectiveness. Details on the evaluation will be posted on the MED website.