Call for Participation in TRECVID 2014
CALL FOR PARTICIPATION in the
2014 TREC VIDEO RETRIEVAL EVALUATION (TRECVID 2014)
February 2014 - November 2014
Conducted by the National Institute of Standards and Technology (NIST)
with additional funding from other US government agencies
I n t r o d u c t i o n:
The TREC Video Retrieval Evaluation series (trecvid.nist.gov) promotes
progress in content-based analysis of and retrieval from digital video
via open, metrics-based evaluation. TRECVID is a laboratory-style
evaluation that attempts to model real world situations or significant
component tasks involved in such situations.
D a t a:
In TRECVID 2014 NIST will use at least the following data sets:
* IACC.2.A, IACC.2.B, and IACC.2.C
Three datasets - totally approximately 7300 Internet Archive
videos (144GB, 600 hours) with Creative Commons licenses in
MPEG-4/H.264 format with duration ranging from 10 seconds to 6.4
minutes and a mean duration of almost 5 minutes. Most videos will have
some metadata provided by the donor available e.g., title, keywords,
* IACC.1.A, IACC.1.B, and IACC.1.C
Three datasets - totaling approximately 8000 Internet Archive videos
(160GB, 600 hours) with Creative Commons licenses in MPEG-4/H.264
format with duration between 10s and 3.5 minutes. Most videos will
have some metadata provided by the donor available e.g., title,
keywords, and description
Approximately 3200 Internet Archive videos (50GB, 200 hours)
with Creative Commons licenses in MPEG-4/H.264 with durations
between 3.6 and 4.1 minutes. Most videos will have some metadata
provided by the donor available e.g., title, keywords, and
* BBC EastEnders
Approximately 244 video files (totally 300GB, 464 hours) with
associated metadata, each containing a week's worth of BBC EastEnders
programs in MPEG-4/H.264 format.
* Gatwick and i-LIDS MCT airport surveillance video
The data consist of about 150 hours obtained from airport
surveillance video data (courtesy of the UK Home Office). The
Linguistic Data Consortium has provided event annotations for
the entire corpus. The corpus was divided into development and
evaluation subsets. Annotations for 2008 development and test
sets are available.
HAVIC is a large collection of Internet multimedia constructed
by the Linguistic Data Consortium and NIST. Participants will
receive a new ~8,000 hour (200,000 clip) MED14 search collection
including a ~1,300 hour search subset for participants with
limited computing resources.
T a s k s:
In TRECVID 2014 NIST will evaluate systems on the following tasks
using the [data] indicated:
* SIN: Semantic indexing [IACC.2]
Automatic assignment of semantic tags to video segments can be
fundamental technology for filtering, categorization, browsing,
search, and other video exploitation. Technical issues to be
addressed include methods needed/possible as collection size and
diversity increase, when the number of features increases, and
when features are related by an ontology. The spatiotemporal
localization subtask will be continued, as will support for
measuring system progress over 3 years.
* SED: Interactive surveillance video event detection (interactive) [i-LIDS]
Detecting human behaviors efficiently in vast amounts
surveillance video, both retrospectively and in realtime, is
fundamental technology for a variety of higher-level
applications of critical importance to public safety and
security. In this task participants will examine the performance
of interactive surveillance video search for a known set of
events. NIST is working on the possibility of adding new
annotations to additional Gatwick videos to create a new test
set - community involvement may be critical in this effort.
* INS: Instance search (interactive, automatic) [BBB EastEnders video]
An important need in many situations involving video collections
(archive video search/reuse, personal video organization/search,
surveillance, law enforcement, protection of brand/logo use) is
to find more video segments of a certain specific person,
object, or place, given a visual example. In 2014 this task will
continue much as in 2013, again using EastEnders videos but with
30 new topics.
* MED: Multimedia event detection [HAVIC]
Participants are tasked with building an automated system that
can determine whether an event is present anywhere in a video
clip using the content of the video clip only. The system gets
as inputs a set of "search" videos and an "event kit" (text and
video describing the event). The system computes an "event
score" (that gives the strength of evidence of the event) and an
optional "recounting" of the event in the each search video in
the input set. The 2014 evaluation will consist of 30
"pre-specified" and 10 new "ad-hoc" event kits containing 100,
10 or 0 example event videos. Participants may choose to either
run their system on the full evaluation set of 200,000 clips
(8000 hours) or a subset containing approximately 32,000
* MER Multimedia event recounting [HAVIC]
Once a MED system locates a specific event in a video clip, a
user may wish to analyze the evidence indicating that the event
was present. Given an event kit, and a video clip that the MED
system has determined to contain the event, the MED system can
then produce a recounting that summarizes the key evidence of
the event in a form that is semantically meaningful to a
human. The MER evaluation will assess those recountings.
MER participants are those participants in the MED evaluation
whose systems also produce a recounting for each clip that their
MED system declares as containing an instance of the event of
In addition to the data, TRECVID will provide uniform scoring
procedures, and a forum for organizations interested in comparing
their approaches and results.
Participants will be encouraged to share resources and intermediate
system outputs to lower entry barriers and enable analysis of various
components' contributions and interactions.
* You are invited to participate in TRECVID 2014 *
The evaluation is defined by the Guidelines. A draft version is
available: http://www-nlpir.nist.gov/projects/tv2014/tv2014.html and
details will be worked out starting in February based in part on input
from the participants.
You should read the guidelines carefully before applying to
P l e a s e n o t e:
1) Dissemination of TRECVID work and results other than in the
(publicly available) conference proceedings is welcomed, but the
conditions of participation specifically preclude any advertising
claims based on TRECVID results.
2) All system results submitted to NIST are published in the
Proceedings and on the public portions of TRECVID web site archive.
3) The workshop is open only to participating groups that submit
results for at least one task and to selected government personnel
from sponsoring agencies and data donors.
4) Each participating group is required to submit before the November
workshop a notebook paper describing their experiments and results.
This is true even for groups who may not be able to attend the
5) It is the responsibility of each team contact to make sure that
information distributed via the call for participation and the
firstname.lastname@example.org email list is disseminated to all team members
with a need to know. This includes information about deadlines and
restrictions on use of data.
6) By applying to participate you indicate your acceptance of the
above conditions and obligations.
T e n t a t i v e v e n u e / s c h e d u l e
To accommodate TRECVID participants who also plan to attend the ACM
Multimedia Conference in Orlando, Florida, 3-7 November, we are
working on holding TRECVID 2014 at the University of Central Florida
(Orlando), 10-12 November in collaboration with UCF's Center for
Research in Computer Vision. Plans for this change in venue should be
completed by the time the guidelines are final on 1. April.
There is a tentative schedule for the tasks included in the Guidelines
W o r k s h o p f o r m a t
Plans are for a 2 1/2 day workshop. Confirmation and details will be
provided to participants as soon as available.
The TRECVID workshop used as a forum both for presentation of results
(including failure analyses and system comparisons), and for more
lengthy system presentations describing retrieval techniques used,
experiments run using the data, and other issues of interest to
researchers in information retrieval. As there is a limited amount of
time for these presentations, the evaluation coordinators and NIST
will determine which groups are asked to speak and which groups will
present in a poster session. Groups that are interested in having a
speaking slot during the workshop will be asked to submit a short
abstract before the workshop describing the experiments they
performed. Speakers will be selected based on these abstracts.
H o w t o r e s p o n d t o t h i s c a l l
Organizations wishing to participate in TRECVID 2014 must respond
to this call for participation by submitting an on-line application by
23. February. Only ONE APPLICATION PER TEAM please, regardless of how
many organizations the team comprises.
*PLEASE* only apply if you are able and fully intend to complete the
work for at least one task. Taking the data but not submitting any
runs threatens the continued operation of the workshop and the
availability of data for the entire community.
Here is the application URL:
You will receive an immediate response when your application is
received. NIST will respond with more detail to all applications
begining just after the application deadline. At that point you'll be
given the active participant's userid and password, be subscribed to
the tv14.list email discussion list, and can participate in finalizing
the guidelines as well as sign up to get the data, which is controlled
by separate passwords.
T R E C V I D 2 0 1 4 e m a i l d i s c u s s i o n l i s t
The tv14.list email discussion list (email@example.com) will serve as
the main forum for discussion and for dissemination information about
TRECVID 2014. It is each participant's responsibility to monitor the
tv14.list postings. It accepts postings only from the email addresses
used to subscribe to it. At the bottom of the guidelines there is a
link to an archive of past postings available using the active
Q u e s t i o n s
Any administrative questions about conference participation,
application format/content, subscriptions to the tv14.list,
etc. should be sent to paul.over at nist.gov.
updated: Tuesday, 18-Feb-2014 07:27:55 EST
For further information contact