Class VideoSubmission

java.lang.Object
  |
  +--StoriesSegment
        |
        +--VideoSubmission

public class VideoSubmission
extends StoriesSegment


Field Summary
 
Fields inherited from class StoriesSegment
clippedRef, clippedSub, clippingPoints, deletion, insertion, lengthVideo, match, matchType, missingType, nomatchType, offsetAnalogMinusDigital, segmentation, segmentationBeforeClipping, treated
 
Constructor Summary
VideoSubmission(boolean typed, org.w3c.dom.Node video)
          Constructor of the class.
VideoSubmission(boolean typed, org.w3c.dom.Node video, java.util.Vector clippingPoints, java.util.Vector offsets)
          Constructor of the class.
 
Method Summary
 void addEndTime()
          Add the endTime of the last story of the segmentation ie the length of the Mpeg1
 java.util.Vector applyClippingPoints()
          This method apply the clipping points to the current Video.
 TimeBase getAlign()
          Returns the offset we have to add to the submission time(Digital) to get the Analog Time.
 int getNbDetectedBoundaries(int fuzzyFactor)
          returns Number Of detected Boundaries for this video.
 int getNumbersOfReferenceBoundariesDetectedOneFile(int fuzzyFactor)
          returns Number Of detected Reference Boundaries for this video.
 int getTotalTimeIdentifiedOf(java.lang.String type)
          This method returns the total Time of a specific story type.
 boolean getTyped()
          This method let us know il the result for this videos contains classification information
 java.lang.String getVideoFileName()
          Returns the name of the video associated to the current truth
 void load()
          Load the Video results from the XML file and fill the attributes of VideoSubmission
 void setTruthData(TruthData truth)
          This method associates a TruthData object to the current Video results
 
Methods inherited from class StoriesSegment
addOneStory, displaySegment, getBeginStoryForTime, getBeginTimeStory, getEndTimeStory, getMPEG1Length, getNumberOfBoundaries, getNumberOfEvaluatedBoundaries, getNumberOfStories, getTimeOfBoundary, getTotalTimeOneStory, getType, getType, getVideoLength, invertStoriesSegment, returnResultBoundaryDistribution, returnResultStoryDistribution, searchClippingPoints
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

VideoSubmission

public VideoSubmission(boolean typed,
                       org.w3c.dom.Node video,
                       java.util.Vector clippingPoints,
                       java.util.Vector offsets)
Constructor of the class.

Parameters:
typed - boolean says if the run whre that video is coming from is typed or not
video - XML Node associated to the current video Results
clippingPoints - Vector containing all the clipping points
offsets - Vector containing all the offsets

VideoSubmission

public VideoSubmission(boolean typed,
                       org.w3c.dom.Node video)
Constructor of the class.

Parameters:
typed - boolean says if the run whre that video is coming from is typed or not
video - XML Node associated to the current video Results
Method Detail

setTruthData

public void setTruthData(TruthData truth)
This method associates a TruthData object to the current Video results

Parameters:
truth - A truthData object
Returns:
void

applyClippingPoints

public java.util.Vector applyClippingPoints()
This method apply the clipping points to the current Video. But it doesn't use the clipping points of the txt file

Specified by:
applyClippingPoints in class StoriesSegment
Returns:
Vector of Story clipped

 if ($offset <0)
  #MPEG starts before the ground truth, we have to remove initial stories from the submission
  # remove story in interval [0.0 -- offset]

       GGGGGGGG
   MMMMMMMMM
   DDDD
  (G is ground truth, M is mpeg, D is delete)

 if ( 1800-$offset < $length) 
   # ground truth ends before the end of the video, we have to remove some
   final stories from
   # submission
   # remove story in interval [ (1800 -$offset) -- $length]

  GGGGGGGG
  MMMMMMMMMMMMM
          DDDDD
 

addEndTime

public void addEndTime()
Add the endTime of the last story of the segmentation ie the length of the Mpeg1

Specified by:
addEndTime in class StoriesSegment
Returns:
void

getTyped

public boolean getTyped()
This method let us know il the result for this videos contains classification information

Returns:
boolean true if the run is typed and false if it is not.

getVideoFileName

public java.lang.String getVideoFileName()
Returns the name of the video associated to the current truth

Specified by:
getVideoFileName in class StoriesSegment
Returns:
String name of the file ex : 19980405_ABC.mpg

load

public void load()
Load the Video results from the XML file and fill the attributes of VideoSubmission

Returns:
void

getNbDetectedBoundaries

public int getNbDetectedBoundaries(int fuzzyFactor)
returns Number Of detected Boundaries for this video. It loops on the boundaries of the video and search if there one which match in the associated truth +-fuzzyfactor. Every times are convected to analog times and are compared.

Returns:
int nb of detected boundaries

getNumbersOfReferenceBoundariesDetectedOneFile

public int getNumbersOfReferenceBoundariesDetectedOneFile(int fuzzyFactor)
returns Number Of detected Reference Boundaries for this video. It loops on the boundaries of the TruthData object associated and search if there one which match in the current object +-fuzzyfactor Every times are convected to analog times and are compared.

Returns:
int nb of detected referenceboundaries

getTotalTimeIdentifiedOf

public int getTotalTimeIdentifiedOf(java.lang.String type)
This method returns the total Time of a specific story type. This method calls the Fusion Class

Parameters:
type - Type of the story, for TRECVID2003 we need the information on "news"
Returns:
int total time

getAlign

public TimeBase getAlign()
Returns the offset we have to add to the submission time(Digital) to get the Analog Time. Thus we could compare the Ref and the submission on the same scale : the analog scale time.

Specified by:
getAlign in class StoriesSegment
Returns:
offset retruned as a TimeBase object : offset=AnalogTime - DigitalTime