Dataset Production and Performance Evaluation for Event Detection and Tracking - PowerPoint PPT Presentation

About This Presentation
Title:

Dataset Production and Performance Evaluation for Event Detection and Tracking

Description:

... Amethyst - Alarm event authentication system for perimeter surveillance (near real time) Alarm is there a person Yes/No/Discard Through to 2000 s ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 22
Provided by: HOS69
Category:

less

Transcript and Presenter's Notes

Title: Dataset Production and Performance Evaluation for Event Detection and Tracking


1
Dataset Production and Performance Evaluation for
Event Detection and Tracking
  • Paul Hosmer
  • Detection and Vision Systems Group

2
Outline
  • Defining a requirement
  • What to include in datasets
  • Constraints
  • Evaluation and Metrics
  • Case Study

3
Background
  • Intelligent Video
  • Started in early 1990s
  • FABIUS
  • Amethyst
  • Through to 2000s
  • VMD capability study
  • Standards-based evaluations

4
What did we want to achieve?
  • Test systems in short period of time
  • Provide data and requirements to research
    community
  • Dataset production
  • Problem what to include?

5
Scenario definition
  • What is an event?
  • Where does the scenario take place?
  • What challenges are posed by the environment?
  • Ask end users / gauge demand
  • Conduct capability study
  • Monitor environment, apply a priori knowledge

6
Scenario definition
  • Abandoned Baggage
  • When is an object abandoned?
  • What types of object?
  • Attributes of person?

7
Scenario definition
  • Abandoned object
  • During the current clip, a person has placed an
    object which was in their possession when they
    entered the clip onto the floor or a seat in the
    detection area
  • That person has left the detection area without
    the object
  • Over sixty seconds after they left the detection
    area, that person has still not returned to the
    object
  • The object remains in the detection area.

8
Scenario definition
  • Key environmental factors
  • Lighting changes film dawn and dusk
  • Rain and snow
  • Night head lights and low SNR

9
How much data?
  • Need to demonstrate performance on wide range of
    imagery
  • Statistical significance
  • Need large training and test corpus 100s of
    events
  • Unseen data for verification

10
Constraints
  • You cant always capture the event you want
    simulation
  • Make simulations as close to the requirement as
    possible
  • Storage vs image quality what will you want to
    do with the data at a later time?
  • Cost try to film as much variation/events as
    you can

11
Performance Evaluation
  • Importance of metrics consistency across
    different evaluations
  • When is an event detected?
  • Real time evaluation, 10x real time, offline
    which is most useful?
  • Statistically significant unseen dataset
  • Performance on training data does not tell me
    anything useful about robustness

12
How HOSDB does it
  • Simulate real analogue CCTV system
  • 215,000 frames per scenario evaluation
  • Evaluation 300 events
  • 60s to alarm after GT alarm condition is
    satisfied
  • One figure of merit for ranking

13
F1 score for event detection
F1 (a 1)RP R aP
a ranges from 0.35 to 75 depending on scenario
and application
14
What about Tracking?
  • 5th i-LIDS scenario
  • Multiple Camera Tracking
  • Increasing interest from end users
  • Significant potential to enhance operator
    effectiveness and aid post event investigation
  • The Problem
  • Unifying tracking labelling across multiple
    camera views
  • Dataset and Evaluation Problem
  • Synchronisation

15
Operational Requirement
  • Camera Requirements
  • Existing CCTV systems
  • Cameras are a mixture of overlapping and
    non-overlapping
  • Internal cameras are generally fixed and colour
  • Scene Contents
  • Scenes are likely to contain rest points
  • Varying traffic densities
  • Target Description
  • There may be multiple targets
  • Targets from wide demographic

16
Imagery Collection
17
Imagery Collection
  • Location
  • Large Transport Hub (airport)
  • Targets
  • Varied Targets
  • Differing target behaviour
  • Varying crowd densities
  • Environment
  • Lighting changes
  • Filmed at Dawn, Day, Dusk and Night
  • Volume
  • 5 cameras
  • 1.35 Million frames
  • Single and multiple target
  • 1000 target events
  • 1TB external HDD

18
(No Transcript)
19
Dataset structure
Target Event Set
TES Properties Daytime High density Night
time Low density Etc Etc
MCT01 MCT02 MCT03
20
Performance Metric
F1 2RP RP
P Overlapping Pixels Total Track Pixels
R Overlapping Pixels Total Ground
Truth Pixels
  • F1 must be greater than or equal to 0.25 for the
    track to be a True Positive


21
  • Performance evaluation is important
  • Evaluations need to use more data
  • With richer content
  • With widely accepted definitions and metrics
  • Demonstrate improved performance
Write a Comment
User Comments (0)
About PowerShow.com