Augmented Virtual Environments (AVE): Dynamic Event Visualization - PowerPoint PPT Presentation

About This Presentation
Title:

Augmented Virtual Environments (AVE): Dynamic Event Visualization

Description:

Augmented Virtual Environments AVE: Dynamic Event Visualization – PowerPoint PPT presentation

Number of Views:240
Avg rating:3.0/5.0
Slides: 35
Provided by: ulrichn2
Category:

less

Transcript and Presenter's Notes

Title: Augmented Virtual Environments (AVE): Dynamic Event Visualization


1
Augmented Virtual Environments (AVE) Dynamic
Event Visualization
  • Ulrich Neumann, Suya You
  • Integrated Media Systems Center
  • Computer Science Department
  • University of Southern California
  • September 2003

2
Problem Statement
  • Imagine dozens of video/data streams from people,
    UAVs, and robot sensors distributed and moving
    through a scene
  • Problem visualization as separate
    streams/images provides no integration of
    information, no high-level scene comprehension,
    and obstructs collaboration

3
A Simple Example USC Campus
1
2
Visualization as separate streams provides no
integration of information, no high-level scene
comprehension, and obstructs collaboration
3
4
AVE Fusion of 2D Video 3D Model
  • VE captures only a snapshot of the real world,
    therefore lacks any representation of dynamic
    events and activities occurring in the scene
  • AVE Approach uses sensor models and 3D models
    of the scene to integrate dynamic video/image
    data from different sources
  • Visualize all data in a single context to
    maximize collaboration and comprehension of the
    big-picture
  • Address dynamic visualization and change
    detection

5
Research Highlights Progress
We address basic algorithm research and
technology barriers inherent in AVE system
  • Integrated Modeling System
  • whole campus, semi-automated
  • feature finding and extraction
  • linear/non-linear element fitting
  • Capture System
  • real time DV streams (lt4)
  • Rendering System
  • real-time graphics HW produces 28fps on dual 2G
    PC - 1280x1024 screen
  • Image Analysis System
  • detection and tracking of moving objects (cars,
    vehicles) and pseudo-models

6
Integrated Modeling System
  • Approach
  • Model reconstruction
  • Input LiDAR point cloud
  • Output 3D mesh model
  • Automated
  • Building extraction
  • Vegetation remove
  • Building detection
  • Model fitting
  • Semi-automated

7
Model Reconstruction from LiDAR
  • Model reconstruction
  • Grid re-sampling (range image)
  • Hole-filling (adaptive weighted interpolation)
  • Tessellation (Delaunay triangulation, depth
    filter)

8
Reconstructed USC Campus Model
Reconstructed range image
Reconstructed 3D model
9
Model Needs to be Refined
  • LiDAR is noisy and incomplete
  • Artifacts result in the model hard to visualize
    and map texture

10
Model Refinement and Extraction
  • Produces complete models and improves texture
    visualization
  • Remove vegetation and ground
  • Extract and refine building models
  • Semi-automated
  • Element based approach
  • Supports linear and nonlinear
  • (high-order) surface fitting
  • Models irregular shapes

11
Model Extraction
  • Segmentation building extraction
  • Users define an interested area (two or three
    points)
  • Edge and surface points are then automatically
    segmented

12
Model Fitting - linear
13
Model Fitting - nonlinear
Superquadric Levenberg-Marquardt nonlinear
fitting
14
LA Natural History Museum (before model
fitting)
15
LA Natural History Museum (after model
fitting)
16
LA Natural History Museum (embedded)
17
USC Campus University Park (reconstructed)
18
USC Campus University Park (ground removed)
19
USC Campus University Park (model fitting)
20
USC Campus University Park (embedded)
21
USC Campus University Park (with aerial photo
texture)
22
USC Campus (close view)
23
AVE Sensor Models (Tracking)
  • Portable tracking package
  • DGPS (Z-Sensor base/mobile from Ashtech)
  • INS (IS300 from Intersense)
  • Stereo camera head (MEGA-D from Videre Design)
  • Real-time data acquisition and AR display
  • - GPS 1Hz
  • - INS 150Hz
  • - Video 30Hz
  • Synchronize fuse
  • at 30Hz video rate

24
Tracking Needs to be Stabilized
  • GPS/INS accuracy is not enough
  • Error is easily visible and undesirable
  • One degree of orientation error results in about
    11-pixels of alignment error in the image plane

25
Camera Pose Stabilization
  • Vision tracking is used for pose refinement
  • Vision tracking is also essential to overcome
    GPS dropouts
  • Complementary vision tracker
  • Originally developed for feature
  • auto-calibration (99 2002)
  • - Pose and 3D structure estimated
  • simultaneously
  • Line (edge) and point features are
  • used for tracking
  • Model based approach

26
Model Based Tracking
  • Combines geometric and intensity constraints to
    establish accurate 2D-3D correspondence
  • Hybrid tracking strategy
  • GPS/INS data serve as an aid to the vision
    tracking by reducing search space and providing
    tolerance to interruptions
  • Vision corrects for drift and error accumulation
  • Extended Kalman Filter (EKF) framework

27
Dynamic Image/model Fusion
  • Update sensor pose and image to paint the
    scene each frame
  • Compute texture transformation during rendering
    of each frame
  • Dynamic control during visualization session to
    reflect most recent information
  • Supports 1-3 real-time video streams
  • Real-time rendering - graphics HW produces
    28fps on dual 2G PC - 1280x1024 screen

28
Dynamic Event Analysis Modeling
  • Video analysis
  • Segmenting and tracking moving objects (people,
    vehicle) in the scene
  • Event modeling
  • Creating pseudo-3D animated model
  • Improving visualization situational awareness

29
Tracking and Modeling Approach
  • Object detection
  • Background subtraction
  • A variable-length time average background model
  • Morphological Filtering
  • Object tracking
  • SSD correlation matching
  • Object modeling
  • Dynamic polygon model
  • 3D parameters (position, orientation and size)

30
Tracking and Modeling Results
31
Integrated AVE Environment
  • An integrated visualization environment built in
    the IMSC laboratory
  • 8x10 foot acrylic back-projection screen
    (Panowall) with stereo glasses interface
  • Christie Mirage 2000 stereo cinema projector with
    HD SDI
  • 3rdTech ceiling tracker
  • A dual 2G CPU Computer (DELL) with Nvidia Quadro
    FX 400 graphics card
  • Supports multiple DV video sources (lt4) in
    real-time (28pfs)

32
Integrated AVE Environment
Video demonstration
33
Interactions
  • Collaboration with Northrup Grumman (TRW)
  • - install system (8/03) for demonstrations
  • Publications
  • IEEE CGA (accepted) Approaches to Large-Scale
    Urban Modeling
  • PRESENCE (accepted) Visualizing Reality in an
    Augmented Virtual Environment
  • IEEE CGA (accepted) Augmented Virtual
    Environments for Visualization of Dynamic
    Imagery
  • CGGM03 Urban Site Modeling From LiDAR
  • VR2003 Augmented Virtual Environments (AVE)
    Dynamic Fusion of Imagery and 3D Models
  • SIGMM03 (accepted) 3D Video Surveillance with
    Augmented Virtual Environments
  • Demos/proposals/talk
  • NIMA, NRO, ICT, Northrup Grumman , Lockheed
    Martin, HRL/DARPA, Olympus, Airborne1

34
Future Plan
  • Automate Modeling - automate segmentation,
    primitive selection, fitting, fusion of imagery
    data
  • Real time tracking of moving cameras model
    based tracking with fused gyro, GPS, vision
  • Dynamic Modeling classify and model-fitting for
    moving objects
  • Texture Management - texture retention,
    progressive refinement
  • System Architecture - scalable video streams and
    rendering capability, (PC clusters?)
Write a Comment
User Comments (0)
About PowerShow.com