Road Map Accuracy Evaluation - PowerPoint PPT Presentation

Loading...

PPT – Road Map Accuracy Evaluation PowerPoint presentation | free to view - id: 161ab-ZDBmZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Road Map Accuracy Evaluation

Description:

Paper map - one line width or 0.5 mm. About 12 m on 1:24,000, or 125 m on 1:250,000 maps ... Maps. Visualization Approach. Tiger-based Map. USGS Digital Map ... – PowerPoint PPT presentation

Number of Views:1181
Avg rating:3.0/5.0
Slides: 96
Provided by: pmch
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Road Map Accuracy Evaluation


1
Road Map Accuracy Evaluation
  • Shashi Shekhar
  • Max Donath
  • Pi-Ming Cheng
  • Weili Wu
  • Xiaobin Ma
  • Research Project Team Meeting
  • (A New Approach to Assessing Road User Charges)
  • June 12th, 2002

2
Motivation
  • Observations
  • Each GIS dataset (e.g. roadmaps) can contain
    various errors
  • Failure to control / manage error may limit or
    invalidate applicationss
  • Alternative Approach to Road User Charge
    Assessment
  • Evaluation of digital road map databases
  • Accuracy - Charges be correct and complete
  • Coverage - System be usable through out USA
  • Fairness - Errors by charging system not be
    spatially biased!
  • Goals
  • Develop the content and quality requirements
    digital GIS road maps
  • Recommend a cost-effective approach

3
Example Requirements
  • TIGER Accuracy Improvement Project (11/2/2001)
  • Target Date - 2010
  • Goals Correctly place a mobile GPS-equipped
    computer
  • On correct side of street 100 percent of the time
  • In correct relationship to legal boundaries 100
    percent of time
  • Alternative Road User Charge System
  • Phase I Correctly identify
  • Coarse jurisdiction (e.g. state)
  • Course road types (state, county, other public,
    private)
  • Phase II Correctly identify
  • Finer jurisdictions (federal, state, county,
    city, private)
  • 9 road types (freeway, state highway, arterial,
    , 9 types)

4
Example Requirement 2
  • Source Forkenbrock, Hanley, Tech. Paper 4 (Nov.
    2001),
  • GPS Accuracy Isues Related to the New Approach to
    Assessing Road User Charges
  • Applications
  • Assessing road user charges
  • Congestion pricing, lane pricing
  • Roadmap wish list
  • Positions - lanes, roads
  • Attributes - classification, political
    jurisdiction
  • Accuracy wish-list
  • Positional accuracy - 1-2 meter for lanes, 30
    meter for roads
  • Assumes - Road separation less than 30 m is rare.
  • Can current GPS ... and GIS road files promise
    30 m accuracy?

5
Understanding Requirements
  • Which map accuracy?
  • Positional accuracy - horizontal, vertical
  • Other - attribute accuracy - not specified
  • What is positional accuracy (e.g. 30m) ?
  • Worst case error is always
  • Statistical, e.g. median, 90th-percentile
  • What is the positional accuracy budget for
    roadmaps?
  • Total positional accuracy budget - GPS accuracy
    budget
  • Less than 30 m
  • GPS accuracy depends on location, weather,
  • Roadmap accuracy should be higher where GPS
    accuracy is lower!

6
Close Road Pair
  • Separation may be

7
Outline
  • Motivation
  • Background
  • Roadmap sources and components
  • Accuracy definition and components
  • Related Work
  • Our Approach
  • Preliminary results
  • Challenges

8
Roadmap Sources
  • Sources for navigable digital road maps
  • Public sector
  • State e.g., State DOT base maps
  • Federal TIGER file, USGS
  • Private
  • Navigable maps Tele Atlas, NavTek, GDT, PC Miler
  • Cartographic AAA, Rand McNally

9
Road Map Components
  • Position
  • latitude, longitude, altitude for intersections,
    shape points
  • center line for road segments
  • Attributes
  • Route attribute (name, type)
  • Topology
  • Route segment (direction, type, restrictions)
  • Routing attributes (intersections, turn
    restrictions)
  • Not widely available
  • position of lanes, political jurisdiction

10
Definitions
  • Accuracy
  • Closeness of estimates to true values
  • (or values accepted to be true)
  • the accuracy of the database may have little
    relationship to the accuracy of products computed
    from the database
  • Precision
  • number of decimal places (significant digits) in
    a measurement
  • Common practice
  • round down 1 decimal place below measurement
    precision

11
Components of Map Accuracy
  • Source Chrisman
  • Spatial data are of limited accuracy, inaccurate
    to some degree, the important questions are
  • How to measure accuracy?
  • How to track the way errors are propagated
    through GIS operations?
  • Components of Data Quality
  • positional accuracy
  • attribute accuracy
  • logical consistency
  • completeness
  • lineage

12
Positional Accuracy - Definition
  • The closeness of location (coordinates)
    information to the true position
  • Measures of positional accuracy
  • Paper map - one line width or 0.5 mm
  • About 12 m on 124,000, or 125 m on 1250,000
    maps
  • RMS error
  • 90th percentile, 95th percentile
  • Components of positional accuracy
  • Horizontal, Vertical

13
Framework to test positional accuracy
  • Compare with a reference of higher accuracy
    source
  • find a larger scale map
  • use the Global Positioning System (GPS)
  • use raw survey data
  • Use internal evidence
  • Indications of inaccuracy
  • Unclosed polygons, lines which overshoot or
    undershoot junctions
  • A measure of positional accuracy
  • The sizes of gaps, overshoots and undershoots
  • Compute accuracy from knowledge of the errors
  • By different sources, e.g
  • 1 mm in source document
  • 0.5 mm in map registration for digitizing
  • 0.2 mm in digitizing

14
Attribute Accuracy
  • The closeness of attribute values to their true
    value
  • Measures depend on nature of the data
  • measurement error for continuous attributes
    (surfaces)
  • e.g. elevation accurate to 1 m
  • categorical attributes such as classified
    polygons
  • gross errors, such as a polygon classified as A
    when it should have been B,
  • e.g. land use is shopping center instead of golf
    course
  • Framework to test attribute accuracy
  • Create a a misclassification matrix
  • Ideally, all points lie on the diagonal of the
    matrix

15
Logical Consistency
  • Internal consistency of the data structure
  • Particularly applies to topological consistency
  • Examples
  • Is the database consistent with its definitions?
  • If there are polygons, do they close?
  • Is there exactly one label within each polygon?
  • Are there nodes wherever arcs cross, or do arcs
    sometimes cross w/o forming nodes?
  • Do road-segments meet at intersections?

16
Completeness
  • The degree to which the data exhausts the
    universe of possible items
  • Up to date Vs. Complete
  • Examples
  • Are all possible objects included within the
    database?
  • Does the digital map cover all new developed
    area?

17
Lineage
  • A record of the data sources and of the
    operations which created the database
  • Examples
  • How was it digitized, from what documents?
  • When was the data collected?
  • What agency collected the data?
  • What steps were used to process the data?

18
Problem Definition
  • Given
  • A GIS roadmap dataset and a Gold Standard
  • Definition of accuracy
  • Find
  • Spatial Accuracy of the given GIS dataset
  • Objectives
  • Fair, reliable, tamper-proof, low cost
  • Constraints
  • Gold-standard accuracy is better than GIS
    dataset accuracy

19
Outline
  • Motivation,Background
  • Related Work
  • Topology, Attribute Accuracy (Navtek)
  • Positional Accuracy
  • Standards
  • Etak
  • GTATT (GPS TIGER file Accuracy Assessment Tool
  • Our Approach
  • Preliminary results
  • Challenges

20
Attribute and Topology Accuracy
  • Claims - 97 accuracy (Navtech NTC)
  • What does it mean?
  • Accuracy 100 - percent error
  • percent error linear combination of 13
    component errors
  • Example Components segment existence, name,
    direction, speed, ownership (public/private),
    address range, prohibited maneuver, ...
  • Definitions for sampling
  • Metropolitan areas (MA) city ans suburbs (US)
  • Primary sampling units (PSU) USGS 7.5 minute
    quadrangles (US)
  • Cells - PSUs has 25 cells (5 by 5 grid)
  • Subcells - A cell has 4 subcells (2 by 2 grid)
  • Samples Random cells from 6 PSUs per MA (150-200
    segments)
  • Pick all roads in class 1, 2 and 3 (arterial)
  • Pick roads in class 4 (non-arterial) from a
    random subcell

21
Positional Accuracy Standards
  • 1947 US National Map Accuracy Standards (NMAS)
  • 90 of the tested points have errors
  • Threshold 1/30 inch for scale 120,000
  • Threshold 1/50 inch for scale
  • Q? "How far out are the 10?" "Where are the
    10?"
  • e.g. all of the 10 point off by several inches
    and are in one road
  • Am. Soc. for Photogram. And Remote Sensing
    (ASPRS)
  • 3 different thresholds (class A, B, C) for each
    scales
  • Dozen scales or so
  • US National Standard for Spatial Data Accuracy
    (NSSDA)
  • 95 percent of points have errors
  • Relates to RMS error for normal distribution
  • British Standard
  • RMS error

22
Etak Accuracy Assessment
  • June 1999 Announcement (www.etak.com/News/newmap.h
    tml)
  • Claims Conforms to National Map Accuracy
    Standards (NMAS)
  • 70 of US Population (1.6 Million miles) at
    124,000 scale
  • Another 25 of US Population at 1100,000 scale
  • Geo-coding - 98 match rate
  • Interpretation 1
  • NMAS requires 90th percentile of error 1/50
    inch
  • 40 feet (12.2 meters) at 124,000 scale
  • 166 feet (51 meters) at 1100,000 scale
  • Interpretation 2
  • 70 population Metropolitan areas
  • Another 25 population Small towns
  • TIGER has 8.5 Million miles of roads
  • Roads corrected are about 1/5th of TIGER roads!

23
TIGER file Accuracy Assessment
  • http//www.census.gov/geo/www/tiger/
  • Report John S. Liadis, TIGER Operations Branch ,
    Geography Division
  • Findings
  • Tested 6800 points across 8 sites, multiple
    sources
  • Mean error 281 feet (about 90 meters)
  • Median error 166 feet (about 50 meters)
  • Errors vary across locations (median from 30m to
    160m)
  • Errors vary across sources (median from 32m to
    350m)
  • 90th percentile errors (NMAS) are much worse!
  • 110m - 400m across different sources

24
GPS TIGER Accuracy Assessment Tool
  • GPS TIGER Accuracy Analysis Tools (GTAAT)
  • Calculates the distance and azimuth difference
  • Between the GPS collected point and the
    equivalent TIGER point
  • Indicated Accuracy of some Popular Digital Map
  • Statistics approach
  • Visualization approach
  • Goals for TIGER Accuracy Improvement Project
    (11/2/2001)
  • Correctly place a mobile GPS-equipped computer
  • On correct side of street 100 percent of the time
  • In correct relationship to legal boundaries 100
    percent of time

25
GPS Tracks Vs. Road Maps
  • Visualization Approach



Tiger-based Map
USGS Digital Map
26
GTAAT Workflow Diagram
27
GTAAT Process Diagram
28
GTAAT Report GPS Data Cleaning
  • Post process collected GPS coordinates
  • Selective availability of the GPS signal
  • GPS satellite clock error
  • Ephemeris data error
  • Tropospheric delay
  • Unmodeled ionospheric delay
  • Differential corrections in post processing
  • Remove common error
  • Both the reference and remote receivers
  • Do not correct multi-path or receiver noise
  • Trimbles Pathfinder Office 2.51 Software used
  • Require downloading data from a GPS base station
  • A local station is available

29
GTAAT GPS Source/Operation
Collected GPS anchor points by Sources or Update
Operation
(Red number source code not used in the
source-by-source analysis)
30
GTAAT Ranking of road map quality
  • Median variance by source
  • median distance difference of operations(or
    source) of GPS and TIGER feature

31
Accuracy Assessment in Road Map
  • GTAAT Statistics Approach
  • Test site Windham County, VT (50025)
  • Result of distance by census

32
Accuracy Assessment in Road Map (2)
  • GTAAT Statistics Analysis Site-by-Site
    Comparison
  • Test site Maricopa County, AZ (04013)
  • Result of distance by tract

33
Limitation of Related Works
  • Limited to positional accuracy and lineage
  • Did not evaluate attribute accuracy, completeness
  • Position accuracy measure is limited
  • No separation of lateral and longitudinal error
  • lateral error affect road determination
  • longitudinal error may be administrative zone
    determination
  • Not scalabile to road network
  • Point to point comparison is limited and slow
  • Did not model GPS accuracy
  • GPS accuracy f (location, weather)

34
Outline
  • Motivation,Background
  • Related Work
  • Our Approach
  • Positional Accuracy
  • Map Matching Accuracy
  • Attribute Accuracy
  • Preliminary results
  • Challenges

35
Our Approach
  • Evaluate total system (GPS roadmap)
  • Road classification accuracy
  • Evaluate road map component
  • Positional accuracy
  • Attribute accuracy

36
Positional Accuracy
  • Lateral accuracy
  • Definition Perpendicular (RMS) distance from GPS
    reading to center line of road in road map.
  • Longitudinal accuracy
  • Definition horizontal distance from GPS reading
    to corresponding Geodetic point.

Comment Lateral error is more important when
closest road is parallel Longitudinal error is
important for other case
37
Positional Accuracy Measures
  • Point-based
  • Input pairs of corresponding points on road map
    and gold standard
  • Output RMS (distance between pairs)
  • Comment scalability to large road networks
  • - need to stop GPS vehicles at
    geodetic points
  • - expensive and dangerous
  • Line-string based
  • Lateral error RMS (shortest distance of GPS
    reading to center line of corresponding roads)

38
Methodology
Digital road map data
Site selection 1
Subsets of road maps
Assess positional accuracy
Statistical analysis
Gather GPS track by driving vehicle
GPS logs
Visualization tools
Overlay of road map and gold standard
39
Map Matching
  • Garmin error circle on USA toposheet maps
    (Source Garmin)
  • Risk of matching to incorrect road in map

40
Map Matching Accuracy
  • Map matching accuracy depends on
  • Positional accuracy, Attribute accuracy,
    Completeness
  • Map Matching Accuracy Measures
  • Miles misclassification
  • Number of road pair closer than threshold (30m)
  • Probability of mis-classifying road for a GPS
    reading

41
Methodology
Digital road map data
Gather gold Standard value (e.g., site
field Survey, Aerial images)
Statistical analysis
Assess mis- classification accuracy
Site selection for mis-classification accuracy
Visualization tool
42
Attribute Accuracy Completeness
  • Interesting Attributes
  • Economic attributes - administration zone(s),
    congestion zones
  • Route attribute - name, type, time restrictions
  • Route segment - direction, type (e.g. bridge),
    restrictions
  • Routing attributes - intersections, turn
    restrictions
  • Definition of Attribute Accuracy
  • PrValue of an attribute for given road segment
    is correct
  • Definition of Completeness
  • Pra roads segment is in digital map
  • Prattribute value is not defined for a road
    segment
  • Scope
  • Small sample

43
Methodology
Digital road map data
Site selection for Attribute accuracy
Gather Gold Standard values (e.g., site
field Survey, aerial image)
Assess attribute accuracy and completeness
Statistical And visualization
Site selection for completeness
44
Core Activities
  • Acquire digital road maps
  • Visualization
  • Select test sites
  • Gather gold standard data for test site
  • GPS tracks, Surveys, etc.
  • Compute accuracy measures
  • Statistical analysis

45
Outline
  • Motivation,Background
  • Related Work
  • Our Approach
  • Preliminary results
  • Map acquisition, visualization
  • Site selection
  • Gold standard data collection
  • Positional Accuracy
  • Map Matching Accuracy
  • Challenges

46
Progress
  • Acquire digital road maps
  • Obtain Etak 7 county of MN map
  • Obtain Basemap (1997, 1999) from Mn/DOT
  • Purchasing two counties (Hennepin and St. Louis)
    from Etak/Tele Atlas
  • Gather gold standard data for test sites
  • Acquired sample GPS tracks from field survey
  • Visualization
  • Develop Java based map access software
  • Read digital map sources and GPS data
  • Display overlay of these two sources
  • Visualize error
  • Data Analysis

47
Progress Roadmap Acquisition
  • Sources for navigable digital road maps
  • Public sector State DOT base maps, TIGER file,
    USGS
  • Private Etak/TeleAtlas, NavTek, GDT, PC Miler
  • Acquisitions
  • Etak Minneapolis-St. Paul metropolitan area
    (7-counties)
  • Basemaps (1997, 1999) from Mn/DOT
  • Plans
  • St. Louis county (MN) from Etak/Tele Atlas
  • State and county boundaries
  • Attributes - Q? Which attributes are needed
    beyond
  • road-type, state name, and county name

48
Progress Report Visualization
  • Off the shelf
  • Arc/View
  • mapquest.com
  • Route guidance, Overlay (GPS, roadmaps)
  • Buffers
  • Custom (Java)
  • site selection
  • new accuracy metrics
  • Ex. Etak map for Twin Cities (7 counties)

49
Outline
  • Motivation,Background
  • Related Work
  • Our Approach
  • Preliminary results
  • Map acquisition, visualization
  • Site selection
  • Gold standard data collection
  • Positional Accuracy
  • Map Matching Accuracy
  • Challenges

50
Progress Report Site Selection
  • Site Selection Goals
  • Map matching for a GPS reading (track)
  • Map Accuracy Positional, Attribute
  • GPS studies
  • Map matching for a GPS reading (track)
  • Challenge small road separation, e.g spagetti
    junctions
  • Colocations, i.e. Stretches of road pairs with
    different types
  • Map Accuracy Positional, Attribute
  • Road center lines, state boundaries, county
    boundaries
  • State names, county names, road types,
    public/private, ...
  • GPS Studies
  • Natural and urban canyons / valleys

51
Site Selection for Map Matching
  • Formulated as a co-location pattern detection
  • Problem formulation
  • Given
  • 1)  A digital roadmap with a set of
    road-types
  • 2) A spatial neighbor relation R over
    locations, (e.g. buffer size S)
  • 3) Prevalence measure (e.g. Min length of
    co-located stretches)
  • Find
  • Stretches of subsets of road-types
    satisfying given threshold on lengths
  • Objectives
  • Correctness, completeness, computational
    efficiency
  • Constraints
  • 1)  R is symmetric and reflexive
  • 2) Monotonic prevalence measure

52
Approaches to Co-location Mining - 1
  • Prevalence Measure (PM)
  • Given a buffer size S
  • PM( A - B) number of miles of B within
    buffer(A, S)
  • PM( B - A) number of miles of A within
    buffer(B, S)
  • PM(A, B) minimum( PM( A- B), PM(B - A) )

Green(inside buffer)
Red (outside buffer)
Road B (type 2)
Road A (type 1)
Buffer (Road A, S)
53
Approaches to Co-location Mining - 2
  • Identifying pairs Brute Force Approach
  • Examine all pairs of roads with different road
    types
  • Compute prevalence measures
  • Select pairs with PM above threshold
  • Identify pairs with Computational Efficiency
  • Reduce number of pairs examined
  • spatial join using a spatial index
  • Road pair picked only if a segment pair is close
    enough
  • Identifying spaghetti junctions
  • Use selected pairs to form triplets
  • Check prevalence measure to filter out triplet
  • Repeat for larger subsets

54
How prevalent are Co-location (by Road Types)?
Low-speed ramp
High-speed ramp
Fraction
Interstate hwy
Primary state hwy
Light duty
Arterial
Collector
Alley or unpaved road
Road Types
55
Result from Co-location Miner
red colocations Buffer size 30
meters Locations Highways Can be divided into
5 routes
56
West Metro Route-Pair
West metro route US169(south)?394(east)?100(south
)?62(east)?I35W(north). Along each highway, two
or more roads are close.
57
Routes for Side Road
  • Process
  • Consider route segment longer than 1 mile
  • Minimize breaks in driving, avoid excessive turns
    and road changes
  • Give both south and north bound local route (west
    and east bound ) for each highway route where
    close local routes exist
  • Examples West metro route
  • Begin near intersection of I94 ad HWY 169 (
    northwest corner )
  • Take I94, exit on Hemlock lane N, Heading S to
    Magda drive(County Hwy 130)
  • Access ramp( just across 64th street or
    Lancaster lane) to 169 south
  • Exit to base lake road west (county hwy 10)
  • Turn left to revere lane north/ Nathan Lane N
    Turn left to 56 ave N
  • Right to Mendelssohn service road Right
    to Schmiot lake road (W)
  • Turn left to Nathan lane N Turn left to
    Lancaster lane N
  • Turn left to 36 ave N (E) Turn right to
    Kilmer lane N
  • Turn right to 34 ave N Left to Pilgrim
    lane N
  • L to 30th Ave N R to Independence Ave N
    R to 33rd Ave N
  • L to Hillsboro Ave N Across 36th ave N to
    Jordan Ave N
  • To 40 1/2 ave N ( because there is not a long
    road ahead, we stop here)

58
Road Type Statistics for Test Routes
  • Road length ratio

High-speed ramp
Interstate hwy
Primary state hwy
Arterial
Collector
Light duty
Alley or unpaved road
Low-speed ramp
59
Southwest Route-Pair
Southwest of Minneapolis test route1
394(east)-US169(south)?7(east)?100(south)? 494(ea
st)?I35W(south). Along each highway, two or more
roads are close.
60
Route-Pair 5594
Route 5594 Olson Memorial Hwy(east)-94(east) A
long each highway, two or more roads are close.
61
Route-Pair 69494
62
Route-Pair 3661
35W(north)? 36(east) ? 61(north). Along each
highway, two or more roads are close.
63
Site Selection for Map Matching
  • Details of Route-pairs
  • Overview map using Java and mapquest
  • Detailed maps (12 - 15 segments) using mapquest
  • Details are useful for
  • Driving and GPS track data collection
  • Visual check of correctness of site selection
  • Note close frontage roads in most segments
  • Choice of tools
  • Note challenges in planning route for frontage
    roads
  • Use Pcmile, Arc/View with manual annotations
  • Next slides show 3 things for each Route-Pair
  • Overview maps using Java and mapquest
  • 1 detailed map (other detailed maps are hidden)

64
West Metro Route (WMR)
West of Minneapolis test route1
US169(south)?394(east)?100(south)?62(east)?I35W(no
rth). Along each highway, two or more roads are
close.
65
Mapquest Map of WMR
66
WMR(a) 35W and Closed Side Roads
67
WMR(b) 35W and Closed Side Roads(Cont)
Four-road patterns 17th St E, 94, 35W, 18th St E

Four-road patterns 4th Ave S, 35W, 5th Ave S, 65
68
Progress Rep. Gold Standard Collection
  • GPS Equipped vehicle
  • Gold standard GPS (MS 750)- accuracy of
    centimeter
  • Other GPS for map matching
  • Each route-pair is driven multiple times
  • Highway routes (both directions)
  • Side-road routes (one direction due to tedious
    nature)
  • GPS tracks
  • Track files from each GPS
  • Files imported in GIS
  • Inspection of overlay(GPS track, roadmap) for
    sanity checks
  • Computation of accuracy metrics(GPS track,
    roadmap)

69
Gold Standard Sanity Check
  • Sanity Check before Detailed Analysis
  • Visual inspection to check alignment with routes
  • Eliminates major problems with GPS receiver
  • Roadmap accuracy - qualitative view
  • Procedure
  • Import Track files from gold standard GPS
  • Overlay on digital roadmap
  • Pi-Ming found a few initial issues and corrected
    those!

70
Outline
  • Motivation,Background
  • Related Work
  • Our Approach
  • Preliminary results
  • Map acquisition, visualization
  • Site selection
  • Gold standard data collection
  • Positional Accuracy
  • Map Matching Accuracy
  • Challenges

71
Positional Accuracy Measures
  • Point-based
  • Input pairs of corresponding points on road map
    and gold standard
  • Output RMS (distance between pairs)
  • Comment scalability to large road networks
  • - need to stop GPS vehicles at
    geodetic points
  • - expensive and dangerous
  • Line-string based
  • Lateral error RMS (shortest distance of GPS
    reading to center line of corresponding roads)
  • Buffer-based accuracy(Gold-std, Buffer-size)
    (Length of Gold-std where correct road is within
    Buffer) / (Total length of Gold-std)
  • Choice Buffer-based accuracy

72
Buffer Computation for Positional Accuracy
  • ArcInfo buffer computation
  • Input
  • Golden standard / MS750 road data
  • Road data from digital map
  • Buffer size parameter(30, 50, 100, 150, 200,
    300 feet)
  • Output Result buffer from which we can get
    intersect information

Green(road inside buffer)
Red (outside buffer)
Gold std GPS data
Buffer
73
Methodology
  • Site selection Route-pairs 1 - 5 for now
  • Future broader sample of highways and
    non-highways as needed!

Digital road map data
Site selection 1
Subsets of road maps
Assess positional accuracy
Statistical analysis
Gather GPS track by driving vehicle
GPS logs
Visualization tools
Overlay of road map and gold standard
74
Positional Accuracy with Buffer 30 feet
75
Positional Accuracy with Buffer 50 feet
76
Positional Accuracy with Buffer 100 feet
77
Positional Accuracy with Buffer 150 feet
78
Positional Accuracy with Buffer 200 feet
79
Positional Accuracy with Buffer 300 feet
80
Outline
  • Motivation,Background
  • Related Work
  • Our Approach
  • Preliminary results
  • Map acquisition, visualization
  • Site selection
  • Gold standard data collection
  • Positional Accuracy
  • Map Matching Accuracy
  • Challenges

81
Progress Rep. Map Matching Accuracy
  • Goals
  • Evaluate map matching algorithms
  • Different GPS receivers
  • Different roadmaps
  • Map matching algorithms
  • Traditional Current GPS point - Nearest road in
    map
  • Context aware account for recent history
  • GPS Receiver
  • Gold standard - centimeter accuracy
  • Other - meter accuracy (Ref. GPS accuracy
    assessment results)
  • Roadmaps
  • Navigable roadmaps - 10m accuracy
  • TIGER file - lower positional accuracy

82
Workload to Evaluate Road Classification Accuracy
  • Given a digital map and all the GPS track data,
    convert them to format understood by our
    visualization program.
  • To calculate the classification accuracy, we need
    to, for each GPS track, manually determine which
    part of the track corresponds to which real road
    recorded by the tester while they were driving on
    the road. This operation includes the following
    steps
  • Use our program to visualize the GPS track and
    all the recorded roads, approximately determine
    the mapping of GPS track sections to roads
  • Accurately determine the start and end point of a
    GPS track section that matches a real road, do
    this for all the recorded roads
  • Update the GPS data file to some medium format
    to reflect the matches in step (2)
  • Use our software to calculate the classification
    accuracy of a GPS track section against its
    matching road, do this for all the recorded
    roads
  • Gather the results generated from step (4),
    compute summaries in different ways (e.g.
    accuracy by road class), and generate reports.

83
Progress Rep. Map Matching Accuracy
  • Preliminary results
  • Traditional map matching
  • Gold Standard GPS, Navigable roadmaps
  • Shown on next 5 slides for 5 route-pairs
  • Blue mismatch, Green match
  • Bad news lots of (about 1/3) blue except
    694-94 route
  • Interpretation
  • Map accuracy needs improvement to distinguish
    road types!
  • Phase II possible tasks
  • Use gold standard GPS
  • Improve positional accuracy of highways
  • Improve positional accuracy of roads in
    colocations

84
GPS MS750 track on West Metro Route
Red No GPS fix or float segments Green GPS
fix or float, and correctly classified road
Blue GPS fix or float, but NOT correctly
classified road
85
GPS MS750 of Southwest Metro Route
Red No GPS fix or float segments Green GPS
fix or float, and correctly classified road
Blue GPS fix or float, but NOT correctly
classified road
86
GPS MS750 of Route 5594
Red No GPS fix or float segments Green GPS
fix or float, and correctly classified road
Blue GPS fix or float, but NOT correctly
classified road
87
GPS MS750 of Route 69494
Red No GPS fix or float segments Green GPS
fix or float, and correctly classified road
Blue GPS fix or float, but NOT correctly
classified road
88
GPS MS750 of Route 3661
Red No GPS fix or float segments Green GPS
fix or float, and correctly classified road
Blue GPS fix or float, but NOT correctly
classified road
89
Map Matching Accuracy
  • Statistical Summary
  • Preliminary results for a route
  • Naïve map matching is correct 2 out of 3 times
  • More details are being worked out
  • Computational bottleneck - Buffer based error
    definition

Road Type 1 Interstate, 2 state highways, 5
light duty
90
Map Matching Accuracy Vs. GPS Types
  • Preliminary results
  • Traditional map matching
  • Navigable roadmaps
  • Shown on next few slides
  • Blue mismatch, Green match, Red GPS
    signal gap
  • Observations for lower accuracy GPS
  • Much less Red on lower accuracy GPS
  • Red converts to blue
  • Green and blue segments retain color
  • Hypothesis
  • Map mismatch possibly due to roadmap positional
    errors!

91
GPS MS750 of West Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
92
GPS AG132 of West Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
93
GPS JRC of West Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
94
GPS MS750 of Southwest Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
95
GPS AG132 of Southwest Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
96
GPS JRC of Southwest Metro Route
Red No GPS fix or float segments
Blue GPS fix or float, but not correctly
classified road
Green GPS fix or float, and correctly classified
road
97
Outline
  • Motivation
  • Background
  • Related Work
  • Our Approach
  • Preliminary results
  • Conclusions and Challenges

98
Conclusions Observations
  • Good news
  • identification of state (or county) is feasible
  • Not so good news
  • distinguishing highways from frontage roads
  • Map matching accuracies may be inadequate even
    with best GPS
  • Separating highways from side roads is less
    reliable
  • Implications for Alternative Approach to Road
    User Charges
  • Inaccuracy - Charges may not be correct
  • Unfair - Charging errors may be spatially biased!

99
Conclusions Recommendations
  • Positional and attribute accuracy requirements
    for digital road maps
  • separate colocated road-type pairs (e.g hwy,
    frontage roads)
  • distinguish jurisdictions
  • A cost-effective approach
  • Identify trouble spots (e.g. colocations,
    juridiction boundaries)
  • Improve roadmap accuracy in trouble spots using
    GPS
About PowerShow.com