How Well Do You Know Your Model - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

How Well Do You Know Your Model

Description:

For spatial models the analytical techniques are not straightforward ... Some rumours about validation. One cannot prove anything, only falsify ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 39
Provided by: AHA62
Category:
Tags: know | model | rumours | well

less

Transcript and Presenter's Notes

Title: How Well Do You Know Your Model


1
How Well Do You Know Your Model?
  • A Methodology for Map Comparison-Based Model
    Validation

2
Introduction
  • Program and background

3
Why map comparison?
  • Good modelling practice
  • Verification
  • Global behaviour analysis
  • Sensitivity analysis
  • Calibration
  • Validation
  • For spatial models the analytical techniques are
    not straightforward
  • All steps require some form of map comparison

4
Workshop program
  • Introduction
  • Program
  • Background
  • Part 1 Methods for map comparison
  • Measuring overlap
  • Measuring near-overlap
  • Measuring structural similarity
  • Part 2 Application of map comparison
  • Multi-scale Multi-criteria
  • Meaningful benchmarks
  • Significance
  • Case demonstration
  • Exercises Map Comparison Kit software

5
Background Modelling land use dynamics
Transition Rule Change cells to land use for
which they have the highest transition
potential until demands are met.
Time Loop
MOLAND Urban Regional Growth Model, courtesy
IES, JRC, DG Research, EU
6
Models produce possible futures
MOLAND Urban Regional Growth Model, courtesy
IES, JRC, DG Research, EU
7
Used in tools for exploring Planning and Policy
options
8
The question of validity
  • State of the art, 2002
  • Standard goodness-of-fit measures do not capture
    relevant aspects of our simulation results.
  • Therefore we rely on visual analysis and expert
    judgement
  • My PhD research
  • Can we define metrics and procedures to
    objectively quantify the performance of spatial
    models?
  • We want to understand the nature, extent and
    spatial distribution of map similarities.
  • How can map similarities be interpreted in terms
    of model performance?

9
Is it possible?
  • Some rumours about validation
  • One cannot prove anything, only falsify
  • No model can encompass everything, there is
    always a larger context
  • Our models are complex and chaotic, testing them
    against one trivial reality is meaningless
  • Its not about agents, but about their collective
    behaviour and emerging patterns
  • A model is valid if it serves a purpose
  • Our models are not intended as perfect and
    absolute truth
  • Model evaluation should match the purpose

10
Cell-by-cell overlap
  • Basic comparison operations

11
Measuring overlap
  • All cells on the map are either equal or not

Percentage Correct 79
12
Separating location and quantity
Differences in location and in quantity 5
differences
Only differences in location 8 differences
13
Kappa statistic
  • Resolve a bias considering uneven distributions
    more alike
  • Controversial amongst remote sensers Variations
    define component of Kappa related to location and
    quantity (histogram).
  • PA Percentage of agreement
  • E(PA) Expected PA, subject to histograms
  • Max Maximal PA, subject to histograms

14
Contingency table
  • Also confusion table/matrix and transition
    table/matrix

15
Per category
  • Temporarily reclassify map for Kappa statistics
    per category (Monserud Leemans)
  • Most useful when prioritizing calibration efforts

Open
Park
River
City
16
The limits of cell-by-cell methods
17
Measuring near-overlap
  • Fuzziness of location and category

18
Fuzzy Set Map Comparison
  • Tolerance for confounding similar categories
  • Fuzziness of category
  • Tolerance for small spatial differences
  • Fuzziness of location
  • Overall map similarity
  • Fuzzy Kappa

19
Fuzziness of categories
Original map
Categorical Fuzzy map
20
Fuzziness of location
Neighbour influence set by distance decay function
Original map
Categorical Fuzzy Map
Full Fuzzy Map
21
Two way comparison step 1
Original Map B
Fuzzified map A is compared to crisp (original)
map B. And vice versa Similarity(A, B)
Intersection (A, B)
Original Map A
Full Fuzzy Map A
Partial comparison
22
Two way comparison step 2
Originals
Comparison
Partial comparison
23
Complete process
Map A
Map B
Original
Full Fuzzy
Comparison
Categorical Fuzzy
Partial Comparison
24
Fuzzy Set Map Comparison applied
Fuzzy Kappa 0.49 Fraction Correct 0.91
25
Fuzzy Kappa per category
Open
Park
City
River
26
Omission and commission
  • Alternative use of categorical similarity
  • Only consider one type of error

27
Measuring structural similarity
  • Moving window approach to landscape structure

28
Comparison of map structure
  • Metrics of landscape configuration
  • Patch size
  • Edge
  • Metrics of landscape composition
  • Diversity
  • Number of classes

29
Balancing structure and overlap
30
Distance weighted moving window
Moving porthole with a fish-eye perspective
31
Smoothly from overlap to structure
Similarity
Landscape
Neighbourhood
Local
Halving distance
32
Mean patch size
Difference
Patch size
Map
Porthole
33
Shannon Diversity
Difference
Porthole
Diversity
Map
34
Subjective but non-trivial decisions
  • What is a patch
  • Rook / Queen contiguity
  • How to weight
  • Per cell / Per patch
  • How to deal with nodata
  • Sim(x, nodata) Sim(nodata, y) nodata
  • Neighbourhood settings
  • Distance decay
  • Radius
  • Aggregation

35
Methods of scaling
  • Moving window, optional distance decay
  • Aggregation
  • Hybrid moving window / aggregation

36
Aggregation versus moving window
37
Download materials
  • Map Comparison Kit
  • General www.riks.nl/mck
  • Todays slides and exercises www.riks.nl/news
  • Also consider the following
  • Fractalyse - www.fractalyse.org
  • Cluster size distribution
  • Fractal dimension
  • Fragstats - http//www.umass.edu/landeco/-researc
    h/fragstats/fragstats.html
  • Many metrics of landscape structure

38
Summary Wednesday
  • Evaluate models on those aspects that are
    important to the application of the model
  • Use map comparison techniques to extract the
    information at the appropriate level of
    abstraction
  • Consider quantity versus location
  • Consider near overlap
  • Consider similarities in spatial structure
  • Exercises
  • Get acquainted with software input, output and
    visualization
  • Planned for Friday
  • Apply methods in a systematic framework
Write a Comment
User Comments (0)
About PowerShow.com