Relative merits of Eulerian vs' IndividualBased models of fish dynamics in patchy habitats Albert J' - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Relative merits of Eulerian vs' IndividualBased models of fish dynamics in patchy habitats Albert J'

Description:

1Joint Institute for the Study of the Atmosphere and Ocean, University of ... Patchy food example: Copepod density from an NPZ model (Hinckley et al.) June 2001 ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 22
Provided by: albertj2
Category:

less

Transcript and Presenter's Notes

Title: Relative merits of Eulerian vs' IndividualBased models of fish dynamics in patchy habitats Albert J'


1
Relative merits of Eulerian vs. Individual-Based
models of fish dynamics in patchy
habitatsAlbert J. Hermann11Joint Institute for
the Study of the Atmosphere and Ocean, University
of Washington, P.O. Box 357941, Seattle, WA
98195, U.S.A.Workshop on advancements in
modelling physical-biological interactions in
fish early-life history recommended practices
and future directions3-5 April, 2006, Nantes,
France
2
Many prey fields are patchy in space and time
  • Eddy-rich circulation fields
  • Nutrient sources locked to a particular
    bathymetric feature
  • submarine canyons
  • submarine banks (tidal mixing)
  • islands
  • Intermittent coastal upwelling

3
Patchy food example Copepod density from an NPZ
model (Hinckley et al.)
June 2001
July 2001
4
Why is this a problem for IBM?
  • Too few individuals in an IBM may severely
    undersample a patchy prey field if only consider
    local prey value.
  • Undersampling is a function of space/time
    statistics of prey field AND the paths of
    individuals
  • To avoid undersampling may need a huge number of
    individuals and/or realizations to get a
    statistically meaningful result
  • Eulerian approach may even be superior in such
    undersampled cases (but sacrifice nonlinearity
    captured by the IBM).

5
The underampling problem
  • Undersampling spacing too coarse to resolve the
    important spatial scales
  • Formal definition from signal processing involves
    the Nyquist frequency (or wavelength)
  • cant resolve a sine wave with fewer than 2
    samples per wavelength, but can resolve any
    longer wavelenghts with that density of samples.

6
Simple sine wave example
Well sampled capture the sine wave
Badly sampled sine wave aliased into longer
wavelength
7
Lagrangian/Eulerian equivalence (Taylor,
1921)An ensemble of particles, each subjected
to a random walk (the Lagrangian approach), can
generate statistics equivalent to turbulent eddy
diffusion (the Eulerian approach)
U velocity R(t) autocovariance of u TL
Lagrangian decorrelation time K Eulerian
diffusion
8
Lagrangian IBM pro/con
  • Lagrangian IBMs have a finite number of particles
  • Spatially explicit Individual-Based models offer
    great flexibility in the specification of
    behaviors, especially those based on past history
    (e.g. gut fullness).
  • Can add stochastic events which represent
    subgridscale encounters
  • retains the true nonlinearity of the individual,
    BUT
  • may undersample the prey field miss other rare
    events

9
Eulerian pro/con
  • Eulerian framework is equivalent to an infinite
    number of individuals contained in spatial bins.
    In each of those bins the model will
  • thoroughly blend the contained individuals
  • operate on the mean properties of the contained
    individuals position, length, weight, etc.
  • this is less accurate life history from the
    perspective of one individual BUT
  • A single Eulerian run is equivalent to the
    ensemble average of many Lagrangian realizations.
    Hence more likely to capture spatially rare
    encounters than IBM!

10
So, which is better?
  • Lagrangian model is more realistic for a single
    individual BUT has finite number of particles so
    can miss rare encounters
  • Eulerian model represents the mean of a large
    ensemble of Lagrangian realizations, but is less
    realistic for any of the individuals
  • Given a finite computational resource, want to
    estimate relative costs based on the statistics
    of the circulation and prey fields

11
Lagrangian dispersion kernel (LDK)
  • Lagrangian dispersion kernel (LDK) of particles
    the probability of finding an individual from
    one place and time, at some later place and time
  • LDK P(xf,tf,xi,ti)
  • Can use the LDK to determine ensemble average
    density of particles (individuals) in space and
    time, for a given release distribution (Siegel et
    al. MEPS, 2004)

12
How to get LDK in practice?
  • Direct numerical simulation from a circulation
    model or IBM
  • Measured/fitted uu and TL, then direct
    numerical trials with random walk
  • NOTE must correct for spatially dependent values
    else get spurious convergence
  • For simple cases can compute analytically

13
Lagrangian/Eulerian equivalenceFor simple
random walks, LDK evolves like Eulerian
diffusion gt probability of finding an
individual at any particular place and time is
just the equivalent tracer concentration at that
locationSolution for initial point source of N0
particles/tracer is
N average density t time k equivalent
diffusion r distance from release
14
LDK with typical oceanic values
  • TL 3 days (from floats)
  • u .05 m/s
  • K uu TL 600 m2/s
  • N0 50 individuals
  • Look at a box 50 km wide
  • Consider various prey patch sizes Lp
  • Plot the LDK expressed as average density of
    particles per prey patch
  • LDK N(x,t)/Lp2
  • LDK lt 1 means wont, on average, find any
    particles in a prey patch at that location -gt
    undersampled!
  • Overlay with corresponding random particle
    density

15
Large (12.5 km) prey patch-gt easily find the
prey
Day 1
Day 5
Day 15
But eventually get undersampled prey anyway
16
Medium (5 km) prey patch -gt easily miss the prey
Day 1
Day 5
Day 15
Never get beyond undersampled prey at the patch..
17
Small (2.5 km) prey patch-gt almost never find
the prey!
Day 1
Day 5
Day 15
Undersampled everywhere after one day!
18
Relative Costs
  • If cannot afford to seed with enough particles to
    ensure that LDK gt1 in relevant areas, may need
    to reconsider the IBM approach
  • Comparing costs in the simple case
  • Lagrangian IBM has individuals Lp2kt (for
    large t)
  • Eulerian has required gridpoints Lp2 (for all
    t)
  • Hence Lagrangian/Eulerian cost ratio kt
  • However, may be other mitigating factors (greater
    reality of IBM) which overule this

19
Possible Solutions
  • Modify your definition of local food
  • calculate/respond to regional food environment of
    each individual use spatial weighting function
    (e.g. correlation length scale of the patchy prey
    field)
  • This is equivalent to using a lowpass filtered
    version of the prey field
  • Makes the model less nonlinear, hence less
    realism
  • Frequently reseed the IBM (tricky)
  • Start with a larger number of individuals

20
Conclusions
  • Many food environments (prey fields) are patchy
    in space and time also true of encounters with
    predators, other events
  • too few individuals in an IBM may severely
    undersample the prey field if only consider local
    prey value
  • Eulerian approach may be superior in such
    undersampled cases (but sacrifice nonlinearity
    captured by the IBM)

21
Possible solution for IBM with undersampled prey
field
  • Calculate/respond to regional food environment of
    each individual
  • Frequently reseed the IBM
  • Start with a larger number of individuals
  • Wait for Moores Law to catch up!
Write a Comment
User Comments (0)
About PowerShow.com