CS 591 Complex Adaptive Systems Spring 2008 Measures of Complexity Professor Melanie Moses 1/28/07 - PowerPoint PPT Presentation

Loading...

PPT – CS 591 Complex Adaptive Systems Spring 2008 Measures of Complexity Professor Melanie Moses 1/28/07 PowerPoint presentation | free to download - id: e5dd0-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

CS 591 Complex Adaptive Systems Spring 2008 Measures of Complexity Professor Melanie Moses 1/28/07

Description:

Wolfram and Langton's complexity of CAs. Why define complexity? To estimate how long a particular system will ... Per Bak: Sandpile model and the edge of chaos' ... – PowerPoint PPT presentation

Number of Views:33
Avg rating:3.0/5.0
Slides: 30
Provided by: valued1031
Learn more at: http://www.cs.unm.edu
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: CS 591 Complex Adaptive Systems Spring 2008 Measures of Complexity Professor Melanie Moses 1/28/07


1
CS 591 Complex Adaptive Systems Spring
2008 Measures of Complexity Professor Melanie
Moses 1/28/07
2
Topics
  • Why do we need formal definitions of complexity?
  • Complex Adaptive Systems
  • Definitions of Complexity
  • Holland
  • Flake
  • Computational Definitions of complexity
  • Running time complexity
  • Kolmogorov (or Algorithmic) complexity
  • Contrast to Shannon information and randomness
  • Logical (thermodynamic) Depth
  • Others
  • Murray Gell-Mann
  • Stuart Kauffman
  • Wolfram and Langtons complexity of CAs

3
Why define complexity?
  • To estimate how long a particular system will
    take to solve a problem
  • To estimate difficulty in engineering complex
    systems
  • To understand the limits of prediction,
    approximation, and simulation
  • To answer fundamental scientific questions
  • How can complexity arise when entropy should
    always increase?
  • Does (why does) complexity increase through
    evolution?
  • As evolution proceeded on the surface of the
    earth, there has been a progressive increase in
    size and complexity J. T. Bonner Evolution of
    Complexity.
  • Evolution tends to push systems towards the edge
    of chaos, where complex, interesting behaviors
    such as life can occur? Chris Langton
  • The universe, the biosphere, the econosphere
    have all become more complex Stuart Kauffmann
  • Are these statements correct? Can we quantify
    increases in complexity over time

4
What are Complex Adaptive Systems?
  • Collections of agents
  • Molecules, cells, animals, planets, stars,
    economic agents.
  • Agents interact (locally) with one another and
    with their environment
  • No central controller.
  • Interaction rules may be trivial or nontrivial,
    i.e. nonlinear.
  • Chemical reactions, cellular interactions,
    mating, buy/sell decisions.
  • Unanticipated properties often result from the
    interactions
  • Immune system responses, flocks of animals,
    settlement patterns, earthquakes, speculative
    bubbles and crashes.
  • Agents adapt their behavior to other agents and
    environmental constraints
  • Imitation, adaptation, learning.
  • System behavior evolves over time
  • Rules change, unfolding of complex behavior.

5
Example Complex Adaptive Systems
  • Biological systems
  • Cells
  • Organs
  • Immune systems
  • Organisms
  • ecosystems
  • Food webs
  • Symbioses
  • Colonies ants, bees, termites, wasps bacteria,
    corals
  • Social/Economic Systems
  • Nations
  • Firms
  • Households
  • Cities
  • Universities
  • Technological Systems
  • The Internet
  • Peer to peer networks
  • Botnets

6
Characteristics of Complex Systems
  • What makes a system complex?
  • Nonlinear interactions among components.
  • Multi-scale phenomena and hierarchy.
  • Evolution of underlying components and
    environments.
  • How to measure a systems complexity?
  • By its unpredictability?
  • By how difficult it is to describe?
  • Length of most concise description.
  • No single model adequate to describe system---the
    more models that are required, the more complex
    the system. (Lee Segel)
  • By measuring how long before it halts, if ever?
    By how long until it repeats itself?
  • Entropy?
  • Multiple levels of organization?
  • Number of interdependencies?
  • Is complexity inherent in the system or in our
    understanding of it?

7
Caveat
  • Some believe that there is no general science of
    complex systems
  • Its becoming apparent that a theory of
    complexity in some sense of a great and
    transforming body of knowledge that applies to a
    whole range of cases may be untenable. Sir
    Robert May (2001)

8
Hollands Definition of Complexity
  • CAS multiple diverse agents, stimulus response
    rules and adaptation
  • Match environmental conditions at various time
    scales
  • Stimulus causes an immediate change in action
    (response)
  • Cumulative stimulus causes changes in rules
    (adaptation, learning, evolution)
  • 7 principles
  • Aggregation how we model CAS emergent behavior
    of CAS
  • Tags identifiers that allow interaction and
    aggregation of agents
  • Nonlinearity esp. interactions, e.g. predator
    prey linear math does not apply
  • Flows over networks of interactions multiplier
    effect vs. recycling effect
  • Diversity results from progressive, distributed
    adaptation perpetual novelty
  • Internal models (Schema) internal structure
    changes in response to stimulus
  • Building blocks repetition, higher level laws
    derive from laws of building blocks

9
G.W. Flakes Definition of Complexity
  • Flakes scientific view
  • An intermediate view between Reductionism and
    Holism
  • The interactions of agentsbind one level of
    understanding to the next
  • Nature is Frugal Occams Razor
  • CAS characterized by
  • Emergence, self-similarity, unpredictability,
    self-organizing
  • Collections/Multiplicity/Parallelism
  • Iteration, Recursion, Feedback
  • Adaptation, Learning, Evolution
  • Conversion of the sciences facilitated by
    computation

10
Representative Measures of Complexity
  • Computational complexity (Cook)
  • How long a program runs (or how much memory it
    uses).
  • Asymptotic.
  • Language complexity
  • Classes of languages that can be computed
    (recognized) by different kinds of abstract
    machines.
  • Decidability, computability.
  • Information-theoretic approaches
  • Algorithmic Complexity (Solomonoff, Komogorov,
    and Chaitin)
  • Length of the shortest program that can produce
    the phenomenon.
  • Mutual information (many authors)
  • Logical depth (Bennett).
  • Thermodynamic depth (Lloyd and Pagels)
  • Effective complexity (Gell-Mann and Lloyd)

11
Computational Complexity
  • Introduced by Steve Cook (1970).
  • Asymptotic running time, and/or memory
    consumption of an algorithm.
  • Worst-case versus average-case.
  • Important computational complexity classes
  • NP (Can be verified in polynomial time). O(p(n))
    on a non-deterministic Turing Machine.
  • NC (polylogarithmic time,Ologk n, using
    polynomial of parallel processors).
  • P (polynomial time using a single processor).
  • C,k are constant, and n is the size (length) of
    the input.
  • Polynomial time algorithms P are O(p(n))
  • for some polynomial function p Cnk where c and K
    are constants independent of n
  • Drawbacks
  • Says nothing about transient behaviors. Many
    interesting systems never reach asymptote
  • Constants can matter, special cases vs worst
    cases, approximations may suffice

12
Algorithmic Complexity (AC) (also known as
Kolmogorov-Chaitin complexity)
  • The Kolomogorov-Chaitin complexity K(x) is the
    length, in bits, of the smallest program that
    when run on a Universal Turing Machine outputs
    (prints) x and then halts.
  • Example What is K(x) where x is the first 10
    even natural numbers? Where x is the first 5
    million even natural numbers?
  • Possible representations
  • 0, 2, 4, 6, 8, 10, 12, 14, 16, 18, (2n - 2)
  • for (j 0 j lt n j) printf(d\n, j 2)
  • How many bits?
  • Alternative 1 O(n log n)
  • Alternative 2 K(x) O(log n)
  • Two problems
  • Calculation of K(x) depends on the machine we
    have available (e.g., what if we have a machine
    with an instruction print the first 10 even
    natural numbers?)
  • In general, it is an uncomputable problem to
    determine K(x) for arbitrary x.

13
Algorithmic Complexity cont.
  • AC formalizes what it means for a set of numbers
    to be compressible and incompressible.
  • Data that are redundant can be more easily
    described and have lower AC.
  • Data that have no clear pattern and no easy
    algorithmic description have high AC.
  • What about random numbers? If a string is
    random, then it possesses no regularities
  • K(x) Print(x)
  • The shortest program to produce x is to input to
    the computer a copy of x and say print this.
  • Implication The more random a system, the
    greater its AC.
  • AC is related to entropy
  • The entropy rate of a symbolic sequence measures
    the unpredictability (in bits per symbol) of the
    sequence.
  • The entropy rate is also known as the entropy
    density or the metric density.
  • The average growth rate of K(x) is equal to the
    entropy rate
  • For a sequence of n random variables, how does
    the entropy of the sequence grow with n?

14
Measures of Complexity that Capture Properties
Distinct from Randomness
AlgorithmicComplexity
Structural Complexity
Measures of randomness do not capture structure
or organization.
15
Logical Depth
  • Bennett 19861990
  • The Logical depth of x is the run time of the
    shortest program that will cause a UTM to produce
    x and then halt.
  • Logical depth is not a measure of randomness it
    is small both for trivially ordered and random
    strings.
  • Drawbacks
  • Uncomputable.
  • Loses the ability to distinguish between systems
    that can be described by computational models
    less powerful than Turing Machines (e.g.,
    finite-state machines).

16
Turing Machines
17
Shannon Information
  • Shannon Entropy H to measure basic information
    capacity
  • For a random variable X with a probability mass
    function p(x), the entropy of X is defined as
  • Entropy is measured in bits.
  • H measures the average uncertainty in the random
    variable.
  • Example 1
  • Consider a random variable with uniform
    distribution over 32 outcomes.
  • To identify an outcome, we need a label that
    takes on 32 different values, e.g., 5-bit
    strings.

18
  • More generally, the entropy of a random variable
    is a lower bound on the average number of bits
    required to represent the random variable.
  • The uncertainty (complexity) of a random variable
    can be extended to define the descriptive
    complexity of a single string.
  • E.g., Kolmogorov (or algorithmic) complexity is
    the length of the shortest computer program that
    prints out the string.
  • Entropy is the uncertainty of a single random
    variable.
  • Conditional entropy is the entropy of a random
    variable given another random variable.

19
Mutual Information
  • Measures the amount of information that one
    random variable contains about another random
    variable.
  • Mutual information is a measure of reduction of
    uncertainty due to another random variable.
  • That is, mutual information measures the
    dependence between two random variables.
  • It is symmetric in X and Y, and is always
    non-negative.
  • Recall Entropy of a random variable X is H(X).
  • Conditional entropy of a random variable X given
    another random variable Y H(X Y).
  • The mutual information of two random variables X
    and Y is

20
Measures of Simplicity, Flake ch 9
  • Algorithmic Regularity Simplicity?
  • Things that can be compressed are simpler
  • Formally quantified as Algorithmic Complexity
    (AC) or Kolomogorov Complexity the shortest
    program to produce a result
  • Redundant or enumerable data are simple
  • AC depends on the computer (or complex system)
    that runs the program
  • Random data are complex because you cannot
    represent them excatly except by explicitly
    listing them
  • AC translates between time complexity and space
    complexity
  • Contrasts with Statistical simplicity
  • Randomness can be simple because you can
    approximate them statistically
  • Coin toss, random walks, Gaussian (normal
    distributions)
  • You can compress random numbers with statistical
    descriptions and only a few parameters

21
Other Definitions
  • Per Bak Sandpile model and the edge of chaos
  • 1. A powerlaw size distribution of extinction
    events many small and few large
  • 2. The powerlaw lifetime distribution of species
    and genera.
  • 3. The powerlaw distribution of species per
    genus, genera per family, and so on.
  • Stuart Kauffmann The Adjacent possible
  • Autonomous agents have to live the most complex
    game that they can
  • There is a tendency for self-constructing
    biospheres to enlarge their workspace, the
    dimensionality of their adjacent possible
  • Murray Gell-Mann Effective Complexity
  • Complexity is highest for things that are neither
    strictly regular nor strictly random
  • Kolmogorov Complexity is based on the length of a
    concise description of a set
  • Effective complexity is based on the length of a
    concise description of a sets regularities

22
Neither random nor regular
  • Complex systems occur at a transition point
    between two extremes
  • Often the only way to predict a future state of a
    complex system is to simulate it
  • It cannot be described by a tractable formula or
    short program
  • It cannot be described statistically
  • Some cellular automata are complex the results
    of running them are neither regular nor random
  • Langtons lambda parameter and Wolframs CA
    classification schemes go from regular to
    chaotic, with complex in the middle

23
Summary of Complexity Measures
  • Computational complexity
  • How many resources does it take to compute a
    function?
  • The language/machine hierarchy
  • How complex a machine is needed to compute a
    function?
  • Information-theoretic methods
  • Entropy
  • Algorithmic complexity
  • Mutual information
  • Logical depth
  • Run-time of the shortest program that generates
    the phenomena and halts.
  • Asymptotic behavior of dynamical systems
  • Fixed points, limit cycles, chaos.
  • Wolframs CA classification the outcome of
    complex CA can not be predicted any faster than
    it can be simulated.
  • Effective Complexity
  • Neither regular nor random

24
Suggested References
  • Computational Complexity by Papadimitriou.
    Addison-Wesley (1994).
  • Elements of Information Theory by Cover and
    Thomas. Wiley (1991).
  • Kaufmann, At Home in the Universe (1996) and
    Investigations (2002).
  • Per Bak, How Nature Works The Science of
    Self-Organized Criticality (1988)
  • Gell-Mann, The Quark and the Jaguar (1994)
  • Reading for Wednesday 1/30 (to be posted)
  • Lansing, Complex Adaptive Systems, Ann Rev
    Anthropology 2003
  • www.ic.arizona.edu/lansing/CompAdSys.pdf
  • Gell Mann, What is complexity? 1995
  • http//www.santafe.edu/mgm/complexity.html

25
(No Transcript)
26
What is a Random Variable?
  • A function defined on a sample space.
  • Should be called random function.
  • Independent variable is a point in a sample space
    (e.g., the outcome of an experiment).
  • A function of outcomes, rather than a single
    given outcome.
  • Probability distribution of the random variable
    X
  • Example
  • Toss 3 fair coins.
  • Let X denote the number of heads appearing.
  • X is a random variable taking on one of the
    values (0,1,2,3).
  • PX0 1/8 PX1 3/8 PX2 3/8 PX3
    1/8.

27
Detour into Information Theory cont.
  • Example 2
  • A horse race with 8 horses competing.
  • The probabilities of 8 horses are
  • Calculate the entropy H of the horse race
  • Suppose that we wish to send a (short) message to
    another person indicating which horse won the
    race.
  • Could send the index of the winning horse (3
    bits).
  • Alternatively, could use the following set of
    labels
  • 0, 10, 110, 1110, 111100, 111101, 111110, 11111.
  • Average description length is 2 bits (instead of
    3).

28
  • the study of cellular automata, from the early
    discoveries of Stanislaw Ulam and John von
    Neumann through to John Conway's Game of Life and
    the extensive work of Stephen Wolfram, made it
    clear that complexity could be generated as an
    emergent feature of extended systems with simple
    local interactions.
  • Over a similar period of time, Benoît
    Mandelbrot's large body of work on fractals
    showed that much complexity in nature could be
    described by certain ubiquitous mathematical
    laws, while the extensive study of phase
    transitions carried out in the 1960s and '70s
    showed how scale invariant phenomena such as
    fractals and power laws emerged at the critical
    point between phases.

29
Computers are complex systems
  • Even though we build them, we dont necessarily
    understand how they work.
  • Complexity is overwhelming design strategies
    based on functional decomposition (reductionism)
  • Context-free languages.
  • Object-oriented programming.
  • Conventional parallel and concurrent programming
    models.
  • Need new methods for engineering nonlinear
    computations
  • Neural networks, Genetic algorithms, Simulated
    annealing.
  • Emergent computation?
About PowerShow.com