Complex Adaptive Systems August 1115, 2003 Lecture 1 Introduction Stephanie Forrest Dept' of Compute - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Complex Adaptive Systems August 1115, 2003 Lecture 1 Introduction Stephanie Forrest Dept' of Compute

Description:

Detour into Information Theory. Shannon Entropy H to measure basic information capacity: ... Detour into Information Theory cont. Example 2: A horse race with 8 ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0
Slides: 22
Provided by: valued1031
Category:

less

Transcript and Presenter's Notes

Title: Complex Adaptive Systems August 1115, 2003 Lecture 1 Introduction Stephanie Forrest Dept' of Compute


1
Complex Adaptive SystemsAugust 11-15,
2003Lecture 1IntroductionStephanie Forrest
Dept. of Computer ScienceUniv. of New
MexicoAlbuquerque, NM http//cs.unm.edu/forres
tforrest_at_cs.unm.edu
2
Course Topics
  • Monday
  • Lecture 1 Introduction to complex adaptive
    systems (Forrest)
  • Lecture 2 Modeling complex adaptive systems
    (Forrest)
  • Discussion Biological modeling---problems and
    approaches (Forrest)
  • Tuesday
  • Lectures 3-4 Evolution, adaptation, social
    modeling (Forrest)
  • Discussion Computer security and complex
    adaptive systems (Forrest)
  • Wednesday
  • Lecture 5 Cellular automata and the game of life
  • Lecture 6 Agent-based modeling and artificial
    life (Forrest)
  • Lecture 7 Power laws and complex systems (Moore)
  • Thursday
  • Lecture 8 Dynamical systems, stability,
    attractors, and chaos (Moore)
  • Lecture 9 Phase transitions in physics and
    computer science (Moore)
  • Discussion Real-world problems (Moore)
  • Friday
  • Lecture 10 Intrinsic computation, structural
    complexity (Crutchfield)
  • Lecture 11 Modeling coordination (Crutchfield)

3
What are Complex Adaptive Systems?
  • Collections of agents
  • Molecules, cells, animals, nations, economic
    agents.
  • Agents interact (locally) with one another and
    with their environment
  • No central controller.
  • Interactions are nontrivial, I.e. nonlinear.
  • Chemical reactions, cellular interactions,
    mating, buy/sell decisions.
  • Unanticipated properties often result from the
    interactions
  • Immune system responses, flocks of animals,
    settlement patterns, earthquakes, speculative
    bubbles and crashes.
  • Agents adapt their behavior to other agents and
    environmental constraints
  • Imitation, adaptation, learning.
  • System behavior evolves over time
  • Rules change, unfolding of complex behavior.

4
Example Complex Adaptive Systems
  • Natural ecosystems
  • Economies
  • Social systems
  • Immune systems
  • The Internet and other computer systems

5
Caveat
  • Some people believe that there is no general
    science of complex systems
  • Its becoming apparent that a theory of
    complexity in some sense of a great and
    transforming body of knowledge that applies to a
    whole range of cases may be untenable. Sir
    Robert May (2001)

6
Theme Understanding Nature and Society Through
Computation
  • Using information processing methods to learn
    more about natural systems
  • Cognitive science..
  • Biological models (e.g., vaccine design,
    ecological models).
  • Lattice gas models in physics.
  • Prisoners dilemma in social systems.
  • Contrast with computational science
  • Modeling nature as a mechanical device vs.
    modeling nature as an information system.
  • E.g., computational biology---genomics and
    protein folding.

7
Characteristics of Complex Systems
  • What makes a system complex?
  • Nonlinear interactions among components.
  • Multi-scale phenomena.
  • Evolution of underlying components and
    environments.
  • How to measure a systems complexity?
  • By its unpredictability?
  • By how difficult it is to describe?
  • Length of most concise description.
  • No single model adequate to describe system---the
    more models that are required, the more complex
    the system. (Lee Segel)
  • By measuring how long before it halts, if ever?
    By how long until it repeats itself?
  • Entropy?
  • Multiple levels of organization?
  • Number of interdependencies?
  • Is complexity inherent in the system or in our
    understanding of it?

8
Some Measures of Complexity
  • Computational complexity (Cook)
  • How long a program runs (or how much memory it
    uses).
  • Asymptotic.
  • Language complexity
  • Classes of languages that can be computed
    (recognized) by different kinds of abstract
    machines.
  • Decidability, computability.
  • Logical depth (Bennett).
  • Information-theoretic approaches
  • Algorithmic Complexity (Solomonoff, Komogorov,
    and Chaitin)
  • Length of the shortest program that can produce
    the phenomenon.
  • Kolmogorov complexity.
  • Mutual information (many authors)
  • Thermodynamic depth (Lloyd and Pagels)
  • Effective complexity (Gell-Mann and Lloyd)

9
Computational Complexity
  • Introduced by Steve Cook (1970).
  • Asymptotic running time, and/or memory
    consumption of an algorithm.
  • Worst-case versus average-case.
  • Important computational complexity classes
  • NP (Can be verified in polynomial time). O(p(n))
    on a non-deterministic Turing Machine.
  • NC (polylogarithmic time,Ologk n, using
    polynomial number of processors).
  • P (polynomial time using a single processor).
  • C,k are constant, and n is the size (length) of
    the input.
  • Polynomial time algorithms P are O(p(n))
  • for some polynomial function p.
  • Drawbacks
  • Says nothing about transient behaviors. Many
    interesting systems never reach asymptopia.
  • The categorization is very coarse----in the real
    word, constants often matter.

10
Computational Complexity Classes(from
Papadimitriou, 1994)
11
Algorithmic Complexity (AC)(also known as
Kolmogorov-Chaitin complexity)
  • The Kolomogorov-Chaitin complexity K(x) is the
    length, in bits, of the smallest program that
    when run on a Universal Turing Machine outputs
    (prints) x and then halts.
  • Example What is K(x) where x is the first 10
    even natural numbers? Where x is the first 5
    million even natural numbers?
  • Possible representations
  • 0, 2, 4, 6, 8, 10, 12, 14, 16, 18, (2n - 2)
  • for (j 0 j lt n j) printf(d\n, j 2)
  • How many bits?
  • Alternative 1 O(n log n)
  • Alternative 2 K(x) O(log n)
  • Two problems
  • Calculation of K(x) depends on the machine we
    have available (e.g., what if we have a machine
    with an instruction print the first 10 even
    natural numbers?)
  • In general, it is an incomputable problem to
    determine K(x) for arbitrary x.

12
Algorithmic Complexity cont.
  • AC formalizes what it means for a set of numbers
    to be compressible and incompressible.
  • Data that are redundant can be more easily
    described and have lower AC.
  • Data that have no clear pattern and no easy
    algorithmic description have high AC.
  • What about random numbers? If a string is
    random, then it possesses no regularities
  • K(x) Print(x)
  • The shortest program to produce x is to input to
    the computer a copy of x and say print this.
  • Implication The more random a system, the
    greater its AC.
  • AC is related to entropy
  • The entropy rate of a symbolic sequence
    measures the unpredictability (in bits per
    symbol) of the sequence.
  • The entropy rate is also known as the entropy
    density or the metric density.
  • The average growth rate of K(x) is equal to the
    entropy rate
  • For a sequence of n random variables, how does
    the entropy of the sequence grow with n?

13
Measures of Complexity that Capture Properties
Distinct from Randomness
AlgorithmicComplexity
Structural Complexity
  • Measures of randomness do not capture pattern,
    structure, correlation, or organization.
  • Mutual information, Wolframs CA classification.
  • The edge of chaos.

14
Logical Depth and Turing Machines
  • Bennett 19861990
  • The Logical depth of x is the run time of the
    shortest program that will cause a UTM to produce
    x and then halt.
  • Logical depth is not a measure of randomness it
    is small both for trivially ordered and random
    strings.
  • Drawbacks
  • Uncomputable.
  • Loses the ability to distinguish between systems
    that can be described by computational models
    less powerful than Turing Machines (e.g.,
    finite-state machines).

15
Detour into Information Theory
  • Shannon Entropy H to measure basic information
    capacity
  • For a random variable X with a probability mass
    function p(x), the entropy of X is defined as
  • Entropy is measured in bits.
  • H measures the average uncertainty in the random
    variable.
  • Example 1
  • Consider a random variable with uniform
    distribution over 32 outcomes.
  • To identify an outcome, we need a label that
    takes on 32 different values, e.g., 5-bit
    strings.

16
What is a Random Variable?
  • A function defined on a sample space.
  • Should be called random function.
  • Independent variable is a point in a sample space
    (e.g., the outcome of an experiment).
  • A function of outcomes, rather than a single
    given outcome.
  • Probability distribution of the random variable
    X
  • Example
  • Toss 3 fair coins.
  • Let X denote the number of heads appearing.
  • X is a random variable taking on one of the
    values (0,1,2,3).
  • PX0 1/8 PX1 3/8 PX2 3/8 PX3
    1/8.

17
Detour into Information Theory cont.
  • Example 2
  • A horse race with 8 horses competing.
  • The probabilities of 8 horses are
  • Calculate the entropy H of the horse race
  • Suppose that we wish to send a (short) message to
    another person indicating which horse won the
    race.
  • Could send the index of the winning horse (3
    bits).
  • Alternatively, could use the following set of
    labels
  • 0, 10, 110, 1110, 111100, 111101, 111110, 11111.
  • Average description length is 2 bits (instead of
    3).

18
Detour into Information Theory cont.
  • More generally,
  • the entropy of a random variable is a lower bound
    on the average number of bits required to
    represent the random variable.
  • The uncertainty (complexity) of a random variable
    can be extended to define the descriptive
    complexity of a single string.
  • E.g., Kolmogorov (or algorithmic) complexity is
    the length of the shortest computer program that
    prints out the string.
  • Entropy is the uncertainty of a single random
    variable.
  • Conditional entropy is the entropy of a random
    variable given another random variable.

19
Mutual Information
  • Measures the amount of information that one
    random variable contains about another random
    variable.
  • Mutual information is a measure of reduction of
    uncertainty due to another random variable.
  • That is, mutual information measures the
    dependence between two random variables.
  • It is symmetric in X and Y, and is always
    non-negative.
  • Recall Entropy of a random variable X is H(X).
  • Conditional entropy of a random variable X given
    another random variable Y H(X Y).
  • The mutual information of two random variables X
    and Y is
  • I(X,Y) H(X) H(Y) - H(X,Y) .

20
Summary of Complexity Measures
  • Computational complexity
  • How many resources does it take to compute a
    function?
  • The language/machine hierarchy
  • How complex a machine is needed to compute a
    function?
  • Information-theoretic methods
  • Entropy
  • Algorithmic complexity
  • Mutual information
  • Logical depth
  • Run-time of the shortest program that generates
    the phenomena and halts.
  • Asymptotic behavior of dynamical systems
  • Fixed points, limit cycles, chaos.
  • Wolframs CA classification.
  • Langtons lambda parameter.

21
Suggested References
  • Computational Complexity by Papadimitriou.
    Addison-Wesley (1994).
  • Elements of Information Theory by Cover and
    Thomas. Wiley (1991).
Write a Comment
User Comments (0)
About PowerShow.com