Computability of entropy and information in classical Hamiltonian systems - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Computability of entropy and information in classical Hamiltonian systems

Description:

Sungyun Kim. Asia-Pacific Center for Theoretical Physics. Motivation ... Pour-El and Richards, computability in analysis and physics, Springer-Verlag(1989) ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 22
Provided by: rde54
Category:

less

Transcript and Presenter's Notes

Title: Computability of entropy and information in classical Hamiltonian systems


1
Computability of entropy and information in
classical Hamiltonian systems
  • Sungyun Kim
  • Asia-Pacific Center for Theoretical Physics

2
Motivation
  • Statistical physics is a physics theory with
    partial information.
  • Entropy and information are central concepts in
    statistical physics.
  • Second law of thermodynamics states entropy is
    always nondecreasing during the time evolution.
  • To which extent we can actually compute the
    entropy and information is an interesting
    question.

3
outline
  • Definition of entropy and information for
    classical probability densities.
  • Properties of classical Hamiltonian time
    evolution.
  • Definitions and notions of computability
  • Example of initially computable probability,
    Hamiltonian and information with non computable
    time evolution of entropy.

4
Entropy and Information
According to Shannon, the entropy is defined as
For an unknown digit X, if it has ½ P of 0 and ½
P of 1, the entropy is Log2 1bit. (base of 2)
If the unknown digit is fixed, the entropy is
reduced to 0 bit and information is gained by 1
bit. The entropy can be viewed as degree of
uncertainty, degree of freedom or information
capacity. Y 0.1001XXX.
Cause of entropy increase
Entropy increase due to the information loss
Entropy increase due to the information Capacity
increase.
5
Entropy for a continuous probability distribution
For a probability distribution with
continuous parameters we discretize the
distribution for the computation.
The probability is defined through the discrete
density and support.
6
Information (relative entropy)
Total Information capacity
Example of equal division
Information
Information maximum for discrete delta. Minimum
for uniform distribution.
Total Information capacity
The entropy defined here is subjective quantity,
depending on two things.
Reference distribution and coarse grained
partition.
7
Entropy and information in Hamiltonian time
evolution
In classical Hamiltonian system, the probability
density moves like an incompressible fluid. The
measure is preserved and if we follow the time
evolution of a point the probability density is
constant.
For the fixed grid, coarse graining is inevitable.
8
If we are looking through a fixed grid
Suppose that we divide the phase space with N
number of cells at t0.
After discrete time evolution t1, cells are
deformed. But if our grid is fixed,
Information is always same or lost during the
time evolution, for fixed discrete partition.
9
AfterThought
  • Information was lost due to the coarse graining.
  • No coarse graining ? No information loss?
  • Computation always possible?

Entropy calculated with out coarse graining
Information is the same, but total information
capacity increased.
Can we compute it?
10
History of computability
  • In 1930s, Gödel showed in an axiomatic systems
    which is strong enough to express natural numbers
    there exists unprovable statements.
  • Turing applies to this programs and showed there
    exists problems which is not solvable by
    algorithms (Halting problem)
  • In algorithmic information theory, G. chaitin
    showed that with n bit of axioms it is impossible
    to prove the complexities of strings greater than
    nc bits.
  • C. Moore showed that a Hamiltonian mapping can be
    mapped into a Truing machine, and where the
    trajectories are passing can be mapped into a
    Halting problem.
  • Exapmles of computable initial condition and
    noncomputable time evolution is shown in wave
    equation.
  • (Pour-El et al)

11
Computability preliminaries
Pour-El and Richards, computability in analysis
and physics, Springer-Verlag(1989)
Computability of numbers
Definition 1
Definition 2
Definition 3
12
Computability of functions
Definition 4
Definition 5
13
Recursively enumerable nonrecursive set
(We can compute a(0), a(1), a(2) . using Turing
machine.)
Theorem 1
There exists an algorithm of calculating a set
but no algorithm to calculate its complement.
Waiting lemma
14
Example and possible(?) examples of recursively
enumerable nonrecursive sets
Turings Halting problem proved to be an
unsolvable problem. No general Algorithm of
deciding whether a program will halt or not. The
set of Halting programs generate recursively
enumerable nonrecursive sets.
Cellular automata (?) there are cellular
automatas, with given initial pattern and given
local rules we can calculate the time evolution
of patterns. P p(1), p(2), p(3).. But the
question of Is a pattern p is in P or not is a
very difficult question.
Mathematical conjectures Goldbach Any even
number larger than 2 can be expressed by sum of
two prime numbers. For a set G n an even
number larger than 2 satisfying Goldbach , we
can show G4,6,8 but finding a generating
algorithm of even N gt 2 G is not known.
In experiment There are experimental ways of
verifying existence of a particle (ex. by a
photo detector) but sometimes it is very hard to
verify experimentally certain particle does not
exists (magnetic monopole?).
15
Examples of noncomputable quantities
1. A noncomputable number.
is a noncomputable number.
The sum is bounded and monotonically increasing,
so the limit exists.
Suppose that
But effective determination is not possible.
2. A function not bounded by any recursive
function.
cannot be bounded by any recursive function.
For only finite number of a(n) are
smaller than m.
For all x the sum converges.
But to get the all a(n) which is smaller than m
we must know the waiting function.
Function grows exponentials of waiting time.
16
3. A linear operator mapping from computable to
noncomputable
Consider
If a linear operator maps
is computable but is not computable.
Ex)
In general, bounded linear operators preserve
computability but unbounded operators do not.
(Pour-El and Richards)
17
Construction of computable initial information
and Hamiltonian, but noncomputable time
evolution of entropy
Initial Hamiltonian and derivatives, probability
density and information is computable.
18
The solution of Hamiltonian
For
19
The total information capacity increases as
As time increases, this function can increase
faster than any recursive function, under any
This is an example of probability and its measure
is conserved, but information capacity to
describe the probability distribution with
respect to the original partition may grow faster
than any recursive function. so information
later time is not computable.
20
Conclusion
  • In continuous systems entropy and information are
    defined through reference frame and partition.
  • The conservation of probability and probability
    measure is not enough to guarantee the
    computation of information.

21
references
  • K. Gödel, Monatshefte für Mathematic und Physik,
    38, 173, 1931
  • A.Turing, Proc. London Math. Soc. Ser2, 42,230,
    1937
  • C. Shannon, Bell Systems Technical Journal, 27,
    379, 1948.
  • E.T. Jaynes, Phys. Rev. 106,620, 1957.
  • G.J. Chaitin, IEEE Transactions on Information
    theory, IT-20,10,1974
  • C.Moore, Phys. Rev. Lett. 64, 2354,1990
  • Z.Xia, Ann. Math. 135, 411, 1992.
  • M.B. Pour-El and J.I. Richards, Computability in
    Analysis and physics, Springer-Verlag, 1989.
  • L.P. Hund, J. Kari and K. Culik, Ergodic theory
    and Dynamical Systems 12,255,1992
  • L. Matyas, Tamas Tel and J. Vollmer, Phys. Rev.
    E, 69,016205,2004.
  • D. Graça et.el, Electronic Note on Theo. Comp.
    Science. 202,49,2008.
Write a Comment
User Comments (0)
About PowerShow.com