BIO-INSPIRED AND COGNITIVE COMPUTING for data mining, tracking, fusion, financial prediction, language understanding, web search engines, and diagnostic modeling of cultures - PowerPoint PPT Presentation

About This Presentation
Title:

BIO-INSPIRED AND COGNITIVE COMPUTING for data mining, tracking, fusion, financial prediction, language understanding, web search engines, and diagnostic modeling of cultures

Description:

BIO-INSPIRED AND COGNITIVE COMPUTING for data mining, tracking, fusion, financial prediction, language understanding, web search engines, and diagnostic modeling of ... – PowerPoint PPT presentation

Number of Views:403
Avg rating:3.0/5.0
Slides: 152
Provided by: leonidper
Category:

less

Transcript and Presenter's Notes

Title: BIO-INSPIRED AND COGNITIVE COMPUTING for data mining, tracking, fusion, financial prediction, language understanding, web search engines, and diagnostic modeling of cultures


1
BIO-INSPIRED AND COGNITIVE COMPUTINGfor data
mining, tracking, fusion, financial prediction,
language understanding, web search engines, and
diagnostic modeling of cultures
IEEE 2007 Fall Short Course Holiday Inn, Woburn
MA 700 900 pm, Nov. 8, 15, 22, Dec. 6
Leonid Perlovsky Visiting Scholar, Harvard
University Technical Advisor, AFRL
2
OUTLINE
  • 1. Cognition, Complexity, and Logic
  • 2. The Knowledge Instinct
  • -Neural Modeling Fields and Dynamic Logic
  • 3. Language
  • 4. Integration of cognition and language
  • 5. High Cognitive Functions
  • 6. Evolution of cultures
  • 7. Future directions

3
DETAILED OUTLINE
  • 1.  Cognition integration of real-time signals
    and a priori knowledge
  • 1.1.  physics and mathematics of the mind
  • 1.2.  genetic argument for the first principles
  • 1.3.   the nature of understanding
  • 1.3.1.  concepts chair
  • 1.3.2.  hierarchy
  • 1.4. combinatorial complexity (CC) a
    fundamental problem?
  • 1.5.   CC since 1950s
  • 1.6.   CC vs. logic
  • 1.6.1. formal, multivalued and fuzzy logics
  • 1.6.2. dynamic logic
  • 1.6.3. Aristotle vs. Godel Alexander the Great
  • 1.7.   mathematics vs. mind
  • 1.8. structure of the mind concepts, instincts,
    emotions, behavior
  • 1.9.   the knowledge instinct
  • 1.9.1. need for learning
  • 1.9.2. knowledge emotion aesthetic emotion
  • 2. Modeling Field Theory (NMF) of cognition

2.4.   applications, examples, exercises 2.4.1 
clustering 2.4.2  tracking and CRB - example
of tracking below clutter - complexity NMF vs.
MHT - models 2.4.3  recognition - example of
pattern in image below clutter - complexity NMF
vs. MHT - models 2.4.4  fusion - example of
fusion, navigation, and detection below
clutter - models 2.4.5  prediction - financial
prediction - models 2.5.   block-diagrams 2.7.  
hierarchical structure 3. Language 3.1.
language acquisition and complexity 3.1.2.
language separate from cognition 3.1.3.
hierarchy of language 3.1.4. application -
search engine based on understanding 3.2. NMF
of language 3.2.1 differentiating
non-differentiable qualitative functions
4
DETAILED OUTLINECONTINUATION
  • 4. Integration of cognition and language
  • 4.1. language vs. cognition
  • 4.2. past AI and Chomskyan linguistics
  • 4.3. integrated models
  • 4.3. integrated hierarchies
  • 4.4. Humboldts inner linguistic form
  • 5. Prolegomena to a theory of the mind
  • 5.1. higher cognitive functions
  • 5.2. from Plato to Lock
  • 5.3. from Kant to Grossberg
  • 5.4. NMF vs. Buddhism
  • 5.5. NMF vs. neuro-biology
  • 5.5. NMF dynamics elementary thought process
  • 5.6. consciousness and unconscious
  • 5.7. aesthetics and beauty
  • 5.8. intuition
  • 5.9. why Adam was expelled from paradise
  • 5.9. symbols and signs

6. Evolution of Culture 6.1. Culture
and language 6.2. KI differentiation and
synthesis 6.3. Spiritual cultural
measurements 6.4. Mathematical modeling and
predictions 6.4.1. dynamic culture 6.4.2. tradit
ional culture 6.4.3. terrorists
consciousness 6.4.4. interacting
cultures 6.5. Evolution of concepts and
emotions 6.6. Creativity 6.7.
Disintegration of cultures 6.8. Emotions
in language 6.9. English vs. Arabic 6.10.
Synthesis 6.11. Differentiation of
emotions 6.12. Role of music in evolution of
the mind and culture 7. Future directions
publications 7.1. Science and
Religion 7.2 Predictions and testing 7.3. Future
directions 7.4 Publications
5
INTRODUCTION
  • Nature of the mind

6
PHYSICS AND MATHEMATICS OF THE MINDRANGE OF
CONCEPTS
  • Logic is sufficient to explain mind
  • Newell, Artificial Intelligence, 1980s
  • No new specific mathematical concepts are needed
  • Mind is a collection of ad-hoc principles,
    Minsky, 1990s
  • Specific mathematical constructs describe the
    multiplicity of mind phenomena
  • first physical principles of mind
  • Grossberg, Zadeh, Perlovsky,
  • Quantum computation
  • Hameroff, Penrose, Perlovsky,
  • New unknown yet physical phenomena
  • Josephson, Penrose

7
GENETIC ARGUMENTSFOR THE FIRST PRINCIPLES
  • Only 30,000 genes in human genome
  • Only about 2 difference between human and apes
  • Say, 1 difference between human and ape minds
  • Only about 300 proteins
  • Therefore, the mind has to utilize few inborn
    principles
  • If we count a protein per concept
  • If we count combinations 300300 unlimited gt
    all concepts and languages could have been
    genetically h/w-ed (!?!)
  • Languages and concepts are not genetically
    hardwired
  • Because they have to be flexible and adaptive

8
COGNITION
  • Understanding the world around
  • Perception
  • Simple objects
  • Complex situations
  • Integration of real-time signals and existing (a
    priori) knowledge
  • From signals to concepts
  • From less knowledge to more knowledge

9
EXAMPLE
  • Example this is a chair, it is for sitting
  • Identify objects
  • signals -gt concepts
  • What in the mind help us do this?
    Representations, models, ontologies?
  • What is the nature of representations in the
    mind?
  • Wooden chairs in the world, but no wood in the
    brain

10
VISUAL PERCEPTION
  • Neural mechanisms are well studied
  • Projection from retina to visual cortex
    (geometrically accurate)
  • Projection of memories-models
  • from memory to visual cortex
  • Matching sensory signals and models
  • In visual nerve more feedback connections than
    feedforward
  • matching involves complicated adaptation of
    models and signals
  • Difficulty
  • Associate signals with models
  • A lot of models (expected objects and scences)
  • Many more combinations modelslt-gtpixels
  • Association adaptation
  • To adapt, signals and models should be associated
  • To associate, they should be adapted

11
ALGORITHMIC DIFFICULTIES A FUNDAMENTAL PROBLEM?
  • Cognition and language involve evaluating large
    numbers of combinations
  • Pixels -gt objects -gt scenes
  • Combinatorial Complexity (CC)
  • A general problem (since the 1950s)
  • Detection, recognition, tracking, fusion,
    situational awareness, language
  • Pattern recognition, neural networks, rule
    systems
  • Combinations of 100 elements are 100100
  • This number the size of the Universe
  • gt all the events in the Universe during its
    entire life

12
COMBINATORIAL COMPLEXITY SINCE the 1950s
  • CC was encountered for over 50 years
  • Statistical pattern recognition and neural
    networks CC of learning requirements
  • Rule systems and AI, in the presence of
    variability CC of rules
  • Minsky 1960s Artificial Intelligence
  • Chomsky 1957 language mechanisms are rule
    systems
  • Model-based systems, with adaptive models CC of
    computations
  • Chomsky 1981 language mechanisms are model-based
    (rules and parameters)
  • Current ontologies, semantic web are
    rule-systems
  • Evolvable ontologies present challenge

13
CC AND TYPES OF LOGIC
  • CC is related to formal logic
  • Law of excluded middle (or excluded third)
  • every logical statement is either true or false
  • Gödel proved that logic is illogical,
    inconsistent (1930s)
  • CC is Gödel's incompleteness in a finite system
  • Multivalued logic eliminated the law of excluded
    third
  • Still, the math. of formal logic
  • Excluded 3rd -gt excluded (n1)
  • Fuzzy logic eliminated the law of excluded
    third
  • Fuzzy logic systems are either too fuzzy or too
    crisp
  • The mind fits fuzziness for every statement at
    every step gt CC
  • Logic pervades all algorithms and neural networks
  • rule systems, fuzzy systems (degree of
    fuzziness), pattern recognition, neural networks
    (training uses logical statements)

14
LOGIC VS. GRADIENT ASCENT
  • Gradient ascent maximizes without CC
  • Requires continuous parameters
  • How to take gradients along association?
  • Data Xn (or?) to object m
  • It is a logical statement, discrete,
    non-differentiable
  • Models / ontologies require logic gt CC
  • Multivalued logic does not lead to gradient
    ascent
  • Fuzzy logic uses continuous association
    variables, but no parameters to differentiate
  • A new principle is needed to specify gradient
    ascent along fuzzy associations dynamic logic

15
DYNAMIC LOGIC
  • Dynamic Logic unifies formal and fuzzy logic
  • initial vague or fuzzy concepts dynamically
    evolve into formal-logic or crisp concepts
  • Dynamic logic
  • based on a similarity between models and signals
  • Overcomes CC of model-based recognition
  • fast algorithms

16
ARISTOTLE VS. GÖDEL logic, forms, and language
  • Aristotle
  • Logic a supreme way of argument
  • Forms representations in the mind
  • Form-as-potentiality evolves into
    form-as-actuality
  • Logic is valid for actualities, not for
    potentialities (Dynamic Logic)
  • Thought language and thinking are closely linked
  • Language contains the necessary uncertainty
  • From Boole to Russell formalization of logic
  • Logicians eliminated from logic uncertainty of
    language
  • Hilbert formalize rules of mathematical proofs
    forever
  • Gödel (the 1930s)
  • Logic is not consistent
  • Any statement can be proved true and false
  • Aristotle and Alexander the Great

17
OUTLINE
  • Cognition, complexity, and logic
  • Logic does not work, but the mind does
  • The Mind and Knowledge Instinct
  • Neural Modeling Fields and Dynamic Logic
  • Language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

18
STRUCTURE OF THE MIND
  • Concepts
  • Models of objects, their relations, and
    situations
  • Evolved to satisfy instincts
  • Instincts
  • Internal sensors (e.g. sugar level in blood)
  • Emotions
  • Neural signals connecting instincts and concepts
  • e.g. a hungry person sees food all around
  • Behavior
  • Models of goals (desires) and muscle-movement
  • Hierarchy
  • Concept-models and behavior-models are organized
    in a loose hierarchy

19
THE KNOWLEDGE INSTINCT
  • Model-concepts always have to be adapted
  • lighting, surrounding, new objects and situations
  • even when there is no concrete bodily needs
  • Instinct for knowledge and understanding
  • Increase similarity between models and the world
  • Emotions related to the knowledge instinct
  • Satisfaction or dissatisfaction
  • change in similarity between models and world
  • Related not to bodily instincts
  • harmony or disharmony (knowledge-world)
    aesthetic emotion

20
REASONS FOR PAST LIMITATIONS
  • Human intelligence combines conceptual
    understanding with emotional evaluation
  • A long-standing cultural belief that emotions are
    opposite to thinking and intellect
  • Stay cool to be smart
  • Socrates, Plato, Aristotle
  • Reiterated by founders of Artificial Intelligence
    Newell, Minsky

21
Neural Modeling Fields (NMF)
  • A mathematical construct modeling the mind
  • Neural synaptic fields represent model-concepts
  • A loose hierarchy of more and more general
    concepts
  • At every level bottom-up signals, top-down
    signals
  • At every level concepts, emotions, models,
    behavior
  • Concepts become input signals to the next level

22
NEURAL MODELING FIELDSbasic two-layer mechanism
from signals to concepts
  • Signals
  • Pixels or samples (from sensor or retina)
  • x(n), n 1,,N
  • Concept-Models (objects or situations)
  • Mm(Sm,n), parameters Sm, m 1,
  • Models predict expected signals from objects
  • Goal learn object-models and understand signals
    (knowledge instinct)

23
THE KNOWLEDGE INSTINCT
  • The knowledge instinct maximization of
    similarity between signals and models
  • Similarity between signals and models, L
  • L l (x) l (x(n))
  • l (x(n)) r(m) l (x(n) Mm(Sm,n))
  • l (x(n) Mm(Sm,n)) is a conditional similarity
    for x(n) given m
  • n are not independent, M(n) may depend on n
  • CC L contains MN items all associations of
    pixels and models (LOGIC)

24
SIMILARITY
  • Similarity as likelihood
  • l (x(n) Mm(Sm,n)) pdf(x(n) Mm(Sm,n)),
  • a conditional pdf for x(n) given m
  • e.g., Gaussian pdf(X(n)m) G(X(n)Mm,Cm)
  • 2p-d/2 detCm-1/2 exp(-DmnTCm-1 Dmn/2) Dmn
    X(n) Mm(n)
  • Note, this is NOT the usual Gaussian assumption
  • deviations from models D are random, not the data
    X
  • multiple models m can model any pdf, not one
    Gaussian model
  • Use for sets of data points
  • Similarity as information
  • l (x(n) Mm(Sm,n)) abs(x(n))pdf(x(n)
    Mm(Sm,n)),
  • a mutual information in model m on data x(n)
  • L is a mutual information in all model about all
    data

25
DYNAMIC LOGIC (DL) non-combinatorial solution
  • Start with a set of signals and unknown
    object-models
  • any parameter values Sm
  • associate object-model with its contents (signal
    composition)
  • (1) f(mn) r(m) l (nm) / r(m') l (nm')
  • Improve parameter estimation
  • (2) Sm Sm a f(mn) ?ln l
    (nm)/?Mm?Mm/?Sm
  • (a determines speed of convergence)
  • learn signal-contents of objects
  • Continue iterations (1)-(2). Theorem MF is a
    converging system
  • - similarity increases on each iteration
  • - aesthetic emotion is positive during learning

26
OUTLINE
  • Cognition, complexity, and logic
  • Logic does not work, but the mind does
  • The Mind and Knowledge Instinct
  • Neural Modeling Fields and Dynamic Logic
  • Application examples
  • Language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

27
APPLICATIONS
  • Many applications have been developed
  • Government
  • Medical
  • Commercial (about 25 companies use this
    technology)
  • Sensor signals processing and object recognition
  • Variety of sensors
  • Financial market predictions
  • Market crash on 9/11 predicted a week ahead
  • Internet search engines
  • Based on text understanding
  • Evolving ontologies for Semantic Web
  • Every application needs models
  • Future self-evolving models integrated cognition
    and language

28
APPLICATION 1 - CLUSTERING
  • Find natural groups or clusters in data
  • Use Gaussian pdf and simple models
  • l (nm) 2p-d/2 detCm-1/2 exp(-DmnTCm-1
    Dmn/2) Dmn X(n) Mm(n)
  • Mm(n) Mm each model has just 1 parameter, Sm
    Mm
  • This is clustering with Gaussian Mixture Model
  • For complex l(nm) derivatives can be taken
    numerically
  • For simple l(nm) derivatives can be taken
    manually
  • Simplification, not essential
  • Simplify parameter estimation equation for
    Gaussian pdf and simple models
  • ?ln l (nm)/?Mm ? (-DmnTCm-1 Dmn) /?Mm Cm-1
    Dmn DmnT Cm-1 2 Cm-1 Dmn, (C is symmetric)
  • Mm Mm a f(mn) Cm-1 Dmn

29
EXERCISE (1) HOME WORK
  • Code a simple NMF/DL for Gaussian Mixture
    clustering
  • (1) Simulate data
  • Specify true parameters, say
  • Dimensionality, d 2
  • Number of classes, 3, m 1, 2, 3
  • Rates rm 0.2, 0.3, 0.5
  • Number of samples, N 100 Nm Nrm 20, 30, 50
  • Means, Mm 2-d vectors, (0.1, 0.1), (0.2, 0.8)
    (0.8, 0.2)
  • Covariances, Cm unit matrixes
  • Call a function that generates Gaussian data
  • (2) Run NMF/DL code to estimate parameters (rm,
    Mm, Cm) and association probabilities f(mn)
  • Initiate parameters to any values, say each r
    1/3 means should not be the same, say M (0,0),
    (0,1), (1,1) covariances should be initiated to
    larger values than uncertainties in the means
    (squared), say each C diag(2,2).
  • Run iterations estimate f(mn), estimate
    parameters until, say (changes in M) lt 0.01
  • (3) Plot results (2-d plots)
  • Plot data for each class in different
    color/symbols
  • Plot means

30
APPLICATION 2 - TRACKING
  • 1) Example
  • 2) Complexity of computations
  • Exercise
  • Cramer-Rao Bounds

31
Example 2 GMTI Tracking and Detection Below
Clutter
DL starts with uncertain knowledge and converges
rapidly on exact solution
18 dB improvement
32
EXAMPLE (2) note page
  • Detection and tracking targets below clutter (a)
    true track positions in 0.5km x 0.5km data set
    (b) actual data available for detection and
    tracking (signal is below clutter,
    signal-to-clutter ratio is about 2dB for
    amplitude and 3dB for Doppler 6 scans are shown
    on top of each other, each contains 500 data
    points). Dynamic logic operation (c) an initial
    fuzzy model, the fuzziness corresponds to the
    uncertainty of knowledge (d) to (h) show
    increasingly improved models at various
    iterations (total of 20 iterations). Between (c)
    and (d) the algorithm fits the data with one
    model, uncertainty is somewhat reduced. There are
    two types of models one uniform model describing
    clutter (it is not shown), and linear track
    models with large uncertainty the number of
    track models, locations, and velocities are
    estimated from the data. Between (d) and (e) the
    algorithm tried to fit the data with more than
    one track-model and decided, that it needs two
    models to understand the content of the data.
    Fitting with 2 tracks continues till (f) between
    (f) and (g) a third track is added. Iterations
    stopped at (h), when similarity stopped
    increasing. Detected tracks closely correspond to
    the truth (a). Complexity of this solution is
    low, about 106 operations. Solving this problem
    by MHT (template matching with evaluating
    combinations of various associations this is a
    standard state-of-the-art) would take about MN
    101700 operations, unsolvable.

33
TRACKING AND DETECTION BELOW CLUTTER (movie,
same as above)
DL starts with uncertain knowledge, and similar
to human mind does not sort through all
possibilities, but converges rapidly on exact
solution
3 targets, 6 scans, signal-to-clutter, S/C
-3.0dB
34
TRACKING EXAMPLE complexity and improvement
  • Technical difficulty
  • Signal/Clutter - 3 dB, standard tracking
    requirements 15 dB
  • Computations, standard hypothesis testing
    101700, unsolvable
  • Solved by Dynamic Logic
  • Computations 2x107
  • Improvement 18 dB

35
EXERCISE (2)
  • Develop NMF/DL for target tracking
  • Step 1 (and the only one)
  • Develop models for tracking
  • Model where do you expect to see the target
  • Parameters, Sm position xm and velocity vm
  • Mm(xm, vm , n) xm vm tn
  • Time, tn is not a parameter of the model, but
    data
  • More complex Keplerian trajectories
  • Note typical data are 2-d (Q,F) or 3-d (R,Q,F)
  • Models are 3-d (or 2-d, if sufficient)
  • 3-d models might be not estimatable from 2-d
    data
  • linear track in (x,y,z) over a short time appears
    linear in (Q,F)
  • then, use 2-d models, or additional data, or
    longer observation period

36
CRAMER-RAO BOUND (CRB)
  • Can a particular set of models be estimated from
    a particular (limited) set of data?
  • The question is not trivial
  • A simple rule-of-thumb N(data points) gt
    10S(parameters)
  • In addition use your mind is there enough
    information in the data?
  • CRB minimal estimation error (best possible
    estimation) for any algorithm or neural neworks,
    or
  • When there are many data points, CRB is a good
    measure (MLNMF)
  • When there are few data points (e.g. financial
    prediction) it might be difficult to access
    performance
  • Actual errors gtgt CRB
  • Simple well-known CRB for averaging several
    measurements
  • st.dev(n) st.dev(1)/vn
  • Complex CRB for tracking
  • Perlovsky, L.I. (1997a). Cramer-Rao Bound for
    Tracking in Clutter and Tracking Multiple
    Objects. Pattern Recognition Letters, 18(3),
    pp.283-288.

37
EXERCISE (2 cont.) HOMEWORK
  • Homework code NMF/DL for tracking, simulate
    data, run the code, and plot results
  • When simulating data add sensor error and
    clutter
  • track sensor errors X(n,m) xm vm tn
    emn
  • sensor error use sensor error model (or simple
    Gaussian) for em,n
  • clutter X(n,m1) ec,n use clutter model for
    ec,n, or a simple uniform or Gaussian simulate
    required number of clutter data points
  • Do not forget to add clutter model to your NMF/FL
    code
  • l (nmclutter) const the only parameter, rc
    expected proportion of clutter
  • Note the resulting system is track-before-detect
    or more accurately concurrent detection and
    tracking
  • does not require a standard procedure
  • (1) detect, (2) associate, (3) track
  • association and detection is obtained together
    with tracking
  • DL requires no combinatorial searches, which
    often limits track-before-detect performance

38
APPLICATION 3
  • FINDING PATTERNS IN IMAGES

39
IMAGE PATTERN BELOW NOISE
Object Image
Object Image Clutter
y
y
x
x
40
PRIOR STATE-OF-THE-ART Computational complexity
Multiple Hypothesis Testing (MHT) approach try
all possible ways of fitting model to the data
For a 100 x 100 pixel image
Number of Objects Number of
Computations 1
1010 2
1020
3 1030
41
NMF MODELS
  • Information similarity measure
  • lnl (x(n) Mm(Sm,n)) abs(x(n))ln pdf(x(n)
    Mm(Sm, n))
  • n (nx,ny)
  • Clutter concept-model (m1)
  • pdf(X(n)1) r1
  • Object concept-model (m2 )
  • pdf(x(n) Mm(Sm, n)) r2 G(X(n)Mm
    (n,k),Cm)
  • Mm (n,k) n0 a(k2,k) (note k, K require
    no estimation)

42
ONE PATTERN BELOW CLUTTER
Y
X
SNR -2.0 dB
43
DYNAMIC LOGIC WORKING
DL starts with uncertain knowledge, and similar
to human mind converges rapidly on exact solution
  • Object invisible to human eye
  • By integrating data with the knowledge-model DL
    finds an object below noise

y (m) Range
x (m) Cross-range
44
MULTIPLE PATTERNS BELOW CLUTTER
Three objects in noise
object 1 object 2 object 3
SCR - 0.70 dB -1.98 dB -0.73 dB
3 Object Image Clutter
3 Object Image
y
y
x
x
45
IMAGE PATTERNS BELOW CLUTTER (dynamic logic
iterations see note-text)
46
IMAGE PATTERNS BELOW CLUTTER (dynamic logic
iterations see note-text)
Logical complexity MN 105000, unsolvable DL
complexity 107 S/C improvement 16 dB
47
APPLICATION (3) note page
  • smile and frown patterns (a) true smile
    and frown patterns shown without clutter (b)
    actual image available for recognition (signal is
    below clutter, signal-to-clutter ratio is between
    2dB and 0.7dB) (c) an initial fuzzy model, the
    fuzziness corresponds to the uncertainty of
    knowledge (d) to (h) show increasingly improved
    models at various iterations (total of 22
    iterations). Between (d) and (e) the algorithm
    tried to fit the data with more than one model
    and decided, that it needs three models to
    understand the content of the data. There are
    several types of models one uniform model
    describing the clutter (it is not shown), and a
    variable number of blob models and parabolic
    models, which number, locations, and curvatures
    are estimated from the data. Until about (g) the
    algorithm thought in terms of simple blob
    models, at (g) and beyond, the algorithm decided
    that it needs more complex parabolic models to
    describe the data. Iterations stopped at (h),
    when similarity stopped increasing. Complexity of
    this solution is moderate, about 1010 operations.
    Solving this problem by template matching would
    take a prohibitive 1030 to 1040 operations. (This
    example is discussed in more details in i.)
    i Linnehan, R., Mutz, Perlovsky, L.I., C.,
    Weijers, B., Schindler, J., Brockett, R. (2003).
    Detection of Patterns Below Clutter in Images.
    Int. Conf. On Integration of Knowledge Intensive
    Multi-Agent Systems, Cambridge, MA Oct.1-3, 2003.

48
MULTIPLE TARGET DETECTION DL WORKING EXAMPLE
DL starts with uncertain knowledge, and similar
to human mind does not sort through all
possibilities like an MHT, but converges rapidly
on exact solution
y
x
49
COMPUTATIONAL REQUIREMENTS COMPARED
Dynamic Logic (DL) vs. Classical
State-of-the-art Multiple Hypothesis Testing
(MHT) Based on 100 x 100 pixel image
Number of Objects
Number of Computations DL vs. MHT
108 vs. 1010 2x108 vs.
1020 3x108 vs. 1030
1 2 3
  • Previously un-computable (1030), can now be
    computed (3x108 )
  • This pertains to many complex
    information-finding problems

50
APPLICATION 4
  • SENSOR FUSION
  • Concurrent fusion, navigation, and detection
  • below clutter

51
SENSOR FUSION
  • The difficult part of sensor fusion is
    association of data among sensors
  • Which sample in one sensor corresponds to which
    sample in another sensor?
  • If objects can be detected in each sensor
    individually
  • Still the problem of data association remains
  • Sometimes it is solved through coordinate
    estimation
  • If 3-d coordinates can be estimated reliably in
    each sensor
  • Sometimes it is solved through tracking
  • If objects could be reliably tracked in each
    sensor, gt 3-d coordinates
  • If objects cannot be detected in each sensor
    individually
  • We have to find the best possible association
    among multiple samples
  • This is most difficult concurrent detection,
    tacking, and fusion

52
NMF/DL SENSOR FUSION
  • NMF/DL for sensor fusion requires no new
    conceptual development
  • Multiple sensor data require multiple sensor
    models
  • Data n -gt (s,n) X(n) -gt X(s,n)
  • Models Mm(n) -gt Mm(s,n)
  • PDF(nm) is a product over sensors
  • This is a standard probabilistic procedure,
    another sensor is like another dimension
  • pdf(mn) -gt pdf(ms,n)
  • Note this solves the difficult problem of
    concurrent detection, tracking, and fusion

53
Source UAS Roadmap 2005-2030
UNCLASSIFIED
54
CONCURRENT NAVIGATION, FUSION, AND DETECTION
  • multiple target detection and localization based
    on data from multiple micro-UAVs
  • A complex case
  • detection requires fusion (cannot be done with
    one sensor)
  • fusion requires exact target position estimation
    in 3-D
  • target position can be estimated by triangulation
    from multiple views
  • this requires exact UAV position
  • GPS is not sufficient
  • UAV position - by triangulation relative to
    known targets
  • therefore target detection and localization is
    performed concurrently with UAV navigation and
    localization, and fusion of information from
    multiple UAVs
  • Unsolvable using standard methods. Dynamic logic
    can solve because computational complexity scales
    linearly with number of sensors and targets

55
GEOMETRY MULTIPLE TARGETS, MULTIPLE UAVS
UAV m
Xm X0m Vmt
UAV 1
Xm(Xm,Ym,Zm)
X1(X1,Y1,Z1)
X1 X01 V1t
56
CONDITIONAL SIMILARITIES (pdf) FOR TARGET k
Data from UAV m, sample number n, where ßnm
signature position and fnm classification
feature vector
Similarity for the data, given target k
signature position
where
classification features
Note Also have a pdf for a single clutter
component pdf(wnm k0) which is uniform in ßnm,
Gaussian in fnm.
57
Data Model and Likelihood Similarity
Total pdf of data samples is the summation of
conditional pdfs (summation over targets plus
clutter)
(mixture model)
classification feature parameters
UAV parameters
target parameters
58
Concurrent Parameter Estimation / Signature
Association (NMF iterations)
FIND SOLUTION FOR SET OF BEST PARAMETERS BY
ITERATING BETWEEN
Parameter Estimation and
Association Probability Estimation
(Bayes rule)
(probability that sample wnm was generated by
target k)
Note1 bracket notation Note2 proven to
converge (e.g. EM algorithm) Note 3 Minimum MSE
solution incorporates GPS measurements
59
Sensor 1 (of 3) Models Evolve to Locate Target
Tracks in Image Data
60
Sensor 2 (of 3) Models Evolve to Locate Target
Tracks in Image Data
61
Sensor 3 (of 3) Models Evolve to Locate Target
Tracks in Image Data
62
NAVIGATION, FUSION, TRACKING, AND DETECTION (this
is the basis for the previous 3 figures, all
fused in x,y,z, coordinatesdouble-click on the
blob to play movie)
63
Model Parameters Iteratively Adapt to Locate the
Targets
Error vs. iteration (4 targets)
Estimated Target Position vs. iteration (4
targets)
64
Parameter Estimation Errors Decrease with
Increasing Number of UAVs in the Swarm
Error in Parameter Estimates vs. clutter level
and of UAVs in the swarm
Target position
UAV position
(Note Results are based upon Monte Carlo
simulations with synthetic data)
Error falls off as 1/vM, where M UAVs in
the swarm
65
APPLICATION 5
  • DETECTION IN SEQUENCES OF IMAGES

66
DETECTION IN A SEQUENCE OF IMAGES
Signature high noise level (SNR -6dB)
Signature low noise level (SNR 25dB)
signature is present, but is obscured by noise
67
DETECTION IN IMAGE SEQUENCETEN ROTATION FRAMES
WERE USED
Iteration 10
Iteration 100
Iteration 400
  • Upon convergence of the model, important
    parameters are estimated, including center of
    rotation, which will next be used for spectrum
    estimation.
  • Four model components were used, including a
    uniform background component. Only one component
    became associated with point source.

Compare with Measured Image (w/o noise)
Iteration 600
68
TARGET SIGNATURE
69
APPLICATION 6
  • Radar Imaging through walls
  • - Inverse scattering problem
  • - Standard radar imaging algorithms (SAR) do not
    work because of multi-paths, refractions, clutter

70
SCENARIO
71
RADAR IMAGING THROUGH WALLS
Standard SAR imaging does not work Because of
refraction, multi-paths and clutter
Estimated model, work in progress Remains -increa
se convergence area -increase complexity of
scenario -adaptive control of sensors
72
DYNAMIC LOGIC / NMF
INTEGRATED INFORMATION objects
relations situations behavior
Dynamic Logic combining conceptual analysis
with emotional evaluation
MODELS - objects - relations - situations -
behavioral
Data and Signals
73
CLASSICAL METHODOLGYno closure
Result Conceptual objects
  • MODELS/templates
  • objects, sensors
  • physical models

Recognition
Input World/scene
signals
Sensors / Effectors
74
NMF closurebasic two-layer hierarchy signals
and concepts
Result Conceptual objects
Attention / Action
Correspondence / Similarity measures
signals Sim.signals
  • MODELS
  • objects, sensors
  • physical models

signals
Sensors / Effectors
Input World/scene
75
APPLICATION 7
  • Prediction
  • - Financial prediction

76
PREDICTION
  • Simple linear regression
  • y(x) Axb
  • Multi-dimensional regression y,x,b are vectors,
    A is a m-x
  • Problem given y,x, estimate A,b
  • Solution to linear regression (well known)
  • Estimate means ltygt, ltxgt, and x-y covariance
    matrix C
  • A Cyx Cxx-1 b ltygt - Altxgt
  • Difficulties
  • Non-linear y(x), unknown shape
  • y(x) changes regime (from up to down)
  • and this is the most important event (financial
    prediction)
  • No sufficient data to estimate C
  • required 10dxy3 data points, or more

77
NMF/DL PREDICTION
  • General non-linear regression (GNLR)
  • y(x) f(mn) ym(x) f(mn)
    (Amxbm)
  • Amand bm are estimated similar to A,b in linear
    regression with the
  • following change all () are changed into
    f(mn)()
  • For prediction, we remember that f(mn) f(mx)
  • Interpretation
  • m are regimes or processes, f(mx) determines
    influence of regime m at point x (probability of
    process m being active)
  • Applications
  • Non-linear y(x), unknown shape
  • Detection of y(x) regime change (e.g. financial
    prediction or control)
  • Minimal number of parameters 2 linear
    regressions f(mn) are functions of the same
    parameters
  • Efficient estimation (ML)
  • Potential for the fastest possible detection of a
    regime change

78
FINANCIAL PREDICTION Efficient Market Hypothesis
  • Efficient market hypothesis, strong
  • no method for data processing or market analysis
    will bring advantage over average market
    performance (only illegal trading on nonpublic
    material information will get one ahead of the
    market)
  • Reasoning too many market participants will try
    the same tricks
  • Efficient market hypothesis, week
  • to get ahead of average market performance one
    has to do something better than the rest of the
    world better math. methods, or better analysis,
    or something else (it is possible to get ahead of
    the market legally)

79
FINANCIAL PREDICTIONBASICS OF MATH. PREDICTION
  • Basic idea train from t1 to t2, predict and
    trade on t21 increment t1-gtt11, t2-gtt21
  • Number of data points between t1 to t2 should be
    gtgt number of parameters
  • Decide on frequency of trading, it should
    correspond to your psychological makeup and
    practical situation
  • E.g. day-trading has more potential for making
    (or losing) a lot of money fast, but requires
    full time commitment
  • Get past data, split into 3 sets (1)developing,
    (2)testing, (3)final test (best, in real time,
    paper trades)
  • After much effort on (1), try on (2), if work,
    try on (3)

80
DETAILS OFFINANCIAL PREDICTION
  • Develop best mathematical technique for market
    prediction
  • Takes a lot of effort
  • Test in up and down markets
  • If your simulated portfolio goes up and up
    smoothly (you are constantly making money in
    computer simulation), look for an error
  • Simple errors in computing return
  • Include spread and commission in return
  • Illegal training using (t21) information for
    training (or trading)
  • Technical details
  • What to optimize in training/development?
  • Performance (return or cum. return), ROR (end
    - start) / start
  • Sharpe (return/risk ratio) Sh (ROR RORmarket)
    / std(RORmarket)
  • Sh gt 1 ok, Sh gt 3 look for errors (beware)
  • Include penalizing factor for free parameters
    (Akaike, Statistical Learning Theory (Vapnik),
    Ridge regression)
  • Ridge regression min (y(t) p(t))2 a
    (p(t) ltp(t)gt)2

81
FINANCIAL MARKET PREDICITION
82
BIOINFORMATICS
  • Many potential applications
  • combinatorial complexity of existing algorithms
  • Drug design
  • Diagnostics which gene / protein is responsible
  • Pattern recognition
  • Identify a pattern of genes responsible for the
    condition
  • Relate sequence to function
  • Protein folding (shape)
  • Relate shape to conditions
  • Many basic problems are solved sub-optimally
    (combinatorial complexity)
  • Alignment
  • Dynamic system of interacting genes / proteins
  • Characterize
  • Relate to conditions

83
NMF/DL FOR COGNITIONSUMMARY
  • Cognition
  • Integrating knowledge and data / signals
  • Evolution from vague to crisp
  • Knowledge concepts models
  • Knowledge instinct similarity(models, data)
  • Aesthetic emotion change in similarity
  • Emotional intelligence
  • combination of conceptual knowledge and emotional
    evaluation
  • Applications
  • Recognition, tracking, fusion, prediction

84
OUTLINE
  • Cognition, complexity, and logic
  • The Mind and Knowledge Instinct
  • Language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

85
LANGUAGE
  • Integration of language-data and language-models
  • Speech, text
  • Language acquisition / learning
  • Search engines
  • Language is similar to cognition
  • specific language data
  • NMF of language

86
LANGUAGE ACQUISITIONAND COMPLEXITY
  • Chomsky linguistics should study the mind
    mechanisms of language (1957)
  • Chomskys language mechanisms
  • 1957 rule-based
  • 1981 model-based (rules and parameters)
  • Combinatorial complexity
  • For the same reason as all rule-based and
    model-based methods

87
HIERARCHY OF LANGUAGE
  • Speech is a (loose) hierarchy of objects
  • Words are made of language sounds, phonemes
  • Phrases are made of words
  • Text is a (loose) hierarchy of objects
  • Letters, words, phrase,
  • Meanings of language objects
  • Language objects acquire meanings in the
    hierarchy
  • Phonemes acquire meaning in words
  • Words acquire meaning in phrases
  • Phrases acquire meaning in paragraphs,

88
APPLICATION SEARCH ENGINE BASED ON UNDERSTANDING
  • Goal-instinct
  • Find conceptual similarity between a query and
    text
  • Analyze query and text in terms of concepts
  • Simple non-adaptive techniques
  • By keywords
  • By key-sentences set of words
  • Define a sequence of words (bag of words)
  • Compute coincidences between the bag and the
    document
  • Instead of the document use chunks of 7 or 10
    words
  • How to learn useful sentences?

89
APPLICATION LEARN LANGUAGE UNDERSTANDING
  • Goal-instinct
  • Find conceptual similarity between a query and
    text
  • Analyze query and text in terms of concepts
  • Learn and identify model-concepts in texts
  • Words from a dictionary
  • Hierarchy
  • phrases made up of words, paragraphs of phrases
  • Language instinct knowledge instinct

90
OUTLINE
  • Cognition, complexity, and logic
  • The Mind and Knowledge Instinct
  • Language
  • - NMF of language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

91
NMF OF LANGUAGE basic two-layer hierarchy words
and phrase-concepts
  • Words and Concept-Models
  • words w(n), n 1,,N
  • model-phrase Mm(Sm,n), parameters Sm, m 1,
  • Simplistic bag-model Mm(Sm,n) Sm nm
    wm,1 wm,2 wm,s
  • nm is a position of the model-center in the text
    s/2 (N-s/2)
  • Goal learn phrase-models
  • associate sequences n with models m and find
    parameters Sm
  • learn word-contents of phrases (and grammatical
    relationships)
  • Maximize similarity between words and models, L
  • Likelihood L l(w(n))
  • l(w(n)) r(m) l(w(n) Mm(Sm,n))
  • CC L contains MN items all associations of
    words and models

92
DYNAMIC LOGIC (non-combinatorial solution)
  • Start with a large body of text and unknown
    phrase-models
  • any parameter values Sm
  • associate fuzzy phrase-model with its contents
    (words)
  • (1) f(mn) r(m) l(nm) / r(m')
    l(nm')
  • Improve parameter estimation
  • (2) Sm Sm a f(mn)
    ?lnl(nm)/?Mm?Mm/?Sm
  • (a determines speed of convergence)
  • learn word-contents of phrases (and grammatical
    relationships)
  • Continue iterations (1)-(2). Theorem NMF is
    converging
  • - similarity increases on each iteration
  • - aesthetic emotion is positive during
    learning

93
DIFFERENTIATION OF QUALITATIVE FUNCTIONS
  • Differentiation of bag-model
  • The bag-model is non-differentiable
  • This is a principal moment, learning
    non-differentiable models requires sorting
    through combinations
  • Lead to combinatorial complexity
  • How to differentiate
  • Non-continuous, non-differentiable, qualitative
    functions
  • Also essential for the hierarchy
  • Higher level are made up of bags of lower level
    concepts

94
QUALITATIVE DERIVATIVE
  • Define fuzzy conditional partial similarity as
  • l(nm) Ss G(e(n,m,s) , sm )
  • e(n,m,s) is a distance between n and the word
    wm,s in Mm(n) that matches w(n), counted in the
    number of words (if no match, e(n,m,s) S/21)
  • Fuzziness is determined by phrase-model length, S
  • and matching st.dev. sm S / 3
  • Parameter estimation
  • Initialize with a large S and any values for
    wm,s
  • On every iteration compute e(n,m,s) and sm
  • From every model delete the least likely word
  • Reduce the phrase length S by 1
  • Thus, most likely words are gradually selected
    for each model
  • Details in Perlovsky, L.I. (2006). Symbols
    Integrated Cognition and Language. Book Chapter
    in A. Loula, R. Gudwin, J. Queiroz, eds.,
    Computational Semiotics. Idea Group, Hershey, PA.

95
OUTLINE
  • Cognition, complexity, and logic
  • The Mind and Knowledge Instinct
  • Language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

96
LANGUAGE vs. COGNITION
  • Nativists, - since the 1950s
  • - Language is a separate mind mechanism (Chomsky)
  • - Pinker language instinct
  • Cognitivists, - since the 1970s
  • Language depends on cognition
  • Talmy, Elman, Tomasello
  • Evolutionists, - since the 1980s
  • - Hurford, Kirby, Cangelosi
  • - Language transmission between generations
  • Co-evolution of language and cognition

97
WHAT WAS FIRST COGNITION OR LANGUAGE?
  • How language and thoughts come together?
  • Conscious
  • final results logical concepts
  • Language seems completely conscious
  • However, a child at 5 knows about good and
    bad guys
  • Are these conscious concepts?
  • Unconscious
  • fuzzy mechanisms of language and cognition
  • Logic
  • Same mechanisms for L. C.
  • Did not work
  • Sub-conceptual, sub-conscious integration

98
INTEGRATEDLANGUAGE AND COGNITION
  • Where language and cognition come together?
  • A fuzzy concept m has linguistic and
    cognitive-sensory models
  • Mm Mmcognitive,Mmlanguage
  • Language and cognition are fused at fuzzy
    pre-conceptual level
  • before concepts are learned
  • Understanding language and sensory data
  • Initial models are fuzzy blobs
  • Language models have empty slots for cognitive
    model (objects and situations) and v.v.
  • Childs learning
  • Language participates in cognition and v.v.
  • L C help learning and understanding each other
  • Help associating signals, words, models, and
    behavior

99
INNER LINGUISTIC FORM HUMBOLDT, the 1830s
  • In the 1830s Humboldt discussed two types of
    linguistic forms
  • words outer linguistic form (dictionary) a
    formal designation
  • and inner linguistic form (???) creative, full
    of potential
  • This remained a mystery for rule-based AI,
    structural linguistics, Chomskyan linguistics
  • rule-based approaches using the mathematics of
    logic make no difference between formal and
    creative
  • In NMF / DL there is a difference
  • static form of learned (converged) concept-models
  • dynamic form of fuzzy concepts, with creative
    learning potential, emotional content, and
    unconscious content

100
OUTLINE
  • Cognition, complexity, and logic
  • The Mind and Knowledge Instinct
  • Language
  • Integration of cognition and language
  • Higher Cognitive Functions
  • Future directions

101
HIGHER COGNITIVE FUNCTIONS
  • Abstract models are at higher levels of hierarchy
  • More vague-fuzzy, less conscious
  • At every level
  • Bottom-up signals are recognized
    lower-level-concepts
  • Top-down signals are vague concept-models
  • Behavior-actions (including learning-adaptation)

102
TOWARD A THEORY OF THE MIND
  • From Plato to physics of the mind
  • A mathematical theory describing first
    principles of the mind
  • Corresponding to existing data and making
    testable predictions

103
FROM PLATO TO LOCKE
  • Realism
  • Plato ability for thinking is based on a priori
    Ideas
  • Aristotle
  • ability for thinking is based on a priori
    (dynamic) Forms
  • an a priori form-as-potentiality (fuzzy model)
    meets matter (signals) and becomes a
    form-as-actuality (a concept)
  • actualities obey logic, potentialities do not
  • Nominalism
  • Antisthenes (contemporary of Plato)
  • there are no a priori ideas, just names for
    similar things
  • Occam (14th c.)
  • Ideas are linguistic phenomena devoid of reality
  • Locke (17th c.)
  • A newborn mind is a blank slate

104
FROM KANT TO GROSSBERG
  • Kant three primary inborn abilities
  • Reason understanding (models of cognition)
  • Practical Reason behavior (models of behavior)
  • Judgment emotions (similarity)
  • We only know concepts, not things-in-themselves
  • Jung
  • Conscious concepts are developed based on
    inherited structures of the mind, archetypes,
    inaccessible to consciousness
  • Realists vs. nominalists introverts vs.
    extroverts
  • Chomsky
  • Inborn structures, not general intelligence
  • Grossberg
  • Models attaining a resonant state (winning the
    competition for signals) reach consciousness
  • DL Aristotle
  • A-priori models are vague-fuzzy and unconscious
  • Understood models in a resonant state are
    crisper and more conscious

105
NMF AND BUDDHISM
  • Fundamental Buddhist notion of Maya
  • the world of phenomena, Maya, is meaningless
    deception
  • penetrates into the depths of perception and
    cognition
  • phenomena are not identical to things-in-themselve
    s
  • Fundamental Buddhist notion of Emptiness
  • consciousness of bodhisattva wonders at
    perception of emptiness in any object (Dalai
    Lama 1993)
  • any object is first of all a phenomenon
    accessible to cognition
  • value of any object for satisfying the lower
    bodily instincts is much less than its value for
    satisfying higher needs, knowledge instinct
  • Bodhisattvas consciousness is directed by the
    knowledge instinct
  • concentration on emptiness does not mean
    emotional emptiness, but the opposite, the
    fullness with highest emotions related to the
    knowledge instinct, beauty and spiritually
    sublime

106
MIND VS. BRAIN
  • We start understanding how to relate the mind to
    the brain
  • Which neural circuits of the brain implement
    which functions of the mind

107
NMF DYNAMICS
  • A large number of model-concepts compete for
    incoming signals
  • Uncertainty in models corresponds to uncertainty
    in associations f(mn)
  • Eventually, one model (m') wins a competition for
    a subset n' of input signals x(n), when
    parameter values match object properties, and
    f(m'n) values become close to 1 for n?n' and 0
    for n?n'
  • Upon convergence, the entire set of input signals
    n is divided into subsets, each associated with
    one model-object
  • Fuzzy a priori concepts (unconscious) become
    crisp concepts (conscious)
  • dynamic logic, Aristotelian forms, Jungian
    archetypes, Grossberg resonance
  • Elementary thought process

108
CONSCIOUSNESS AND UNCONSCIOUS
  • Jung conscious concepts and unconscious
    archetypes
  • Grossberg models attaining a resonant state
    (winning the competition for signals) reach
    consciousness
  • NMF fuzzy mechanisms (DL) are unconscious, crisp
    concept-models, adapted and matched to data are
    conscious

109
AESTHETIC EMOTIONS
  • Not related to bodily satisfaction
  • Satisfy instincts for knowledge and language
  • learning concepts and learning language
  • Not just what artists do
  • Guide every perception and cognition process
  • Perceived as feeling of harmony-disharmony
  • satisfaction-dissatisfaction
  • Maximize similarity between models and world
  • between our understanding of how things ought to
    be and how they actually are in the surrounding
    world Kant aesthetic emotions

110
BEAUTY
  • Harmony is an elementary aesthetic emotion
  • higher aesthetic emotions are involved in the
    development of more complex higher models
  • The highest forms of aesthetic emotion, beautiful
  • related to the most general and most important
    models
  • models of the meaning of our existence, of our
    purposiveness
  • beautiful object stimulates improvement of the
    highest models of meaning
  • Beautiful reminds us of our purposiveness
  • Kant called beauty aimless purposiveness not
    related to bodily purposes
  • he was dissatisfied by not being able to give a
    positive definition knowledge instinct
  • absence of positive definition remained a major
    source of confusion in philosophical aesthetics
    till this very day
  • Beauty is separate from sex, but sex makes use of
    all our abilities, including beauty

111
INTUITION
  • Complex states of perception-feeling of
    unconscious fuzzy processes
  • involves fuzzy unconscious concept-models
  • in process of being learned and adapted
  • toward crisp and conscious models, a theory
  • conceptual and emotional content is
    undifferentiated
  • such models satisfy or dissatisfy the knowledge
    instinct before they are accessible to
    consciousness, hence the complex emotional feel
    of an intuition
  • Artistic intuition
  • composer sounds and their relationships to
    psyche
  • painter colors, shapes and their relationships
    to psyche
  • writer words and their relationships to psyche

112
INTUITION Physics vs. Math.
  • Mathematical intuition is about
  • Structure and consistency within the theory
  • Relationships to a priori content of psyche
  • Physical intuition is about
  • The real world, first principles of its
    organization, and mathematics describing it
  • Beauty of a physical theory discussed by
    physicists
  • Related to satisfying knowledge instinct
  • the feeling of purpose in the world

113
WHY ADAM WAS EXPELLED FROM PARADISE?
  • God gave Adam the mind, but forbade to eat from
    the Tree of Knowledge
  • All great philosophers and theologists from time
    immemorial pondered this
  • Maimonides, 12th century
  • God wants people to think for themselves
  • Adam wanted ready-made knowledge
  • Thinking for oneself is difficult (this is our
    predicament)
  • Today we can approach this scientifically
  • Rarely
Write a Comment
User Comments (0)
About PowerShow.com