From Pixels to Propositions: Bridging the Gap from Sensor-Level Data to Cognitive-Level Knowledge - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

From Pixels to Propositions: Bridging the Gap from Sensor-Level Data to Cognitive-Level Knowledge

Description:

Bridging the Gap from Sensor-Level Data to Cognitive-Level Knowledge Kathryn Blackmond Laskey Department of Systems Engineering & Operations Research – PowerPoint PPT presentation

Number of Views:176
Avg rating:3.0/5.0
Slides: 25
Provided by: DHAR59
Category:

less

Transcript and Presenter's Notes

Title: From Pixels to Propositions: Bridging the Gap from Sensor-Level Data to Cognitive-Level Knowledge


1
From Pixels to PropositionsBridging the Gap
from Sensor-Level Data to Cognitive-Level
Knowledge
  • Kathryn Blackmond Laskey
  • Department of Systems Engineering Operations
    Research
  • George Mason University

2
  • This presentation is dedicated to the memory of
    journalist Danny Pearl, murdered in Pakistan in
    February 2002, and to the pioneering research of
    his father Judea Pearl. Danny Pearls spirit
    will live on in the work of those who apply his
    fathers work to protect the open society for
    which he gave his life.

The Daniel Pearl Foundation (http//www.danielpea
rl.org) was formed in memory of journalist
Daniel Pearl to further the ideals that inspired
Daniel's life and work.
3
Representation A Key Enabler
  • Performance of intelligent system depends on good
    representation of problem space
  • Good representations for fusion must
  • Capture important regularities in the domain
  • Capture how objects and processes give rise to
    observable evidence
  • Rest on a mathematically sound and scientifically
    principled logical foundation
  • The best and most efficient algorithm will
    produce bad results if you are solving the wrong
    problem
  • Type III error dwarfs Type I and Type II errors

Everything is easy if you can find the right
representation Herbert A. Simon
4
  • Effective multi-source fusion
  • Depends on good representations
  • Requires integrating sensor inputs with
    information from other sources
  • Depends heavily on background knowledge and
    context

5
Models and Representations
  • Models represent systems and processes
  • We use models to answer questions about the real
    world
  • Goal Build good enough models
  • Good enough depends on purpose for which model
    is used
  • Simplifications and inaccuracies dont matter if
    they dont affect results
  • Representations are approximations
  • Restricted set of variables
  • Unrealistic simplifications
  • Untested assumptions
  • Models are constructed from
  • Past data on system or related systems
  • Judgment of subject matter experts
  • Judgment of experienced model builders

6
(No Transcript)
7
(No Transcript)
8
Representing Representation
9
The Fusion Challenge
  • Fusion is the process of incorporating
    information from different sources into a single
    fused representation
  • Why fusion is difficult
  • Vast quantities of sensor information
  • Real-time processing requirements
  • Restrictions on weight, communication bandwidth
  • Need to integrate physical and geometrical models
    with qualitative knowledge
  • Noisy, unreliable, ambiguous data
  • Active attempts at deception
  • Requirement for robustness to new or little-known
    entities
  • Why fusion is important
  • Features that are meaningless in isolation are
    definitive in combination

Data, data everywhere, and not the time to think
10
Paradigm Shift in Computing
  • Old paradigm Algorithms running on Turing
    machines
  • Deterministic steps transform inputs into outputs
  • Result is either right or wrong
  • Semantics based on Boolean logic
  • New paradigm Economy of SW agents running on
    physical symbol system
  • Agents make decisions (deterministic or
    stochastic) to achieve objectives
  • Program replaced by dynamic system improving
    solution quality over time
  • Semantics based on decision theory / game theory
    / stochastic processes
  • Hardware realizations of physical symbol systems
  • Physical systems minimize action
  • Decision theoretic systems maximize utility /
    minimize loss
  • Hardware realization of physical symbol system
    maps action to utility
  • Programming languages are replaced by
    specification / interaction languages
  • Software designer specifies goals, rewards and
    information flows
  • Unified theory spans sub-symbolic to cognitive
    levels
  • Old paradigm is limiting case of new paradigm

11
No Computation Without Representation
  • First figure out what you would do if
    computation were not an issue, and then figure
    out how to compute it.
  • Good representation provides theoretical basis
    for informed choices about computation
  • Good representation provides statistical basis
    for evaluating solution quality
  • Bad representation leads to failures you dont
    know are failures and wouldnt know how to fix if
    you did

Tod Levitt Jay Kadane
12
Elements of Computational Representation
  • Vocabulary
  • Variables, constants, operators, punctuation
  • Syntax
  • Rules for composing legal expressions
  • Organization into higher level structures or
    patterns
  • Frames
  • Objects
  • Graphs
  • Proof rules (operational semantics)
  • Rules for deriving expressions from other
    expressions
  • Corresponds to operational semantics of computer
    language
  • Semantics - characterizes meaning of expressions
  • Ontology or theory of reference (denotational
    semantics)
  • Theory of truth (axiomatic semantics)

13
First-Order Logic
  • Vocabulary
  • Constants (stand for particular named objects)
  • Variables (stand for generic unnamed objects)
  • Functions (allow objects to be referred to
    indirectly)
  • Location(x)
  • MotherOf(y)
  • Predicates (represent hypotheses that can be true
    or false)
  • Guilty(s)
  • Near(John,GroceryStore32)
  • Connectives
  • Quantification, conjunction, disjunction,
    implication, negation, equality
  • Syntax
  • Atomic sentences
  • Composition rules for forming compound sentences
    from atomic sentences
  • Semantics
  • Tarski invented the standard semantics for
    first-order logic
  • Compositional meaning of sentence depends on
    meaning of parts
  • Valid sentence is true in all interpretations of
    a language unsatisfiable sentence cannot be true
    in any interpretation
  • Proof rules

14
Privileged Status of FOL
  • Has been proposed as unifying language for
  • Defining extended logics
  • Interchanging knowledge
  • FOL has enough expressive power to define all of
    mathematics, every digital computer that has ever
    been built, and the semantics of every version of
    logic, including itself. (Sowa,2000)
  • Issues
  • Cannot express generalizations about sets,
    predicates, functions
  • Cannot represent gradations of plausibility
  • No built-in approaches to
  • Categories
  • Time and space
  • Causality
  • Action
  • Events
  • Value

15
Ontology
  • Categories of things that can exist in a domain
  • Organized hierarchically into types / subtypes
  • Objects of a given type have
  • Similar structure (part-whole composition)
  • Similar behavior (processes)
  • Similar associations
  • Subtypes can inherit structure, behavior,
    association from supertype
  • Ontology describes
  • Types of entities in the domain
  • Attributes of entities
  • Relationships they can participate in
  • Ways to specify ontology
  • Formal - types defined by logical rules (usually
    FOL)
  • Informal - types specified via prototypical
    instances

16
Requirements for New Paradigm Computational
Representation
  • Embrace uncertainty
  • Perform plausible reasoning
  • Learn from experience
  • Incorporate observation, historical data, expert
    knowledge
  • Explore multiple alternatives
  • Replace rote procedure with focus on attaining
    objectives
  • Trade off multiple objectives

17
Complementary Technologies
  • Traditional Logic-Based Artificial Intelligence
  • Structured representations for symbolic
    knowledge
  • Efficient methods for searching complex problem
    spaces
  • - Rudimentary and atheoretical methods for
    reasoning under uncertainty
  • Traditional Probability
  • - Rudimentary and unstructured knowledge
    representation
  • - Assumes all hypotheses are known in advance
  • Theoretical justified and practically proven
    method for reasoning under uncertainty

18
Bayesian Networks
  • Language for representing knowledge about
    uncertain phenomena
  • Multiple hypotheses
  • Cause and effect relationships between evidence
    hypotheses
  • Time evolution (dynamic Bayesian networks)
  • Architecture for efficient computation
  • Apply Bayes rule to incorporate evidence

19
Probabilistic Knowledge Representation
  • Bayesian networks are insufficiently expressive
    for general knowledge representation
  • Requirements for a probabilistic representation
  • Represent classes having multiple similar but
    non-identical instances
  • Represent hierarchical structure of classes
  • Represent relationships between classes
  • Represent different types of uncertainty
  • Attribute-value uncertainty
  • Type uncertainty
  • Association uncertainty
  • Existence uncertainty
  • Model uncertainty (structure and parameters)
  • Learn better representations (structure and
    parameters) as observations accrue

20
Emerging Directions in Knowledge Representation
  • Increasingly expressive languages for encoding
    probabilistic domain theories
  • Probabilistic versions of historically successful
    representation frameworks
  • Decision theoretic justification for why they
    work
  • Extend to incorporate uncertainty
  • Integrate with legacy systems
  • Graphical model semantics provides principled
    theoretical foundation to address key issues
  • Multi-resolution modeling High-level summary is
    (approximate) sufficient statistic for relevant
    data from low-level sensor data
  • Distributed MS elements pass (approximate)
    sufficient statistics across communication
    pathways
  • Learning uses (approximate) Bayesian inference to
    refine structure parameters as data accrue
  • Probabilistic semantics for model
    interoperability
  • Efficient exact and approximate computation

21
Multi-Entity Bayesian Network (MEBN) Logic
  • Represent knowledge as collection of partial
    Bayesian networks
  • Instantiate compose into problem-specific
    models
  • MEBN is to BN as algebra is to arithmetic
  • Consistency constraints ensure existence of
    global probability distribution
  • Integrates classical first-order logic with
    probability
  • Predicates ? Boolean random variables
  • Functions ? Non-Boolean random variables
  • Existence results
  • MTheory implicitly represents coherent joint
    distribution on interpretations of associated
    first-order theory
  • Universal MTheory specifies joint distribution on
    satisfiable first-order sentences conditional
    distribution given any consistent finite set of
    axioms
  • Provides logical basis for probabilistic
    databases (Probabilistic Relational Models
    research _at_ Stanford)

22
On the FlyBN Constrution Example
23
Illustrative Applications
  • Identify type groups of vehicles from
    individual reports
  • BN construction module takes inputs from
    (simulated) tracker
  • Ability to identify and type platoons is robust
    to
  • Missed tracks
  • Mis-association between closely spaced vehicles
  • Incorrect vehicle types or inability to type many
    vehicles
  • Spurious tracks
  • Information architecture for missile defense
  • Distributed Bayesian inference, value of
    information, optimal interceptor allocation
  • Slated for insertion into 06 build
  • Translation of user requirements into SRS
  • Proof of concept evaluated on HLA requirements
    document
  • Found requirements humans had missed

24
SummaryAdvantages of MEBN Logic
  • Modular, object-oriented representation
  • Compose complex probability models from
    manageable sub-units
  • Implicitly represent consistent domain theory
    over unbounded number of entities
  • Constructed SSN approximates implicit model
  • MEBN theory provides metrics for estimating
    quality of approximation
  • Can balance fidelity to domain against
  • Knowledge engineering burden
  • Model construction resources
  • Inference resources
  • Learning ability
  • Probability and decision theory provides unified
    modeling approach and semantics spanning JDL
    Levels 0 through 4
  • Combines logic probability
  • Application experience to date is promising
Write a Comment
User Comments (0)
About PowerShow.com