NeuCom - A Neurocomputing Environment for Evolving Intelligence Nik Kasabov Knowledge Engineering and Discovery Research Institute Auckland University of Technology nkasabov@aut.ac.nz http://www.kedri.info - PowerPoint PPT Presentation

Loading...

PPT – NeuCom - A Neurocomputing Environment for Evolving Intelligence Nik Kasabov Knowledge Engineering and Discovery Research Institute Auckland University of Technology nkasabov@aut.ac.nz http://www.kedri.info PowerPoint presentation | free to view - id: 1ee6f2-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

NeuCom - A Neurocomputing Environment for Evolving Intelligence Nik Kasabov Knowledge Engineering and Discovery Research Institute Auckland University of Technology nkasabov@aut.ac.nz http://www.kedri.info

Description:

... Fuzzy Systems and Knowledge Engineering, MIT Press, 1996 ... intelligent machines, Springer Verlag, London, New York, Heidelberg, 2002 (www.springer.de) ... – PowerPoint PPT presentation

Number of Views:160
Avg rating:3.0/5.0

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: NeuCom - A Neurocomputing Environment for Evolving Intelligence Nik Kasabov Knowledge Engineering and Discovery Research Institute Auckland University of Technology nkasabov@aut.ac.nz http://www.kedri.info


1
NeuCom - A Neurocomputing Environment for
Evolving IntelligenceNik Kasabov Knowledge
Engineering and Discovery Research
InstituteAuckland University of
Technologynkasabov_at_aut.ac.nzhttp//www.kedri.inf
o
University of California at BerkeleyBISC
FLINT-CIBI Workshop, December 2003
2
Data analysis, modeling and knowledge discovery
  • Modelling complex processes is a difficult task
  • Most existing techniques may not be appropriate
    to model complex, dynamic processes.
  • Variety of methods need to be developed to be
    applied to a number of challenging real-world
    applications

3
Evolving Intelligence
  • Subset of AI
  • An information system that develops its structure
    and functionality in a continuous,
    self-organised, adaptive and interactive way from
    incoming information possibly from many sources,
    and performs intelligent tasks the systems
    acquires intelligence.
  • EI represented in this presentation as ECOS and
    their applications

4
NeuCom
  • Facilitates data analyses, data understanding,
    model creation and knowledge discovery
  • Data management and data ontology
  • Data analysis and feature extraction
    (statistical, PCA, clustering, SNR, )
  • Data modeling and rule extraction
    (classification, prediction, optimisation)
  • Image recognition
  • Module integration
  • Free inspection copy from
  • www.theneucom.com or
  • www.kedri.info/

5
Data Visualisation
  • Visualisation for data mining
  • Discovering patterns
  • What to visualise (dimensions, etc)?
  • How to visualise data?
  • Methods for visualisation
  • Visualisation of multi-dimensional data -
    Principal Component Analysis (PCA)
  • Visualisation of dynamic data
  • Time-space dilemma

6
Feature selection
  • Feature evaluation is the process of establishing
    how relevant to the problem in hand are the
    available features (variables)
  • Feature selection is the process of choosing the
    most appropriate features when creating a
    computational model
  • If a system evolves and learns in an on-line
    mode, features have to be evaluated in an on-line
    mode too, at each time - having the most relevant
    features used.

7
Modeling and Knowledge Discovery clustering,
classification, prediction, optimisation, rule
extraction
  • Statistical methods (HMM, Regression, SVM)
  • Case-based reasoning, e.g. K- nearest neighbor
  • Simple and quick not very precise
  • Neural Networks
  • Good learning MLP no transparency black
    box RBF
  • Knowledge-based neural networks KBNN
  • Allow for rules to be extracted
  • ECOS, e.g.Evolving connectionist systems
  • Adaptation to various data is possible
  • Classification, clustering, and rule (profile)
    extraction are all possible in one model
  • Integrating existing knowledge and new data

8
Different learning algorithms
9
Statistical analysis and learning methods
  • Basic statistical methods, e.g. means, t-test,
    f-test, histograms
  • Signal-to-noise ratio
  • Principle component analysis
  • Regression analysis
  • Linear discriminant analysis
  • Clustering
  • Markov Models
  • SVM

10
Statistical Methods
  • Linear Discriminant Analysis (LDA)
  • Find a linear subspace that maximises class
    separability among the feature vector projections
    in the data space.
  • Popular separability criterion is ratio of
    between-class scatter and within-class scatter
  • LDA seeks directions efficient for discrimination

11
Support Vector Machines
  • Kernel Method
  • The idea of support vector machine is to map the
    training data into higher dimensional feature
    space vian kernel computation, and constructing a
    separating hyperplane with maxim margin there.
  • These kernel functions could be
  • Polynomial funcition
  • Radail basis fuction
  • Linear function.

12
Support Vector Machines
  • SVM Computation
  • SVM compute hyperplane for classification based
    on two facts
  • Among all hyperplanes separating the data, there
    exist a unique one that have the maximum margin
    of separation between classes.
  • The capacity of decreases with increasing margin.

13
Artificial neural networks (ANN) (connectionist
systems)
  • Computational models that mimic the nervous
    system in its main function of adaptive learning.
  • ANN can learn from data and make generalisation
  • ANN are universal computational models
  • N.Kasabov, Foundations of Neural Networks, Fuzzy
    Systems and Knowledge Engineering, MIT Press, 1996

14
Knowledge-based ANN for learning and rule
extraction
  • Combine the strengths of different AI techniques,
    e.g. ANN and rule-based systems or fuzzy logic
  • FuNN (Kasabov et al, 1997)
  • Learning from data and rule extraction, e.g.
  • R1 IF x1 is Small (DI11) and x2 is Small (DI21)
    THEN y is Small (CF1),
  • R2 IF x1 is Large (DI12) and x2 is Large (DI22)
    THEN y is Large (CF2).

15
Dynamic ANN Evolving Connectionist Systems - ECOS
  • N.Kasabov, Evolving connectionist systems-
    methods and applications in bioinformatics, brain
    study and intelligent machines, Springer Verlag,
    2002 (www.springer.de www.amazon.co.jp)
  • Evolve in an open dimensional space
  • Learn in an on-line mode
  • Learn in a life-long mode
  • Learn as an individual and as evolutionary
    population systems
  • Have evolving structures
  • Facilitate dealing with different types of
    knowledge
  • (memory-based, statistical and symbolic)

16
Evolving connectionist systems (ECOS)
  • ECOS are modular connectionist-based systems that
    evolve their structure and functionality in a
    continuous, self-organised, on-line, adaptive,
    interactive way from incoming information they
    can process both data and knowledge in a
    supervised and/or unsupervised way.
  • ECOS as a whole
  • Neural network modules
  • Neurons

17
Knowledge-based learning and rule manipulation in
EFuNNs
  • Different types of rules (here spatial rules
    apply)
  • Rule insertion
  • Rule extraction
  • Example gas furnace data 2 inputs CO2(t-1),
    Meth(t-4), 1 output - CO2(t) rules extracted
    that may be inserted into a new EFuNN
  • Fuzzy membership functions

18
Evolving Fuzzy Neural Networks (EFuNNs)
  • Grow and shrink
  • Feed-forward and feedback connections (not shown)
  • Fuzzy concepts may be used
  • Not limited in number and types of inputs,
    outputs, nodes, connections
  • On-line training - if possible, always add new
    data for a better generalisation
  • PEBL (Pacific Edge Biotechnology Ltd)
    (www.pebl.co.nz)

19
Dynamic Evolving Neuro-Fuzzy System DENFIS for
time series prediction , identification and
control
  • Modeling, prediction and knowledge discovery
    from dynamic time series
  • Publication Kasabov, N., and Song, Q., DENFIS
    Dynamic Evolving Neural-Fuzzy Inference System
    and its Application for Time Series Prediction,
    IEEE Transactions on Fuzzy Systems, 2002, April

20
Parameter optimisation of ECOS through
evolutionary algorithms
  • Nature optimises its parameters through
    evolution
  • Replication of individual ECOS systems and
    selection of
  • The best one
  • The best m averaged

21
Image and Signal Processing in NeuCom
  • Different type of visual information can be used
    to train a neural network for a subsequent recall
    on new images. That includes
  • static images (e.g. OCR cancer cells, )
  • video data (moving fish insects animals
    people)
  • live data (e.g., face recognition see Peter
    Hwang on the picture)

22
Integrating different sources of information in
NeuCom
  • Through the Module Integration it is possible to
    integrate into one decision support system
    different modules developed in NeuCom, that
    relate to the same problem but use different
    source of information, including
  • Numerical, e.g. gene, clinical information
  • Text references, publications, labels
  • Images (e.g. microscope images, phenotypic
    information
  • Different type of visual information
  • static images
  • video data
  • live data

23
Mixture of experts approach for the integration
of gene expression and clinical information
24
A case study Lymphoma Outcome Prognosis Gene
expression and clinical data integration (data
from M.Shipp et al,2002)
25
Data and model integration in ECOS
A case study of a model M (formula) and a data
set D integration through an ECOS (a) A 3D plot
of data D0 (data samples denoted as o )
generated from a model M (formula) y
5.1x10.345x12 0.83x1 log10 x2 0.45x2 0.57
exp(x2 0.2) in the sub-space of the problem space
defined by x1 and x2 both having values between 0
and 0.7, and new data D (samples denoted as )
defined by x1 and x2 having values between 0.7
and 1
26
After integration, the system performs better on
the new data
  • The data clusters of D0 (the 7 clusters on the
    left, each defined as a cluster center denoted as
    and a cluster area) and of the data D (the 2
    upper right clusters) in the 2D input space of x1
    and x2 input variables from fig.2a , are formed
    in an DENFIS ECOS trained with the data D0tr
    (randomly selected 56 data samples from D0) and
    then further trained with the data Dtr (randomly
    selected 25 samples from D). The following
    parameter values were used Rmax0.15 triangular
    membership functions denoted in the rules as left
    point, center, and right point for each variable
  • The test results of the initial model M (the
    dashed line) versus the new model Mnew (the
    dotted line) on the generated from M test data
    D0tst (the first 42 data samples) and on the new
    test data Dtst (the last 30 samples) (the solid
    line). The new model Mnew performs well on both
    the old and the new test data, while the old
    model M fails on the new test data.

27
Prototype rules extracted from DENFIS and EFuNN
after model and data integration
 
  • Takagi-Sugeno fuzzy rules (DENFIS)
  • Rule 1 IF x1 is (-0.05, 0.05, 0.14) and x2 is
    (0.15,0.25,0.35) THEN y 0.01 0.7x1 0.12x2
  • Rule 2 IF x1 is (0.02, 0.11, 0.21) and x2 is
    (0.45,0.55, 0.65) THEN y 0.03 0.67x1 0.09 x2
  • Rule 3 IF x1 is (0.07, 0.17, 0.27) and x2 is
    (0.08,0.18,0.28) THEN y 0.01 0.71x1 0.11x2
  • Rule 4 IF x1is (0.26, 0.36, 0.46) and x2 is
    (0.44,0.53,0.63) THEN y 0.03 0.68x1 0.07x2
  • Rule 5 IF x1is (0.35, 0.45, 0.55) and x2 is
    (0.08,0.18,0.28) THEN y 0.02 0.73x1 0.06x2
  • Rule 6 IF x1is (0.52, 0.62, 0.72) and x2 is
    (0.45,0.55,0.65) THEN y -0.21 0.95x1 0.28x2
  • Rule 7 IF x1is (0.60, 0.69,0.79) and x2 is
    (0.10,0.20,0.30) THEN y 0.01 0.75x1 0.03x2
  • New rules
  • Rule 8 IF x1is (0.65,0.75,0.85) and x2 is
    (0.70,0.80,0.90) THEN
  • y -0.220.75x10.51x2
  • Rule 9 IF x1is (0.86,0.95,1.05) and x2 is
    (0.71,0.81,0.91) THEN
  • y 0.03 0.59x10.37x2 

 
28
Applications
  • Bioinformatics
  • Brain study and cognitive engineering
  • Decision support
  • Adaptive speech recognition and language
    processing
  • Adaptive image processing
  • Adaptive multi-modal information processing
  • Adaptive mobile robots
  • Adaptive learning agents
  • Other

29
References
  • N.Kasabov, Foundations of neural networks, fuzzy
    systems and knowledge engineering, MIT Press, CA,
    MA, 1996 (www.mitpress.com)
  • N.Kasabov, Evolving connectionist systems
    Methods and Applications in Bioinformatics, Brain
    study and intelligent machines, Springer Verlag,
    London, New York, Heidelberg, 2002
    (www.springer.de)
  • www.kedri.info
  • www.theneucom.com
  • www.amazon.com
About PowerShow.com