University of California at Berkeley BISC FLINT-CIBI Workshop, December 2003 KNOWLEDGE-BASED NEURO-COMPUTING FOR DATA MINING AND KNOWLEDGE DISCOVERY IN BIOINFORMATICS Nik Kasabov nkasabov@aut.ac.nz Knowledge Engineering and Discovery Research - PowerPoint PPT Presentation

Loading...

PPT – University of California at Berkeley BISC FLINT-CIBI Workshop, December 2003 KNOWLEDGE-BASED NEURO-COMPUTING FOR DATA MINING AND KNOWLEDGE DISCOVERY IN BIOINFORMATICS Nik Kasabov nkasabov@aut.ac.nz Knowledge Engineering and Discovery Research PowerPoint presentation | free to download - id: 3bce88-NWZkZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

University of California at Berkeley BISC FLINT-CIBI Workshop, December 2003 KNOWLEDGE-BASED NEURO-COMPUTING FOR DATA MINING AND KNOWLEDGE DISCOVERY IN BIOINFORMATICS Nik Kasabov nkasabov@aut.ac.nz Knowledge Engineering and Discovery Research

Description:

BISC FLINT-CIBI Workshop, December 2003 KNOWLEDGE-BASED NEURO-COMPUTING FOR DATA MINING AND KNOWLEDGE DISCOVERY IN BIOINFORMATICS Nik Kasabov nkasabov_at_aut.ac.nz – PowerPoint PPT presentation

Number of Views:179
Avg rating:3.0/5.0

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: University of California at Berkeley BISC FLINT-CIBI Workshop, December 2003 KNOWLEDGE-BASED NEURO-COMPUTING FOR DATA MINING AND KNOWLEDGE DISCOVERY IN BIOINFORMATICS Nik Kasabov nkasabov@aut.ac.nz Knowledge Engineering and Discovery Research


1
University of California at BerkeleyBISC
FLINT-CIBI Workshop, December 2003KNOWLEDGE-
BASED NEURO-COMPUTING FOR DATA MINING AND
KNOWLEDGE DISCOVERY IN BIOINFORMATICSNik
Kasabov nkasabov_at_aut.ac.nzKnowledge Engineering
and Discovery Research Institute, KEDRI
www.kedri.info Auckland University of Technology
AUT, Auckland, NZ
2
Content
  • Bio-informatics an area of emerging knowledge
  • CI for Bioinformatics. Knowledge based neural
    networks (KBNN)
  • Knowledge and data integration in KBNN
  • Promoter recognition and splice junction
    identification
  • Gene expression analysis, modeling and profiling
  • Modeling of gene networks
  • Protein prediction
  • Medical decision support systems
  • Future directions
  • References

3
1. Bioinformatics an area of emerging knowledge
  • Each cell of the body contains the whole DNA of
    the individual (about 40,000 genes in the human
    genome, each of them comprising from 50 to a mln
    base pairs A,T,C or G)
  • The Main Dogma in Genetics DNA-gtRNA-gtproteins
  • Transcription DNA (about 5) -gt mRNA
  • DNA -gt pre-RNA -gt splicing -gt mRNA (only the
    exons)
  • Translation mRNA -gt proteins
  • Proteins make cells alive and specialised (e.g.
    blue eyes)
  • Genome -gt proteome

4
Bioinformatics
  • The area of Science that is concerned with the
    development and applications of methods, tools
    and systems for storing and processing of
    biological information to facilitate knowledge
    discovery.
  • Interdisciplinary Information and computer
    science, Molecular Biology, Biochemistry,
    Genetics, Physics, Chemistry, Health and
    Medicine, Mathematics and Statistics,
    Engineering, Social Sciences.
  • Biology, Medicine --? Information Science --gt
    IT, Clinics, Pharmacy,
  • I__________________
    __I
  • Links to Health informatics, Clinical DSS,
    Pharmaceutical Industry

5
Bioinformatics challenging problems for computer
and information sciences
  • Discovering patterns (features) from DNA and RNA
    sequences (e.g. genes, promoters, RBS binding
    sites, splice junctions)
  • Analysis of gene expression data and predicting
    protein abundance
  • Discovering of gene networks genes that are
    co-regulated over time
  • Protein discovery and protein function analysis
  • Predicting the development of an organism from
    its DNA code (?)
  • Modeling the full development (metabolic
    processes) of a cell (?)
  • Implications health social,

6
Problems in Computational Modeling for
Bioinformatics
  • Abundance of genome data, RNA data, protein data
    and metabolic pathway data is now available (see
    http//www.ncbi.nlm.nih.gov) and this is just the
    beginning of computational modeling in
    Bioinformatics
  • Complex interactions
  • between proteins, genes, DNA code,
  • between the genome and the environment
  • much yet to to be discovered
  • Stability and repetitiveness Genes are
    relatively stable carriers of information.
  • Many sources of uncertainty
  • Alternative splicing
  • Mutation in genes caused by ionising radiation
    (e.g. X-rays) chemical contamination,
    replication errors, viruses that insert genes
    into host cells, aging processes, etc.
  • Mutated genes express differently and cause the
    production of different proteins
  • It is extremely difficult to model dynamic,
    evolving processes

7
2. CI for Bioinformatics
  • Probabilistic methods
  • Case-based reasoning (e.g. k-NN transductive
    reasoning)
  • Decision trees
  • Rule-based systems (propositional Aristotel)
  • Fuzzy systems (Zadeh)
  • Neural networks (SOM, MLP, RBF, ART,)
  • Evolutionary computation (GA, ES, EP)
  • Hybrid systems (e.g. knowledge-based neural
    networks neuro-fuzzy systems neuro-fuzzy-genetic
    )

8
Knowledge-based NN (KBNN)
  • Architectures, based on the type of knowledge
    representation
  • Probabilistic KBNN
  • Regression KBNN
  • Fuzzy neural networks (e.g. neo-fuzzy neuron
    Yamakawa, Furuhashi, NEFCLASS D.Nauck, FuNN,
    EFuNN N.Kasabov)
  • Neuro-fuzzy systems (e.g. Bezdek, ANFIS - Jang,
    DENFIS Kasabov and Song)
  • Neuro-fuzzy-genetic systems (X.Yao, D.Fogel,
    N.Kasabov)
  • Neuro-fuzzy system with rough sets
    initialization (S.Pal)
  • other hybrids
  • KBNN facilitate
  • Learning from data
  • Rule extraction -gt Knowledge discovery
  • Integrating existing knowledge and new data
    (adaptation)

9
Example Fuzzy Neural Network FuNN
  • Combine the strengths of different AI techniques,
    e.g. ANN and rule-based systems or fuzzy logic
  • FuNN (Kasabov et al, 1997, Information Science)
  • Learning from data and rule extraction, e.g.
  • R1 IF x1 is Small (DI11) and x2 is Small (DI21)
    THEN y is Small (CF1),
  • R2 IF x1 is Large (DI12) and x2 is Large (DI22)
    THEN y is Large (CF2).

10
Different learning algorithms for KBNN
11
Inductive Learning framework
  • Inductive Learning extrapolates from a given set
    of examples so that we can make accurate
    predictions about future examples.
  • Given a training set of positive and negative
    examples of a concept, construct a description
    that will accurately classify whether future
    examples are positive or negative. That is, learn
    some good estimate of function f given a
    training set (x1, y1), (x2, y2), ..., (xn, yn)
    where each yi is either (positive) or -
    (negative).

12
Transductive learning framework
  • Transductive learning is concerned with the
    estimation of a function in a single point of the
    space only. For every new input vector xi, a new
    model Mi is dynamically created from these
    samples to approximate the function in the
    locality of point xi
  • Compared with inductive learning, transductive
    learning specially takes both labeled data and
    unlabeled data into account.
  • Neuro-fuzzy method for transdictive learning
    (Song and Kasabov, submitted to IEEE Tr FS, 2003)

13
Evolving connectionist systems (ECOS)
  • ECOS are modular connectionist-based systems that
    evolve their structure and functionality in a
    continuous, self-organised, on-line, adaptive,
    interactive way from incoming information they
    can process both data and knowledge in a
    supervised and/or unsupervised way.
  • N.Kasabov, Evolving connectionist systems
  • methods and applications in
    bio-informatics,
  • brain study and intelligent machine,
    Springer Verlag, 2002
  • Throw the chemicals and let the system grow,
    is that what you are talking about, Nik ?
    Walter Freeman, UC at Berkeley, a comment at
    Iizuka1998 conference

14
Evolving Fuzzy Neural Networks (EFuNNs)
  • Learning is based on clustering in the input
    space and a function estimation for this cluster
  • Prototype rules represent the clusters and the
    functions associated with them
  • Different types of rules e.g. Zadeh-Mamdani,
    or Takagi-Sugeno
  • The system grows and shrinks in a continuous way
  • Feed-forward and feedback connections (not shown)
  • Fuzzy concepts may be used
  • Not limited in number and types of inputs,
    outputs, nodes, connections
  • On-line/off line training
  • IEEE Tr SMC, 2001, N.Kasabov
  • ECF evolving classifier function a partial
    case of EFuNN no output MF

15
Dynamic Evolving Neuro-Fuzzy System DENFIS for
time series prediction , identification and
control
  • Modeling, prediction and knowledge discovery
    from dynamic time series
  • Publication Kasabov, N., and Song, Q., DENFIS
    Dynamic Evolving Neural-Fuzzy Inference System
    and its Application for Time Series Prediction,
    IEEE Transactions on Fuzzy Systems, 2002, April

16
ECOS are based on clustering of input vectors
  • An evolving clustering process using ECM with
    consecutive
  • examples x1 to x9 in a 2D space (Fig 2.7)

17
Knowledge-based learning and rule extraction in
ECOS (EFuNN)
  • Different types of rules (here spatial rules
    apply)
  • Rule insertion
  • Rule extraction
  • Example gas furnace data 2 inputs CO2(t-1),
    Meth(t-4), 1 output - CO2(t) rules extracted
    that may be inserted into a new EFuNN
  • Fuzzy membership functions

18
Parameter adaptation in ECOS through evolutionary
methods
  • Many modules are evolved simultaneously on the
    same data through a GA method
  • A chromosome for the ECOS Input features
    Errthr maxRadius MF, learning rate rule
    nodes learning mode
  • After every N examples ECOS are evaluated and the
    best one is selected for a further development

19
3. Data and model integration in ECOS
A case study of a model M (formula) and a data
set D integration through an ECOS (a) A 3D plot
of data D0 (data samples denoted as o )
generated from a model M (formula) y
5.1x10.345x12 0.83x1 log10 x2 0.45x2 0.57
exp(x2 0.2) in the sub-space of the problem space
defined by x1 and x2 both having values between 0
and 0.7, and new data D (samples denoted as )
defined by x1 and x2 having values between 0.7
and 1
20
After integration, the system performs better on
the new data
  • The data clusters of D0 (the 7 clusters on the
    left, each defined as a cluster center denoted as
    and a cluster area) and of the data D (the 2
    upper right clusters) in the 2D input space of x1
    and x2 input variables from fig.2a , are formed
    in an DENFIS ECOS trained with the data D0tr
    (randomly selected 56 data samples from D0) and
    then further trained with the data Dtr (randomly
    selected 25 samples from D). The following
    parameter values were used Rmax0.15 triangular
    membership functions denoted in the rules as left
    point, center, and right point for each variable
  • The test results of the initial model M (the
    dashed line) versus the new model Mnew (the
    dotted line) on the generated from M test data
    D0tst (the first 42 data samples) and on the new
    test data Dtst (the last 30 samples) (the solid
    line). The new model Mnew performs well on both
    the old and the new test data, while the old
    model M fails on the new test data.

21
Prototype rules extracted from DENFIS and EFuNN
after model and data integration
 
  • Takagi-Sugeno fuzzy rules (DENFIS)
  • Rule 1 IF x1 is (-0.05, 0.05, 0.14) and x2 is
    (0.15,0.25,0.35) THEN y 0.01 0.7x1 0.12x2
  • Rule 2 IF x1 is (0.02, 0.11, 0.21) and x2 is
    (0.45,0.55, 0.65) THEN y 0.03 0.67x1 0.09 x2
  • Rule 3 IF x1 is (0.07, 0.17, 0.27) and x2 is
    (0.08,0.18,0.28) THEN y 0.01 0.71x1 0.11x2
  • Rule 4 IF x1is (0.26, 0.36, 0.46) and x2 is
    (0.44,0.53,0.63) THEN y 0.03 0.68x1 0.07x2
  • Rule 5 IF x1is (0.35, 0.45, 0.55) and x2 is
    (0.08,0.18,0.28) THEN y 0.02 0.73x1 0.06x2
  • Rule 6 IF x1is (0.52, 0.62, 0.72) and x2 is
    (0.45,0.55,0.65) THEN y -0.21 0.95x1 0.28x2
  • Rule 7 IF x1is (0.60, 0.69,0.79) and x2 is
    (0.10,0.20,0.30) THEN y 0.01 0.75x1 0.03x2
  • New rules
  • Rule 8 IF x1is (0.65,0.75,0.85) and x2 is
    (0.70,0.80,0.90) THEN
  • y -0.220.75x10.51x2
  • Rule 9 IF x1is (0.86,0.95,1.05) and x2 is
    (0.71,0.81,0.91) THEN
  • y 0.03 0.59x10.37x2 

 
22
Regression-based KBNN
  • f1,f2,f3 and f4 are existing formulas used for
    GFR prediction
  • A MLP type NN structure has this functions
    wired-in and also adapted to new data
  • Input x x1, x2, , x8
  • Output G w0 w1f1 w2f2 w3f3 w4f4
  • Training modify each formula incrementally
  • Experiments with GFR data
  • Model AAE RMSE
  • MDRD 5.88 7.74
  • GFRNN 5.08 6.84
  • Improved (adapted) analytical representation is
    learned

23
4. Promoter recognition splice junction
identification
  • Promoter DNA sequences in a gene that promote
    RNA synthesis by controlling
  • 1) nucleotide at which transcription initiates
    (1 in RNA)
  • 2) frequency of transcription initiation -
    promoter strength
  • The promoter defines the region where
    transcription will begin.

24
Promoter Recognition
  • During the transcription from DNA to RNA, the
    operator (Transcribed Region) consists of three
    types of segments
  • Intron
  • Exon
  • UTR(Untranscribed Region)
  • Promoter Recognition means to distinguish
    promoter region from intron, exon and UTR,
    respectively.

25
Identify intron/exon splice junction
  • http//divcom.otago.ac.nz/infosci/kel/CBIIS/geni
    n.html
  • EXTRACTION OF RULES
  • Rule1 if ----------------------------AGGT-AG---
    ----------------------
  • then EI
  • Rule8 if ------------------T------T-CAG--------
    ----------------------
  • then IE

26
5. Gene Expression Data Analysis and Disease
Profiling
  • DNA analysis - large data bases data always
    being added and modified different sources of
    information
  • Markers and drug discoveries

27
Gene expression data analysis, modelling and
knowledge discovery
  • Goal identify a gene or a group of genes
    associated with the state of the cell (tissue),
    e.g. cancer.
  • Large number of genes (appr. 30,000) expressed in
    a microarray (in vitro) from a single tissue.
  • It is difficult to find consistent patterns of
    gene expression for a class of tissue
  • After all, a microarray data is just of few
    microseconds snapshot of what is happening in the
    cell
  • Genes interact how do we find out about that?
  • Growing number of examples and complexity.
  • PEBL (www.pebl.co.nz, or www.peblnz.com)

28
Case study Outcome prognostic system for
treatment of DLBCL based on gene expression data
  • 1 Shipp, M. et al, Diffuse large B-cell
    lymphoma outcome prediction by gene-expression
    profiling and supervised machine learning,
    Nature Medicine, vol.8, n.1, January 2002, 68-74
  •  
  • 2 Alizadeh, et al, Distinct types of diffuse
    large B-cell lymphoma identified by
    gene-expression profiling, Nature, vol.403,
    February 2000, 503-511.
  • Shipps model for each example i158 do (1)
    take the example out (2) select a set of genes
    through signal to noise ratio method (3) train a
    model (4) test the model on the left out
    example end. At average 72 accuracy was
    achieved.
  • A classification/prognosis model was developed
    based on EFuNNs with pre-selected 11 inputs (the
    expression of the 11 selected by Shipp et al,
    genes ) and two outputs (cured, or fatal). The
    model was trained and tested with the
    leave-one-out method on the 58 data examples. 90
    prognostic accuracy (88 class cured and 92
    class fatal) (a demo)

29
Fuzzy representation of gene expression data in
ECOS
30
Prognostic Profiling Through Rule Extraction from
ECOS
  • if X1 is( 2 0.84 )
  • X2 is( 2 0.81 )
  • X3 is( 1 0.91 )
  • X4 is( 1 0.91 )
  • X5 is( 3 0.91 )
  • X6 is( 1 0.89 )
  • X7 is ()
  • X8 is( 1 0.91 )
  • X9 is( 1 0.91 )
  • X10 is ( 3 0.87 )
  • X11 is ( 1 0.86 )
  • then Class is 1 - Fatal
  • Where X1,..X11 are known genes
  • 1- small, 2-medium, 3-large

if X1 is( 2 0.60 ) X2 is( 1 0.73
) X3 is( 1 0.91 ) X4 is( 3 0.91
) X5 is( 1 0.64 ) X6 is ()
X7 is( 2 0.65 ) X8 is( 2 0.90 )
X9 is( 1 0.91 ) X10 is( 1 0.62 )
X11 is( 1 0.91 ) then Class is 2 - Cured
31
Dynamic modeling and knowledge discovery from
gene expression data - 14 types of cancer
  • A continuous flow of data
  • An adaptive mother model is being created and
    updated over time new data new genes new
    classes
  • At any time, an optimal simple model is
    extracted and analysed
  • Rules are extracted and genes arte analysed
  • Example Ramaswamis data (PNAS,January,2002) of
    14 types of cancer

32
A specialized gene expression profiling SIFTWARE
www.peblnz.com www.kedri.info
33
6. Gene regulatory networks What do we make
out of the time course gene expression data?
  • Case study
  • Leukemia cell line U937 (experiments done at the
    NCI, NIH, Frederick, USA, Dr Dimitrovs lab)
  • Two different clones of the same cell line
    treated with retinoic Acid
  • 12,680 genes expressed over time points
  • 3 time points (the MINUS clone, the cell died)
    and
  • 6 time points (PLUS cell line, cancer)

34
The Goal Discover Gene Regulation Networks and
Gene State Transitions
35
Genetic networks and reverse engineering
  • GN describe the regulatory interaction between
    genes
  • DNA transcription, RNA translation and protein
    folding and binding all are part of the process
    of gene regulation
  • Here we use only RNA gene expression data
  • Reverse engineering from gene expression data
    to GN.
  • It is assumed that gene expression data reflects
    the underlying genetic regulatory network
  • Co-expressed genes over time either one
    regulates the other, or both are regulated by
    same other genes
  • Problems
  • What is the time unit?
  • Appropriate data needed and a validation
    procedure
  • Data is usually insufficient we need to use
    special methods
  • Correct interpretation of the models may generate
    new biological knowledge

36
Methods for reverse engineering and GRN modelling
  • Detecting gene relations from MEDLINE abstracts
  • Analytical modeling
  • ODE
  • Statistical pre-processing techniques.
    Correlation analysis.
  • Cluster analysis
  • Evolutionary computation, e.g. GA
  • Connectionist techniques (neural networks)
  • Evolving fuzzy connectionist techniques
  • Mixture of the above techniques

37
Cluster analysis of time course gene expression
data reduces the variables in the GRN
  • Genes that share similar functions usually show
    similar gene expression profiles and cluster
    together
  • Different clustering techniques
  • Exact clusters vs fuzzy clusters
  • Pre-defined number of clusters or evolving
  • Batch vs on-line
  • Using different similarity or correlation measure


38
Evolving fuzzy neural networks for GN modeling
(Kasabov and Dimitrov, ICONIP, 2002)
  • G(t) EFuNN G(tdt)
  • On-line, incremental learning of a GN
  • Adding new inputs/outputs (new genes)
  • The rule nodes capture clusters of input genes
    that are related to the output genes
  • Rules can be extracted that explain the
    relationship between G(t) and G(tdt), e.g.
    IF g13(t) is High (0.87) and g23(t) is
    Low (0.9)
  • THEN g87 (tdt) is High (0.6) and
    g103(tdt) is Low
  • Playing with the threshold will give stronger or
    weaker patterns of relationship

39
Rules extracted are turned into state transition
graphs
                                         
PLUS cell line
MINUS cell line
40
Using DENFIS (Dynamic, evolving neuro-fuzzy
inference system) for GN modeling (IEEE Trans.
FS, April, 2002)
  • DENFIS vs EFuNN
  • G(t) gj(tdt)
  • Dynamic partitioning of the input space
  • Takagi-Sugeno fuzzy rules, e.g.
  • If X1 is ( 0.63 0.70 0.76) and
  • X2 is ( 0.71 0.77 0.84) and
  • X3 is ( 0.71 0.77 0.84) and
  • X4 is ( 0.59 0.66 0.72) and
  • then Y 1.84 - 1.26 X1 - 1.22X2
  • 0.58X3 - 0.03 X4

41
Using a GRN model to predict the expression of
genes in a future time
42
7. Proteins and protein structure prediction
  • The mRNA is translated into proteins
  • A protein is a sequences of amino-acids, each of
    them defined by a group of 3 nucleotides (codons)
  • 20 amino acids all together (A,C-H,I,K-N,P-T,V,W,Y
    )
  • Initiation and stop codons
  • Proteins have complex structures
  • Primary (linear),
  • Secondary (3D, defining functionality)
  • Tertiary ( energy minimisation packs),
  • Quaternary (interaction between molecules)
  • The Protein Data Bank www.rcsb.org - 100,000
    hits a day on average


43
Case study Protein secondary structure prediction
  • Predicting the secondary structure from the
    primary
  • Segments from a protein can have different
    shapes
  • Helix
  • Sheet
  • Coil (loop)
  • ANN is trained on existing data (T.Sejnowski et
    al) to predict the shape of an arbitrary new
    segment window of 13 amino-acids
  • 273 inputs 3 outputs 18,000 examples for
    training

44
8. Medical decision support systems
  • Large amount of data available in the clinical
    practice
  • The need for intelligent decision support
    systems - a market demand
  • Web-based learning and decision support systems
  • Palmtops can be used to download and run an
    updated decision support system
  • Examples Cardio-vescular risk analysis Trauma
    data analysis and prognosis Sarcoma prognostic
    systems
  • R.Walton, Q.Song, N.Kasabov, Information
    Sciences, submitted, 2003

45
Adaptive Renal Function Evaluation System
GFR-ECOS(Song, Ma, Marshal, Kasabov, Lancet,
2003, submitted)
46
9. Future directions
  • Problems from Biology ? New methods for CI
  • Neuro- genetic modelling
  • Integrating biological data, CI, and ontology
    systems on the WWW (the cyber space), i.e. the
    integration of the micro-, and the macro world
    information processing
  • Personalised drug design and personalised
    medicine
  • Embedding multi-agent CI systems into a
    biological environment

47
Ontology Builders and Computational Intelligence
48
10. References
  • T. Akutsu, S. Miyano, and S. Kuhara,
    Identification of genetic networks from a small
    number of gene expression patterns under the
    boolean network model, Pacific Symposium on
    Biocomputing, vol. 4, pp.17-28, 1999.
  • S. Ando, and E. Sakamoto, and H. Iba,
    Evolutionary Modelling and Inference of Genetic
    Network, Proceedings of the 6th Joint
    Conference on Information Sciences, March 8-12,
    pp.1249-1256, 2002.
  • P. D'Haeseleer, S. Liang, and R. Somogyi,
    Genetic network inference from co-expression
    clustering to reverse engineering,,
    Bioinformatics, vol. 16, no. 8, pp.707-726, 2000.
  • N. Kasabov, Evolving connectionist systems
    methods and applications in bioinformatics,
    brain study and intelligent machines, Springer
    Verlag, 2002
  • S. Kauffman, The large scale structure and
    dynamics of gene control circuits an ensemble
    approach, Journal of Theoretical Biology, vol.
    44, pp.167-190, 1974.
  • J. R. Koza, W. Mydlowec, G. Lanza, J. Yu, M. A.
    Keane, Reverse Engineering of Metabolic Pathways
    from Observed Data using
  • Genetic Programming, Pacific Symposium on
    Biocomputing, vol. 6, pp.434-445, 2001
  • A. Lindlof, and B. Olsson, Could
    Correlation-based Methods be used to Derive
    Genetic Association Networks?, Proceedings of
    the 6thJoint Conference on Information Sciences,
    March 8-12, pp.1237-1242, 2002.
  • Liang, S. Fuhrman, and R. Somogyi, REVEAL A
    general reverse engineering algorithm for
    inference of genetic network architectures,
    Pacific Symposium on Biocomputing, vol. 3,
    pp.18-29, 1998.
  • A. Mimura, and H. Iba, Inference of a Gene
    Regulatory Network by Means of Interactive
    Evolutionary Computing, Proceedings of the 6th
    Joint Conference on Information Sciences, March
    8-12, pp.1243-1248, 2002.
  • P. A. Pevzner, Computational Molecular Biology
    An Algorithmic Approach, MIT Press, 2000.
  • R. Somogyi, S. Fuhrman, and X. Wen, Genetic
    network inference in computational models and
    applications to large-scale gene expression
    data, Computational Modeling of Genetic and
    Biochemical Networks,J. Bower and H. Bolouri
    (eds.), MIT Press, pp.119-157, 1999.
  • Z. Szallasi, Genetic Network Analysis in Light
    of Massively Parallel Biological Data
    Acquisition, Pacific Symposium on
    Biocomputing, vol. 4, pp.5-16, 1999.
  • J. Vohradsky, Neural network model of gene
    expression, The FASEB Journal, vol. 15, March,
    pp.846-854, 2001.

49
Future events
  • NZ Summer school of Bioinformatics, 9-12 February
    2004, Auckland, NZ (www.kedri.info)
  • ISMB 2004, August 2004, Glasgow
  • IJCNN2004, 26-29 July 2004, Budapest
  • ICONIP2004 22-26 November 2004, Calcutta,
    India
  • NCEI04 Neuro Computing and Evolving
    Intelligence, 12-15 December 2004, Auckland, NZ

50
Acknowledgement - The Knowledge Engineering and
Discovery Research Institute KEDRI, AUT,
NZhttp//www.kedri.info
  • Established March-June 2002.
  • Funded by AUT, NERF (FRST), NZ industry.
  • Appr. NZ 0.7mln pa
  • 25 research staff and graduate students 25
    associated researchers
  • Both fundamental and applied research (theory
    practice)
  • 55 publications in 2002-03
  • 450 publications all together
  • Multicultural environment (12 ethnical origins)
  • Strong national and international collaboration
  • NCI, NIH, Frederick, USA Dr Dimitrov
  • Japan KIT, Ritsumeikan
  • UCBerkeley, U Cambridge (UK)
  • NTU, InfoCom, EI-asia Ltd Singapore
    (www.ei-asia.net)
  • PEBL (www.pebl.co.nz)
  • Many other
  • Photo of KEDRIstaff
About PowerShow.com