Implementing Cognitive Radio - PowerPoint PPT Presentation

About This Presentation
Title:

Implementing Cognitive Radio

Description:

Implementing Cognitive Radio How does a radio become cognitive? – PowerPoint PPT presentation

Number of Views:1215
Avg rating:3.0/5.0
Slides: 77
Provided by: jamesn89
Category:

less

Transcript and Presenter's Notes

Title: Implementing Cognitive Radio


1
Implementing Cognitive Radio
  • How does a radio become cognitive?

2
Presentation Overview
  • Architectural Approaches
  • Observing the Environment
  • Autonomous Sensing
  • Collaborative Sensing
  • Radio Environment Maps and Observation Databases
  • Recognizing Patterns
  • Neural Nets
  • Hidden Markov Models
  • Making Decisions
  • Common Heuristic Approaches
  • Case-based Reasoning
  • Representing Information
  • A Case Study

3
Architectural Overview
  • What are the components of a cognitive radio and
    how do they relate to each other?

4
Strong Artificial Intelligence
  • Concept Make a machine aware (conscious) of its
    environment and self aware

(probably a good thing)
  • A complete failure

5
Weak Artificial Intelligence
  • Concept Develop powerful (but limited)
    algorithms that intelligently respond to sensory
    stimuli
  • Applications
  • Machine Translation
  • Voice Recognition
  • Intrusion Detection
  • Computer Vision
  • Music Composition

6
Implementation Classes
  • Weak cognitive radio
  • Radios adaptations determined by hard coded
    algorithms and informed by observations
  • Many may not consider this to be cognitive (see
    discussion related to Fig 6 in 1900.1 draft)
  • Strong cognitive radio
  • Radios adaptations determined by conscious
    reasoning
  • Closest approximation is the ontology reasoning
    cognitive radios
  • In general, strong cognitive radios have
    potential to achieve both much better and much
    worse behavior in a network.

7
Weak/Procedural Cognitive Radios
  • Radios adaptations determined by hard coded
    algorithms and informed by observations
  • Many may not consider this to be cognitive (see
    discussion related to Fig 6 in 1900.1 draft)
  • A function of the fuzzy definition
  • Implementations
  • CWT Genetic Algorithm Radio
  • MPRG Neural Net Radio
  • Multi-dimensional hill climbing DoD LTS (Clancy)
  • Grambling Genetic Algorithm (Grambling)
  • Simulated Annealing/GA (Twente University)
  • Existing RRM Algorithms?

8
Strong Cognitive Radios
  • Radios adaptations determined by some reasoning
    engine which is guided by its ontological
    knowledge base (which is informed by
    observations)
  • Proposed Implementations
  • CR One Model based reasoning (Mitola)
  • Prolog reasoning engine (Kokar)
  • Policy reasoning (DARPA xG)

9
DFS in 802.16h
Decision, Action
Service in function
Channel Availability Check on next channel
  • Drafts of 802.16h defined a generic DFS algorithm
    which implements observation, decision, action,
    and learning processes
  • Very simple implementation

Choose Different Channel
Observation
Available?
No
Observation
Yes
In service monitoring of operating channel
No
Decision, Action
Detection?
Select and change to new available channel in a
defined time with a max. transmission time
Yes
Stop Transmission
Learning
Start Channel Exclusion timer
Log of Channel Availability
Channel unavailable for Channel Exclusion time
Yes
Available?
No
Background In service monitoring (on
non-operational channels)
Modified from Figure h1 IEEE 802.16h-06/010 Draft
IEEE Standard for Local and metropolitan area
networks Part 16 Air Interface for Fixed
Broadband Wireless Access Systems Amendment for
Improved Coexistence Mechanisms for
License-Exempt Operation, 2006-03-29
10
Example Architecture from CWT
Observation Orientation
Action
Decision
Learning
Models
11
Architecture Summary
  • Two basic approaches
  • Implement a specific algorithm or specific
    collection of algorithms which provide the
    cognitive capabilities
  • Specific Algorithms
  • Implement a framework which permits algorithms to
    be changed based on needs
  • Cognitive engine
  • Both implement following processes
  • Observation, Decision, Action
  • Either approach could implement
  • Learning, Orientation
  • Negotiation, policy engines, models
  • Process boundaries may blur based on the
    implementation
  • Signal classification could be orientation or
    observation
  • Some processes are very complementary
  • Orientation and learning
  • Some processes make most intuitive sense with
    specific instantiations
  • Learning and case-based-reasoning

12
Observations
  • How does the radio find out about its environment?

13
The Cognitive Radio and its Environment
14
Signal Detection
  • Optimal technique is matched filter
  • While sometimes useful, matched filter may not be
    practical for cognitive radio applications as the
    signals may not be known
  • Frequency domain analysis often required
  • Periodogram
  • Fourier transform of autocorrelation function of
    received signal
  • More commonly implemented as magnitude squared of
    FFT of signal

15
Comments on Periodogram
  • Spectral leaking can mask weak signals
  • Resolution a function of number of data points
  • Significant variance in samples
  • Can be improved by averaging, e.g., Bartlett,
    Welch
  • Less resolution for the complexity
  • Significant bias in estimations (due to finite
    length)
  • Can be improved by windowing autocorrelation,
    e.g., Blackman-Tukey

Estimation Quality Factor Complexity
Periodogram 1
Bartlett 1.11 N ?f
Welch (50 overlap) 1.39 N ?f
Blackman-Tukey 2.34 N ?f
Quality Factor
16
Other Detection Techniques
  • Nonparametric
  • Goertzel evaluates Fourier Transform for a
    small band of frequencies
  • Parametric Approaches
  • Need some general characterization (perhaps as
    general as sum of sinusoids)
  • Yule-Walker (Autoregressive)
  • Burg (Autoregressive)
  • Eigenanalysis
  • Pisarenko Harmonic Decomposition
  • MUSIC
  • ESPRIT

17
Sub noise floor Detection
  • Detecting narrowband signals with negative SNRs
    is actually easy and can be performed with
    preceding techniques
  • Problem arises when signal PSD is close to or
    below noise floor
  • Pointers to techniques
  • (white noise) C. L. Nikias and J. M. Mendel,
    Signal processing with higher-order spectrum,
    Signal Processing, July 1993.
  • (Works with colored noise and time-varying
    frequencies) K. Hock, Narrowband Weak Signal
    Detection by Higher Order Spectrum, Signal
    Processing, April 1996
  • C.T. Zhou, C. Ting, Detection of weak signals
    hidden beneath the noise floor with a modified
    principal components analysis, AS-SPCC 2000, pp.
    236-240.

18
Signal Classification
  • Detection and frequency identification alone is
    often insufficient as different policies are
    applied to different signals
  • Radar vs 802.11 in 802.11h,y
  • TV vs 802.22
  • However, would prefer to not have to implement
    processing to recover every possible signal
  • Spectral Correlation permits feature extraction
    for classification

19
Cyclic Autocorrelation
  • Cyclic Autocorrelation
  • Quicky terminology
  • Purely Stationary
  • Purely Cyclostationary
  • Exhibiting Cyclostationarity
  • Meaning periods of cyclostationarity correspond
    to
  • Carrier frequencies, pulse rates, spreading code
    repetition rates, frame rates
  • Classify by periods exhibited in R

20
Spectral Correlation
  • Estimation of Spectral Correlation Density (SCD)
  • For ?0, above is periodogram and in the limit
    the PSD
  • SCD is equivalent to Fourier Transform of Cyclic
    Autocorrelation

21
Spectral Coherence Function
  • Spectral Coherence Function
  • Normalized, i.e.,
  • Terminology
  • ? cycle frequency
  • f spectrum frequency
  • Utility Peaks of C correspond to the underlying
    periodicities of the signal that may be obscured
    in the PSD
  • Like periodogram, variance is reduced by averaging

22
Practical Implementation of Spectral Coherence
Function
From Figure 4.1 in I. Akbar, Statistical
Analysis of Wireless Systems Using Markov
Models, PhD Dissertation, Virginia Tech, January
2007
23
Example Magnitude Plots
BPSK
DSB-SC AM
FSK
MSK
24
?- Profile
  • ?-profile of SCF
  • Reduces data set size, but captures most
    periodicities

BPSK
DSB-SC AM
MSK
FSK
25
Combination of Signals
MSK
BPSK
BPSK MSK
26
Impact of Signal Strength
27
Resolution
BPSK 200x200
  • High ? resolution may be needed to capture
    feature space
  • High computational burden
  • Lower resolution possible if there are expected
    features
  • Legacy radios should be predictable
  • CR may not be predictable
  • Also implies an LPI strategy

BPSK 100x100
AM
Plots from A. Fehske, J. Gaeddert, J. Reed, A
new approach to signal classification using
spectral correlation and neural networks,DySPAN
05, pp. 144-150.
28
Additional comments on Spectral Correlation
  • Even though PSDs may overlap, spectral
    correlation functions for many signals are quite
    distinct, e.g., BPSK, QPSK, AM, PAM
  • Uncorrelated noise is theoretically zeroed in the
    SCF
  • Technique for subnoise floor detection
  • Permits extraction of information in addition to
    classification
  • Phase, frequency, timing
  • Higher order techniques sometimes required
  • Some signals will not be very distinct, e.g.,
    QPSK, QAM, PSK
  • Some signals do not exhibit requisite second
    order periodicity

29
Collaborative Observation
  • Possible to combine estimations
  • Reduces variance, improves PD vs PFA
  • Should be able to improve resolution
  • Proposed for use in 802.22
  • Partition cell into disjoint regions
  • CPE feeds back what it finds
  • Number of incumbents
  • Occupied bands

Source IEEE 802.22-06/0048r0
30
More Expansive Collaboration Radio Environment
Map (REM)
  • Integrated database consisting of multi-domain
    information, which supports global cross-layer
    optimization by enabling CR to look through
    various layers.
  • Conceptually, all the information a radio might
    need to make its decisions.
  • Shared observations, reported actions, learned
    techniques
  • Significant overhead to set up, but simplifies a
    lot of applications
  • Conceptually not just cognitive radio, omniscient
    radio

From Y. Zhao, J. Gaeddert, K. Bae, J. Reed,
Radio Environment Map Enabled Situation-Aware
Cognitive Radio Learning Algorithms, SDR Forum
Technical Conference 2006.
31
Example Application
  • Overlay network of secondary users (SU) free to
    adapt power, transmit time, and channel
  • Without REM
  • Decisions solely based on link SINR
  • With REM
  • Radios effectively know everything

Upshot A little gain for the secondary users
big gain for primary users
From Y. Zhao, J. Gaeddert, K. Bae, J. Reed,
Radio Environment Map Enabled Situation-Aware
Cognitive Radio Learning Algorithms, SDR Forum
Technical Conference 2006.
32
Observation Summary
  • Numerous sources of information available
  • Tradeoff in collection time and spectral
    resolution
  • Finite run-length introduces bias
  • Can be managed with windowing
  • Averaging reduces variance in estimations
  • Several techniques exist for negative SNR
    detection and classification
  • Cyclostationarity analysis yields hidden
    features related to periodic signal components
    such as baud rate, frame rate and can vary by
    modulation type
  • Collaboration improves detection and
    classification
  • REM is logical extreme of collaborative
    observation.

33
Pattern Recognition
  • Hidden Markov Models, Neural Networks,
    Ontological Reasoning

34
Hidden Markov Model (HMM)
  • A model of a system which behaves like a Markov
    chain except we cannot directly observe the
    states, transition probabilities, or initial
    state.
  • Instead we only observe random variables with
    distributions that vary by the hidden state
  • To build an HMM, must estimate
  • Number of states
  • State transition probabilities
  • Initial state distribution
  • Observations available for each state
  • Probability of each observation for each state
  • Model can be built from observations using
    Baum-Welch algorithm
  • With a specified model, output sequences can be
    predicted using the forward-backward algorithm
  • With a specified model, a sequence of states can
    be estimated from observations using the Viterbi
    algorithm.

35
Example
  • A hidden machine selects balls from an unknown
    number of bins.
  • Bin selection is driven by a Markov chain.
  • You can only observe the sequence of balls
    delivered to you and want to be able to predict
    future deliveries

Hidden States (bins)
Observation Sequence
36
HMM for Classification
  • Suppose several different HMMs have been
    calculated with Baum Welch for different
    processes
  • A sequence of observations could then be
    classified as being most like one of the
    different models
  • Techniques
  • Apply Viterbi to find most likely sequence of
    state transitions through each HMM and classify
    as the one with the smallest residual error.
  • Build a new HMM based on the observations and
    apply an approximation of Kullback-Leibler
    divergence to measure distance between new and
    existing HMMs. See M. Mohammed, Cellular
    Diagnostic Systems Using Hidden Markov Models,
    PhD Dissertation, Virginia Tech, October 2006.

37
System Model for Signal Classification
38
Signal Classification Results
39
Effect of SNR and Observation Length
  • BPSK signal detection rate of various SNR and
    observation length(BPSK HMM is trained with 9dB)
  • Decreasing SNR increases observation time to
    obtain a good detection rate

Detection Rate
- 12dB
0 50
100
- 9dB
- 6dB
0 5 10 15 20 25
30 35 40
Observation Length (One block is 100 symbols)
40
Location Classifier Design
  • Designing a classifier requires two fundamental
    steps
  • Extraction of a set of features that ensures
    highly discriminatory attributes between
    locations
  • Select a suitable classification model
  • Features are extracted based on received power
    delay profile which includes information
    regarding the surrounding environment (NLoS/LoS,
    multipath strength, delay etc.).
  • The selection of hidden Markov model (HMM) as a
    classification tool was motivated by its success
    in other applications i.e., speech recognition.

41
Determining Location by Comparing HMM Sequences
  • In the testing phase, the candidate power profile
    is compared against all the HMMs previously
    trained and stored in the data base.
  • The HMM with the closest match identifies the
    corresponding position.

42
Feature Vector Generation
  • Each location of interest was characterized by
    its channel characteristics i.e., power delay
    profile.
  • Three dimensional feature vectors were derived
    from the power delay profile with excess time,
    magnitude and phase of the Fourier transform (FT)
    of the power delay profile in each direction.

43
Measurement Setup Cont.
Measurement Locations 1.1 1.4, 4th Floor,
Durham Hall, Virginia Tech. The transmitter is
located in Room 475, Receivers 1.1 and 1.2 are
located in Room 471 Receiver 1.3 is in the
conference room in the 476 computer lab, and
Receiver 1.4 is located in the hallway adjacent
to 475.
  • Transmitter location 1 represents NLOS
    propagation from a room to another room, and from
    a room to a hallway. The transmitter and
    receivers were separated by drywall containing
    metal studs.
  • The transmitter was located in a small
    laboratory. Receiver locations 1.1 1.3 were in
    adjacent rooms, whereas receiver location 1.4 was
    in an adjacent hallway. Additionally, for
    locations 1.1 1.3, standard office dry-erase
    whiteboard was located on the wall separating
    the transmitter and receiver.

17
58
44
Vector Quantization (VQ)
  • Since the discrete observation density is
    required to train HMMs, a quantization step is
    required to map the continuous vectors into
    discrete observation sequence.
  • Vector quantization (VQ) is an efficient way of
    representing multi-dimensional signals. Features
    are represented by a small set of vectors, called
    codebook, based on minimum distance criteria.
  • The entire space is partitioned into disjointed
    regions, known as Voronoi region.

Example vector quantization in a two-dimensional
space.


http//www.geocities.com/mohamedqasem/vectorquanti
zation/vq.htm
45
Classification Result
  • A four-state HMM was used to represent each
    location (Rx 1.1-1.4).
  • Codebook size was 32
  • Confusion matrix for Rx location 1.1-1.4

HMM based on Rx location (estimated)
Position Rx 1.1 Rx 1.2 Rx 1.3 Rx 1.4
Rx 1.1 95 5 0 0
Rx 1.2 5 95 0 0
Rx 1.3 0 0 100 0
Rx 1.4 0 10 0 90
Overall accuracy 95
Candidate received power profile (true)
Correct classification
46
Some Applications of HMMs to CR from VT
  • Signal Detection and Classification
  • Position Location from a Single Site
  • Traffic Prediction
  • Fault Detection
  • Data Fusion

47
The Neuron and Threshold Logic Unit
Neuron
  • Several inputs are weighted, summed, and passed
    through a transfer function
  • Output passed onto other layers or forms an
    output itself
  • Common transfer (activation) functions
  • Step
  • Linear Threshold
  • Sigmoid
  • tanh

Image from http//en.wikipedia.org/wiki/Neuron
x1
Threshold Logic Unit
w1
x2
w2
f (a)
a
?
wn
xn
48
Neuron as Classifier
  • Threshold of multilinear neuron defines a
    hyperplane decision boundary
  • Number of inputs defines defines dimensionality
    of hyperplane
  • Sigmoid or tanh activation functions permit soft
    decisions

Activation Function
Activation
Inputs
Weights
x1 x2
0 0
0 1
1 0
1 1
w1 w2
-0.5 0.5
a
0
0.5
-0.5
0
gt0.25?
1
1
0
1
w3
0.5
49
Training Algorithm
  • Perceptron (linear transfer function)
  • Basically an LMS training algorithm
  • Steps
  • Given sequence of input vectors v and correct
    output t
  • For each (v,t) update weights as
  • where y is the actual output (thus t-y is the
    error)
  • Delta (differentiable transfer function)
  • Adjusts based on the slope of the transfer
    function
  • Originally used with sigmoid as derivative is
    easy to implement

50
The Perceptron
  • More sophisticated version of TLU
  • Prior to weighting, inputs are processed with
    Boolean logic blocks
  • Boolean logic is fixed during training

Boolean Logic Blocks
x1
Threshold Logic Unit
w1
x2
w2
f (a)
a
?
wn
xn
51
More Complex Decision Rules
  • Frequently, it is impossible to correctly
    classify with just a single hyperplane
  • Solution Define several hyperplanes via several
    neurons and combine the results (perhaps in
    another neuron)
  • This combination is called a neural net
  • Size of hidden layer is number of hyperplanes in
    decision rules

x1
x2
52
Backpropagation Algorithm
  • Just using outputs and inputs doesnt tell us how
    to adjust hidden layer weights
  • Trick is figuring out how much of the error can
    be ascribed to each hidden neuron

x1
x2
53
Example Application
  • Each signal class is a multilayer linear
    perceptron network with 4 neurons in the hidden
    layer
  • Trained with 199 point ?-profile, back
    propagation
  • Activation function tanh
  • MAXNET chooses one with largest value

295 Trials unknown Carrier, BW, 15 dB SNR
460 Trials Known Carrier, BW, -9 dB SNR
Results from A. Fehske, J. Gaeddert, J. Reed, A
new approach to signal classification using
spectral correlation and neural networks,DySPAN
2005. pp. 144 - 150.
54
Comments on Orientation
  • By itself ontological reasoning is likely
    inappropriate for dealing with signals
  • HMM and Neural Nets somewhat limited in how much
    they can scale up arbitrarily
  • Implementations should probably feature both
    classes of techniques where
  • HMMs and NNs identify presence of objects,
    locations, or scenarios, and reasoning engine
    combines.
  • Meaning of the presence of these objects is then
    inferred by ontological reasoning.

55
Decision Processes
  • Genetic algorithms, case-based reasoning, and more

56
Decision Processes
  • Goal choose the actions that maximize the
    radios goal
  • Very large number of nonlinearly related
    parameters tends to make solving for optimal
    solution quite time consuming.

57
Case Based Reasoning
  • An elaborate switch (or if-then-else) statement
    informed by cases defined by orientation (or
    context)
  • Case identified by orientation, decision
    specified in database for the case
  • Database can be built up over time
  • Problem of what to do when new case is identified

A. Aamodt, E. Plaza (1994) Case-Based Reasoning
Foundational Issues, Methodological Variations,
and System Approaches. AI Communications. IOS
Press, Vol. 7 1, pp. 39-59.
58
Local Search
  • Steps
  • Search a neighborhood of solution, sk to find
    s that that improves performance the most.
  • sk1s
  • Repeat 1,2 until sk1 sk
  • Variant Gradient search, fixed number of
    iterations, minimal improvement
  • Issues Gets trapped in local maxima

Figure from Fig 2.6 in I. Akbar, Statistical
Analysis of Wireless Systems Using Markov
Models, PhD Dissertation, Virginia Tech, January
2007
59
Genetic Algorithms
  • Concept Apply concept of evolution to searching
    complex spaces
  • Really random search with some structure
  • Successive populations (or generations) of
    solutions are evaluated for their fitness.
  • Least fit solutions are removed from the
    population
  • Most fit survive to breed replacement members of
    the population
  • Breeding introduces mutation and cross-overs so
    that new population is not identical to original
    population
  • Like parents and kids
  • Lots of variants
  • Parents die off
  • Niches
  • Tabu for looping

60
Genetic Algorithm Example
Breeding
Population
Fitness
PWR
F
MAC
NET
PWR
F
MAC
NET
PWR
F
7
PWR
F
MAC
NET
MAC
NET
PWR
F
MAC
NET
5
Cross over
9
Mutation
PWR
F
MAC
NET
1
61
Comments on GA
  • Tends to result in good solution very quickly
  • Long time (perhaps no better than a random
    search) to find optimum
  • Often paired with a local search
  • Low mutation rates can cause genetic drift
  • High mutation rates can limit convergence
  • Cross over is like high mutation, but without
    damaging convergence, but can get stuck on local
    maxima
  • In theory, reaches global optimum, but requires
    more time to guarantee than an exhaustive search
  • Lots of freedom in the design
  • Mutation rate, cross over rate, chromosome size,
    number of generations, population size, number of
    survivors, breeding rules, surviving rules
  • Even more variation used when fitness function or
    data sets are changing over time (e.g., set
    mutation rate or population as a function of
    fitness)
  • Theoretically, best combination of parameters is
    a function of the characteristics of the solution
    space
  • In practice, empirically setting parameters tends
    to be better (GA to program a GA?)

62
Simulated Annealing
  • Steps
  • Generate a random solution, s
  • If s is better than sk, then sk, then sk1s
    else generate random variable r. If r is less
    than some function of temperature and the
    difference in value of sk and s and T, then
    sk1s.
  • From time to time decrease T so that f(sk s,T)
    decreases over time.
  • Repeat steps 1-3 until stopping criterion
  • Comments
  • Important to store best result
  • In theory, reaches global optimum, but requires
    more time to guarantee than an exhaustive search
  • Often finished with a local search applied to
    best solution
  • Freedom in algorithm
  • Distributions for generating s, schedules for T,
    change in distributions with T
  • Threshold trading can be less costly

63
Comments on Decision Processes
  • Execution time
  • Case-based reasoning lt Searches
  • Good architectural decision is to combine
    approaches
  • CBR except when unknown case
  • GA for a quick good solution
  • Refine with local search
  • Can revisit searches later when excess cycles are
    available
  • CBR can provide initial solution(s) to search
    algorithms
  • Sometimes simpler algorithms are all that are
    required and will run much faster than any of
    these
  • Adjust power level for a target SINR

64
Representing Information
  • How can a radio store and manipulate knowledge?

65
Types of Knowledge
  • Conceptual Knowledge
  • Analytic or axiomatic
  • Analytic if it expresses or follows from the
    meaning of objects
  • E.g., a mobile radio is a radio with the property
    of mobility
  • Axiomatic fundamental conceptual relationships
    not based on meaning alone
  • Rules
  • Relationships or theorems committed to memory
  • Some authors draw a distinction between rules and
    conceptual knowledge, but it could be argued that
    a rule is just an axiom (or property)
  • Can be expressed symbolically (e.g., UML),
    ontologically, or behaviorally (e.g., GA)

66
Why languages to represent information?
  • Negotiation
  • Heterogeneous devices can exchange information
  • Sharing learned information between devices
  • Permits reasoning and learning to be abstracted
    away from specific platforms and algorithms
  • Portability, maintainability
  • Permits appearance of intelligence by reasoning
    in a manner that appears familiar to a human
  • Note much of the preceding could also be done
    with behavioral knowledge (e.g., sharing GA
    states) but it is somewhat clumsier

67
Proposed Languages
  • UML
  • Radio Knowledge Representation Language
  • Describes environment and radio capabilities
  • Part of radioOne
  • Resource Description Language
  • Web-based Ontology Language (OWL)
  • Proposed for facilitate queries between radios
  • DAML and (used by BBN)
  • Issues of language interoperability, testability,
    actual thought processes

68
Language Capabilities and Complexity
  • Increasing capabilities significantly increases
    complexity

Language Features Reasoning Complexity
XTM Higher order relationships None O(N)
RDF Binary Relationships None O(N)
RDFS RDF plus subclass, subproperty, domain, and range Subsumption O(Nm)
OWL Lite RDFS plus some class constructors no crossing of metalevels Limited form of description logic O(eN)
OWL-DL All class constructors no crossing of metalevels General description logic lt?
OWL Full No restrictions Limited form of first order predicate logic ?
Modified from Table 13.1 in M. Kokar, The Role of
Ontologies in Cognitive Radio in Cognitive Radio
Technology, ed., B. Fette, 2006.
69
Comments on Knowledge Representation
  • Ontologies are conceptually very appealing for
    realizing thinking machines
  • Personal concern that goals of very high level
    abstraction, platform independence, lack of a
    detailed specification, and automated
    interoperability will lead to JTRS-like
    implementation difficulties (see theoretically
    unbounded complexity, JTRS is at least bounded)
  • However these are really the benefits of using
    ontologies
  • Building an ontology is a time-intensive and
    complex task
  • Combining ontologies will frequently lead to
    logical inconsistencies
  • Makes code validation hard
  • Encourage development of domain standardized
    ontologies
  • Policy, radio, network

70
Virginia Tech Cognitive Radio Testbed - CORTEKS -
  • Researchers
  • Joseph Gaeddert, Kyouwoong Kim, Kyung Bae,
    Lizdabel Morales, and Jeffrey H. Reed

71
Current Setup (CORTEKS)
GPIB
GPIB
Arbitrary Waveform Generator AWG430 Multi-mode
transmitter
GPIB
Arbitrary Waveform Generator AWG710B Signal
upconverter
72
Current Waveform Architecture
73
CoRTekS Screenshot
Image Display
Transmitted Image
Received Image
Policy (4 ch.)
Transmit spectral power mask
Modulation schemes
Packet History Display
Bit error Rate
Spectral Efficiency
Waveform
Center Freq.
Symbol Rate
Mod. Type
Transmit Power
Spectrum Display
Available Spectrum
Detected Interference
Current CR spectrum usage
74
CoRTekS Decision Process
75
Demonstration of CORTEKs
76
Implementation Summary
  • Broad differences in architectural approaches to
    implementing cognitive radio
  • Engines vs algorithms
  • Procedural vs ontological
  • Numerous different techniques available to
    implement cognitive functionalities
  • Some tradeoffs in efficiencies
  • Likely need a meta-cognitive radio to find
    optimal parameters
  • Process boundaries are sometimes blurred
  • Observation/Orientation
  • Orientation/Learning
  • Learning/Decision
  • Implies need for pooled memory
  • Good internal models will be important for
    success of many processes
  • Lots of research going on all over the world
    lots of low hanging fruit
  • See DySPAN, CrownCom, SDR Forum, Milcom for
    papers upcoming JSACs
  • No clue as to how to make a radio conscious or if
    we even should
Write a Comment
User Comments (0)
About PowerShow.com