Consensus Problems in Networks - PowerPoint PPT Presentation

About This Presentation
Title:

Consensus Problems in Networks

Description:

Cooperative control for multi agent systems have a lot of applications. Formation control ... Agents that can pass info to all the other vehicles have a say in ... – PowerPoint PPT presentation

Number of Views:203
Avg rating:3.0/5.0
Slides: 19
Provided by: smi760
Category:

less

Transcript and Presenter's Notes

Title: Consensus Problems in Networks


1
Consensus Problems in Networks
  • Aman Agarwal
  • EYES 2007 intern
  • Advisor Prof. Mostofi
  • ECE, University of New Mexico
  • July 5, 2007

2
Background
  • Cooperative control for multi agent systems have
    a lot of applications.
  • Formation control
  • Non formation cooperative control
  • Key issue is shared information

3
Consensus Protocols
  • xi be the information state of agent i
  • Continuous time
  • xi(t) S aij (xj - xi )
  • ? x(t) - L x(t)
  • ? x(t) e-Lt x(0) Lt t?8 e-Lt?1vT vT1
    1 vTL 0
  • ? x(t) 1vTx(0) where v is the eigen vector
    corr to eigen value 0
  • Discrete time
  • xi k1 S aijk xik
  • ? xk1 Dk xk
  • ? xk1 Dk x0 Lt k?8 Dk?1 vT vT1 1
    vTD vT
  • ? x(t) 1vTx(0) where v is the eigen
    vector corr to eigen value 1
  • L has an eigen value 0 corresponding to the
    solution and D has an eigen value 1 corresponding
    to the solution

4
Convergence of Consensus Protocols
  • Equilibrium value function of the initial state.
  • Agents that can pass info to all the other
    vehicles have a say in the final value.
  • Second smallest eigen value (L) or second largest
    eigen value(-L) or Fiedler eigen value
  • Determines the speed of convergence
  • Dense graphs ? ?2 is relatively large.
  • Sparse graphs ? ?2 is relatively small.
  • The third smallest eigen value (of L) should be
    far away from ?2 for faster convergence.
  • Multiple values of ?2 also affect the speed of
    convergence.
  • Ideally we would like to have ?2 as a simple
    eigen value for fast convergence.

5
Binary Consensus Problems
  • In most consensus applications, the agents will
    communicate their status wirelessly.
  • On the bit level there is receiver noise.
  • Noise is not bounded ?no transition point beyond
    which consensus is guaranteed ?a probabilistic
    approach to characterize and understand the
    behavior of the network.
  • To examine this effect we look at binary
    consensus problems.
  • Assume that the network is fully connected.
  • A majority poll to assert if the majority of the
    nodes are in consensus and ? node updates its own
    information

6
Binary Consensus Problems
  • Model 1
  • noise
    decision
  • bj(k1) Dec( S bj,i(k)/ M ) Dec(x) 1 x ?
    0.5      0 x lt 0.5
  • Dec( S bj(k)/ M S
    nj,i(k)/ M )
  • Dec( S S(k)/ M wi(k) )
  • S(k) state of the system at time k S bj(k)
  • ?i(k) probability state S(k) i
  • ?(k) ?0(k) ?1(k) ?n(k) probability
    vector
  • Pij probability S(k) j S(k) i
  • MCj kij (1-ki)M-j where ki prob
    i / M wi(k) gt 0.5 S(k)i
  • ?(k1) PT ?(k) P Pij
  • ?(k) ( PT )k ?(0) asymptotic behavior of
    probabilities

bj(k)
bji(k)
bj(k1)
7
Model 1 M4 X(0)0 1 1 1
Probability plot for Model 1 with sigma 0.5
Probability plot for Model 1 with sigma 0.75
Probability plot for Model 1 with sigma 1
Probability plot for Model 1 with sigma 2
8
Binary Consensus Problems
  • Model 2(a) 2(b)
  • Noise
    noise filtering decision
  • bj,i D(k) Dec( bj,i(k) ) Dec(x) 1 x
    ? th (normally 0.5)

    0         x lt th
  • bj(k1) Dec( S bj,i D(k)/ M ) Dec(x) 1
    x ? 0.5

    0        x lt 0.5
  • Dec( S Dec( bj,i(k) )/ M )
  • Pij probability S(k) j S(k) i
  • MCj kij (1-ki)m-j where ki
    prob S Dec( bj,i(k) )/ M gt 0.5 s(k)i

  • M
  • S MCl P bj,i D(k) 1
    j P bj,i D(k) 0 m-j

  • L ?m/2?
  • Pbj,i d(k) 1 Pbj,i d(k)1
    bj(k)1Pbj(k) 1Pbj,i d(k)1
    bj(k)0Pbj(k) 0
  • i / m Q(0.5/?)(1 - 2i / M)
  • Pbj,i d(k) 0 Pbj,i d(k)0 bj(k)
    1Pbj(k) 1Pbj,i d(k)0 bj(k)0Pbj(k)
    0
  • 1- i / M -
    Q(0.5/?)(1 - 2i / M)

bj(k)
bji(k)
bj(k1)
bji D(k)
9
Model 2(a) The noise is filtered first by
thresholding the received values at threshold
level of 0.5 to ensure that the majority decision
is made on correct data only. M4 X(0)0 1 1
1
Probability plot for Model 2(a) with sigma 0.5
Probability plot for Model 2(a) with sigma 0.75
Probability plot for Model 2(a) with sigma 1
Probability plot for Model 2(a) with sigma 2
10
Model 2(b) In this case the threshold for the
comm. noise is dynamically chosen by monitoring
the values that the nodes are sending and then
updating the threshold based on the differential
probabilities of sending a 1 or a 0.
11
Model 2(b) M4 X(0)0 1 1 1
Probability plot for Model 2(b) with sigma 0.5
Probability plot for Model 2(b) with sigma 0.75
Probability plot for Model 2(b) with sigma 1
12
Binary Consensus Problems
  • Model 3
  • Noise noise
    filtering soft info
    decision
  • bj,i D(k) Dec( bj,i(k) ) Dec(x) 1
    x ? th (normally 0.5)


    0         x lt th
  • bj(k1) Dec( S E bj(k) bj,i(k) / M )
    Dec(x) 1 x ? 0.5
  • 0 x lt 0.5
  • Where E bj(k) bj,i(k)
    f( bj,i(k) - 1 ) P bj(k) 1

  • f( bj,i(k) - 1 ) P bj(k) 1 f( bj,i(k) )
    P bj(k) 0
  • And f(x) pdf of N ( 0 , ?2 )
  • Pij probability s(k) j s(k) i
    finding the probability of transition becomes
    very tedious and complex in this case so we
    simulate the case and calculate the probability
    statistically by taking a lot of samples ( min
    1000 )

bj(k)
bji(k)
bj(k1)
bji D(k)
Ebj(k)bji D(k)
13
Model 3
Probability plot for Model 2 with sigma 0.5
Probability plot for Model 3 with sigma 0.75
Probability plot for Model 3 with sigma 1
Probability plot for Model 3 with sigma 2
14
Comparison of models
  • Model 1 ? performance sharply degrades for larger
    noise variances (sigma gt 0.5).
  • Model 2(a) ? Better than model 1 but cant handle
    large noise variances (sigma gt 1).
  • Model 2(b) ? better than model 2(a). The dynamic
    threshold works but only if the noise variance is
    lt 1, because for larger noises a threshold
    between 0,1 will not work.
  • Model 3 ? is very robust and can perform with
    large noises also (sigma gt1) but we trade off
    speed of convergence for handling larger noises.

15
Detection Estimation
  • A group of nodes where each node has limited
    sensing capabilities ? rely on the group for
    improving its estimation/detection quality.
  • Estimation ? each agent has an estimate of the
    parameter of interest which can take values over
    an infinite set or a known finite set.
  • Detection ? parameter of interest takes values
    from a finite known set

16
Sensing noise noise
filtering decision
comm. noise noise filtering
decision
Binary Detection
Oj(k1)
S
Sj(k)
Sj(k)
Oj(k)
Oji(k)
Oj D(k)
  • For k 1
  • Sj(k) event sensed at time k
  • Oj(k) opinion formed at time k
  • Oji(k) Oj(k) sn ji
  • Oi D(k) Dec ( Oji(k) ) Dec(x) 1 x
    0.5
  • 0 x lt 0.5
  • Oj(k1) Dec( ( ?Oi D(k) Oj(k) Sj(k) ) /
    M1 )

sn 0.5 , ss 1 S1
17
Binary Detection
  • Every node has M 1 different values to weigh
    every time
  • Weigh nodes with better communication or better
    sensing differently
  • Define trust factor
  • Trust factors ? either time invariant or time
    variant
  • Should update themselves over time

18
Binary Detection
  • Trust factors one way of implementing this is as
    follows

Average consensus
Different weights
How nodes with good sensing and good
communication affect the consensus X(0) 0 0 0
0 0 1 1 1
Write a Comment
User Comments (0)
About PowerShow.com