Class Discussion Analyzing the MAClevelBehavior of Wireless Networks in the Wild - PowerPoint PPT Presentation

About This Presentation
Title:

Class Discussion Analyzing the MAClevelBehavior of Wireless Networks in the Wild

Description:

Class Discussion 'Analyzing the MAC-levelBehavior of Wireless Networks in the Wild' ... level course....there should be a lot of discussion. Critique Guidance ... – PowerPoint PPT presentation

Number of Views:83
Avg rating:3.0/5.0
Slides: 62
Provided by: Jer593
Category:

less

Transcript and Presenter's Notes

Title: Class Discussion Analyzing the MAClevelBehavior of Wireless Networks in the Wild


1
Class DiscussionAnalyzing the MAC-levelBehavior
of Wireless Networks in the Wild
  • Discussion Guided by
  • Jerry Sussman

2
Critique Guidance
  • Critique Instructions
  • Critique the paper, not the me!
  • All students should read the paper before class
  • Critique is due prior to the following weeks
    class
  • Discussion Leader MUST use slides to guide the
    discussion
  • Critiques should be organized / structured per
    website
  • This is a 300-level course.there should be a lot
    of discussion.

3
Critique Guidance
  • (10) State the problem the paper is trying to
    solve.
  • (20) State the main contribution of the paper
    solving a new problem, proposing a new algorithm,
    or presenting a new evaluation (analysis). If a
    new problem, why was the problem important? Is
    the problem still important today? Will the
    problem be important tomorrow?  If a new
    algorithm or new evaluation (analysis), what are
    the improvements over previous algorithms or
    evaluations? How do they come up with the new
    algorithm or evaluation? 
  • (15) Summarize the (at most) 3 key main ideas
    (each in 1 sentence.) 
  • (30) Critique the main contribution
  • Rate the significance of the paper on a scale of
    5 (breakthrough), 4 (significant contribution), 3
    (modest contribution), 2 (incremental
    contribution), 1 (no contribution or negative
    contribution). Explain your rating in a sentence
    or two.
  • Rate how convincing the methodology is how do
    the authors justify the solution approach or
    evaluation? Do the authors use arguments,
    analyses, experiments, simulations, or a
    combination of them? Do the claims and
    conclusions follow from the arguments, analyses
    or experiments? Are the assumptions realistic (at
    the time of the research)? Are the assumptions
    still valid today? Are the experiments well
    designed? Are there different experiments that
    would be more convincing? Are there other
    alternatives the authors should have considered?
    (And, of course, is the paper free of
    methodological errors.)
  • What is the most important limitation of the
    approach?
  • (15) What lessons should researchers and
    builders take away from this work. What (if any)
    questions does this work leave open?
  • (10) Propose your improvement on the same
    problem.
  • Note the purpose of this template is to serve as
    a starting point, instead of a constraint. Use
    your judgment and creativity. Some advice through
    the resource link of the class can be helpful.

4
Agenda
  • Authors
  • Summary
  • Background
  • Wit
  • Theory Behind Wit
  • Implementation of Wit
  • Wit Evaluation
  • Inference versus Additional Monitors
  • Application in Live Environment
  • Conclusion

5
Authors
  • Ratual Mahajan Microsoft
  • Maya Rodrig University of Washington
  • David Wetherall University of Washington
  • John Zahorjan University of Washington
  • Funding NSF
  • Presented SIGCOMM06
  • September 11-15, 2006
  • Pisa, Italy

6
Summary First
  • Paper Documents WIT
  • Passive Wireless Analysis Tool
  • Analyzes MAC-Level behavior on Wireless Networks
  • Paper Assesses WIT Performance
  • Based on Real Simulated Data
  • Authors tested WIT against live Wireless Network

7
Why Is WIT Needed?
  • ???

8
Why Is WIT Needed?
  • Understand how live networks communicate in
    different situations
  • Highly loaded environment
  • Low load environments
  • Interfering wireless LANs, etc.
  • Critical to knowing how to improve performance of
    wireless LANs.

9
Background
  • Measurement-driven analysis of live networks
  • Critical to understanding live performance of
    networks
  • Critical to improving performance
  • Measurement-driven refers to
  • Part Measured / Collected data
  • Part generated data

10
Background
  • Wireless Measurement-Driven Analysis
  • At time of paper publication, Lacking in
  • Software Collection/Analysis Tools
  • Performance data from wireless networks
  • Reasons
  • Based on Simple Network Mgt Protocol (SNMP) logs
    from AP
  • AP logs
  • Low fidelity (i.e. course logs) of AP Side
  • No data from client view
  • Packet traces from Wired hosts next to AP
  • Traces omit wireless retransmissions

11
Background
  • Unrealistic Solution
  • Instrument entire wireless network
  • Proven Successful in control environments
  • Unrealistic and not a match for commercial
    application
  • Only Realistic Solution
  • Obtain trace via passive monitoring
  • 1 or more nodes declared monitors
  • Monitors placed in vicinity of wireless network
  • Record attributes of all transmissions
  • Trivial to deploy

12
Background
  • Problems with Passive Monitoring
  • Data / Traces may be incomplete
  • Packets dropped due to weak signal
  • Packets dropped due to collisions
  • Difficult to know what packets are missing from a
    monitor
  • Monitor stations cant determine if destination
    properly received packets
  • Important for determining reception probability

13
Background
  • This paper is trying to
  • Find a way to assemble an accurate trace of
    wireless environment for analysis
  • Use data from multiple monitoring stations
  • Determine missing packets
  • Re-create missing packets
  • Combine into single Trace file
  • Determine Network Performance
  • How often do clients retransmit their packets
  • Determine loss effects between two nodes
  • Effect of increased load on the network

14
Background
  • Authors attempt to solve problem with WIT
  • Paper presents WIT, a tool for Measurement-Driven
    Analysis.
  • WIT has 3 modules which solve key problems
    identified earlier

15
Wit
16
Why Is WIT Needed?
  • Quantify Wireless Network Performance
  • Estimate of competing stations
  • Assist in diagnosing wireless network problems

17
WIT Core Processing Steps
  • Merging procedure
  • Packet Reconstruction
  • Determination of Network Performance

18
Merging procedure1st Core Processing Step
  • Combine incomplete traces from multiple,
    independent monitors
  • Provides a complete trace for follow-on steps
  • Based upon collected date
  • Not inferred or reconstructed

19
Packet ReconstructionSecond Core Processing
Step
  • Reconstructs packets not captured by any monitor
  • Strong inference engine
  • Determines if packet received at destination
  • Again, provides more complete trace for follow-on
    step

20
Determination of Network Performance Third Core
Processing Step
  • WIT Calculates Network Performance
  • Input Constructed trace
  • Output
  • Typical simple network measurements
  • Packet reception probabilities
  • Estimates number of nodes contending for medium
  • Not previously achieved according to authors

21
Passive Monitoring Pipeline
22
WIT Evaluation
  • After Development of WIT, Authors faced with
    Evaluation Task
  • Used mix of real and simulated data
  • Used WIT at SIGCOMM 2004 conference
  • Multi-monitor traces captured
  • Uncovered MAC-layer characteristics of
    environment
  • Network was dominated by period of low contention
    during which the medium was poorly utilized, even
    though APs were waiting to tx packets
  • Suggests 802.11 MAC tuned for high traffic levels
    that are uncommon on real networks.
  • Authors claim this cant be obtained by other
    methods

23
Now for the Theory behind WIT phasesImplementat
ion of Phases will follow.
24
3 Core Phases
  • Merging of Traces
  • Inferring Missing Information
  • Deriving Measurments / Performance

25
3 Core Phases
  • Merging of Traces
  • Inferring Missing Information
  • Deriving Measurments / Performance

26
Merging of Traces
27
Merging of Traces
  • Input
  • Number of Packet traces
  • 1 Trace per monitor
  • Timestamps reflect local AP Receive Packet time

28
Merging of Traces
  • Output
  • Merge into single, consistent timeline for all
    packets observed
  • Eliminate duplicates
  • Assign coherent timestamps to all packets
    independent of monitor
  • Timestamp accuracy to a few microseconds
    required.
  • Identify and Eliminate Duplicates

29
Merging of Traces
  • Timing, the critical element
  • Only few packets carry info guaranteed to be
    unique over a few miliseconds
  • Only way to distinguish duplicates is by time
  • Accurate timestamps are vital to creating the
    merged trace
  • Reference packets are the key

30
Merging of Traces
  • Three Step Merging Process
  • Identify the reference packets common to both
    monitors
  • Beacons generated by APs as references
  • Contain unique source MAC address
  • Contain 64-bit value of local, microsecond
    resolution timer

31
Merging of Traces
  • Three Step Merging Process
  • Use reference timestamps to translate the time
    coordinates
  • Pair up two reference timestamps across two
    traces
  • Time interval of secondary is altered to match
    baseline trace
  • Constant added to align the two traces between
    the two individual reference points
  • Resizing / alignment process adjusts for clock
    drift and alignment bias between two monitors

32
Merging of Traces
  • Three Step Merging Process
  • Identify and Remove duplicates
  • Identify by matching
  • Packet Type
  • Same Source
  • Same Destination
  • Time stamp that is less than ½ of minimum time to
    transmit a packet
  • Note The code for this would be straight
    forward however I suspect much time was spent
    reviewing the data and proving that the
    code/scheme worked.

33
Merging of Traces
  • Waterfall Merging Process
  • Merge two traces
  • Then merge third trace to baseline trace
  • Approach is not most time efficient
  • Approach provides improved precision
  • New reference points continually added
  • Easier to find set of shared reference points as
    more monitor traces merged

34
3 Core Phases
  • Merging of Traces
  • Inferring Missing Information
  • Deriving Measurments / Performance

35
Inferring Missing Information
36
Inferring Missing Information
  • Two Fundamental Purposes
  • Infer missing packets from collected merged
    data
  • Estimate whether packets were received by their
    destination
  • Authors claim this is new

37
Inferring Missing Information
  • Key Technique
  • Transmitted packets imply useful data about the
    packets it must have received
  • Example
  • AP send ASSOCIATION RESPONSE only if it recently
    receive an ASSOCIATION REQUEST.
  • If the merge trace contains the response but no
    request then we know request was successfully
    sent
  • Also, sender and destination of missing request
    are known from response packet.

38
Inferring Missing Information
  • Processing the merged trace
  • Scan each packet and process
  • Classify each packet type
  • Generate markers
  • Ex Ongoing conversation end
  • Formal Language Approach (FSM)
  • Infer Packet Reception
  • Infer Missing Packets
  • Construct Packets as Required

39
3 Core Phases
  • Merging of Traces
  • Inferring Missing Information
  • Deriving Measurements/Performance

40
Deriving Measurements / Performance
41
Inferring Missing Information
  • Merged Trace Can Be Mined
  • Many ways to study detailed behavior
  • Packet Reception probability
  • Estimate number of stations that are competing
    for medium per snapshot in time
  • Requires access to State
  • randomly selected backoff values
  • DATA DATAretry packets

42
Now for the Implementation of WIT
43
WIT Implementation
  • WIT Implented in 3 Components
  • halfWit
  • nitWit
  • dimWit
  • Half, Nit, DIM correspond to three pipeline
    phases discussed earlier

44
WIT Implementation
  • halfWit
  • Merge phase
  • 1st Insert all traces into database
  • Database used to merge data as defined earlier
  • Database also used to pass final merged trace to
    nitWit
  • Uses merge-sort methodology
  • Traces handled like queues

45
WIT Implementation
  • nitWit
  • Inference phase
  • nitWit take output of halfWit
  • Determines and recreates missing packets
  • Annotates captured and inferred packets
  • Critical annotation for each packet is whether it
    was received.
  • Retry packet fields are tracked

Note Original implementation did not merge
captured and inferred packets because
exact timing uncertainty. Different than theory
writeup section.
46
WIT Implementation
  • dimWit
  • Derived Measures Component
  • dimWit take output of nitWit
  • Produces summary network information
  • Produces number of contenders in the network
  • Implemented to analyze tens of millions of
    packets in a few minutes.

47
Wit Evaluation
48
Wit Evaluation
  • Purpose of Evaluation
  • Understand how well each phase works
  • Key questions to be evaluated
  • Quality of time synchronization?
  • Quality of merged product?
  • Accuracy of inferences?
  • Fraction of missing packets inferred?
  • Number of Contenders accuracy?
  • Analyze improvement from more monitors or more
    inference?

49
Wit Evaluation
  • Reality of this type of evaluation
  • Comparing against ground truth unrealistic
  • Too much detail
  • Unrealistic to create absolute controlled
    environment
  • Reduced to simulation as primary validation method

50
Wit Evaluation
  • Simulated Environment
  • 2 Access Points (APs)
  • 40 clients randomly distributed on a grid
  • Packet Simulator
  • Reception probability based on
  • signal strength
  • Transmission rate
  • Existing packets in environment
  • Random bit errors

51
Wit Evaluation
  • Simulated Environment (continued)
  • 10 randomly distributed monitors
  • Detailed logs of simulated packet generation and
    simulated packet collection.

52
Wit Evaluation
  • Merging
  • Check correctness characterize quality of time
    synchronization
  • Basis for waterfall merging
  • Inference
  • Check ability to infer packet reception statuses
    and missing packets
  • Estimating Contenders
  • Run dimWit on merged traces and compare against
    logs

53
Wit Evaluation
  • Results
  • Will Limit results here to high priority
    end-resultthe Contenders.
  • Worst case simulation with 90 packets captured,
    dimWit is within -1 87 of cases
  • In smaller simulation with 98 packets captured,
    estiates are within - 1 for 95 of the cases.
  • Closer study reveals
  • High error values tend to correspond to cases
    with high number of contenders

54
Wit Evaluation
  • Results
  • Will Limit results here to high priority
    end-resultthe Contenders.
  • Worst case simulation with 90 packets captured,
    dimWit is within -1 87 of cases
  • In smaller simulation with 98 packets captured,
    estiates are within - 1 for 95 of the cases.
  • Closer study reveals
  • High error values tend to correspond to cases
    with high number of contenders

55
Inference Versus Additional Monitors
56
Inference Versus Additional Monitors
  • Both more inference and more monitors increase
    quality of results
  • Cant Increase Both In Real Life
  • Which Has More Bang-for-the-Buck?
  • Test show diminishing returns as number of
    monitors increase
  • Expected Result

57
Applying To Live Environment
58
Applying To Live Environment
  • SIGCOMM 2004 Conference wireless environment
  • 4 days
  • 550 attendees
  • Large / busy setting
  • 5 Access Points
  • Channels 1 and 11
  • Internet via DSL access lines
  • Interfering Wireless Networks
  • Number of transient wireless networks
  • Hotel Wireless Network
  • Private Wireless Network on Ch 6
  • Montoring 24/7 During Conference

59
Applying To Live Environment
  • Results
  • Successful merge trace produced for each channel
  • One monitor didnt have enough references in
    common with merged trace so it was excluded
  • Lesson Learned Placement of monitors
  • Significant overlap in what each monitor hears
  • Additional monitors increases number of unique
    packets in each trace
  • True even when two monitors right next to each
    other
  • Therefore, even dense array of monitors will miss
    packets

60
Applying To Live Environment
  • Results
  • nitWit inferred that 80 of unicast packets were
    received by their destination
  • nitWit inferred that 90 of total packets were
    captured by the monitors
  • dimWit determined that Uplink to the AP was more
    reliable than the downlink
  • Medium was inefficiently utilized
  • Reception probability did not decrease with
    contention
  • Performance was stable at high contention levels

61
Concluding Remarks
  • Wit implementation provides wireless live data
    not previously available
  • Measurement-driven analysis, implemented by Wit,
    successfully evaluated
  • Further Study warranted
  • Will lead to increased efficiency of Wireless LANs
Write a Comment
User Comments (0)
About PowerShow.com