Background : The Product - PowerPoint PPT Presentation

1 / 86
About This Presentation
Title:

Background : The Product

Description:

Performed walkthroughs. 16. We Learned! ... Walkthroughs with Libjingle Chief Architect. Analysis of code and team knowledge sharing ... – PowerPoint PPT presentation

Number of Views:61
Avg rating:3.0/5.0
Slides: 87
Provided by: patrickp5
Category:

less

Transcript and Presenter's Notes

Title: Background : The Product


1
(No Transcript)
2
  • The semester was
  • Intense
  • Our hearts were in the work

Yeah
(Andrew Carnegie)
3
  • We also worked smart

4
  • Used status meetings for technical discussions
  • 3/4 4/4 (go figure)

5
Effort Distribution
Operations Meetings, Risk Eval, Process, slides
6
Weekly Team Effort - Spring 07
60 hrs/week average (team)
7
  • Because of all that
  • We came out strong.

8
  • The next step?
  • We cant wait to

9
Have A Life!
We deserve it
10
Our Client
11
His preferences
12
Our Clients Intuition
13
What does our client want?
14
Googles Libjingle
15
Libjingle Example
16
What did we do?
  • Our background(or lack of)
  • Took 15-744 Computer Networks
  • Studied (books, papers, RFCs)
  • Talked to experts
  • Dave Andersen, Daniel Plakosh, Paulo Marques,
    Tony, Brian McBarron
  • Performed walkthroughs

17
We Learned!The eye sees only what the mind is
prepared to comprehend (Henri Bergson)
18
Is there any problem?
19
Project Context
Network (Router)
Peer A (Sender)
Peer B (Receiver)
20
CC View
21
For the technically inclined, three slides
22
For the technically inclined
  • Our client thinks that TCPs saw tooth shape is
    bad, because it forces packet losses

Packet Loss
cwnd (congestion window)
(Courtesy of Dave Andersen)
Time
23
For the technically inclined
  • He thinks that a smoother curve will achieve
    higher throughput. (That makes sense)

cwnd (congestion window)
Time
24
For the technically inclined
Next slide focuses in this area
cwnd (congestion window)
Time
24
25
For the technically inclined
26
Lets see how it behaves in practice
27
PATRICIO
28
  • Feasible
  • Testable
  • Measurable
  • Achievable Project
  • Now for our approach

29
Perspectives on Process
  • Operations
  • Client Management
  • Architectural Reconstruction
  • Handling Uncertainties
  • Planning
  • Development Progress

30
Operations
  • Whats our process?
  • Iterate on the Requirements Specification
  • Iterate on the Design Specifications
  • Verify design work against our SRS
  • Strive toward a defined threshold of success
  • Evaluate project risk each cycle

31
Added chop-chop-tiveness to TSP
  • Tailored peer evaluations
  • Embraced satisfaction surveys
  • Tracked process conformance, then reported it at
    status meetings
  • Participated in improvement via post-mortem
  • Collaborated during common work hours
  • Adopted email templates for efficient
    communication
  • Held knowledge sharing meetings (technology and
    process)

32
Takeaways from courses
  • 15-744 Computer Networks
  • 17-668 Computer Networks Mini
  • 17-615 Software Process Definition
  • 17-610 Risk Management for Software Systems
  • Architecture
  • QA scenarios helped us nail down our requirements
  • Architecture reconstruction can help us gain more
    insight into performance issues of protocol
  • Analysis
  • Coverity Prevent for static analysis

33
Client Management
  • How do we operate with the client?
  • Structured questions for client meetings
  • Maintained a client history log
  • Emailed reminders of our Wiki and project
    documents
  • Each iteration on our Requirements Specification
    shows it is working

34
Libjingle Architectural Reconstruction
  • What were dealing with
  • Code that evolved from multiple company merges
  • Brittle and complex code paths
  • How we handled it
  • Walkthroughs with Libjingle Chief Architect
  • Analysis of code and team knowledge sharing
  • We have completed iterations of our design
    specifications

35
Handling Uncertainties
  • Identified the top risks during cycle

36
Handling Technical Uncertainties
  • Experimentation
  • Objective was to uncover the dark areas
  • Used the ACDM experiment templates
  • A lot more ahead but weve accomplished
  • Understanding some client assumptions
  • Interfacing our architecture with third party
    components

37
Planning
  • How do we plan?
  • TSP development strategy
  • Wideband Delphi estimation
  • MS-Access custom TSPi tool
  • Cycles planned initially, weekly updates
  • Effort task completion, EV tracking

38
A Look at Estimation
39
Summer Plan
40
Development Progress
  • Is it working? Why the 15 derivation?
  • Next cycle, we will buffer for unplanned tasks

41
Accomplishments
  • Requirement Specification Iteration
  • Protocol Design Specification
  • Test Suite Design Specification
  • Libjingle architecture recovery
  • Summer plan
  • Test environment setup

42
  • One last accomplishment
  • We understand that we ourselves are our best
    assets.
  • We are a team!

43
Future Work
  • Verify were following initial proposals
  • Hold common working hours during summer
  • Average 48 hour work week during summer
  • Continue process improvement via post-mortem
  • Attempt to extend Coverity Prevent license
  • Add entry/exit criteria to our summer tasks

44
Lessons Learned
  • Even if your customer is in Software Development,
    it still doesnt mean they can describe exactly
    what they want.
  • Communicate as much and as many ways possible
  • n developers have the communication paths of
    n(n-1)/2
  • chop-chop has 10 paths, not including our 4
    mentors

45
More Lessons Learned
  • We need to account for unplanned tasks in our EV
  • Make design decisions and move on

46
Questions
?
?
?
?
?
?
47
Questions
?
?
?
?
?
?
48
Project goals
  • Implement file transfer protocol
  • Implement test suite to check correctness and
    measure performance

49
Gantt Chart Spring 07
50
Deliverables Threshold of Success
  • Implement a protocol for a correct transfer of
    files at least 10Mb between two computers
    (protocol properties and correctness specified in
    the SRS)
  • Implement a test suite that
  • measures correctness and performance as defined
    in SRS
  • produces a report of those results (report
    structure defined in design document)

51
Team Threshold of Success
  • Team members work on assigned tasks to the level
    of quality and effort (assigned hours per task)
    specified in the project schedule.
  • Team members can stray up to 10 of estimated
    schedule as long as they update the team when the
    original deadline is reached.

52
Roles
53
(No Transcript)
54
Libjingle Connectors
  • Uses Signal/Slot Library
  • Common interface for components
  • Components do not need to know what theyre
    calling
  • Components do not need to know whats calling
    them
  • Emitted signals call all connected slots

55
Experiment Extended Sequence Numbers
56
Experiment pcp Direct Connections
57
Packet Header Comparison
58
RUDP Worst Case
59
Estimation - Summer
  • Estimation for development and QA tasks
  • Minimum 754 hours
  • Most likely 1024 hours
  • Maximum 1352
  • Productivity measure 67
  • With 30 buffer
  • Available hours 5 members 32 hours 12 weeks
    1920

60
Summer Plan - Estimates
61
_at_Risk simulation results
  • 95 probability of completing the project by
    August 2, 2007
  • 100 probability of completing the project by
    August 8, 2007

62
Architectural view - available
63
Architectural views - recovered
64
Architectural view test suite
65
Architectural view test suite
66
(No Transcript)
67
(No Transcript)
68
Protocol
69
Test Suite
70
Protocol - Work Breakdown Structure
1.1 Simple file transfer
1.2 Packet reordering and retransmission
1. Protocol impl.
1.3 RTT calculation
1.4 Window adjustments
1.5 Multiple peer connections
71
Test suite Work Breakdown Structure
2.1 libjingle wrapper
2.2 NISTNet wrapper
2. Test suite impl.
2.3 logger
2.4 Sender agent
2.5 Receiver agent
72
Implementation plan
  • Build test suite parallel to protocol
  • Test and release an increment of the protocol in
    every cycle
  • Analyze protocol in different environments

73
Quality goals
  • Software Req. Spec. lt 1 major defect
  • Design lt 1 major defect
  • Coding standards
  • Bugs / KLOC lt 5

74
Severity of Defect
  • Serious
  • SRS Errors or ambiguity in requirements or
    quality attributes
  • Design Docs Errors in architectural views.
    Quality attributes not promoted.
  • Code wrong functionality

75
Map Goal - Strategy
76
For Code
  • Static Analysis (cool!)
  • Reviews
  • Peer review (check list)
  • Unit testing
  • Integration testing.
  • System testing (performance, reliability)

77
Responsibilities
78
Responsibilities
79
Environment Setup for testing
  • NISTNet for network emulation
  • Gnuplot for plotting graphs
  • Bugzilla for bug tracking

80
For the technically inclined
  • Graph represents ideal network
  • One flow
  • 10 ms RTT
  • No buffering
  • Less than 26 Mbits/s
  • ((3210248bits/10 ms)1000)/1Million)

81
What went well
  • Balanced management activities and deliverables
  • Defined threshold of success
  • Tracked earned value
  • Planned for summer

82
What could have been better
  • Communicate with the client about how we plan to
    approach the problem
  • Document the requirements so that both client and
    team has a clear understanding
  • Plan ahead and create prototype to mitigate
    technology risks and satisfy client

83
Process improvements for summer
  • Continuously identify and mitigate risks
  • Find dependencies between tasks
  • Have entry and exit criteria for tasks
  • Conduct knowledge sharing sessions on demand
  • Use quality status reports for code
  • Plan weekly

84
Risk mitigation
  • Used brainstorming sessions for mitigating top
    three risks
  • Tracked and re-prioritized risks

85
Mitigation strategies for top 3 risks
86
For the technically inclined
  • Graph represents ideal network
  • One flow
  • 10 ms RTT
  • No buffering
  • Less than 26 Mbits/s
  • ((3210248bits/10 ms)1000)/1Million)
Write a Comment
User Comments (0)
About PowerShow.com