O I N T - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

O I N T

Description:

Time. Tracking & Control. Estimation. vs. Actual. Planning ... 5. Tracking & Oversight. Carnegie Mellon University. CMU/MSE. Point team 22. 3. Software Process ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 41
Provided by: Davi620
Category:

less

Transcript and Presenter's Notes

Title: O I N T


1
O I N T
April 30, 2004 The End Of Semester
Presentation
2
Final Presentation on Studio Project
  • Customer
  • Samsung SDS
  • Team Members
  • Mentors

Jaepyoung Kim Hyungjin Joo Sunghwan Choi
Heeran Youn Hyochin Kim
  • Technical Writer

Mel Rosso-Llopart Danhyung Lee Choonbong
Lim Poornima
3
AGENDA
  • Project Background
  • Architecture
  • Software Process
  • Next Plan
  • Lesson learned
  • Question

4
1. PROJECT BACKGROUND
BUSINESS
TSP Support Tool
Facilitate TSP Minimize Effort Maximize Data
  • Team Software Process
  • Set up Metric Based Process
  • Improve Quality

5
1. PROJECT BACKGROUND
TSP(1)
SEPG 2003 Applying TSP to CBD Projects
6
1. PROJECT BACKGROUND
TSP(2)
Management Resource Artifacts Job Distribution
Data Collection Defect Size Time
INFORMATION Earned Value Analysis Reasoning
Tracking Control Estimation vs. Actual
Planning WBS Quality Size Estimation Effort
Estimation
7
1. PROJECT BACKGROUND
OVERALL PLAN
Fall
Spring
Summer
960 hr
960 hr
960 hr
960 hr
960 hr
Requirement
Design
Implement
Test
Deploy
SOW SRS SPMP
Acceptance Test Install Delivery
Source Code Test Cases Unit Test Integration User
Document
SOW (2.0) SRS (2.0) SPMP (2.0) Architecture HLD DL
D Test Plan
Requirement
Implement
Test
Deploy
Design
Semester
Hours
Plan
Changed Plan
Legend
8
1. SPRING Semester
WHAT WE DID
  • 22 artifacts to the customer
  • UI Standard, Naming Convention, Message Standard,
    Requirement List, etc..,
  • Prototype 1st iteration
  • Catch our capability and performance
  • Estimation on project
  • Figure out which design approach is best
  • Making Standard

9
1. PROJECT BACKGROUND
Context Diagram
WBTS (Web Based TSP Support Tool System)
Administrator
SMTP Server
Notification Data
Create Project
HR System
Other Team Member
Personnel Data
Refer other project data
Manage personal Process, product data
Manage Project Data
Team Member
Team Leader
LEGEND
Database connection
Message
Interaction
External Entity (People)
System
External Entity (System)
10
2. ATCHITECTURE
ATAM-UTILITY TREE
Key / Usability, Modifiability, Performance,
Availability
11
2. ARCHITECTURE
EVALUATION-ATAM
  • ATAM Process Evaluation
  • Driven by business
  • Cherish the interesting of the stakeholder
  • Coordinate the different perspectives of the
    stakeholder
  • Articulate the requirements of stakeholder
  • Scenario based on stimulus, source, environment,
    artifacts, response, and measure
  • Figure out non/functional and architectural
    requirement in terms of quality attributes
  • Analyze architectural decision
  • Risk, Sensitive Point, Tradeoff
  • Provide reasoning on architectural decision
  • Satisfy the stakeholder with best choices

12
2. ARCHITECTURE
COMPONENT CONNECTOR View
13
2. ARCHITECTURE
PHYSICAL VIEW
14
2. ARCHITECTURE
MODULE VIEW
15
3. Software Process
1. SRE(1)
16
3. Software Process
1. SRE(2)
17
3. Software Process
1. SRE-Mitigation Plan(3)
The scope of functionality of the system might
change due to development teams design decision.
  • Problems
  • Scope is not still changeable
  • Knowledge of Fault Tolerance is still lack
  • Knowledge of some features for usability such as
    wireless connection and local application is poor
  • Plan
  • Freeze Architecture through ATAM with customers
    as soon as possible
  • Prioritize the critical design decision
  • The critical design decision should be made
    before fixing the scope

18
3. Software Process
2. Development Process
  • First choice is RUP
  • Reasons for choosing
  • Well defined process (Templates and examples)
  • Supporting tool (RUP web version)
  • Reflection of Method Class
  • What we did with RUP
  • Iteration Plan and Test Plan
  • Class diagrams and Sequential Diagrams for 3 Use
    cases
  • Second Choice is TSP
  • We will define development process in TSP Launch
  • Team will use RUP template is supplementary
  • What we learned from RUP
  • Too many artifacts
  • OOP concept is essential to use RUP
  • Using RUP template is not following RUP
  • Architecture is needed to use RUP

19
3. Software Process
3. Requirement Management
  • What we did for Requirement Management
  • Managing Requirements with Use cases
  • 16 prioritized use cases
  • Analyzing quality attribute requirement with ATAM
  • Our Weakness
  • Not using Requirement Trace Table (Risk)
  • Documented requirements from customer are not
    available
  • What would we do differently
  • Using Requirement Management tool
  • Using actively CCB (Configuration Control Board)
  • Using TSP ANAL process for impact analysis

20
3. Software Process
4. Project Planning
  • What we did for Project Planning
  • TSP Relaunch 3 times (75 hours)
  • All team members participate and agree on
    planning
  • What we did for Project Planning
  • Planning artifacts TSP workbook includes size
    estimation, resource estimation, and schedule
  • Our Weakness
  • Not focusing on overall project planning
  • High cost for planning (about 10 )
  • What would we do differently
  • Following strictly LAU 4 process
  • Extend Launch cycle from one month to one semester

21
3. Software Process
5. Tracking Oversight
  • What we did for Tracking Oversight
  • Conducting Weekly Status meeting
  • Using Earned Value Tracking
  • SPI (Schedule Performance Index) Earned
    Value/Planned Earned Value
  • CPI (Cost Performance Index) Planned Cost/
    Actual Cost
  • Replanning when SPI lt 0.7 or SPI gt1.3
  • Spring Semester status
  • SPI 0.88 (12 delay)
  • CPI 1.02 (2 overestimation)
  • Weakness
  • No size tracking
  • No quality tracking
  • What would we do differently
  • Active QA activity for data collection

22
3. Software Process
6. Quality Assurance
  • Our Weakness
  • Even though team has a quality assurance plan, it
    is not detailed.
  • We dont have Quality Assurance process.
  • No one watches QA activity of Quality manager.
  • What would do we differently
  • Separate Team lead and Quality manager.
  • Define Quality Assurance process.
  • Conduct QA audit every month and Team leader
    reviews the audit result.

23
3. Software Process
7. Peer Review
  • What we did for Peer Review
  • We conducted
  • Fagan style formal Inspection
  • Walkthrough
  • Our Weakness
  • Checklists for Inspection do not exists
  • Due to not clear exit criteria for artifacts,
    sometimes team conducts inspections with
    uncompleted products.
  • Insight
  • 3 participants are appropriate for Inspection
  • What would do we differently
  • First define checklists for inspection
  • Make Inspection plan

24
3. Software Process
8. Test Plan
Aug
May
June
July
Implement
Test
Deploy
Design
Test Design (July 21 July 25)
Test Execution (Aug 2)
Test Planning (Apr 19)
Test Development (July 30)
Test Evaluation (Aug 4)
  • Test Features
  • Software Functionality
  • Software Performance
  • Interfaces
  • Availability (Fail-over/ Recovery)
  • Security

25
4. Next Plan
1. Next Plan
  • Launch Next Week
  • Scope the rest of project
  • Establish goals
  • Define team roles
  • Assess risks
  • Produce a comprehensive team plan

A defined and measured process framework for
managing, tracking, and reporting on the teams
work
26
5. Lessons learned
5. Lessen Learned
  • Use 60 of unit hours rather than 100 because of
    meeting or other overhead
  • To make studio dedication time efficient
  • Written agenda
  • Clear goals
  • Make team meeting short
  • TSP Launch
  • 3 launches in this semester
  • Cost
  • (ex) Last launch 1 day for 3 weeks plan
  • The ratio of planned time to planning time is too
    small
  • It is hard to elicit all tasks at launch
  • Specify problems is better than generalizing them
    to analyze
  • (ex) Goal tracking form (Building a jelled team)
  • What if defects are found?
  • Revise immediately Vs. Next Iteration

27
6. Questions
28
2. Spring Semester Milestones
II. Plan Milestones
P L A N
A C T U A L
TSP Launch (Jan 31) Revise Requirement List (Feb
2) HLD for Project Prototype (Feb 9) DLD for
Project Prototype (Feb 23) Revise SRS (Mar
1) Revise SPMP (Mar 15) Test Case for Project
Prototype (Mar 1) Implementation for Project
Prototype (Mar 23) Design Standard (Mar 27) UI
Prototype (Mar 30) HLD (Apr 15) Test Plan (Apr
15) Mini-SRE (Apr 22) ATAM (Apr 26) Architecture
(Apr 26) DLD (May 7)
(Feb 2) TSP Launch (Feb 16) Revise Requirement
List (Feb 2, 20, Mar 20) HLD for Project
Prototype (Feb 15, Mar 30) DLD for Project
Prototype (Mar 1) Revise SRS (Mar 15) Revise
SPMP (Mar 26) Test Case for Project
Prototype Implementation for Project
Prototype (Mar 26) Design Standard (Mar 30) UI
Prototype HLD (Apr 15) Test Plan (Apr 22)
Mini-SRE (Apr 25) ATAM (Apr 29) Architecture DLD
Jan
Feb
Mar
Apr
May
29
2. ATCHITECTURE
ATAM-UTILITY TREE
30
7. Scenario 1 analysis - Availability
3. Architecture
SCENARIO When a Servlet Container is dead, the
system detects faults in 1 second, provides
failover within 1 minute, and recovers the
Servlet Container within 2 minutes.
  • RISK
  • R1. Global ReplMan itself can be dead.
  • R2. Local RepMan itself can be dead.
  • R3. Socket Communication can cause security
    problems.
  • SENSITIVITY
  • S1. Global ReplMan should know which Servlet
    Container is the primary.
  • S2. Global RepMan should know where the replicas
    are located in.
  • S3. The backup should save all log of primary
    into its own log-file.
  • TRADEOFF
  • T1. Global RepMan increases the reliability of
    the system.
  • T2. Global RepMan increase usability
  • T3. Global RepMan can be bottle neck of the
    transaction.
  • T4. Socket Communication can decrease security
  • T5. There is strong coupling between Global/Local
    RepMan and Logger.

31
8. Alternative on Scenario 1 (Availability)
3. Architecture
- Passive Replication based on Eventbus -
Performance (), Modifiability (-),
- Passive Replication based on Socket -
Performance (-), Modifiability ()
KEY
32
2. ARCHITECTURE
TRADE OFF 1
Load Balance gt Performance
Fault Tolerance gt Availability, Reliability
33
Step 2. Present Business Drive
  • Stakeholders
  • Samsung SDS, Point Team, Mentors
  • Important Functions
  • WBS Building
  • Schedule Planning, Controlling, and Tracking
  • Quality Planning, Controlling, and Tracking
  • Constraints
  • Economic Constraints Resources, Time
  • Technical Constraints Web-Based Environment,
    COTS, Reflection what we learnt from classes
  • Managerial Constraints Usable, Modifiable,
    Performance, and Available

34
Step 2. Present Business Drive cont.
  • Business goals and context
  • Reduce cost in gathering and analyzing data
    (size, time, defect of projects) effectively and
    efficiently
  • Apply TSP to 20 projects in the next business
    year
  • Architectural drivers
  • Usability
  • The input time on LOGD and LOGT should be shorter
    than the Excel-based TSP Support Tool.
  • Modifiability
  • Design should be considered for further
    development and extension
  • Performance
  • The system responses within 3 seconds up to 9th
    out of 10 users
  • Availability
  • Avoid single points of failure in the middle tier

35
Glossary
  • SUMS Using this page a user can establish main
    plans for artifacts which should be done during
    the project or cycle including size, time, and
    defect.
  • TASK For producing defined artifacts, detailed
    tasks should be planned. Through this page, a
    user can plan team tasks and individual tasks or
    record the finished tasks, and also tasks can be
    assigned to members.
  • SCHDULE A user can predict the length of the
    project or cycle by assigning work hours for each
    week.
  • LOGT A user can record what he does during the
    work days.
  • LOGD A user can record the defects which he
    finds during works.
  • IRTL A user can manage risks or issues
  • SUMP This is a report for showing the
    progressing of the project with program size,
    hours for each phase, and defects by comparing
    plan and actual data.
  • SUMQ By tracing defects a user can report the
    quality of the projects
  • IRWEEK This is risk report by weeks
  • PROJECT A user can establish, update, and delete
    a project or a cycle.
  • CODE To categorize defect types, phases, roles,
    and so on, the system uses defined code value
  • HRDB Human Resource Database. The data will be
    imported into the system, but this data will not
    be managed by the system. This data will be used
    just for reading.

36
3. Software Process - Inefficient Team Meeting
1. SRE-Mitigation Plan(3)
  • Problems
  • Opinion conflict among members
  • No clear meeting goals
  • Members arrives late.
  • Plan
  • Conflict Resolution Process
  • No written agenda, No meeting!!
  • Penalty 1 per 10 minutes

37
Estimation Vs Actual- Fall
  • Function Point
  • Scope Excel-based TSP Support Tool
  • The Unadjusted Function Point
  • Sum 341
  • 40.7 MM by COCOMO II
  • Validation for Analysis phase
  • Function Point 2.85 man-month 160 456 hours
  • Actual Time 409.5 hours
  • Ratio 409.5 / 456 0.89
  • Future Estimation
  • Product Design 985.44 hours
  • Detailed Design 1500.32 hours

38
Estimation Vs Actual- Spring
Next Semester 12 48 5 2880 60
Rule28800.6 1728(80)
39
GOAL - MOSP
40
Success Criteria 2003 FALL
  • Success Criteria
  • More than 10 teams can use in August 2004
  • System Test defect density is less than 0.1
    defects/KLOC
Write a Comment
User Comments (0)
About PowerShow.com