ITOM 6231 Special Topics in ITOM Project Management Section 6 - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

ITOM 6231 Special Topics in ITOM Project Management Section 6

Description:

Testing during the project ... Unit testing during code development ... Saving testing until the end only allows you to put metrics on the results it ... – PowerPoint PPT presentation

Number of Views:33
Avg rating:3.0/5.0
Slides: 45
Provided by: facul93
Category:

less

Transcript and Presenter's Notes

Title: ITOM 6231 Special Topics in ITOM Project Management Section 6


1
ITOM 6231 Special Topics in ITOMProject
Management (Section 6)
2
Integration and Testing
3
Testing during the project
  • Testing should be incorporated from the very
    beginning and throughout the project
  • Planning/inception phases
  • Measurable test results defined in project
    objectives section of project charter
  • Testing team should be incorporated into your
    project organization
  • Analysis/elaboration phases
  • Use case variations
  • Use case scenario development
  • Design/construction phases
  • Unit testing during code development
  • Extreme Programming (XP) an alternative SDLC
    methodology actually requires building tests
    prior to building code

4
The role of the quality assurance (QA) team
  • The QA team is normally a separate organizational
    component of your project team
  • Need to be objective
  • Able to focus on testing activities
  • May be shared across project teams
  • The team should start as early as possible
  • Saving testing until the end only allows you to
    put metrics on the results it doesnt allow you
    to react to the results
  • Need to balance resource costs vs. value, however
  • Should be kept out of any decision making or
    product directional discussions

5
QA process
Test Matrix
Test Plan
QA Assessment
Traceability Matrix
Defect Reports
Test Cases
Test Scripts
6
Pieces of the QA process
  • Test matrix
  • Defines the QA methodologies for the project
  • Looks for overall holes in what is defined for
    the project so that QAs job can be comprehensive
  • Test plan
  • A QA specific project plan
  • Should be finalized in parallel with the overall
    project plan and elaborated with it as well
    however, the specific items on the plan can not
    change or QA will be invalid
  • Traceability matrix
  • Tracks which tests have been run, which are
    scheduled, etc.
  • Traces back to specific requirements and features
    of the system
  • Used a measuring stick to determine how much of
    the system has been tested and what amount has
    been validated
  • Test cases / test requirements
  • Used for manual testing of the system
  • Describes a generic input/output that is expected
    from the system
  • Anomalies between test cases and actual results
    are the basis of defects

7
Pieces of the QA process (cont)
  • Test scripts / test scenarios
  • A particular process in the system to be tested
    end-to-end
  • Typically used for regression testing or to
    measure major feature completion
  • Defect reports
  • Reporting of bugs / defects / anomalies, etc.
    within the system
  • Important focus of the project manager since this
    is a large component of understanding remaining
    scope
  • QA risk assessment
  • The overall status of the project from a QA
    perspective
  • of system tested, of system passed vs. of
    system failed, total number of high, medium or
    low defects, etc.

8
White-box vs. black-box testing
White-box / Structural Testing
Known Inputs
Known Outputs
Black-box / Functional Testing
?
Known Inputs
Known Outputs
9
Types of testing
  • Unit testing
  • White-box testing of individual code
    modules/methods i.e., a unit
  • Written by the developer should be automated so
    that all new builds can call unit tests to
    regression unit test the code
  • Smoke testing
  • Turn it on and see if it blows up
  • Used to test the basic functionality of the
    system following a new build, etc.
  • Integration testing (also String testing)
  • End-to-end testing of software components to make
    sure that they can successfully talk with each
    other
  • String testing is the idea of starting with the
    GUI all the way back to the database
  • Should also focus on installation/side-by-side
    existence issues
  • Acceptance testing (also known as User acceptance
    testing UAT)
  • Towards the end of the testing process to
    simulate a launched system
  • Run by end-users/business sponsors of the project
    to make sure that the system works as planned

10
Types of testing (cont)
  • Beta testing
  • Beta system functionally complete but with known
    defects
  • Opened up into production for a limited user
    community normally invited to try the system
    and iron out any major issues that are difficult
    to simulate purely by a QA team
  • Pilot testing similar concept but with a more
    controlled group or tailored to a specific type
    of end user instead of the general user
    population
  • Usability testing
  • Typically in a controlled/monitored environment
    to capture how end-users interact with the system
  • Normally run during the design/elaboration phase
  • Regress testing
  • Regression means to re-run exactly the same set
    of tests on the system as had been run previously
  • Typically run after a new build to ensure that
    fixed defects do not open new defects to the
    general code base
  • As opposed to testing just documented changes to
    the system

11
Types of testing (cont)
  • Benchmarking
  • Comparison of system (especially performance) in
    regard to industry standard or expected results
  • Used to find potential non-functional defects
  • Performance testing
  • Used to test how the system performs in a real
    environment under production expected loads
  • Includes load, stress and stability testing
  • Security testing
  • Used to test the overall system against potential
    security issues
  • Tests all components of the application including
    hardware, network, software (operating system,
    middleware, custom coded components, packages,
    etc.)

12
Code complete
  • Important code milestones
  • Code freeze / code complete the date when no new
    features of the system should be implemented
    only defect fixes are allowed at this point
  • Alpha a release/build of the system that is
    functionally incomplete with known defects
  • Beta a release/build of the system that is
    functionally complete with known defects
  • RC (Release Candidate) a release/build of the
    system that is functionally complete with no
    known defects
  • GA (General availability) a release of the
    system
  • It is extremely important to stick to code freeze
    dates
  • Regression testing will hopefully find bugs in
    any new releases, but there is no guarantee
  • Side effects are the biggest potential
    challenge to a program with a GA looming, there
    is no contingency left in case of mistakes

13
Deployment Project Closure
14
Deployment
  • Often the calm before the storm
  • Hopefully, youve tested the system, had users
    accept it, passed all of your performance
    testing, etc.
  • In theory, there is no difference between theory
    and reality but, in reality, there is
  • Do not dramatically staff down just prior to
    deployment unless it is out of scope
  • Make sure users are fully trained (as
    appropriate)
  • Cut-over plan
  • At the time of go-live there is normally a
    cut-over plan that is prepared for the final
    days/hours
  • Includes any last minute data migration
  • Includes any last minute configuration (e.g.,
    making the site available on the Internet)
  • Includes resource plans / communication plans for
    support
  • Tier 1 first level support for basic issue such
    as computer problems
  • Tier 2 second level support for application
    related issues common problems that may require
    some level of administrative access to fix
  • Tier 3 one-off situations that require
    technical knowledge of the system
  • Tier 4 issues caused by defects that need to be
    fixed in the system

15
Project close out
  • Close the books on the project
  • Finalize all change requests, vendor purchase
    orders, etc.
  • Update any contracts that may be milestone based
  • Plan a lessons learned session
  • Hagdens law Good judgment comes from
    experience. Experience comes from bad judgment.
  • Can often been heated use an external
    facilitator
  • Techniques like SWOT analysis are a good tool for
    this especially since they tend to be balanced
    and start with strengths over improvement areas
  • Taking action on the lessons learned
  • For each major lesson, assign a resource (or
    team) to make progress often times it may just
    be further investigation, but it might be
    something you can fix
  • Improvements from lessons learned are the best
    way to make the process much smoother next time

16
Project closure report
  • Just as you have a project kick-off, you should
    hold a project-close out meeting
  • Review the goals of the project and emphasize the
    measurable objectives you are on the verge of
    achieving them (hopefully!)
  • Provide details on how/why the project was closed
    (not all close outs are good news)
  • Provide a complete listing of all deliverables
    provide a project binder to key stakeholders
  • Identify outstanding risks that were not fixed
    within the scope/budget of the project
  • Review budget (baseline vs. actual)
  • Review next steps following the project for
    either operations, the next phase, etc.
  • Closing an abandoned project
  • You may or may not go through as formal of a
    process especially since it can be demoralizing
  • Still important to have lessons learned session
    (even if informally)
  • Be a good leader dont let team members get
    overly emotional

17
Effective Leadership
18
What makes a good leader?
  • Good decision making skills
  • Ownership/responsibility
  • Good problem solving skills
  • Assertive
  • Positive attitude and eagerness to learn
  • Good listening skills
  • Respectful of others
  • Values diversity
  • Recognizes and resolved conflict
  • Empowers others
  • Effective meeting-management skills
  • Sense of dedication and loyalty to the project,
    team and company
  • Formulate a clear vision and shares it with
    others
  • Effective delegation skills
  • Keeps people enthused and motivated
  • Good writing/presentation skills and excellent
    verbal communication skills
  • Good negotiator
  • Abundance of energy
  • Patience

19
Leading a team
  • Understand what is important to your team members
  • Leaders are good at setting overall direction
    but good team leaders are able to translate that
    to individuals an individual sense of ownership
    is key
  • Recognize opportunities to coach/mentor
    individual team members and dont forget to
    leverage their strengths
  • Working side-by-side is critical to acceptance
    and respect
  • Manage performance and performance reviews
  • Set clear expectations for individuals
  • At least every 3 months, provide feedback on
    progress against their individual goals
    independent of project success (they may be under
    or over achieving)
  • Give individuals achievable goals but also give
    them stretch goals
  • Use consistent ratings across individuals

20
Managing conflict
  • Fight human nature
  • Conflicts are often brushed under cover but
    they are ultimately more hurtful to individuals
    and the project
  • Conflicts must be handled face to face start
    off individually then have the conflicted parties
    in the same room
  • Dont be afraid to ask for suggestions, etc. a
    self-effacing attitude can help disarm someone
    and let them feel that they are being listened to
  • E.g., active listening i.e., repeating back to
    them what you are hearing
  • Speak in non-emotional terms if you are trying to
    fight a highly charged situation
  • Dont assume that you can fix a problem and it
    will go away it will usually resurface,
    especially if you are dealing with certain
    individuals or plain conflicts in personalities

21
Leadership and communication styles
Task Oriented (Analytical)
Analytical (Professor)
Driver (Controller)
Strengths Logical Dependable Organized Precise Sys
tematic
Challenges Over analyze Avoids confrontation Draws
attention away from issues Retreats to other
distractions Delays decisions controls emotions
Strengths Independent Persistent Strong-Willed Pra
gmatic Efficient
Challenges Wants accomplishment Confronts
others Focuses on results performance Looks for
rationale Money motivated
Telling
Asking
Strengths Cooperative Negotiator Supportive Loyal
Listener
Challenges Wants acceptance People-focused Yields
to other viewpoints Smoothes relationships Avoids
conflict
Strengths Independent Persistent Strong-Willed Pra
gmatic Efficient
Challenges Wants applause Confronts others Very
opinionated egotistical Blames others on a
personal level Displays extreme emotion
Amiable (Caregiver)
Expressive (Visionary)
People Focused (Emotional/Feeling)
22
Alternative behaviors
If your style is
consider
Analytical (Professor)
  • Show emotional support for the feelings of others
  • Make decsiiosn on the basis of intuition when
    appropriate

Driver (Controller)
  • Listen
  • Show practice concern for others express
    respect for their ideas

Amiable (Caregiver)
  • Initiate action provide some direction and stick
    to goals and objectives
  • Be firm self-assured about your position

Expressive (Visionary)
  • Check and consider the facts and feelings of
    others
  • Keep emotions open but under control

23
CMM
24
The Capability Maturity Model (CMM)
  • The Software Engineering Institute (SEI) -
    Carnegie Mellon University was awarded the
    contract to create a Capability Maturity Model
    (CMM) for software in 1984- funded by US Federal
    Government http//www.sei.cmu.edu
  • It is a model that describes how software
    engineering practices in an organization evolve
    under certain conditions
  • The work performed is organized and viewed as a
    process
  • The evolution of the process is managed
    systematically
  • According to SEI documents, it is
  • a framework which delineates paths for
    organizations to follow that want to increase
    their software process capability.
  • a model that describes key characteristics or
    attributes that would be expected to typically
    characterize an organization at a particular
    level of maturity.
  • a normative model in the sense that it
    addresses what is considered to be normal
    behavior for an organization doing large-scale
    projects.

25
CMM Defines Five Software Process Achievement
Levels
Initial
  • Chaotic, most work is ad hoc
  • People argue against engineering discipline -
    hurts creativity
  • Project success is due entirely to the heroic
    efforts by people
  • Standards and practices are sacrificed for the
    schedule
  • Results are unpredictable and quality is uneven

26
CMM Defines Five Software Process Achievement
Levels (cont)
Initial
Repeatable
  • A documented process is in use (even in a
    crunch)
  • Similar projects have consistent results in a
    stable environment
  • Basic project management is practiced
  • Reasonable commitments are planned
  • Software engineering techniques are in use

27
CMM Defines Five Software Process Achievement
Levels (cont)
Initial
Repeatable
Defined
  • A standard software development methodology is
    in place
  • It is used consistently throughout the
    organization
  • Everyone knows their role
  • The methodology is specifically adapted for
    each project
  • Results are predictable and quality is
    consistent

28
CMM Defines Five Software Process Achievement
Levels (cont)
Initial
Repeatable
Defined
Managed
  • Measurements are taken at all points during
    development
  • Statistical process control principles address
    process variations
  • Detailed process and product data are available
  • Quality targets are set and predictable

29
CMM Defines Five Software Process Achievement
Levels (cont)
Initial
Repeatable
Defined
Managed
Optimizing
  • Measurements are used to continuously improve
    the process
  • Chronic causes of poor performance are ided
    and eliminated
  • New techniques are prototyped, piloted and
    introduced

30
Cultural Changes Occur at Each Level in the CMM
Empowered and Innovative Culture -- processes are
steadily improved
Measured Excellence -- processes perform within
quantitative bounds
Common Engineering -- common processes used
across the organization
Culture of Commitment -- process is protected
within projects
Imported Cultures -- individuals do there own
thing
31
CMM is Based on a Process Maturity Framework
Key Process Areas
Cluster related activities that, when
performed collectively, achieve goals that
enhance process capability
32
CMM is Based on a Process Maturity Framework
(cont)
Key Process Areas
Process Maturity
constitute
An organizations ability to consistently follow
and improve its process
33
CMM is Based on a Process Maturity Framework
(cont)
Key Process Areas
Process Maturity
constitute
Process Capability
enable
indicates
The range of results expected from following the
process
34
CMM is Based on a Process Maturity Framework
(cont)
Key Process Areas
Process Maturity
constitute
Process Capability
enable
indicates
predicts
Process Performance
The actual results achieved from following a
process
35
There Are Key Process Areas for Each Maturity
Level
36
Key Process Areas (KPAs) Level 1 (Initial)
  • None

37
Key Process Areas (KPAs) Level 2 (Repeatable)
  • Requirements Management
  • Requirements/scope can be baselined
  • Plans, activities and deliverables are consistent
  • Software Project Planning
  • Project and activities estimates are documented
    and tracked
  • People agree with their project commitments
  • Software Subcontract Management
  • Contractor selection is effective, contractors
    maintain ongoing communications and meet project
    commitments
  • Software Quality Assurance
  • QA activities are planned, adhered to and
    verified
  • Noncompliance to QA standards, procedures and
    requirements is resolved in the project and
    addressed by senior management
  • Software Configurations Management
  • Software configuration management activities are
    planned and controlled
  • Software products are controlled and status
    maintained

38
Key Process Areas (KPAs) Level 3 (Defined)
  • Organization Process Focus
  • Process development and improvement activities
    are coordinated across the organization
  • Process standards are used to evaluate the
    strengths/weaknesses of the software processes
  • Improvement activities are planned
  • Organization Process Definition
  • A standard software process is developed and
    maintained
  • Information is collected, reviewed and shared
  • Training Program
  • Training activities are planned for the skill
    knowledge needs of management technical roles
  • Individuals are trained in the skills needed to
    perform their roles
  • Integrated Software Management
  • The organizations standards, software process
    and plan are tailored to the project
  • Software Process Engineering
  • Engineering tasks and products are defined and
    consistently performed across the organization
  • Intergroup Coordination
  • All groups agree to customer requirements
  • Inter-group activities are identified and tracked
  • Issues are resolved
  • Peer Reviews

39
Key Process Areas (KPAs) Level 4 (Managed)
  • Quantitative Process Management
  • Metrics aspects of the software process are
    planned and quantitatively defined
  • Process capability can be expressed in
    quantitative terms
  • Software Quality Management
  • Project quality management activities are planned
    and measured
  • Priorities are defined
  • Actual versus planned quality goals are
    quantifies and managed

40
Key Process Areas (KPAs) Level 5 (Optimizing)
  • Defect Prevention
  • Defect prevention activities are planned
  • Causes are identified, prioritized and
    systematically eliminated
  • Technology Change Management
  • Incorporating technology changes into the
    organization is planned
  • New technologies are evaluated to determine the
    effect on quality and productivity
  • New technologies are transferred into the normal
    software process across the organization
  • Process Change Management
  • Continuous process improvement is planned and
    participation in these efforts is organization
    wide
  • Standards and project processes are continuously
    improved

41
Few Organizations Have Achieved High Maturity
Levels
42
Criticisms of the CMM
  • Reflects the large, aerospace, contract-software
    development environment
  • Touches on only a small part of good software
    development practices (316 specific recommended
    practices)
  • Documentation is difficult to follow, as opposed
    to English
  • CMM practitioners have not compiled large
    databases of project information for comparison
    purposes

43
Misconceptions About the CMM
  • Maturity Levels
  • If you are Level 1, you are pond scum
  • Level 2 is about software engineering activities
  • You must perform all activities and practices
    defined at some maturity level to achieve that
    level
  • Software measurement is not required until you
    are approaching Level 4
  • The SEI certifies an organization at a specific
    maturity level
  • Practices
  • The CMM requires specific software development
    practices, tools and methods
  • Software Quality Assurance is mostly about
    testing
  • The CMM requires software inspections to achieve
    Level 3
  • Having a tailorable process means you can do
    whatever you want
  • Requirements management is the same as
    requirements engineering
  • Application
  • You cannot work on improving key process areas
    more than one maturity level higher than your
    current level
  • The CMM mandates bureaucracy and wasteful
    paperwork
  • The CMM is a quick fix for short term problems

44
Review for exam
  • Exam format
  • Multiple choice / true false section (about 30)
  • Short answer (about 20)
  • Long answer (1 of 2 or 3)
  • What it covers
  • All materials presented in class
  • Materials from the text are fair game, but they
    will not be an emphasis of the exam
  • Rules
  • Closed book / closed note
  • You will have up to 2 hours to complete the exam
  • Final scores will be emailed out to you if you
    would like a graded copy, please provide a
    self-addressed, stamped envelope before the exam
    so that I can send it back
Write a Comment
User Comments (0)
About PowerShow.com