Requirements Engineering - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

Requirements Engineering

Description:

Requirements Engineering Quality Management Software Quality Management What is Quality in Software ? The end product should meet the specification Issues Bad or ... – PowerPoint PPT presentation

Number of Views:193
Avg rating:3.0/5.0
Slides: 60
Provided by: bost75
Category:

less

Transcript and Presenter's Notes

Title: Requirements Engineering


1
Requirements Engineering
  • Quality Management

2
Software Quality Management
  • What is Quality in Software ?
  • The end product should meet the specification
  • Issues
  • Bad or imperfect specification
  • Non-functional requirements
  • What is Quality Management ?
  • Defining standards and policies to be followed in
    development
  • Checking to see that they are followed
  • In addition, develop a quality culture
  • Functions of Quality Management
  • Quality Assurance establishing the frame work
  • Quality Planning the use of the framework in
    planning specific projects
  • Quality Control The process by which compliance
    with the standards and processes is ensured

3
Software Quality
  • Quality Management is a separate process
  • Needs independence
  • From budget
  • From schedule
  • From project management chain
  • From Product Development Groups
  • ISO 9000 a guide to quality process
  • ISO 9001 general applicability to product
    development
  • ISO 9000-3 interprets ISO 9000 for software
    development
  • Deliverables from the software process are
    submitted to QC for review

4
ISO 9000 and Quality
  • ISO 9000 supplies Quality Models
  • Subset developed as Organizational quality manual
  • which documents the organization quality process
  • The subset is the basis used to develop project
    quality plan
  • Project quality management uses the plan to
    enforce the organizational standards
  • See text for references to ISO materials

5
ISO 90012000 Standard
  • ISO 90012000 is the quality assurance standard
    that applies to software engineering.
  • The standard contains 20 requirements that must
    be present for an effective quality assurance
    system.
  • The requirements delineated by ISO 90012000
    address topics such as
  • management responsibility, quality system,
    contract review, design control, document and
    data control, product identification and
    traceability, process control, inspection and
    testing, corrective and preventive action,
    control of quality records, internal quality
    audits, training, servicing, and statistical
    techniques.

6
Quality assurance and standards
  • QA defines the framework for achieving quality
    software
  • Defines the standards
  • Product
  • Process
  • Provide
  • Best practices - make the success of others
    available
  • Provides a checklist to judge if standards have
    been followed
  • Continuity institutional memory
  • Sources
  • IEEE
  • ANSI
  • US DoD
  • NATO

7
Basic definitions
  • A failure is an unacceptable behaviour exhibited
    by a system
  • The frequency of failures measures the
    reliability
  • An important design objective is to achieve a
    very low failure rate and hence high reliability.
  • A failure can result from a violation of an
    explicit or implicit requirement
  • A defect is a flaw in any aspect of the system
    that contributes, or may potentially contribute,
    to the occurrence of one or more failures
  • It might take several defects to cause a
    particular failure
  • An error is a slip-up or inappropriate decision
    by a software developer that leads to the
    introduction of a defect

8
Effective and Efficient Testing
  • To test effectively, you must use a strategy that
    uncovers as many defects as possible.
  • To test efficiently, you must find the largest
    possible number of defects using the fewest
    possible tests
  • Testing is like detective work
  • The tester must try to understand how programmers
    and designers think, so as to better find
    defects.
  • The tester must not leave anything uncovered, and
    must be suspicious of everything.
  • It does not pay to take an excessive amount of
    time tester has to be efficient.

9
Standards
  • Documents require standards for
  • Process production i.e. Creation thru print
    final
  • Documents - structure and presentation
  • Identification
  • Structure
  • Presentation font, styles, logos, etc.
  • Update version control
  • Interchange exchange compatibility
  • Process
  • How do we improve quality in the product
  • Feedback
  • Standardization

10
Quality Planning and Control
  • Plan
  • Developed early
  • Addresses the most important software quality
    attributes
  • Safety
  • Security
  • Reliability
  • Etc.
  • Control
  • Quality reviews
  • Design or program inspection errors in
    requirements, design or code
  • Progress reviews schedule, budget
  • Quality the whole package

11
Measurement and Metrics
  • Metrics
  • Not widely used in software industry
  • Lack of standards
  • Lack of standard processes
  • Control measures associated with process
  • Time to repair defects
  • Time to modify or enhance
  • Predictor associated with the product
  • Cyclomatic count
  • Fog factor
  • Size
  • Measurement process
  • Choose measurements
  • Select components
  • Measure
  • Identify anomalous measurements
  • Analyze components

12
Product Metrics
  • Product Metrics
  • Concerned with the software itself
  • Dynamic measurements made during execution
  • Static made of the representations
  • Design
  • Program
  • Documentation
  • Dynamic assess
  • Efficiency
  • Reliability
  • Relatively easy to measure
  • Static
  • Complexity
  • Understandability
  • Maintainability

13
Defect testing
  • Goal expose latent defects in the system before
    it is implemented (delivered)
  • Successful test causes the system to perform
    incorrectly
  • Demonstrates the presence of program faults
  • Test case
  • Specifications of input and output
  • Statement of what is being tested
  • Test data
  • Inputs devised to test the code.
  • Test thoroughness
  • Exhaustive not possible
  • Policies of the organization not the development
    team

14
Documentation defects
  • Defect
  • The software has a defect if the user manual,
    reference manual or on-line help
  • gives incorrect information
  • fails to give information relevant to a problem.
  • Testing strategy
  • Examine all the end-user documentation, making
    sure it is correct.
  • Work through the use cases, making sure that each
    of them is adequately explained to the user.

15
Writing Formal Test Cases and Test Plans
  • A test case is an explicit set of instructions
    designed to detect a particular class of defect
    in a software system.
  • A test case can give rise to many tests.
  • Each test is a particular running of the test
    case on a particular version of the system.

16
Test plans
  • A test plan is a document that contains a
    complete set of test cases for a system
  • Along with other information about the testing
    process.
  • The test plan is one of the standard forms of
    documentation.
  • If a project does not have a test plan
  • Testing will inevitably be done in an ad-hoc
    manner.
  • Leading to poor quality software.
  • The test plan should be written long before the
    testing starts.
  • You can start to develop the test plan once you
    have developed the requirements.

17
Information to include in a formal test case
  • A. Identification and classification
  • Each test case should have a number, and may also
    be given a descriptive title.
  • The system, subsystem or module being tested
    should also be clearly indicated.
  • The importance of the test case should be
    indicated.
  • B. Instructions
  • Tell the tester exactly what to do.
  • The tester should not normally have to refer to
    any documentation in order to execute the
    instructions.
  • C. Expected result
  • Tells the tester what the system should do in
    response to the instructions.
  • The tester reports a failure if the expected
    result is not encountered.
  • D. Cleanup (when needed)
  • Tells the tester how to make the system go back
    to normal or shut down after the test.

18
Levels of importance of test cases
  • Level 1
  • First pass critical test cases.
  • Designed to verify the system runs and is safe.
  • No further testing is possible.
  • Level 2
  • General test cases.
  • Verify that day-to-day functions correctly.
  • Still permit testing of other aspects of the
    system.
  • Level 3
  • Detailed test cases.
  • Test requirements that are of lesser importance.
  • The system functions most of the time but has not
    yet met quality objectives.

19
Determining test cases by enumerating attributes
  • It is important that the test cases test every
    aspect of the requirements.
  • Each detail in the requirements is called an
    attribute.
  • An attribute can be thought of as something that
    is testable.
  • A good first step when creating a set of test
    cases is to enumerate the attributes.
  • A way to enumerate attributes is to circle all
    the important points in the requirements
    document.
  • However there are often many attributes that are
    implicit.

20
Software Inspections
  • Software Inspections
  • Program inspections
  • 1970s
  • Line by line code review
  • Defect detection not enhancement
  • Team of four to six usually
  • Author
  • Reader
  • Tester
  • Moderator
  • Maybe scribe and chief moderator
  • Requires
  • Precise spec
  • Members familiar with standards
  • Up to date set of code
  • About 2 hours

21
Reviews Inspections
... there is no particular reason
why your friend and colleague
cannot also be your sternest critic.
Jerry Weinberg
22
What Are Reviews?
  • a meeting conducted by technical people for
    technical people
  • a technical assessment of a work product created
    during the software engineering process
  • a software quality assurance mechanism
  • a training ground

23
What Reviews Are Not
  • A project summary or progress assessment
  • A meeting intended solely to impart information
  • A mechanism for political or personal reprisal!

24
The Players
review
leader
standards bearer (SQA)
producer
maintenance oracle
reviewer
recorder
user rep
25
Conducting the Review
be preparedevaluate
1.
product before the review
review the product, not
2.
the producer
keep your tone mild, ask
3.
questions instead of
making accusations
stick to the review agenda
4.
5.
raise issues, don't resolve them
6.
avoid discussions of stylestick to technical
correctness
7.
schedule reviews as project tasks
8.
record and report all review results
26
Sample-Driven Reviews (SDRs)
  • SDRs attempt to quantify those work products that
    are primary targets for full FTRs.
  • To accomplish this
  • Inspect a fraction ai of each software work
    product, i. Record the number of faults, fi found
    within ai.
  • Develop a gross estimate of the number of faults
    within work product i by multiplying fi by 1/ai.
  • Sort the work products in descending order
    according to the gross estimate of the number of
    faults in each.
  • Focus available review resources on those work
    products that have the highest estimated number
    of faults.

27
Metrics Derived from Reviews
inspection time per page of documentation
inspection time per KLOC or FP (or Use Case)
inspection effort per KLOC or FP (or Use Case)
errors uncovered per reviewer hour
errors uncovered per preparation hour
errors uncovered per SE task (e.g., requirements)
number of minor errors (e.g., typos)
number of major errors (e.g., nonconformance
to User wants/needs vision)
number of errors found during preparation
28
Statistical SQA
Product Process
Collect information on all defects Find the
causes of the defects Move to provide fixes for
the process
measurement
... an understanding of how
to improve quality ...
29
Six-Sigma for Software Engineering
  • The term six sigma is derived from six standard
    deviations3.4 instances (defects) per million
    occurrences implying an extremely high quality
    standard.
  • The Six Sigma methodology defines these core
    steps
  • Define customer requirements and deliverables and
    project goals via well-defined methods of
    customer communication
  • Measure the existing process and its output to
    determine current quality performance (collect
    defect metrics)
  • Analyze defect metrics and determine the vital
    few causes.
  • Improve the process by eliminating the root
    causes of defects.
  • Control the process to ensure that future work
    does not reintroduce the causes of defects.

30
Inspecting compared to testing
  • Both testing and inspection rely on different
    aspects of human intelligence.
  • Testing can find defects whose consequences are
    obvious but which are buried in complex code.
  • Inspecting can find defects that relate to
    maintainability or efficiency.
  • The chances of mistakes are reduced if both
    activities are performed.

31
Testing or inspecting, which comes first?
  • It is important to inspect software before
    extensively testing it.
  • The reason for this is that inspecting allows you
    to quickly get rid of many defects.
  • If you test first, and inspectors recommend that
    redesign is needed, the testing work has been
    wasted.
  • There is a growing consensus that it is most
    efficient to inspect software before any testing
    is done.
  • Even before developer testing

32
Quality Assurance in General
  • Root cause analysis
  • Determine whether problems are caused by such
    factors as
  • Lack of training
  • Schedules that are too tight
  • Building on poor designs or reusable technology

33
Measure quality and strive for continual
improvement
  • Things you can measure regarding the quality of a
    software product, and indirectly of the quality
    of the process
  • The number of failures encountered by users.
  • The number of failures found when testing a
    product.
  • The number of defects found when inspecting a
    product.
  • The percentage of code that is reused.
  • More is better, but dont count clones.
  • The number of questions posed by users to the
    help desk.
  • As a measure of usability and the quality of
    documentation.

34
Post-mortem analysis
  • Looking back at a project after it is complete,
    or after a release,
  • You look at the design and the development
    process
  • Identify those aspects which, with benefit of
    hindsight, you could have done better
  • You make plans to do better next time

35
Meaning of VV (Boehm)
  • Verification
  • are we building the thing right?
  • Validation
  • are we building the right thing?

Graphics reproduced with permission from Corel.
36
Process standards
  • The personal software process (PSP)
  • Defines a disciplined approach that a developer
    can use to improve the quality and efficiency of
    his or her personal work.
  • One of the key tenets is personally inspecting
    your own work.
  • The team software process (TSP)
  • Describes how teams of software engineers can
    work together effectively.
  • The software capability maturity model (CMM)
  • Contains five levels, Organizations start in
    level 1, and as their processes become better
    they can move up towards level 5.
  • ISO 9000-2
  • An international standard that lists a large
    number of things an organization should do to
    improve their overall software process.

37
The PSP Evolution (Humphrey)
PSP3 Cyclic development
1000s of lines
Skills added to prior stage
PSP2.1 Design templates
PSP2 Code reviews Design reviews
100s of lines
Additional capability at the same level
PSP1.1 Task planning Schedule planning
PSP1 Size estimation Test report
PSP0.1 Coding standards Process improvement
proposal Size measurement
PSP0 Current personal process Basic measurements
(Adapted from Hu1 )
38
TSP Objectives 1 (Humphrey)
  • Build self-directed teams
  • 3-20 engineers
  • establish own goals
  • establish own process and plans
  • track work
  • Show managers how to manage teams
  • coach
  • motivate
  • sustain peak performance

Graphics reproduced with permission from Corel.
39
TSP Objectives 2 (Humphrey)
  • Accelerate CMM improvement
  • make CMM 5 normal
  • Provide improvement guidelines to high-maturity
    organizations
  • Facilitate university teaching of
    industrial-grade teams

40
Background - Capability Maturity Model for
Software
  • 1986 Software Engineering Institute, and the
    Mitre Corp. begin to develop a process maturity
    framework to improve software processes
  • 1987 description of the framework
  • Assessment
  • Evaluation
  • 1991 evolved to the Capability Maturity Model for
    Software (CMM v1.0)
  • Recommended practices in key process areas
    (KPAs)
  • Gain control of processes for developing and
    maintaining software
  • Evolve to a culture of software engineering and
    management excellence
  • Current

41
What is the CMM?
  • Concept
  • The application of process management and quality
    improvement concepts to software development and
    maintenance
  • Model
  • A model for organizational improvement
  • Guidelines
  • A guide for evolving toward a culture of
    engineering excellence
  • Basis for Measurement
  • The underlying structure for reliable and
    consistent software process assessments, software
    capability evaluations, and interim profiles

42
Maturity Levels are a Framework for Process
Improvement
  • Based on Continuous Process Improvement
  • based on many small, evolutionary steps rather
    than revolutionary innovations.
  • Plateau
  • A maturity level is a well-defined evolutionary
    plateau toward achieving a mature software
    process.
  • Foundation
  • Each maturity level provides a layer in the
    foundation for continuous process improvement.
  • Priority Order
  • The levels also help an organization prioritize
    its improvement efforts.

43
Symptoms of Process Failure
  • Commitments consistently missed
  • Late delivery
  • Last minute crunches
  • Spiraling costs
  • No management visibility into progress
  • Youre always being surprised.
  • Quality problems
  • Too much rework
  • Functions do not work correctly.
  • Customer complaints after delivery
  • Poor morale
  • People frustrated
  • Is anyone in charge?

44
Settling for Less
  • Do these statements sound familiar? If they do,
    your
  • organization may be settling for less than it is
    capable of
  • and may be a good candidate for process
    improvement.
  • a senior software manager (industry)
  • I'd rather have it wrong than have it late. We
    can always
  • fix it later.
  • a program manager (government)
  • The bottom line is schedule. My promotions and
    raises are based on meeting schedule first and
    foremost.
  • -

45
The Process Management Premise
  • The quality of a system is highly influenced by
  • the quality of the process used to acquire,
    develop,
  • and maintain it.
  • This premise implies a focus on processes as well
  • as on products.
  • This is a long-established premise in
    manufacturing
  • (and is based on TQM principles as taught by
    Shewhart,
  • Juran, Deming, and Humphrey).
  • Belief in this premise is visible worldwide in
    quality
  • movements in manufacturing and service industries
  • (e.g., ISO standards).

46
(No Transcript)
47
What is a CMM
48
CMM (Software) Overview

SEIs VisionTo bring engineering discipline to
the development and maintenance of software
products
Desired ResultHigher quality -- better products
for a better pricePredictability --
function/quality, on time, within budget
Methodology to Achieve that Desired Result
2. Identify Desired StateUnderstand the
description of the next Level
1. Identify Current StateKnow your current
Capability Maturity Level
3. Reduce the GapPlan, implement, and
institutionalizethe key practices of the next
Level.Repeat until continuous optimization is
part of the culture.
49
Assessment vs Evaluation
  • A software process assessment is an appraisal by
    a trained team of software professionals to
    determine
  • the state of an organization's current software
    process,
  • the high-priority software process-related issues
    facing an organization,
  • and to obtain the organizational support for
    software process improvement.
  • A software capability evaluation is an appraisal
    by a trained team of professionals to identify
    contractors who are qualified to perform the
    software work or to monitor the state of the
    software process used on an existing software
    effort.

50
A Foundation, Not a Destination
  • The optimizing level (Level 5) is not the
    destination of process management.
  • The destination is better products for a better
    price economic survival
  • The optimizing level is a foundation for building
    an ever-improving capability.

51
Fundamental Concepts Underlying Process Maturity
  • A software process
  • can be defined as a set of activities, methods,
    practices, and transformations that people use to
    develop and maintain software and the associated
    products (e.g., project plans, design documents,
    code, test cases, and user manuals). As an
    organization matures, the software process
    becomes better defined and more consistently
    implemented throughout the organization.
  • Software process capability
  • describes the range of expected results that can
    be achieved by following a software process. The
    software process capability of an organization
    provides one means of predicting the most likely
    outcomes to be expected from the next software
    project the organization undertakes.
  • Software process performance
  • represents the actual results achieved by
    following a software process. Thus, software
    process performance focuses on the results
    achieved, while software process capability
    focuses on results expected.
  • Software process maturity
  • is the extent to which a specific process is
    explicitly defined, managed, measured,
    controlled, and effective. Maturity implies a
    potential for growth in capability and indicates
    both the richness of an organization's software
    process and the consistency with which it is
    applied in projects throughout the organization.

52
Fundamental Concepts Underlying Process Maturity
  • Maturity Level
  • A well-defined evolutionary plateau toward
    achieving a mature software process. Each
    maturity level comprises a set of process goals
    that, when satisfied, stabilize an important
    component of the software process. Achieving each
    level of the maturity framework establishes a
    different component in the software process,
    resulting in an increase in the process
    capability of the organization.
  • Goal
  • As a software organization gains in software
    process maturity, it institutionalizes it
    software process via policies, standards, and
    organizational structures. Institutionalization
    entails building an infrastructure and a
    corporate culture that supports the methods,
    practices, and procedures of the business so that
    they endure after those who originally defined
    them have gone.

53
The Five Levels of Software Process Maturity
Continuously Improving Process
5.Optimizing

Focus on process improvement
4. Managed
Predictable Process
Managing Change
Process measured and controlled
Standard, Consistent Process
3. Defined
Product and Process Quality
Process characterized, fairly well understood
2. Repeatable
Disciplined Process
Integrated Engineering Process
Can repeat previously mastered tasks
1. Initial
Project Management
Unpredictable and poorly controlled
54
Part 2. CMM Level 2 Key Process Areas
Software Quality Assurance
Requirements Management
SoftwareProjectPlanning
Software Configuration Management
Software Subcontract Management
SoftwareProject Tracking and Oversight
55
CMM Level 3 Key Process Areas
Organization Process Focus
Organization Process Definition
Training Program
Intergroup Coordination
PeerReviews
Software Product Engineering
Integrated Software Management
56
Understanding the Repeatable and DefinedLevels
(2 3)
  • To achieve Level 2, management must focus on its
    own processes to achieve a disciplined software
    process and establish a leadership position.
  • Level 2 provides the foundation for Level 3
    because the focus is on management acting to
    improve its processes before tackling technical
    and organizational issues at Level 3.
  • Processes may differ between projects in a Level
    2 organization the organizational requirement
    for achieving Level 2 is that there are policies
    that guide the projects in establishing the
    appropriate management processes.
  • Documented procedures provide the foundation for
    consistent processes that can be
    institutionalized across the organization, with
    training and assurance.
  • Level 3 builds on this project management
    foundation by defining, integrating, and
    documenting the entire software process.

57
Comparison of Level 2 and Level 3
  • Difference between Level 1 and Level 2
  • Level 1
  • is ad hoc and occasionally chaotic few
    processes are defined, and success depends on
    individual effort.
  • Level 2
  • Basic project management processes are
    established to track cost, schedule, and
    functionality.
  • The necessary process discipline is in place to
    repeat earlier successes on projects with similar
    applications.
  • Difference between Level 2 and Level 3Level 3
    encompasses integrated and standardized
    management and engineering activities projects
    tailor the organizations standard software
    process to meet their needs.

58
The CMM Structure
Maturity Levels
indicate
contain
Processcapability
  • Key Process Areas

achieve
Goals
Organized by
Common Features
address
Implementation orinstitutionalization
contain
Key Practices
describe
Infrastructure orActivities
59
Note 5. The process of going back and forth
between doing changes in the design followed by
a code generation and then doing changes in the
code followed by a reverse engineering for every
change, the best possible perspective, is called
Round-trip Engineering.
Write a Comment
User Comments (0)
About PowerShow.com