Title: A Framework for Behavioral Engineering of Software Systems
1A Framework forBehavioral Engineeringof
Software Systems
- Swaminathan Natarajan
- Dept. of Computer Science
- Rochester Institute of Technology
- sxn_at_cs.rit.edu
2Behavioral Engineering
- Typically, much of the design focus for software
systems on functionality - The behavior of the system depends on its
ilities all its different non-functional
attributes - Performance, dependability, usability
- Tradeoffs with evolvability, business attributes
- Engineering involves a systematic approach to
identify the best tradeoffs among the design
goals
3Need for Behavioral Engineering
- High architecture and design emphasis on making
software modular, layered, easy to modify and
extend - Performance terrible, project scrapped
- Very fancy user interface, latest technology,
lots of bells and whistles, online help given - Users still complain about the interface
- Customers very pleased with product, want to
increase capacity, put on other platforms - Product is very hard to port, does not scale well
4- Customers view of quality
reliability
usability
performance
scalability
extensibility
security
availability
portability
cost
operability
safety
5- Developers view of quality
maintainability
portability
Documentation
extensibility
cycletime
reusability
Testability
cost
correctness
performance
6An Attribute Classification
Evolvability Extensibility Maintainability Scalabi
lity Portability
Business Cycletime Cost Reusability
Behavior Performance Dependability Usability
Not exhaustive list Not mutually independent
?Tradeoffs
Usability Operability Learnability Helpfulness Int
eroperability Control, Affect Adaptability
Dependability Reliability Availability Timeliness
Robustness Precision Security, Safety
- Performance
- Response time
- Throughput
- Capacity
- Resources Usage
- Space
- Bandwidth
7ISO9126 Attribute Classification
Functionality Suitability Accurateness Interoperab
ility Compliance Security
Reliability Maturity Fault-tolerance Recoverabilit
y
Usability Understandability Learnability Operabili
ty
Maintainability Analyzability Changeability Stabil
ity Testability
Portability Adaptability Installability Conformanc
e Replaceability
Efficiency Time behavior Resource behavior
8A Framework for Behavioral Engg
Specify
Design
Analyze
Attribute goals Criticality of goals Preferred
tradeoffs
Measure
Quantitative / Qualitative Fidelity varies with
effort, available info
Testing Field data Customer feedback (needs
work)
9Background
- SEI architecture program basic approach of
attribute-focused design - Work on attribute analysis
- Performance analysis
- Reliability engineering and fault-tolerance
- SUMI for usability analysis
- SAAM evolvability, other qualitative attributes
- ATA for tradeoff approach
- This particular integrative framework developed
at Motorola India
10Product Definition Activities
Business
Customer Feedback
Planning Business Feasibility Analysis
Prototyping
Requirements Elicitation
Requirements Specification
Product Profile
Requirements Analysis
Modeling Analysis
Architecture
Functionality
Behavior
11Lifecycle View
12Attribute Specification
- Identify and prioritize attributes of interest
- Create specifications for both quantitative and
qualitative attributes - Typically 20-50 specifications for a medium-size
system (100KLOC) - If possible, create tradeoff specifications
- Customer preferences among alternative profiles
- Attribute specification is part of requirements
elicitation and capture - Works well with use case approach
13Attribute Specification Examples
Performance
- Response Time
- Call setup lt 250ms
- System startup lt 2 minutes
- Resume service within 1.5 sec on channel
switchover - Throughput
- 1000 call requests /sec
- 800 traps (SNMP) / day
- Capacity
- 70 simultaneous calls
- 50 concurrent users
- Resource Utilization
- Max 50 CPU usage on full load
- Max 16MB run time memory
- Max bandwidth 96 kb/sec
14Attribute Specification Examples Usability,
Dependability
- Usability
- User types Administrators Operators
- Look and feel same as Windows packages
- Server invocation in lt 60 ms
- Invocation command shall have lt 5 Command line
arguments - Expert user should be able to complete the task
in lt 5 sec - New users to start using the system in one hour
without training - Dependability
- lt 20 min downtime for upgrades
- No more than 3 failures per 100,000 operations
- Results accurate upto 3 decimal places
- Recover automatically if configuration files
corrupted or deleted - No more than 6 hours downtime per year
- The system should be resistant to
man-in-the-middle attacks
15Attribute Specification Examples Evolution
- Portability
- Application Should run on Windows-NT as well
- Should be able to use different databases
Oracle/Sybase/... - Scalability
- Increase the number of SVs in the space-network
from 66 to 110 - Extensibility
- Should be easy to incorporate password protection
- Medium effort to add content sensitive help
feature to the GUI - Diagnostic monitoring tool should be extensible
w.r.t. analysis capabilities for monitored data - Maintainability
- The tool should allow easy addition of new
message formats - The tool should be customizable for new business
processes
16Design
- Design space of feasible alternatives
- Goodness of designs determined by attribute
satisfaction - Identify critical design drivers
- Attribute requirements that are hard to meet,
have most impact on customer satisfaction - Core design choices to optimize critical drivers
- Decomposition style strategy (i.e. software
architecture), technology and platform choices,
algorithms data structures - Analyze design(s) to determine attribute profile
- Add mechanisms to improve attributes (tradeoffs)
- Iterate design to optimize attribute satisfaction
17Attribute Modifiers
- Blocks that can be inserted into a design to
improve some quality attributes, usually at the
expense of others - Lookup tables (performance, -memory)
- Compression (memory, -performance)
- Hotspot optimization (performance,
-evolvability) - Passwords (security, -memory, -performance,
-usability) - Certificates (security, -cost,
-interoperability?) - Context-sensitive help (usability, -memory,
-cycletime, -cost) - Middleware (evolvability, interoperability,
-performance, -cost
18Analysis
- For both quantitative and qualitative attributes
- Including subjective attributes
- Iterate between design and analysis
- Basket of analysis technologies
- Approaches include modeling, estimation,
closed-form analysis, expert inputs, reviews and
surveys - Based on representations
- Different design views facilitate different
analyses - Useful to create views that focus on key
attributes - Different levels of fidelity based on
- How complete and detailed the design is
- Level of effort put into analysis
- Prototypes, benchmarking, testing inputs to
improve analysis fidelity
19Performance AnalysisScheduling Theory
- Execution time estimation for individual modules
- Inputs code structure (algorithm) and
benchmarking data - Views Detailed design, control flow diagrams
- Computation time (work) estimation for operations
- Views Sequence diagrams / state diagrams /
control flow - Wait times blocked waiting on resources
- Inputs Tasking and resource contention patterns
- Views Process views, enhanced with shared
resource usage - Response times for operations
- Inputs Scheduling information, operational
profiles - Views Use cases annotated with request arrival
patterns - Timeliness derived directly from response times
- Fidelity factor of 2-10, useful for worst-case
analysis, bottleneck identification,
understanding
20Performance Analysis ResourcesMemory usage
analysis, network scheduling algorithms
- Code/data/heap/stack analysis for memory usage
- Inputs Operational profiles, dimensioning
information - Views Block structure diagrams, class diagrams
- Network analysis
- Inputs Request patterns and sizes, network
bandwidth allocation algorithm, network
scheduling algorithm - Views Communication view (enhancement to process
view showing interactions) - Can perform contention analysis for any shared
resource (process view enhanced with resource
dependencies)
21Response Time Analysis Concept
Database
Database Interface Layer
Server
Application Processing Layer
Client Interface Layer
Message Queue
Network
Client-n
Client-1
..
22Performance Analysis ModelingDiscrete-event
simulations
- Build model of system and run simulations
- Inputs
- Operational profiles (request arrival patterns)
- System architecture, sequence diagrams
- Operation computation times
- Tasking and resource usage patterns
- Value
- Fidelity /- 20 (factor of 2 with less detail)
- Identification of bottlenecks
- What-if analysis, sensitivity analysis
- Throughput and capacity analysis
- More effort, can handle high complexity
23Reliability AnalysisSoftware Reliability
Engineering
- Design operation test cases that generate random
inputs within equivalence classes - Design test scripts that select test cases
randomly based on operational profiles - Perform system testing and record failures
- Plot failures over time to obtain reliability
growth curves - Release guidance based on reliability objectives
- Estimation of test effort based on reliability
objectives - Software Reliability Engineering is actually a
complete end-to-end methodology for developing
reliable software
24Operational Profile - Example
25Robustness, Availability AnalysisFault trees,
FMEA, Fault injection
- Identify set of failure modes
- Identify the set of possible faults
- Draw (AND/OR) fault trees showing relationship
between faults and failure modes - FMEA (failure modes effects analysis) to
determine relationships - Add exception handling and fault-tolerance to
weaken linkages between faults and failures
(introduce AND nodes) - Use failure impact and probability data to guide
need for choices between fault-tolerance
approaches - Fault injection testing and reliability analysis
to gauge effectiveness of design - Approach can also be used for safety, security
26Usability AnalysisSoftware Usability Measurement
Inventory
- SUMI is a survey-based approach for usability
analysis (HFRG, UCC (univ), UK) - Standard user questionnaire 50 questions
- Pre-calibrated response analysis tool
- Score of 0-10 along 5 dimensions efficiency,
learnability, helpfulness, control, affect - Inputs Actual interface and software behavior,
prototypes - Can augment with quantitative and qualitative
internal goals for usability and analyze against
them
27Evolvability AnalysisSoftware Architecture
Analysis Method
- Scenarios identified during specification
- e.g. should be easy to add context-sensitive help
- During design review, identify what work needs to
be done for this scenario - Each reviewer rates this easy/moderate/hard
- Ratings can be translated to design score if
desired - Views Block structure diagrams with dependencies
- Can be used for any qualitative attribute
security, safety, usability, reusability - Allows for subjective inputs but well-defined
scenarios and goals provide basis for resolution - Can do scalability analysis based on complexity
theory
28Business Attribute AnalysisProject Estimation
and Planning
- Cycletime and cost analysis based on planning
data - Views Work-breakdown structure, Wideband delphi
or other estimation data, GANTT chart - Reusability analysis using SAAM
- Identify specific targets for reuse
- Effort estimation for different designs provides
critical inputs to design decision making - Just enough design vs. best possible design
- Include just enough within concept of best
possible
29Testing
- Attribute testing as part of product testing
- Performance testing (measurement)
- Reliability testing (reliability growth testing /
reliability certification testing on stable
version) - Error, fault injection testing
- Security testing (attacks)
- Usability testing (attribute focus during alpha
testing, observing users, surveys) - Validating evolvability estimates during
modifications, development of subsequent versions - Provides inputs to subsequent design iterations
30Product Quality Data Chart
Key Product-Quality Attributes(Performance,
Usability)
Availability Goal
Usability score from SUMI (if used)
Product Evolution Goals
? Motorola India Electronics Ltd, 2000
31Metrics
- Simplest metric customer satisfaction with
attributes survey data for each product - Measurements such as of design goals
satisfied hard to translate to meaningful
metrics - SUMI, reliability, availability have ready
metrics - Need more work to identify good metrics
32Deployment
- Deployed at Motorola India for the last 3 years
20 projects - Specification practiced on most projects
- Analysis and testing focus on some projects
- Product Quality Data Chart part of monthly
project review presentations - Product Quality satisfaction questions part of
periodic customer satisfaction surveys
33Results
- Most attribute ratings improved from around 7.0
to 7.5-8.0 on 10-point scale - Room for further improvement
- Strong positive subjective feedback from
customers and from internal project managers and
technical staff, on some projects - Some gaps in product quality continue to exist,
including both specification gaps and design gaps - 2-3 cases where analysis had huge impact on
reworking requirements and preventing disasters - Success in identifying bottlenecks and critical
drivers - Rated Motorola emerging best practice in 2000
34Summary
- It is possible to take a systematic approach
towards engineering the product profile for
software systems - There exist a full basket of specification and
analysis technologies - Diagramming is critical to analysis
- The individual practices have been around a long
time, and proven extensively in many environments - The framework has also been deployed and found
useful - However, this is not yet widely accepted
practice - Partly because universities do not focus on it
(my opinion) - Partly because of poor tool support (catch-22)
35References
- "Quality Attributes", Barbacci et al, 1995,
CMU/SEI-95-TR-021 - "The Architecture Trade-off Analysis Method",
Kazman et al, http//www.sei.cmu.edu - "Scenario-Based Analysis of Software
Architecture", Kazman et al, IEEE Software,
13(6)47-56, 1996. - "Using the WinWin Spiral Model A Case Study",
Boehm et al, IEEE Computer, July 1998, pp.
33-44. - Software Reliability Engineered Testing, Musa et
al, McGraw-Hill, 1998. - Product Quality Framework, Krishnan et al, ACM
Software Engineering Notes, July 2001. - FMEA http//www.fmeainfocentre.com/
- SUMI http//www.ucc.ie/hfrg/
- ISO Product Quality http//www.cse.dcu.ie/essisco
pe/
36Comparison to ISO/IEC-9126
- Characteristics-- Sub-Characteristics --
Measurements - Characteristics Functionality, Reliability,
Usability, Maintainability, Portability,
Efficiency - Characteristic Maintainability
- Sub-characteristic Analyzability
- Metrics Cyclomatic number, comment density,..
- Intrinsic Vs Extrinsic Metrics
- Post Measurement Certification Vs Focus
throughout the Lifecycle