SE Principles Silver Bullets - PowerPoint PPT Presentation

About This Presentation
Title:

SE Principles Silver Bullets

Description:

SE Principles Silver Bullets CS 620/720 Software Engineering January 15, 2004 – PowerPoint PPT presentation

Number of Views:466
Avg rating:3.0/5.0
Slides: 570
Provided by: JeffG166
Category:

less

Transcript and Presenter's Notes

Title: SE Principles Silver Bullets


1
SE PrinciplesSilver Bullets
  • CS 620/720
  • Software Engineering
  • January 15, 2004

2
Poor Engineering leads to ad-hoc structure!
The result of continuous building without any
thought toward design.
Result Stairs leading to ceiling Windows
in the middle of room Doors opening to wall
Non-intuitive floor plan!.
3
Poor Engineering Has Disastrous Consequences!
Aerodynamic phenomena in suspension bridges were
not adequately understood in the profession nor
had they been addressed in this design. New
research was necessary to understand and predict
these forces. The remains, located on the bottom
of the Sound, are a permanent record of man's
capacity to build structures without fully
understanding the implications of the design.
http//www.nwrain.net/newtsuit/recoveries/narrow
s/narrows.htm
4
Poor Engineering Has Disastrous Consequences!7
Billion Fire Works One Bug, One Crash
On 4 June 1996, the maiden flight of the Ariane 5
launcher ended in a failure. Only about 40
seconds after initiation of the flight sequence,
at an altitude of about 3700 m, the launcher
veered off its flight path, broke up and
exploded. The failure of the Ariane 501 was
caused by the complete loss of guidance and
attitude information 37 seconds after start of
the main engine ignition sequence (30 seconds
after lift- off). This loss of information was
due to specification and design errors in the
software of the inertial reference system.
The launcher started to disintegrate at about H0
39 seconds because of high aerodynamic loads
due to an angle of attack of more than 20 degrees
that led to separation of the boosters from the
main stage, in turn triggering the self-destruct
system of the launcher. This angle of attack was
caused by full nozzle deflections of the solid
boosters and the Vulcain main engine. These
nozzle deflections were commanded by the On-Board
Computer (OBC) software on the basis of data
transmitted by the active Inertial Reference
System (SRI 2). Part of these data at that time
did not contain proper flight data, but showed a
diagnostic bit pattern of the computer of the SRI
2, which was interpreted as flight data. The
reason why the active SRI 2 did not send correct
attitude data was that the unit had declared a
failure due to a software exception. The OBC
could not switch to the back-up SRI 1 because
that unit had already ceased to function during
the previous data cycle (72 milliseconds period)
for the same reason as SRI 2. The internal SRI
software exception was caused during execution of
a data conversion from 64-bit floating point to
16-bit signed integer value. The floating point
number which was converted had a value greater
than what could be represented by a 16-bit signed
integer. This resulted in an Operand Error. The
data conversion instructions (in Ada code) were
not protected from causing an Operand Error,
although other conversions of comparable
variables in the same place in the code were
protected. The error occurred in a part of the
software that only performs alignment of the
strap-down inertial platform. This software
module computes meaningful results only before
lift-off. As soon as the launcher lifts off, this
function serves no purpose. The alignment
function is operative for 50 seconds after
starting of the Flight Mode of the SRIs which
occurs at H0 - 3 seconds for Ariane 5.
Consequently, when lift-off occurs, the function
continues for approx. 40 seconds of flight. This
time sequence is based on a requirement of Ariane
4 and is not required for Ariane 5. The Operand
Error occurred due to an unexpected high value of
an internal alignment function result called BH,
Horizontal Bias, related to the horizontal
velocity sensed by the platform. This value is
calculated as an indicator for alignment
precision over time. The value of BH was much
higher than expected because the early part of
the trajectory of Ariane 5 differs from that of
Ariane 4 and results in considerably higher
horizontal velocity values.
http//java.sun.com/people/jag/Ariane5.html http/
/www.around.com/ariane.html http//archive.eiffel.
com/doc/manuals/technology/contract/ariane/page.ht
m l
5
Poor Engineering Has Disastrous Consequences!
  • Software Runaways Lessons Learned from Massive
    Software Failures, Robert Glass
  • Denver Airport
  • Safeware System Safety and Computers, Nancy
    Leveson
  • Therac-25
  • http//sunnyday.mit.edu/papers/therac.pdf
  • Risks to the Public, Peter Neumann
  • http//www.csl.sri.com/users/risko/risks.txt

6
Poor Engineering Has Disastrous Consequences!
  • Can you think of other examples.?

7
1968 Birth of Software Engineering
  • 1968 NATO Convention on new field of Software
    Engineering
  • http//www.cs.ncl.ac.uk/old/people/brian.randell/h
    ome.formal/NATO/index.html
  • Virtual Whos Who
  • Dijkstra, Naur, Perlis, Gries

8
First Software Reuse paper
  • Doug McIlroy

9
1968 NATO SE
  • The conference must have been tiring.

10
No Silver Bullet Essence and Accident in
Software Engineering
  • IEEE Computer, April 1987
  • Brooks was the 2000 Turing Award winner
  • Brooks Law
  • Mythical Man Month
  • UNC

11
Silver Bullets
  • But as we look to the horizon of a decade hence,
    we see no silver bullet. There is no single
    development, either in technology or management
    technique, which by itself promises even one
    order of magnitude improvement in productivity,
    in reliability, in simplicity. Not only are there
    no silver bullets in view, the very nature of
    software makes it unlikely there will be any.
  • Frederick Brooks, The Mythical Man Month

No magical cure for software crisis
12
In a nutshell
  • I believe the hard part of building software to
    be the specification, design, and testing of this
    conceptual construct, not the labour of
    representing it and testing the fidelity of the
    representation.

13
SE A Disciplined ApproachKill the Snake-Oil
  • The first step toward the management of disease
    was replacement of demon theories and humours
    theories by the germ theory. That very step, the
    beginning of hope, in itself dashed all hopes of
    magical solutions. It told workers that progress
    would be made stepwise, at great effort, and that
    a persistent, unremitting care would have to be
    paid to a discipline of cleanliness. So it is
    with software engineering today.

14
Essence and accident
  • All software construction involves
  • essential tasks the fashioning of the complex
    conceptual structures that compose the abstract
    software entity i.e., problem-space, and
  • accidental tasks the representation of the
    abstract entities in programming languages and
    the mapping of these onto machine languages
    within space and time constraints.
  • Distinction originally due to Aristotle.

15
Brooks thesis
  • Good news!
  • We have made great headway in solving the
    accidental problems! Orders of magnitude
    improvements!
  • Bad news!
  • Progress on essential problems will be much
    slower going.
  • There is no royal road, but there is a road.

16
No Silver Bullet Fred Brooks
  • Like physical hardware limits, there are problems
    w/ SW that wont be solved
  • Inherent difficulties of software production
  • complexity
  • conformity
  • changeability
  • invisibility

17
Complexity
  • Software is complex, even when done right.
  • Many components
  • Many kinds of interactions between components
  • Combinatorially many states, syntax must be just
    so or else.
  • 16 bit word in HW -gt 216 states
  • Def-use of variables and state changes
  • Rich structure, interesting dependencies add to
    complexity.
  • Good interfaces, design can lessen
    externally-visible complexity
  • Accidental (as practised) issues add to
    complexity.
  • Efficient code is usually complicated code
  • Use of prefab abstractions reuse and generic
    components mean complicated, stateful glue.
  • Complex code is harder to evolve, results in
    design drift and even more complexity.

18
Complexity
  • Complexity is an inherent property because there
    really are that many moving parts!
  • Hardware engineering is achieved by replication
    of relatively simple parts arranged just so to
    achieve a certain effect.
  • Software is NOT a physical system, it is entirely
    a design.
  • When two pieces of software perform the same
    task, we abstract them into one!
  • Thus the size of a well designed system
    measures not mass but complexity.
  • Complexity depends non-linearly on size
  • Results
  • difficult to understand whole product
  • errors in specification code
  • hard to manage
  • how can you estimate without understanding?
  • maintenance is a nightmare

19
Conformity
  • Invariably, have to coerce nice problem-space
    abstractions to fit someone elses prefab
    solution-space technology.
  • New kid on the block! (rules were already made)
  • Perceived as more flexible than hardware or
    humans
  • Result bewildering diversity of requirements
  • Wrapping and unwrapping, data marshalling, etc.
  • Software has to be made to agree to common
    interfaces, protocols, standards, etc.
  • Interface with an existing system
  • e.g., plant built asked to create SW to control
  • software must conform to the plant
  • Interface with a new system
  • misperception that SW is easier to conform
  • again, software will be forced to conform
  • Some translations/coercions can be automated and
    take care of themselves it's the other ones that
    will cause our systems to break, often in very
    subtle ways.

20
Changeability
  • the software product is embedded in a
    cultural matrix of applications, users, laws, and
    machine vehicles. These all change continually,
    and their changes inexorably force change upon
    the software product.
  • We change software because we can!
  • Easy to update existing systems in the field (in
    theory).
  • E.g., Windows update web site
  • Can undo changes if desired (in theory).
  • Not like a car recall.
  • To be successful is to be changed!
  • Keep system flexible, marketable.
  • Try to anticipate and/or react to future
    unforeseen uses
  • Pressure to change
  • reality changes
  • useful software will encourage new requests
  • long lifetime (15 yrs) vs. hardware (4 yrs).

21
Invisibility
  • The reality of software is not inherently
    embedded in space.
  • Software is invisible unvisualizable
  • Different from physical laws and math theorems
  • no way to represent a complete product or
    overview
  • complete views (e.g., code)
  • incomprehensible
  • partial views
  • misleading
  • Makes it hard to communicate to
  • other software professionals
  • users clients
  • Of course, we can use various techniques to help
    us visualize aspects of software (control flow,
    data dependencies, UML, etc.)
  • but that's NOT quite the same thing as, say, the
    blueprints of a building.

22
Attacks on accidental problems
  • HLLs
  • Frees us from byte-level thinking (accidental
    complexity). Takes us up to generic problem
    space.
  • e.g., C, sizeof(int), register allocation vs.
    pure objects and garbage collection
  • What we need beyond this is common abstractions
    in problem space
  • i.e., domain modelling telephony, avionics,
    flight reservation
  • and in solution space too
  • e.g., frameworks, libraries, components
  • HLLs and OOP replace Ada by Java
  • Good training/use will make for better systems.
  • When used well, bumps up the abstraction level
  • but still begs the question
  • Won't solve the SE problem, but the abstraction
    aspects of OO live in problem space too.
  • This has been a HUGE win in attacking essential
    complexity.

23
Attacks on accidental problems
  • IDEs (SDEs)
  • This was novel then! Research systems can
    actually do lots more than VC.
  • Even a smartly tweaked vim/emacs is a big step
    forward.
  • In the old days, correct and reasonably efficient
    compiling was a notable feat. It was hard to get
    Unix standalone tools to interoperate.
  • Current Eclipse

24
What about these silver bullets?
  • AI and expert systems
  • Mostly not.
  • Sometimes they help, e.g., test oracles
  • Certainly, AI techniques are useful in many
    application domains, but not as a silver bullet
    to solve the SE problem.
  • WWW, faster processors, cheap memory have made
    some AI techniques quite useful, much more so now
    than then.

25
What about these silver bullets?
  • Automatic programming
  • i.e., state the parameters, turn the crank, hey
    presto a software system!
  • We can do this now in some cases, works quite
    well
  • it didn't work very well at the time of writing
  • Some systems generate part of their source code
  • Generative programming model-based approaches
  • Will never work completely in the general case,
    but is certainly useful.
  • Really, all this does is bump up the abstraction
    level by one
  • Great potential in domain-specific environments
  • Parsers yacc, lex
  • MIC/GME

26
What about these silver bullets?
  • Graphical programming e.g., UML, SDL, et al.
  • Undeniably useful as a tool sometimes
  • Always tempting to try, as certain aspects of
    software systems do seem inherently
    topological.
  • A favourite topic for PhD dissertations in SE.
  • In the general case, software is ethereal,
    unvisualizable (unlike hardware) with truly
    bizarre dependencies.
  • Surprisingly difficult to do well as most visual
    metaphors break down under scale (or bad design).
  • Have you ever looked at a complicated SDL
    state-machine diagram? A class diagram with lots
    of complex interdependencies? A use-case scenario
    diagram with lots of variation?
  • Improvement Domain-specific visual languages

27
What about these silver bullets?
  • Formal program verification
  • (Dijkstra formal derivation)
  • Does program P implement specification S?
  • Undecidable in the general case, can be done by
    hand until scale overwhelms
  • In practice, has been done with some great
    successes, but only when investment seems to be
    worthwhile.
  • Tremendously expensive requires expert
    logicians!
  • The hard part, as Brooks rightly points out, is
    making sure you have the right specification S!
  • Validation versus verification

28
What about these silver bullets?
  • Better tools and faster computers
  • Always good. Usually solution-space though.
  • Technological Peter Principle
  • in a hierarchically structured administration,
    people tend to be promoted up to their "level of
    incompetence"
  • Cheap disks, fast modems? Napster!

29
Promising attacks on conceptual essence
  • Buy, don't build.
  • Old days IBM 1960s specialists developed
    highly customized solutions Just For You.
  • Experience led to generic (or at least
    configurable) products now we have
    shrinkwrapped software (one size fits all).
  • Advice to software developers Learn how to
    build generic components or systems. This is
    hard, though. Modularization (next topic) plays a
    big role.
  • If customization is straightforward, this is a
    huge leap forward and an attack on the conceptual
    essence.

30
Promising attacks on conceptual essence
  • Rapid prototyping
  • Take requirements, build mock up, show to user,
    analyze feedback, repeat.
  • Early feedback means less chance for requirements
    errors (which are the most expensive), fast
    turnaround in problem space to narrow
    misunderstandings and educate customer.
  • Requirements are the essence of a software
    system.
  • Focusing on getting the requirements right is a
    direct attack on essential problems.
  • Relevance to XP?

31
Promising attacks on conceptual essence
  • Staged delivery
  • aka organic/incremental development, daily
    build
  • Get a skeleton system up and running ASAP flesh
    it out as you go.
  • Helps to get developers feeling like system is
    real. They add their updates as they finish
    them. They care about them not breaking.
  • Other extreme big bang merging. Often lots of
    unpleasant surprises.
  • Microsoft (and many others) uses this approach
    works well for them.
  • Need a culture that supports it. Lots of running
    around, not afraid of change, group buy-in,
    product-centred, etc.

32
Promising attacks on conceptual essence
  • Virtuoso designers (a.k.a. Cowboy code-slinger)
  • Find good people and keep them.
  • Pay them well, encourage them.
  • Above all, listen to them.
  • Ours is still a very young field this reliance
    on magic and gurus is worrisome and
    anti-engineering.
  • However, great software design is mostly pure
    design (not engineering) its an act of
    creativity and innovation balanced against
    experience and good engineering. Its impossible
    to teach, per se.
  • Note that great artists still require a
    grounding education in classical techniques and
    exposure to best practices.

33
Other Promising Technologies?
  • AOP?
  • MIC or MDA?
  • WWW? 1998
  • Extreme programming?
  • design patterns, software architecture?
  • The emergent evolution of hard interfaces
  • e.g., imap, http, tcp/ip
  • Frameworks?
  • EJB, CORBA, COM, componentware?
  • scripting languages?
  • lawsuits?
  • your suggestions??

34
No Silver Bullet Refired
  • Brooks reflects on the No Silver Bullet paper,
    ten years later
  • Lots of people have argued that their methodology
    is the silver bullet
  • If so, they didnt meet the deadline of 10 years!
  • Other people misunderstood what Brooks calls
    obscure writing
  • For instance, when he said accidental, he did
    not mean occurring by chance

35
Obtaining the Increase
  • Some people interpreted Brooks as saying
  • that the essence could never be attacked
  • Thats not his point however he said that no
  • single technique could produce an order of
  • magnitude increase by itself
  • He argued that several techniques in tandem
  • could achieve that goal but that requires
  • industry-wide enforcement and discipline

36
SE Principles - Parnas
37
Definition
  • Parnas starts this paper with a definition of SE
    as
  • multi-person construction of multi-version
    programs.
  • Do you agree with this definition?

38
Excludes solo-programming
  • The argument, in some way, makes sense
  • Dividing the job
  • Specifying exact behavior
  • Team communication
  • all of these things contribute to problems that
    need to be solved by applying SE principles.

39
But
  • Is it not the case that developing a large
    application, even by yourself, necessitates the
    need for good SE?
  • Do we not need proper modularization when
    developing solo-progams?
  • If building a house relies on good construction
    and engineering skills, is it true that building
    a dog house (solo effort) does not?

40
Even building a dog house takes some engineering
From http//www.ttyler.8m.com/Dog20House.htm Ini
tially started as a "basic" dog house but soon
turned into a masterpiece of quality
workmanship.  Total time spent was 8 hours at a
cost of 110 US.  Start with a piece of paper and
a idea  Design your dog house to the size and
quantity of your dogs.  A perfectly built home is
worthless if its to small to properly accommodate
your dog.     Framing  The framing process
should be constructed with 2x4's or rip them in
half for smaller homes.  A removable roof should
be incorporated in assisting the future cleaning
and maintenance.   Wall Covering  Should be
tong grove for a tight fit, no warping, and to
cut down on cross drafts.  For large homes,
plywood is a economical material that can be
used.     Roof  30 year home shingles cut down
to the proper size.  As for this house, an
oriental piece was constructed then topped of
with a copper fence post top.  An additional
hours work and 15 cost was needed   Trim
Finishing Touches  Trim can add a lot to the
astidics of your dog house.  Trim can be bought 
with may different variations or with some
craftsmanshipcan can be made with the use of a
router.   Sanding Paint  Sink all nails
below the surface and cover with wood filler. 
Prepare surface for painting by sanding wood
filler, rough spots, and blemishes.
41
More SE Principles
  • It seems that the problems identified as 4-6 are
    not peculiar to solo-programming
  • 4. How to write programs that are easily
    modifiable. Programs in which a change in one
    part does not require changes in many other
    places.
  • 5. How to write programs with useful subsets.
    Remove uneeded parts
  • E.g., Zen and CORBA footprint
  • 6. How to write programs that are easily extended.

42
Structure
  • The connections between program parts are the
    assumptions that the parts make about each
    other.
  • Precursor to Design by Contract Pre/Post
    conditions
  • the properties that it expects other parts to
    satisfy
  • the system properties that it is required to
    guarantee

43
Changeability
  • Parnas writes that these two concerns help when
    we ask
  • What changes can be made to one part without
    involving change to other parts?
  • Or, as Dijkstra has written elsewhere
  • ... program structure should be such as to
    anticipate its adaptations and modifications. Our
    program should not only reflect (by structure)
    our understanding of it, but it should also be
    clear from its structure what sort of adaptations
    can be catered for smoothly. Thank goodness the
    two requirements go hand in hand.

44
Two Techniques for Controlling Structure
  • Decomposition
  • Technique for dividing systems into modules
  • Well-structured program is one with minimal
    interconnections between its modules
    (low-coupling)
  • More to be said in later lectures
  • Precise Specification
  • precisely describing the assumptions that the
    designers of one module are permitted to make
    about other modules
  • More also to be said on this later
  • Some examples of why it is easier in other
    engineering endeavours

45
Decomposition and Simple Specification
The prong and receptacle parts of a Lego block
have been unchanged since 1932 Lego, 2002.
46
Simple Interface Specification
Since around 1850, the standard dimensions for an
air cell masonry brick in the United States has
been 2.5 x 3.75 x 8 inches Chrysler and Escobar,
2000.
47
For next lecture
  • Read chapter 7 of Parnas
  • On the criteria.
  • Perhaps one of the top 5 most cited papers in all
    of software engineering
  • I will also lecture on some things that you do
    not have to read (e.g., cohesion and coupling)
  • Pop quiz some time in semester possible
  • i.e., quiz not on typical Thursday, but Tuesday

48
A lot of review On the Criteria (hopefully)
  • CS 620/720
  • Software Engineering
  • January 20, 2004

49
Basic Definitions of SE
  • Software engineering is a discipline whose aim is
    the production of fault-free software, delivered
    on time and within budget, which satisfies the
    users needs Schach

50
Generic Lifecycle Models
51
Software Lifecycles
Linear Model
System/information
engineering
analysis
design
code
test
52
Waterfall
Royce, 1970
Object-Oriented and Classical Software
Engineering Fifth Edition, WCB/McGraw-Hill,
2002Stephen R. Schach
Verification vs validation
53
Relative Cost Of Software Development Activities
54
The Cost of Change
55
More Overview
  • Abstraction, Information Hiding, Encapsulation,
    Cohesion/Coupling.
  • all in 15 minutes!!

56
Abstraction
  • A means of achieving stepwise refinement by
    accentuating relevant details (and, by
    implication, suppressing unnecessary details).
  • Ex. Braking in your car, turning on the lights
  • Other examples?
  • How about examples in the medical field, or other
    disciplines?

57
Abstraction some definitions
  • "A view of a problem that extracts the essential
    information relevant to a particular purpose and
    ignores the remainder of the information."
  • -- IEEE, 1983
  • "The essence of abstraction is to extract
    essential properties while omitting inessential
    details. -- Ross et al, 1975
  • "Abstraction is a process whereby we identify
    the important aspects of a phenomenon and ignore
    its details. -- Ghezzi et al, 1991
  • "Abstraction is generally defined as 'the
    process of formulating generalized concepts by
    extracting common qualities from specific
    examples.'"
  • -- Blair et al, 1991
  • "Abstraction is the selective examination of
    certain aspects of a problem. The goal of
    abstraction is to isolate those aspects that are
    important for some purpose and suppress those
    aspects that are unimportant."
  • -- Rumbaugh et al, 1991

58
Abstraction some definitions
  • "The meaning of abstraction given by the
    Oxford English Dictionary (OED) closest to the
    meaning intended here is 'The act of separating
    in thought'. A better definition might be
    'Representing the essential features of something
    without including background or inessential
    detail.
  • -- Graham, 1991
  • "A simplified description, or specification,
    of a system that emphasizes some of the system's
    details or properties while suppressing others. A
    good abstraction is one that emphasizes details
    that are significant to the reader or user and
    suppress details that are, at least for the
    moment, immaterial or diversionary."
  • -- Shaw, 1984
  • "An abstraction denotes the essential
    characteristics of an object that distinguish it
    from all other kinds of object and thus provide
    crisply defined conceptual boundaries, relative
    to the perspective of the viewer."
  • -- Booch, 1991

59
Abstraction my favorite definition
  • Abstraction is doing just what our small minds
    need making it possible for us to think about
    important properties of our program its
    behavior without having to think about the
    entirety of the machinations.
  • Kiczales, 1992

60
Information Hiding
  • The focus of todays paper
  • Hides the implementation details from other
    modules

"The second decomposition was made using
'information hiding ... as a criterion. The
modules no longer correspond to steps in the
processing. ... Every module in the second
decomposition is characterized by its knowledge
of a design decision which it hides from all
others. Its interface or definition was chosen to
reveal as little as possible about its inner
workings." -- Parnas, 1972b "... the purpose
of hiding is to make inaccessible certain details
that should not affect other parts of a
system." -- Ross et al, 1975
61
Im sure glad I dont have to eat this stuff!
Looks Yummy!
62
The Restaurant
Messages invoke methods methods send messages.
Customer
Turnstile (object) Data tickets 1
Methods isTicketReady add Ticket remove
Ticket
Cook (object) Data name Arnold
specialties HamandEggs Pancakes
FrenchToast Private Methods makeHamandEggs
makePancakes makeFrenchToast Public
Methods takeTicketFromTurnstile
putOrderOnCounter
Waiter (object) Data name Joe tables
1,2 tickets 2 Methods takeOrder
putOrderonTurnstile pickup Order serverOrder
Counter (object) Data ordersAvailable
Methods isOrderReady addOrder removeOrder
63
Abstraction vs. Information Hiding
  • However, abstraction ltgt information hiding
  • It is possible to hide implementation details,
    yet provide a very poor interface into the module
    such that its key elements are still not easy to
    comprehend
  • Abstraction is about providing a representation
    of some thing which highlights that things
    essential elements

64
Encapsulation
  • The gathering together into one unit of all
    aspects of the real-world entity modeled by the
    abstract data unit.
  • Definitions

"to enclose in or as if in a capsule -- Mish,
1988 "The concept of encapsulation as used in
an object-oriented context is not essentially
different from its dictionary definition. It
still refers to building a capsule, in the case
a conceptual barrier, around some collection of
things." -- Wirfs-Brock et al, 1990
65
Encapsulation
  • Modularity is about separation When we worry
    about a small set of related things, we locate
    them in the same place. This is how thousands of
    programmers can work on the same source code and
    make progress.
  • Gabriel and Goldman, 2000

66
Information Hiding vs. Encapsulation
  • The two are also not equal
  • It is possible to have an encapsulated module
    that has all of its internal structure visible
    from the outside
  • Commonality of module is collected in one place,
    but the inner guts are not hidden (e.g., all
    members of a class are public)

67
Cohesion/Coupling
  • Two factors that help increase reliability,
    understandability, efficiency, and
    maintainability within and between modules.
  • Cohesion - within a module
  • Coupling - between modules
  • Provides some initial objective measure to the
    question What makes a good design?

68
A Brain Teaser Whos joined to whomand by
what?
A
C
B
Global Data
G
D
E
F
69
Modular Cohesion
  • The degree of interaction within a module.
  • OR
  • The measure of the strength of functional
    relatedness of elements (an instruction, group of
    instructions, a data definition, or a call to
    another module) within a module.
  • The term was borrowed from sociology by Larry
    Constantine in the mid-1960s, where it means the
    relatedness of humans within groups.

70
Scale of Cohesion
  • Stevens, Myers, Constantine, and Yourdon
    developed the Scale of Cohesion as a measure of
    the black boxness of a module, and as a result,
    the maintainability of a module.

Scale of Cohesion
71
Coincidental Cohesion
  • A module whose elements perform multiple,
    completely unrelated actions.
  • Such modules make systems less understandable and
    less maintainable than systems with no modularity
    at all.
  • GrossPay PayRate Hours
  • SalesTax Cost SalesTaxRate
  • Close File 1

72
Coincidental Cohesion cont.
  • Disadvantages of Coincidental Cohesion
  • Severe lack of maintainability of product.
  • Lack of reusability.
  • Corrective action
  • break the module into smaller modules.

73
Logical Cohesion
  • Occurs when overlapping parts of functions that
    have the same lines of code or the same buffers,
    but are not even executed at the same time
    (switch statement dispatch).
  • function_code 7
  • New_operation(function_code, d1, d2, d3)
  • // d1, d2, d3 are dummy variables and not
  • // used when function_code 7

74
Logical Cohesion cont.
  • Disadvantages
  • The interface is difficult to understand.
  • The code for more than one action may be
    intertwined, leading to maintainability problems.
  • The intertwining makes reusability of the module
    difficult, if not impossible.
  • Corrective action
  • separate the functions and rewrite.

75
Logical (aka Illogical) Cohesion
Not only does a logically cohesive module have
an ugly exterior with maybe a dozen different
parameters fighting to use four accesses, but
its inside resembles a plate of spaghetti mixed
with noodles and worms.
Meilir
Page-Jones 1988
76
Temporal Cohesion
  • A module whose elements are involved in
    activities that are related in time.
  • Elements are usually more closely related to
    activities in other modules than they are to one
    another (leads to tight coupling).
  • Disadvantage
  • Lack of reusability in other products.
  • Corrective Action
  • Take the procedure apart and rewrite code as
    necessary

77
Procedural Cohesion (skip)
  • A module whose elements are involved in different
    and possibly unrelated activities in which
    control flows from each activity to the next.
  • Related to each other by order of execution
    rather than by any single problem-related
    function (Similar to temporal cohesion).

78
Communicational Cohesion
  • A module whose elements contribute to activities
    that use the same input or output data.
  • E.g., Update record in database and write it to
    log file
  • Makes the transition into modules more easily
    maintainable, but still not easily reusable.

79
Informational Cohesion
  • A module whose elements perform a number of
    actions, each with its own entry point, with
    independent code for each action, and all
    performed on the same data structure.
  • Ex. Abstract data types
  • Supports Structured Programming concepts.

80
Functional Cohesion
  • A module whose elements all contribute to the
    execution of one and only one problem-related
    task (but not necessarily one and only one
    output).
  • Systems built chiefly of normally coupled,
    functionally cohesive modules are by far the
    easiest (and thus the cheapest) to maintain.
  • No matter how complicated, the sum of the module
    is one problem-related function.

81
Functional Cohesion cont.
  • Advantages
  • Concept of module is easily understood.
  • Easily maintained.
  • Product is more easily updatable
  • or changeable.
  • Supports fault isolation (easily testable).
  • Supports heavy Reusability.

82
Comparisons
Comparisons Between Levels of Cohesion
83
Modular Coupling
  • The degree of interaction between two modules.

Best (Lowest Interaction)
Normal Data Stamp Control Common Content
Levels of Coupling
Worst (Highest Interaction)
84
Content (alias Pathological) Coupling
  • Two modules exhibit content coupling if one
    refers to the inside of the other in any way.
  • Value being accessed is not passed through the
    parameter list.
  • Ex. if one module alters a statement in another,
    or updates another modules global state.

Module p Uses local data a
85
Common (alias Global) Coupling
  • Two modules that refer to the same global data
    area.
  • Disadvantages
  • Global areas may sometimes be drastically abused,
    as in when different modules use the same area to
    store quite different pieces of information,
    called overloading.
  • Programs using a lot of global data are extremely
    difficult to understand because of the difficulty
    of knowing what data are used by which module
    (very expensive to correct) def-use pairs hard
    to see
  • Wulf and Shaw Global Variables Considered
    Harmful

86
Control Coupling
  • Two modules where one passes to the other a piece
    of information intended to control the internal
    logic of the other.
  • Typically through the use of control flags.
  • Disadvantages
  • Leads to indirectness and obscurity.
  • Two modules are not independent.
  • Possibility of reuse is reduced.
  • Generally associated with modules that have
    logical cohesion.

87
Stamp Coupling
  • Two modules where one passes to the other a
    composite piece of data, that is, a piece of data
    with a meaningful internal structure.
  • Ex. All Employee Personnel Info, instead of just
    the pay rate and SSN.
  • Disadvantages
  • The indirectness can cause a broad interface.
  • Data not necessarily can be accessed by the
    module (creating dependencies between otherwise
    unrelated modules).

88
Data Coupling
  • Two modules that communicate by parameters, each
    parameter being an elementary piece of data.
  • Communication of data between modules is
    unavoidable and necessary, as long as it is kept
    to a minimum.

Call D Using X, Y
C D
X
Y
89
Data Coupling cont.
  • Advantages
  • Avoids sending unnecessary data
  • Is direct.
  • Flexible.
  • Highly reusable.
  • Maintainable.

90
Data Coupling - Warnings
  • 1. Small is better. Keep the interface as narrow
    as possible.
  • 2. Avoid using tramp data,
  • Data that passes through modules that do not need
    it in order to reach the recipient module
    (AspectJ wormhole example)
  • pieces of information that shuffle aimlessly
    around a system, unwanted by and meaningless to
    most of the modules through which it passes.
    Usually a symptom of poor organization of
    modules.
  • To varying degrees, tramp data violates all five
    of the principles for good coupling

91
Comparisons
Comparisons Between Levels of Coupling
92
The Goal of Good Modularity?
  • High Cohesion
  • Functional or Information
  • Low Coupling
  • Data, Stamp, Control

93
When to Use What?
  • Cohesions Goal
  • To create a procedure that performs one
    functionally-related task.
  • Couplings Goal
  • To protect global data and local data from being
    used within a procedure without declaring it on
    the procedures header

Both significantly affect maintenance. When
used correctly, maintenance can be reduced when
used incorrectly, maintenance can be a nightmare!
94
Enough of that.
  • Sample code from last week
  • Parnas paper

95
Simple C Example
class B public B() void f() cout
ltlt "Bf()" ltlt endl virtual void g()
cout ltlt "Bg()" ltlt endl class D public
B public D() void f() cout ltlt
"Df()" ltlt endl void g() cout ltlt "Dg()"
ltlt endl int main(int,char) B bp
D dp bp new B bp-gtf() bp-gtg()
dp new D dp-gtf() dp-gtg() bp dp
bp-gtf() bp-gtg()
Output Bf() Bg() Df() Dg() Bf() D
g()
96
A Sketchy Evolution of Software Design
  • 1960s
  • Structured Programming
  • (Goto Considered Harmful, E.W.Dijkstra)
  • Emerged from considerations of formally
    specifying the semantics of programming
    languages, and proving programs satisfy a
    predicate.
  • Adopted into programming languages because its a
    better way to think about programming
  • 1970s
  • Structured Design
  • Methodology/guidelines for dividing programs into
    subroutines.
  • 1980s
  • Modular (object-based) programming
  • Ada, Modula, Euclid,
  • Grouping of sub-routines into modules with data.
  • 1990s
  • Object-Oriented Languages started being commonly
    used (60s origin)
  • Object-Oriented Analysis and Design for guidance.

97
Module Structure
  • David Parnas (birthday anecdote)
  • On the Criteria To Be Used in Decomposing
    Systems into Modules Comm. ACM 15, 12 (Dec.
    1972), 1053-1058
  • Perhaps most popular paper in SE
  • Initial CACM rejection (Nobody does it that
    way)
  • Universal acceptance (Parnas wrote about common
    practice)
  • Discusses modularization
  • Module a collection of subroutines and data
    elements
  • Critique of Procedural Design
  • Pointing the way to object-based and OO design.
  • Describes two ways to modularize a program that
    generates KWIC (Key Word in Context) indices.
  • Modularization 1 - Based on the sequence of steps
    to perform
  • Modularization 2 - Based on the principle of
    information hiding

98
Weiss quote
  • The way to evaluate a modular decomposition,
    particularly one that claims to rest on
    information hiding, is to ask what changes it
    accommodates.
  • Hoffman and Weiss, 2001

99
KWIC
  • Input
  • Designing Software for Ease of ConstructionFigs
    are Good
  • Output
  • are Good Figs
  • for Ease of Construction Designing Software
  • of Construction Designing Software for Ease
  • Construction Designing Software for Ease of
  • Designing Software for Ease of Construction
  • Ease of Construction Designing Software for
  • Figs are Good
  • Good Figs are
  • Software for Ease of Construction Designing

100
KWIC Modularization 1
Master control
Input medium
Output medium
101
KWIC Modularization 2
Master control
Input medium
Output medium
102
Criteria for decomposition
  • Modularization 1
  • Each major step in the processing was a module
  • Modularization 2
  • Information hiding
  • Each module has one or more "secrets
  • Each module is characterized by its knowledge of
    design decisions which it hides from all others.
  • Lines
  • how characters/lines are stored
  • Circular Shifter
  • algorithm for shifting, storage for shifts
  • Alphabetizer
  • algorithm for alpha, laziness of alpha

103
General Comparison
  • General
  • Note both systems might share the same data
    structures and the same algorithms
  • Differences are in the way they are divided into
    work assignments
  • Systems are substantially different even if
    identical in the runnable representation
  • Possible because the runnable representation is
    used only for running
  • Other representations are used for
  • Changing
  • Documenting
  • Understanding

104
Changeability Comparison
  • Design decisions that may change (design1,
    design2)
  • Input format
  • (1, 1)
  • All lines stored in memory
  • (all, 1)
  • Pack characters 4 to a word
  • (all, 1)
  • Make an index for circular shifts rather than
    store them
  • (3,1)
  • Alphabetize once, rather than either
  • Search for each item as needed
  • Partially alphabetize, partially search
  • (3,1)

105
Independent Development
  • Modularization 1
  • Must design all data structures before parallel
    work can proceed
  • Complex descriptions needed
  • Modularization 2
  • Must design interfaces before parallel work can
    begin
  • Simple descriptions only
  • Comprehensibility
  • Modularization 2 is better
  • Parnas subjective judgment

106
Comparing Rationales
Modularization 1 Modularization 2
Design Criterion Each major processing step is made into a module Modules are designed using the principle of information hiding
Is task-specific? Yes. For e.g., the Add module is responsible for directly adding a contact into the address book. Yes. For e.g., the Add module is responsible for directly adding a contact into the address book
Inter-dependence HIGH. All modules are heavily dependent on the Data Storage module NONE. All modules are independent!
107
Information Hiding
  • Before decomposing a system into modules, a list
    of all possible design changes is made - Hiding
    Assumption List
  • Each module hides the implementation of an
    important design decision so that only the
    constituents of that module know the details
  • All design decisions are independent of each other

108
Information Hiding in Modularization 2
  • Modularization 2 used this principle of
    Information Hiding.
  • All of its modules are independent and have
    well-defined interfaces.
  • There is very low coupling between them.

109
Information Hiding in Modularization 2
  • Each module is very task-specific. All modules
    are highly cohesive.
  • For example, the sorting algorithm is known only
    to the Sort module. Similarly, the format of data
    storage is known only to the Read/Write Interface
    module.

110
Benefits of Good Modular Design
  • Independent Development
  • Since each module is independent, they can be
    developed independently at the same time ?
    Shortened Development Time!

111
Benefits of Good Modular Design
  • Changeability, Product Flexibility Reusability
  • Modules can be easily modified without affecting
    the rest of them. Moreover, modules can be easily
    replaced to add, enhance or change product
    capabilities.

112
Benefits of Good Modular Design
  • Comprehensibility
  • It is easier for programmers to fully understand
    the design of the entire product by individually
    studying the modules.

113
Comprehensibility Quote
  • In many pieces of code the problem of
    disorientation is acute. People have no idea what
    each component of the code is for and they
    experience considerable mental stress as a
    result.
  • Gabriel, 1995

114
Historical Content in Present Context
  • Paper is 30 years old, but only some details
    might make this fact apparent
  • Terminology
  • Previous concerns
  • Past design processes (flowcharts)
  • Changing guidelines
  • Code reuse (not a major point)

115
Terminology
Parnas uses some terms that are not used anymore,
or are used nowadays with different meanings,
such as - CORE Then main memory, general
storage space Now internal functionality,
internals - JOB Then Implied batch
processing Now ??? - Nowadays, we speak of
memory in a more abstract way (data structures,
etc). Memory was more often understood as
referring to physical storage (addresses,
records)
116
Previous Concerns
  • Parnas mentions as major advancements in the
    area of modular programming
  • The development of ASSEMBLERS
  • Nowadays, we could mention higher level
    languages, mainly object-oriented languages that
    better
  • (1) allow one module to be written with little
    knowledge of the code in another module, and
  • (2) allow modules to be reassembled and replaced
    without reassembly of the whole system
  • Aspect Languages

117
Past Design Processes
  • Use of flowcharts
  • When paper was written, the use of flowchart by
    programmers before was almost mandatory. With a
    flowchart in hands, the programmer would move
    from there to a detailed implementation. This
    caused modularizations like the first one to be
    created.
  • Parnas could see the problem with this approach
    and condemned it A flowchart would work ok for a
    small system, but not with a larger one.

118
Changing Guidelines
  • The sequencing of instructions necessary to call
    a given routine and the routine itself are part
    of the same module.
  • This pertains to worries of programmers at the
    time because they were programming in assembly
    and other low-level languages. Concerns such as
    code optimization were very important and
    involved creating smaller sets of machine
    instructions for a given task.
  • The sequence in which certain items will be
    processed should be hidden within a single
    module.
  • It has become irrelevant most times.

119
Code Reuse
  • Parnas does not emphasize code reuse so much in
    this paper. The reason might be the nature of
    programs written in assembly or lower-level
    languages programmers (not very
    portable/reusable).
  • If the paper were to be reviewed by Parnas,
    reuse would certainly be a point he would
    emphasize more.
  • It is important to notice that these points do
    not disturb the current relevance of Parnas
    ideas.

120
Effects on Current Programming
  • Fathered key ideas of OOP
  • Information hiding
  • Encapsulation before functional relations
  • Easier understandability/maintainability
  • Design more important than implementation
  • Good design leads to good implementation
  • Proper design allows for different
    implementations (easily modifiable)

121
New forms of Separation
  • Early plug for course next Fall
  • CS 692/792
  • Reflection and Metaprogramming
  • Advanced Separation of Concerns
  • Model-integrated Computing
  • Adaptive Middleware

122
Intermingled Decisions
123
Concern Separation
124
Making Modules Easier to Change
125
Parnas Transparency (skip 9.6 and 9.7)OO Design
Principles Open-Closed Liskov
Substitutability Dependency Inversion
  • CS 620/720
  • Software Engineering
  • January 22, 2004

126
Next Week
  • Tue
  • Reading Demeter
  • Reading Parnas Extension/Contraction
  • Reading Big Ball of Mud
  • Lead into patterns, frameworks, refactoring
  • May not cover all of this!
  • Thu
  • Finish what is left from Tue lecture
  • Patterns intro
  • Quiz 2
  • HW1 assigned

127
Parnas Transparency
128
Top Down Design
  • Also called Outside In Design
  • Describes and creates a system from the highest
    hierarchical level where the full specifications
    of a design must be known
  • Difficult or infeasible to obtain full
    specification
  • Can result in software that is unnecessarily
    inflexible
  • For these reasons, pure
  • Top Down has problems

129
Bottom Up Design
  • Create the system Inside Out from a set of
    lower level components (i.e. start at the
    bottom)
  • Work upwards, solving entire project
  • Reuse components from other projects
  • More practical to implement internal structures
    first, creating separate modules and joining them
    together
  • Bottom Up is more flexible. Hard to design
    general purpose system / library using top-down

130
Bottom Up Design (cont.)
  • As you move up the system hierarchy, you create
    structural levels
  • Base Machine
  • the lower level of a hierarchy, maybe hardware or
    an intermediate software level
  • Virtual Machine
  • a level above the base machine, it hides the
    complexity of the base machine to make
    interaction with the system easier

131
Transparency in Bottom Up Design
  • Transparency
  • describes the implementation completeness of the
    virtual machine with respect to the base
    machines functionality
  • Complete transparency
  • the virtual machine has ALL of the functionality
    of the base machine
  • Loss of transparency
  • a lack of functionality with respect to the base
    machine exists in the virtual machine
  • There is some sequence that can be specified in
    the base machine that can not be expressed in the
    virtual machine

132
Driving with strings and steering wheel
Base machine
New virtual machine
133
Example positions
What figures suggest a loss of transparency? In
this case, is the loss of transparency ok?
134
Virtual machine for register access
  • Many possible implementations
  • register is an array indexing shifting for
    insert/delete
  • register is one-way linked list linear search
  • register is indexed link list
  • register is linked list of small arrays

135
Completeness of the abstraction
  • What operation does Parnas show is not possible
    in the virtual machine, which suggests a loss of
    transparency?

136
Other Examples of Transparency
  • Hardware
  • Search Engine

137
Example Graphics Card Transparency
Hierarchical Level Description
0 Graphics Card silicon
1 Driver
2 API DirectX,OpenGL
3 Application Game, CAD
138
Graphics Card Example (cont.)
  • Positive results of transparency
  • Much easier to program with API than directly
    with driver. Using an API lets an application
    run on different hardware
  • Negative results of transparency
  • Depending on implementation, an application might
    not run as fast on a particular piece of
    hardware. I.e., it wont fully utilize certain
Write a Comment
User Comments (0)
About PowerShow.com