1985 CPSR-MIT Debate - PowerPoint PPT Presentation

About This Presentation
Title:

1985 CPSR-MIT Debate

Description:

Realistic testing is not possible, No chance to fix software during use, ... 'Star Wars' Testing ' ... Very limited opportunities for realistic testing ... – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 79
Provided by: kevinw89
Category:
Tags: cpsr | mit | debate | testing

less

Transcript and Presenter's Notes

Title: 1985 CPSR-MIT Debate


1
1985 CPSR-MIT Debate
  • Michael Dertouzos, moderator
  • David Parnas, against SDI
  • (Joseph Weizenbaum, against)
  • Charles Seitz, for SDI
  • (Danny Cohen, for)

2
Charles Seitz, arguing for
3
Pause for Analysis
  • Sketch Seitz argument in premise-conclusion
    style
  • Since Premise, and
  • Premise,
  • Therefore Conclusion.
  • (Hint identify conclusion first.)

4
Seitz Conclusion
  • It is possible to create reliable SDI
    software.

5
Seitz Premises
  • Since
  • A hierarchical architecture seems best,
  • (because more natural, used in nature,
    understood by military, allows abstraction up
    levels )

6
Seitz Premises
  • Since
  • A hierarchical architecture seems best,
  • Physical organization should follow logical
    organization,
  • (simplest choice, natural)

7
Seitz Premises
  • Since
  • A hierarchical architecture seems best,
  • Physical organization also hierarchical,
  • Tradeoffs to make software problem tractable
    are in the choice of system architecture
  • (not in new / radical methods)

8
Seitz Premises
  • Since
  • A hierarchical architecture seems best,
  • Physical organization also hierarchical,
  • This makes software problem tractable,
  • Loose coordination allows us to infer
    system performance
  • (assume stat. independence, )

9
Seitz Argument
  • Since
  • A hierarchical architecture seems best,
  • Physical organization also hierarchical,
  • This makes software problem tractable,
  • And allows system reliability estimate,
  • Therefore
  • It is possible to create reliable SDI
  • battle management software.

10
Pause for Analysis
  • Whose argument is better?
  • Why?
  • Do they start with the same problem
    definition?

11
David Parnas, Rebuttal
12
Charles Seitz, Rebuttal
13
Pause for Analysis
  • Relevant analogies to SDI?
  • Why / why not?
  • Space shuttle software
  • Telephone system software
  • Nuclear plant software
  • others?

14
Pause for Analysis
  • Outline the most realistic SDI software
    testing that you can.

15
Pause for Analysis
  • How did you account for
  • real-world sensor inputs
  • variable weather conditions
  • target / decoy appearance
  • variable attack structure
  • attacked components failing

16
Fault Tolerant Software?
  • James Ionson, in Reliability and Risk, a CPSR
    video.

17
Fault Tolerant Software?
  • It is not error-free code, it is fault-tolerant
    code. And if another million lines has to be
    written to ensure fault-tolerance, so be it.
  • - James Ionson

18
Fault Tolerant Software?
  • Diagram in premise-conclusion form the argument
    being made by James Ionson.
  • Does the argument make sense?
  • Why / why not?

19
Star Wars Today
  • Current SDI-like programs are called
    National Missile Defense.
  • There are some potentially important
    differences.

20
Star Wars Today
  • One of the remarkable aspects of the evolution
    of missile defenses is that few policy makers
    question the fundamental ability to be
    effective. Instead they focus on timing, cost,
    .
  • (Mosher, page 39, IEEE Spectrum, 1997)

21
Star Wars Today
  • This is a sharp change from the Reagan years,
    perhaps because the technology used is closer at
    hand and the threats are smaller.
  • (Mosher, page 39, IEEE Spectrum, 1997)

22
Star Wars Today
  • Smaller anticipated mission
  • protect the U.S. against an attack by a rogue
    state using a handful of warheads outfitted with
    simple countermeasures.
  • (Mosher, page 36, IEEE Spectrum, 1997)

23
Star Wars Today
  • Smaller anticipated mission
  • also provide protection against an accidental
    launch of a few warheads by Russia or China.
  • (Mosher, page 36, IEEE Spectrum, 1997)

24
Star Wars Today
  • One talked-about version does not use
    space-based weapons
  • no more than 100 hit-to-kill interceptors
    based at old ABM site near Grand Forks, ND.
  • (Mosher, page 37, IEEE Spectrum, 1997)

25
Pause for Analysis
  • How fundamentally does it change Parnas
    argument if the anticipated attack uses
    fewer and simpler missiles?

26
Parnas Argument
  • How are the premises changed?
  • Specifications not known in advance,
  • Realistic testing is not possible,
  • No chance to fix software during use,
  • No foreseeable technology changes this,
  • None are changed in principle but
  • overall it seems somehow less impossible.

27
Star Wars Testing
  • In the last 15 years, the U.S. has conducted 20
    hit-to-kill intercepts, . Six intercepts were
    successful 13 of those test were done in the
    last five years, and among them three succeeded.
  • (Mosher, page 39, IEEE Spectrum, 1997)

28
Star Wars Testing
  • No real attempts have been made to intercept
    uncooperative targets those that make use of
    clutter, decoys, maneuver, anti-simulation, and
    other countermeasures.
  • (Mosher, page 39, IEEE Spectrum, 1997)

29
Star Wars Testing
  • Test of a powerful laser has been blocked by
    bad weather and software problems.
  • a software problem caused the laser to
    recycle, or unexpectedly lose power .
  • (R. Smith, Washington Post, Oct 8, 1997)

30
Schwartz versus TRW
  • In 1996, ex TRW engineer Nira Schwartz filed a
    False Claims Act suit, alleging that results of
    tests to distinguish warheads and decoys were
    falsified by TRW.
  • (featured on 60 Minutes II in January 2001)

31
Schwartz versus TRW
  • Schwartz claims that TRW
  • knowingly made false test plans, test
    procedures, test reports and presentations to the
    government to remain in the program.

32
Schwartz versus TRW
  • Schwartz claims
  • I say to my boss, It is wrong, what we are
    going it is wrong. And the next day, I was
    fired.

33
Schwartz versus TRW
  • TRW says TRW scientists and engineers devoted
    years to this complex project, while Ms.
    Schwartz, in her six months with the company
    Her understanding is insufficient to lend any
    credibility to her allegations.

34
Schwartz versus TRW
  • DOD criminal investigator says absolute,
    irrefutable, scientific proof that TRWs
    discrimination technology does not, cannot, and
    will not work
  • TRW knowingly covering up.

35
Schwartz versus TRW
  • DOD panel then said
  • TRWs software and sensors are well designed
    and work properly provided that the Pentagon
    does not have any wrong information about what
    kind of warheads and decoys an enemy is using.

36
Schwartz versus TRW
  • Lt. General Kadish Right now, from what I see,
    there is no reason to believe that we cant make
    this work. But theres a lot more testing to be
    done.

37
Schwartz versus TRW
  • Congressman Curt Weldon, R-PA
  • If we dont build a new aircraft carrier, we
    have older ones. If we dont build a new fighter
    plane, we have older ones. If we dont build
    missile defense, we have nothing.
  • What is the premise-conclusion summary of this
    argument?

38
Schwartz versus TRW
  • Congressman Curt Weldon, R-PA
  • On 50 Nobelists anti-BMD letter - I dont know
    any of them thats come to Congress or me. I
    mean its easy to get anyone to sign a letter.
    I sign letters all the time.
  • What is the premise-conclusion summary of this
    argument?

39
Schwartz versus TRW
  • Congressman Curt Weldon, R-PA
  • There were scientists that who made the case
    against Kennedy that it was crazy, wed never
    land on the moon. And I characterize Postol now
    as one of those people.
  • What is the premise-conclusion summary of this
    argument?

40
Ethical Issues
  • What are some of the important ethical questions?
  • And what guidance do the codes of ethics give on
    these questions?

41
Ethical Issues
  • How to interact with colleagues with whom
    you disagree?
  • When to blow the whistle?
  • Should you accept work on an impossible
    but project?

42
Dealing with Colleagues
  • AITP Standards of Conduct
  • In recognition of my obligation to fellow
    members and the profession I shall cooperate with
    others in achieving understanding and in
    identifying problems.

43
Dealing with Colleagues
  • Item 5.12 of ACM / IEEE-CS
  • Software Engineering Code
  • Those managing or leading software engineers
    shall not punish anyone for expressing ethical
    concerns about a project.

44
Accept Impossible Work?
  • Item 3.2 of ACM / IEEE-CS
  • Software Engineering Code
  • Software engineers shall ensure proper and
    achievable goals and objectives for any
    project on which they work or propose.

45
Accept Impossible Work?
  • Item 1.3 of the ACM / IEEE-CS
  • Software Engineering Code
  • Software engineers shall accept software only
    if they have a well founded belief that it is
    safe, meets specifications, passes appropriate
    tests,

46
Blow the Whistle?
  • AITP Standards of Conduct
  • In recognition of my obligation to society, I
    shall never misrepresent or withhold information
    that is germane to a problem or situation of
    public concern nor allow any such known
    information to remain unchallenged.

47
Blow the Whistle?
  • Item 1.4 of ACM / IEEE-CS
  • Software Engineering Code
  • Software engineers shall disclose to
    appropriate persons or authorities any
    actual or potential danger to the user,
    the public that they reasonably believe

48
Summary
  • Difficult ethical issues arise in creation
    of safety-critical software.
  • Trustworthy SDI software is more clearly
    impossible in retrospect.
  • Modern, smaller SDI-like programs appear
    more tractable.

49
National Science Foundation grant DUE
97-52792
  • Thanks to
  • for partial support of this work.

50
Computing Professionals for Social
Responsibility (www.cpsr.org)
  • Thanks to the
  • for permission to distribute digitized video
    of the debate.

51
David ParnasChuck Seitz
  • Thanks to
  • for commenting on a draft of the paper
    describing this module.

52
The Ronald Reagan Presidential Library
(www.reagan.utexas.edu)
  • Thanks to the
  • for help in obtaining the video of
    Reagans 3/23/83 speech.

53
Christine KranenburgLaura MalaveMelissa
ParsonsJoseph Wujek
  • Thanks to
  • for technical assistance.

54
The End.
55
Overheads from Parnas Presentation
  • The next slides are transcribed versions of (most
    of) the transparencies in Parnas presentation.

56
Why is it important that the software can never
be trusted?
  • We will make decisions as if it was not there.
  • They will make decisions as if it might work.

57
A necessary condition for trustworthy engineering
products is validation by
  • Mathematical analysis, or
  • Exhaustive case analysis, or
  • Prolonged, realistic, testing
  • or a combination of the above

58
Why software is always the unreliable glue in
engineering systems
  • The best mathematical tools require that a system
    be described by continuous functions
  • Exhaustive case analysis can only be used when
    the number of states is small or the design
    exhibits a repetitive structure

59
Why do we have some usable software?
  • Sometimes the requirements allow untrustworthy
    software
  • There has been extensive use under actual
    conditions
  • Operating conditions are controlled or
    predictable
  • Backup manual system available when needed

60
What makes the SDI software much more difficult
than other projects?
  • Lack of reliable information on target and decoy
    characteristics
  • Distributed computing with unreliable nodes and
    unreliable channels
  • Distributed computing with hard real-time
    deadlines
  • Physical distribution of redundant real-time data
  • Hardware failures will not be statistically
    independent

61
What makes the SDI software much more difficult
than other projects?
  • Redundancy is unusually expensive
  • Information essential for real-time scheduling
    will not be reliable
  • Very limited opportunities for realistic testing
  • No opportunities for repairing software during
    use
  • Expected to be the largest real-time system ever
    attempted, frequent changes are anticipated

62
Software Espionage and Nuclear Blackmail
  • Fact Software systems, because of their rigid
    predetermined behaviors are, easily defeated by
    people who understand the programs
  • Fact Changes in large software systems must be
    made slowly and carefully with extensive review
    and testing

63
What about new Soft. Eng. techniques?
  • Precise requirement documents
  • Abstraction/information hiding
  • Formal specifications
  • The use of these techniques requires previous
    experience with similar systems
  • Co-operating sequential processes requires
    detailed information for real-time scheduling
  • Structured programming reduces but does not
    eliminate errors

64
What about Artificial Intelligence?
  • AI-1 - Defined as solving hard problems.
  • Study the problem, not the problem solver.
  • No magic techniques just good solid program
    design.
  • AI-2 - Heuristic or Rule Based Programming/Expert
    Systems
  • Study the problem solver, not the problem
  • Ad hoc, cut and dry programming
  • Little basis for confidence

65
What about new programming languages?
  • No magic
  • They help if they are simple and well understood
  • No breakthroughs
  • The fault lies not in our tools but in ourselves
    and in the nature of our product.

66
What about automatic programming?
  • Since 1948 a euphemism for programming in a new
    language?

67
What about program verification?
  • The right problem but do we have a solution?
  • Whats a big program?
  • Wrong kind of program? How do you verify a model
    of the earths gravitational field?
  • Implicit assumption of perfect arithmetic
  • What about language semantics?

68
Is there a meaningful concept of tolerance for
software?
  • The engineering notion of tolerance depends on
    an assumption of continuity.
  • Statistical measures of program quality are
    limited in their application to situations where
    individual failures are not important.

69
Overheads from Seitz Presentation
  • The next slides are transcribed versions of (most
    of) the transparencies in Seitz presentation.

70
From The Strategic Defense InitiativeWhite
House pamphlet dated Jan, 1985.
  • SDIs purpose is to identify ways to exploit
    recent advances in ballistic missile defense
    technologies that have potential for
    strengthening our security and that of our
    Allies. The program is designed to answer a
    number of fundamental scientific and engineering
    questions that much be addressed before the
    promise of these new technologies can be fully
    assessed. The SDI program will provide to a
    future president and a future congress the
    technical knowledge necessary to support a
    decision in the early 1990s on whether to
    develop and deploy advanced defensive systems.

71
From 1985 Report to the Congress on the Stategic
Defense Initiative (Section III)
  • The goal of the SDI is to conduct a program of
    rigorous research focused on advanced defensive
    technologies.
  • The SDI seeks, therefore, to exploit emerging
    technologies that may provide options for a
    broader-based deterrence by turning to a greater
    reliance on defensive systems

72
From 1985 Report to the Congress on the Stategic
Defense Initiative (Section III)
  • It should be stressed that the SDI is a research
    program that seeks to provide the technical
    knowledge required to support a decision on
    whether to develop and later deploy these
    systems. All research efforts will be fully
    compliant with U.S. treaty obligations.

73
  • Weapons
  • Incapable of causing damage at Earths surface
  • Range 1000 km.
  • Partial deployment ineffective in boost phase
  • Sensors
  • Some located in high orbits
  • Can be passive
  • Useful in early deployments
  • Battle Management System
  • Computers and communication

74
Coordination
  • Lowest Level - stereo and sensor fusion
  • Middle Levels - target discrimination, attack and
    coordination
  • High Levels - assignment of priorities of target
    in midcourse in order to prevent particular areas
    from being overwhelmed in terminal defense, or to
    prevent any single area to accept too high a
    concentration for terminal defense
  • Top Level - command and control decisions

75
Conclusions of the Panel
  • The feasibility of the battle management
    software and our ability to test, simulate, and
    modify the system are very sensitive to the
    choice of system architecture. In particular, the
    feasibility of the BMS software is much more
    sensitive to the system architecture than it is
    to the choice of software engineering technique

76
Conclusions of the Panel
  • Software technology is developing against what
    appears today to be relatively inflexible limits
    in the complexity of systems. The treadeoffs
    necessary to make the software tractable are in
    the system architecture

77
Conclusions of the Panel
  • We must prefer an unconventional system
    architecture whose programming is within the
    anticipated limits of software engineering over
    reliance on radical software development
    approaches and the risk that we could not develop
    reliable software at any cost

78
Conclusions of the Panel
  • One promising class of system architecture for a
    strategic defense system are those that are less
    dependent on tight coordination because of
    the ability to infer the performance of
    full-scale deployment by evaluating the
    performance of small parts of the system.
Write a Comment
User Comments (0)
About PowerShow.com