February 22 – Intro to Software Engr – Lecture Notes

Finishing up “A Rational Design Process”

 

Role of Documentation in the Process

  • Typically bad – hard to use
    • Poor organization: stream of consciousness, execution
    • Boring prose:  figures, pictures
    • Confusing and inconsistent terminology: 
    • Myopia
  • How do we avoid these problems
    • View documentation as a primary product
    • Do checks for completeness and consistency
    • Becomes knowledge base, extends use
    • Make documentation a standard
    • Provide dictionaries

Faking the Process

  • Produce the documents that would’ve been produced if we followed the ideal process
  • Complete and accurate
  • Mathematicians polish proofs – polish the document (prose)
  • Understanding, not a report

 

Alternative – design document –

 

“A Guided Tour of Program Design Methodologies” (Bergland)

 

Design/Architecture - system project

Ideal Process is not achievable, design document is

 

Literate programming – combining documentation and code, with tools that would extract and compile code from documents (“How’s your organ?”)

 

SWE Techniques

  • Program Structure (structural, oo, abstract data types)
  • Developers’ Processes (way of coordinating)
  • Support tools

 

Design Techniques

  • functional decomposition
  • data flow design/analysis
  • data structure design
  • programming calculus (Dykstra)

 

Code level

  • Abstraction
  • Communication
  • Clarity
  • Control flow

Module level

  • cohesion
  • minimize coupling
  • complexity
  • correctnsesses
  • correspondence

 

System Level

  • consistent interfaces
  • connectivity
  • continuity/change/chaos
  • optimization and packaging

 

Types of Cohesions – from greatest to least

Functional (integral)

Sequential (data flow)

Communicational (comm. Data0

Procedural (flow chart)

Temporal (same time)

Logical (same function)

Coincidental (random)

 

Functional Decomposition Design

  • divide and conquer
    • stepwise decomposition
    • top-down (or bottom up)
    • gradual refinement
    • gradual detail
  • design strategy
    • define intended function
    • subdivide
    • connect
    • check
    • create subcomponent
    • continue until you feel comfortable
  • think of what you’re doing as a language
    • nouns, verbs – programming language allows for sequences of those, to allow for discourse in that language
    • like a file system – open, read/write, close
  • higher-level machines (virtual machines – next paper)
    • bottom-up, building starting with hardware
    • build up layers of higher-level languages around that
  • problems with this approach
    • decompose with respect to what?  What are the criteria?  E.g. time (initialize/process/finalize – temporal cohesion) or access (communicational cohesion) or data flow (sequential cohesion)
  • advantage – general applicability
  • disadvantage – unpredictability and variability
  • McDonald’s example

 

Data Flow Design

  • functional decomposition with regards to data
  • data flows + transforms
  • design strategy
    • draw data flow graph
    • model of the problem environment
    • transform -> progam structure
    • data “is composed of” data    - functional “uses”  :  consumer/producers
  • shortcomings
    • produces a network, not a hierarchy
    • can be solved by taking a point, “shake it”, and it falls into a hierarchy
  • method – sequential cohesion
    • model program as data flow
    • identify afferent, efferent, and transform elements
    • factor into hierarchy
    • refine and optimize
  • problems
    • initialization/termination – boundary

 

 

*** Midterm exam: beginning of Saturday -- 8 T/F, 8 multiple choice are less than half of the test; 4 essay questions are more than half  -- cover up through Deployment, Maintenance, Configuration Management (no paper, just notes in class);  chapter 18 (p 227) Brooks outlines each chapter of Mythical Man-month***

 

Data Structure Design (Michael Jackson of UK, Warnier of France)

  • closest thing we have to a true method…  can yield the same result
  • basic idea
    • we view the world through data structures
    • a correct model of the data structure can be transformed into a program or system that incorporates a correct model of the world
  • basic relationship: “is composed of”
  • Reasons to use this design:
    • since the data structure spec can easily be viewed as correct or incorrect (i.e. does it reflect the problem), then the program should be as well
    • two people solving the same problem should come up with programs that are essentially the same
  • design strategy
    • form a system network diagram that models the problem environment
    • define and verify the data-stream structures
    • derive and verify the program structures
    • derive and allocate the elementary operations
    • write the structure text and program text
  • utility
    • complex problem becomes a network of hierarchies, each of which represents a simpler problem
    • program inversion – schedule the network (“is called by”)

*** two research paradigms

                tool builder – “I did it”

                process/metrics – “I had a dream”

 

  • problems
    • data structure clash – problem not being able to do the combinations you want to do
    • elementary operations – performed on data structures, driven by data structures, so see above
    • basically, we develop bottoms-up:  clear for small problems, not clear for big ones
  • critique
    • data diagrams and structure text should be kept as permanent
    • tool support used to be lacking
  • example

 

Programming Calculus

  • program and proof together (good for highly-critical systems)
  • strategy
    • formally specify the result in pc (predicated calculus)
    • given the post condition, derive the preconditions, working backwards
    • language statements viewed as “predicate transformers”
      • {A} B {B}
    • from proof structure, derive program
  • problems
    • difficult – loop invariants
    • scaling up is very hard (time consuming)
    • programs are correct, but sometimes proofs are not
  • utility – critical systems

 

Comparison

Functional design:

  • process is basic
  • favors addition of changes, that can be keyed to the beginning of a group

Data Flow design:

  • difference between the lead card and last card in a group
  • favors changes at end
  • sequential cohesion

Data Structure design:

  • keying of groups is automatic within the control structure
  • everything looks clean
  • further additions at the beginning or the end

Programming Calculus design:

  • looks clean, problems if beginning-of-operation additions are required

 

Critique

FD: 

  • been around a long time, comes naturally
  • if not careful, yields logical cohesion (lowest form of cohesion)
  • telescoping
  • number of possible decompositions is large
  • decompose with respect to what?
  • Difficult to come up with same solution

DF:

  • came later, not used as extensively
  • concepts of coupling and cohesion are important
  • useful hierarchical structure (a little bit artificially)
  • often get a lot of artificial data passing
  • reverts to functional decomposition in the central transport
  • starts with modeling the data flow, and then start worrying about structure of program
  • still an art

 

DS:

  • used in Europe mostly
  • closest to a true method:  teachable, repeatable, reliable
  • program structure models problem structure
  • multiple network of hierarchies
  • difficult to assess cohesion, sometimes functional, sometimes communicational
  • model before construction
  • deriving the “correct” structures can be difficult

PC:

  • requires mathematical maturity, even for simple programs
  • mathematical proofs are longer than programs sometimes
  • multiple designs possible
  • “correct” program may have the wrong structure
  • still useful for certain cases

 

 

“Designing Software for Ease of Extension and Contraction” (Parnas)

SW as families (product line architectures)

 

Common problems

  • different hardware, OS, platforms; platforms change
  • same function, different data format
  • different algorithms
  • different resources
  • different frequency of data or events
  • need-only subset of functionality
  • different performance/reliability
  • different standards – telephony

 

Distinguish - commonality and variability

Goals

  • reduce maintenance
  • increase flexibility
  • ease of change

 

Software – not subsetable, not extensible … single-use programs, monolithic programs

Problems

  • excessive information distribution
  • chains of data transforming components
  • components that perform more than one function
  • loops in the “uses” relation

 

Steps Toward a Better Structure

  • Requirements definition – anticipate the change before you design
  • Information hiding:  interface and module definition
    • Identify secret items
    • Localize
    • Create abstract logical interface, insensitive to change
  • Make things that are going to change rapidly/regularly data (logical interface)
  • The virtual machine (VM) concept
  • Designing the “Uses” cases

** Researchers tend to overstate what they do, and then have it downgraded by their colleagues

Developers tend to understate what they’ve been doing, and then have their colleagues upgrade it.