IE 3265 R. Lindeke, Ph. D. - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

IE 3265 R. Lindeke, Ph. D.

Description:

IE 3265 R. Lindeke, Ph. D. Quality Management in POM Part 2 Topics Managing a Quality System Total Quality Management (TQM) Achieving Quality in a ... – PowerPoint PPT presentation

Number of Views:258
Avg rating:3.0/5.0
Slides: 59
Provided by: dUmnEdur
Learn more at: https://www.d.umn.edu
Category:
Tags: lindeke

less

Transcript and Presenter's Notes

Title: IE 3265 R. Lindeke, Ph. D.


1
IE 3265R. Lindeke, Ph. D.
  • Quality Management in POM Part 2

2
Topics
  • ? Managing a Quality System
  • ? Total Quality Management (TQM)
  • ? Achieving Quality in a System
  • ? Look early and often
  • ? 6 Sigma an approach a technique
  • ? Make it a part of the process
  • ? The Customers Voice in Total Quality Management
  • ? QFD and the House of Quality
  • Quality Engineering
  • Loss Function
  • Quality Studies
  • Experimental Approaches
  • T.M. FMEA Shainin

3
Taguchis Loss Function
  • Taguchi defines Quality Level of a product as the
    Total Loss incurred by society due to failure of
    a product to perform as desired when it deviates
    from the delivered target performance levels.
  • This includes costs associated with poor
    performance, operating costs (which changes as a
    product ages) and any added expenses due to
    harmful side effects of the product in use

4
Exploring the Taguchi Method
  • Considering the Loss Function, it is quantifiable
  • Larger is Better
  • Smaller is Better
  • Nominal is Best

5
Considering the Cost of Loss
  • k in the L(y) equation is found from

6
Loss Function Example (nominal is best)
  • We can define a processes average loss as
  • s is process (product) Standard Deviation
  • ybar is process (product) mean

7
Example cont.
  • A0 is 2 (a very low number of this type!) found
    by estimating that the loss is 10 of the 20
    product cost when a part is exactly 8.55 or 8.45
    units
  • Process specification is 8.5.05 units
  • Historically ybar 8.492 and s 0.016

8
Example Cont.
  • Average Loss
  • If we make 250,000 units a year
  • Annual Loss is 64,000

9
Fixing it
  • Shift the Mean to nominal
  • Reduce variation (s 0.01)
  • Fix Both!

10
Taguchi Methods
  • Help companies to perform the Quality Fix!
  • Quality problems are due to Noises in the product
    or process system
  • Noise is any undesirable effect that increases
    variability
  • Conduct extensive Problem Analyses
  • Employ Inter-disciplinary Teams
  • Perform Designed Experimental Analyses
  • Evaluate Experiments using ANOVA and Signal-to
    noise techniques

11
Defining the Taguchi Approach
  • The Point Then Is To Produce Processes Or
    Products The Are ROBUST AGAINST NOISES
  • Dont spend the money to eliminate all noise,
    build designs (product and process) that can
    perform as desired low variability in the
    presence of noise!
  • WE SAYROBUSTNESS HIGH QUALITY

12
Defining the Taguchi Approach
  • Noise Factors Cause Functional Variation
  • They Fall Into Three Classes
  • 1. Outer Noise Environmental Conditions
  • 2. Inner Noise Lifetime Deterioration
  • 3. Between Product Noise Piece To Piece
    Variation

13
Taguchi Method isStep-by-Step
14
Defining the Taguchi Approach
  • TO RELIABLY MEET OUR DESIGN GOALS MEANS
    DESIGNING QUALITY IN!
  • We find that Taguchi considered THREE LEVELS OF
    DESIGN
  • level 1 SYSTEM DESIGN
  • level 2 PARAMETER DESIGN
  • level 3 TOLERANCE DESIGN

15
Defining the Taguchi Approach SYSTEM DESIGN
  • All About Innovation New Ideas, Techniques,
    Philosophies
  • Application Of Science And Engineering Knowledge
  • Includes Selection Of
  • Materials
  • Processes
  • Tentative Parameter Values

16
Defining the Taguchi Approach Parameter Design
  • Tests For Levels Of Parameter Values
  • Selects "Best Levels" For Operating Parameters to
    be Least Sensitive to Noises
  • Develops Processes Or Products That Are Robust
  • A Key Step To Increasing Quality Without
    Increased Cost

17
Defining the Taguchi Approach Tolerance Design
  • A "Last Resort" Improvement Step
  • Identifies Parameters Having the greatest
    Influence On Output Variation
  • Tightens Tolerances On These Parameters
  • Typically Means Increases In Cost

18
Selecting Parameters for Study and Control
  • Select The Quality Characteristic
  • Define The Measurement Technique
  • Ennumerate, Consider, And Select The Independent
    Variables And Interactions
  • Brainstorming
  • Shainins technique where they are determined by
    looking at the products
  • FMEA failure mode and effects analysis

19
Preliminary Steps in Improvement Studies
  • To Adequately Address The Problem At Hand We
    Must
  • 1. Understand Its Relationship With The Goals We
    Are Trying To Achieve
  • 2. Explore/Review Past Performance compare to
    desired Solutions
  • 3. Prepare An 80/20 Or Pareto Chart Of These Past
    Events
  • 4. Develop A "Process Control" Chart -- This
    Helps To Better See The Relationship between
    Potential Control And Noise Factors
  • A Wise Person Can Say A Problem Well Defined Is
    Already Nearly Solved!!

20
Going Down the Improvement Road
  • Start By Generating The Problem Candidates List
  • Brainstorm The Product Or Process
  • Develop Cause And Effects (Ishikawa) Diagrams
  • Using Process Flow Charts To Stimulate Ideas
  • Develop Pareto Charts For Quality Problems

21
DEVELOPING A Cause-and-Effect Diagram
  • 1. Construct A Straight Horizontal Line (Right
    Facing)
  • 2. Write Quality Characteristic At Right
  • 3. Draw 45 Lines From Main Horizontal (4 Or 5)
    For Major Categories Manpower, Materials,
    Machines, Methods And Environment
  • 4. Add Possible Causes By Connecting Horizontal
    Lines To 45 "Main Cause" Rays
  • 5. Add More Detailed Potential Causes Using
    Angled Rays To Horizontal Possible Cause Lines

22
Generic Fishbone CE Diagram
23
Building the Experiment Working From a Cause
Effect Diagram
24
Designing A Useful Experiment
  • Taguchi methods use a cookbook approach!!
    Building Experiments for selected factors on the
    CE Diagram
  • Selection is from a discrete set of Orthogonal
    Arrays
  • Note an orthogonal array (OA) is a special
    fractional factorial design that allows study of
    main factors and 2-way interactions

25
T.M. Summary
  • Taguchi methods (TM) are product or process
    improvement techniques that use DOE methods for
    improvements
  • A set of cookbook designs are available and
    they can be modified to build a rich set of
    studies (beyond what we have seen in MP labs!)
  • TM requires a commitment to complete studies and
    the discipline to continue in the face of
    setbacks (as do all quality improvement methods!)

26
Simplified DOE
  • Shainin Tools these are a series of steps to
    logically identify the root causes of variation
  • These tools are simple to implement,
    statistically powerful and practical
  • Initial Step is to sample product (over time) and
    examine the sample lots for variability to
    identify causative factors this step is called
    the multi-vari chart approach
  • Shainin refers to root cause factors as the Red
    X, Pink X, and Pink-Pink X causes

27
Shainins Experimental Approaches to Quality
Variability Control
28
Shainin Ideas exploring further
  • Red X the primary cause of variation
  • Pink X the secondary causes of variation
  • Pink-Pink X significant but minor causes of
    variation (a factor that still must be
    controlled!)
  • Any other factors should be substituted by lower
    cost solutions (wider tolerance, cheaper
    material, etc.)

29
Basis of Shainins Quality Improvement Approaches
  • As Shainin Said Dont ask the engineers, they
    dont know, ask the parts
  • Contrast with Brainstorming approach of Taguchi
    Method
  • Multi-Vari is designed to identify the likely
    home of the Red X factors not necessarily the
    factors themselves
  • Shainin suggests that we look into three source
    of variation regimes
  • Positional
  • Cyclical
  • Temporal

30
Does the mean shift in time or between products
or is the product (alone) showing the variability?
31
Positional Variations
  • These are variation within a given unit (of
    production)
  • Like porosity in castings or cracks
  • Or across a unit with many parts like a
    transmission, turbine or circuit board
  • Could be variations by location in batch loading
    processes
  • Cavity to cavity variation in plastic injection
    molding, etc.
  • Various tele-marketers at a fund raiser
  • Variation from machine-to-machine,
    person-to-person or plant-to-plant

32
Cyclical Variation
  • Variation between consecutive units drawn from a
    process (consider calls on a software help line)
  • Variation AMONG groups of units
  • Batch-to Batch Variations
  • Lot-to-lot variations

33
Temporal Variations
  • Variations from hour-to-hour
  • Variation shift-to-shift
  • Variations from day-to-day
  • Variation from week-to-week

34
Components Search the prerequisites
  • The technique is applicable (primarily) in
    assbly operations where good units and bad units
    are found
  • Performance (output) must be measurable and
    repeatable
  • Units must be capable of disassembly and
    reassembly without significant change in original
    performance
  • There must be at least 2 assemblies or units
    one good, one bad

35
The procedure
  • Select the good and bad unit
  • Determine the quantitative parameter by which to
    measure the units
  • Dissemble the good unit reassemble and measure
    it again. Disassemble and reassemble then measure
    the bad units again. If the difference D between
    good and bad exceeds the d difference (within
    units) by 51, a significant and repeatable
    difference between good and bad units is
    established

36
Procedure (cont.)
  • Based on engineering judgment, rank the likely
    component problems, within a unit, in descending
    order of perceived importance.
  • Switch the top ranked component from the good
    unit to the bad unit or assembly with the
    corresponding component in the bad assembly going
    to the good assembly. Measure the 2 (reassembled)
    units.
  • If there is no change the good unit stays good
    bad stays bad, the top guessed component (A) is
    unimportant go on to component B
  • If there is a partial change in the two
    measurements A is not the only important
    variable. A could be a Pink X family. Go on to
    Component B
  • If there is a complete reversal in outputs of the
    assemblies, A could be in the Red X family. There
    is no further need for components search.

37
Procedure (cont.)
  • Regardless of which of the three outcomes above
    are observed, restore component A to the original
    units to assure original conditions are repeated.
    Then, repeat the previous 2 steps for the next
    most important components B, C, D, etc. if each
    swap leads to no or partial change
  • Ultimately, the Red X family will be IDd (on
    complete reversal) or two or more Pink X or pale
    Pink X families if only partial reversals are
    observed

38
Procedure (cont.)
  • With the important variables identified, a
    capping run with the variables banded together
    as good or bad assemblies must be used to verify
    their importance
  • Finally, a factorial matrix, using data generated
    during the search, is drawn to determine,
    quantitatively, main effects and interactive
    effects.

39
Paired Comparisons
  • This is a technique like components search but
    when products do not lend themselves to
    disassembly (perhaps it is a component in a
    component search!)
  • Requires that there be several Good and Bad units
    that can be compared
  • Requires that a suitable parameter can be
    identified to distinguish Good from Bad

40
Steps in Paired Comparison
  • Randomly select one Good and one Bad unit
    call it pair one
  • Observe the differences between the 2 units
    these can be visual, dimensional, electrical,
    mechanical, chemical, etc. Observe using
    appropriate means (eye, optical or electron
    microscopic, X-ray, Spectrographic,
    tests-to-failure, etc)
  • Select a 2nd pair, observe and note as with pair
    1.
  • Repeat with additional pairs until a pattern of
    repeatability is observed between goods bads

41
Reviewing
  • The previous (three methods) are ones that
    followed directly from Shainins talk to the
    animals (products) approach
  • In each, before we began actively specifying the
    DOE parameters, we collect as much information as
    we can from good or bad products
  • As stated by one user The product solution was
    sought for over 18 months, we talked to engineers
    designers we talked to engineering managers,
    even product suppliers all without a successful
    solution, but we never talked to the parts. With
    the component search technique we identified the
    problem in just 3 days

42
Taking the Next step Variables Search
  • The objective is to
  • Pinpoint the Red X, Pink X and one to three
    (more) critical interacting variables
  • Its possible that the Red X is due to strong
    interactions between two or more variables
  • Finally we are still trying to separate the
    important variables from unimportant ones
  • Variables search is a way to get statistically
    significant results without executing a large
    number of experimental runs (achieving knowledge
    at reduced cost)
  • It has been shown the this binary comparison
    technique (on 5 to 15 variables) can be
    successful in 20, 22, 24 or 26 runs vs. 256, 512,
    1024, etc. runs using traditional DOE

43
Variables Search is a 2 stage process
STAGE 1
  • List the important input variables as chosen by
    engineering judgment (in descending order of
    ability to influence output)
  • Assign 2 levels to each factor a best and worst
    level (within reasonable bounds)
  • Run 2 experiments, one with all factors at best
    levels, the second with all factors at worst
    levels. Run two replications sets
  • Apply the Dd ? 51 rule (as above)
  • If the 51 ratio is exceeded, the Red X is
    captured in the factor set tested.

44
Stage 1 (cont)
  • If the ratio is less than 51, the right factors
    are not chosen or 1 or more factors have been
    reversed between best worst levels.
    Disappointing, but not fatal!
  • If the wrong factors were chosen in opinion of
    design team decide on new factors and rerun
    Stage 1
  • If the team believes it has the correct factors
    included, but some have reversed levels, run B
    vs. C tests on each suspicious factor to see if
    factor levels are in fact reversed
  • One could try the selected factors (4 at a time)
    using full factorial experiments could be prone
    to failure too if interacting factors are
    separated during testing!

45
Moving on to Stage 2
  • Run an experiment with AW (a at worst level) and
    the rest of factors at best levels (RB)
  • If there is no change in best results in Stage 1
    step 3, factor A is in fact unimportant
  • If there is a partial change from best results
    toward Worst results A is not the only
    important factor. A could be Pink X
  • If a complete reversal in Best to Worst results
    in Stage 1 step 3, A is the Red X
  • Run a second test with AB and RW
  • If no change from Worst results in Stage 1 the
    top factor A is further confirmed as unimportant
  • If there is a partial change in the worst results
    in Stage 1 toward Best results A is further
    confirmed as a possible Pink X factor
  • If a complete reversal Best results in Stage 1
    are approximated, A is reconfirmed as the Red X

46
Continuing Stage 2
  • Perform the same component search swap of step 1
    2 for the rest of the factors to separate
    important from unimportant factors
  • If no single Red X factor, but two or three Pink
    X factors are found, perform a capping or
    validation experiment with the Pink Xs at the
    best levels (remaining factors at their worst
    levels). The results should approximate the best
    results of Step 3, Stage 1.
  • Run a second capping experiment with Pinks at
    worst level, the rest at Best level should
    approx. the worst results in Step 3, Stage 1.

47
Variables Search ExamplePress Brake Operation
  • A press brake was showing high variability with
    poor CPK
  • The Press Brake was viewed as a Black Magic
    operation the worked sometimes then went bad
    for no reason
  • Causes of the operational variability were hotly
    debated, Issues included
  • Raw Sheet metal
  • Thickness
  • Hardness
  • Press Brake Factors (some which are difficult or
    impossible to control)
  • The company investigated new P. Brakes but
    observed no realistic and reliable improvements
  • Even high cost automated brakes sometimes
    produced poor results!

48
A Variables Search was Performed
  • Goal was to consistently achieve a ?.005
    tolerance (or closer!)
  • 6 Factors were chosen
  • A. Punch/Die Alignment B Aligned, W not
    Specially Aligned
  • B. Metal Thickness B Thick, W Thin
  • C. Metal Hardness B Hard, W Soft
  • D. Metal Bow B Flat, W Bowed
  • E. Ram Storage B Coin Form, W Air Form
  • F. Holding Material B Level, W Angle
  • Results reported in Process Widths which is
    twice tolerance, in 0.001 units

49
Results
50
Continuing to Stage 2
51
Factorial Analysis D F
52
Factorial Analysis
53
Factorial Analysis
  • Factor G is Red X It has a 41.9 main effect on
    the process spread
  • Factor D is a Pink X with 10.9 main effect on
    process spread
  • Their interaction is minor with a contribution of
    4.9 to process spread
  • With D F controlled, using a holding fixture to
    assure level and reduction in bowing (but with
    hardness and thickness tolerances open up leading
    to reduced raw metal costs) the process spread
    was reduced to 0.004 (?.002) much better than
    the original target of ?.005 with an observed
    CPK of 2.5!

54
Introduction to Failure Mode and Effects Analysis
(FMEA)
  • Tool used to systematically evaluate a product,
    process, or system
  • Developed in 1950s by US Navy, for use with
    flight control systems
  • Today its used in several industries, in many
    applications
  • products
  • processes
  • equipment
  • software
  • service
  • Conducted on new or existing products/processes
  • Presentation focuses on FMEA for existing process

55
Benefits of FMEA
  • Collects all potential issues into one document
  • Can serve as troubleshooting guide
  • Is valuable resource for new employees at the
    process
  • Provides analytical assessment of process risk
  • Prioritizes potential problems at process
  • Total process risk can be summarized, and
    compared to other processes to better allocate
    resources
  • Serves as baseline for future improvement at
    process
  • Actions resulting in improvements can be
    documented
  • Personnel responsible for improvements can gain
    recognition
  • Controls can be effectively implemented
  • Example Horizontal Bond Process FMs improved
    by 40 causes improved by 37. Overall risk in
    half in about 3 months.

56
FMEA Development
  • Assemble a team of people familiar with process
  • Brainstorm process/product related defects
    (Failure Modes)
  • List Effects, Causes, and Current Controls for
    each failure mode
  • Assign ratings (1-10) for Severity, Occurrence,
    and Detection for each failure mode
  • 1 is best, 10 is worst
  • Determine Risk Priority Number (RPN) for each
    failure mode
  • Calculated as Severity x Occurrence x Detection

57
Typical FMEA Evaluation Sheet
58
Capturing The Essence of FMEA
  • The FMEA is a tool to systematically evaluate a
    process or product
  • Use this methodology to
  • Prioritize which processes/ parameters/
    characteristics to work on (Plan)
  • Take action to improve process (Do)
  • Implement controls to verify/validate process
    (Check)
  • Update FMEA scores, and start focusing on next
    highest FM or cause (Act? Plan)

Write a Comment
User Comments (0)
About PowerShow.com