Jeffrey C. Worthington Chair, Energy and Environmental Division American Society for Quality OEI Director of Quality Office of Environmental Information U.S Environmental Protection Agency - PowerPoint PPT Presentation

About This Presentation
Title:

Jeffrey C. Worthington Chair, Energy and Environmental Division American Society for Quality OEI Director of Quality Office of Environmental Information U.S Environmental Protection Agency

Description:

Jeffrey C. Worthington Chair, Energy and Environmental Division American Society for Quality OEI Director of Quality Office of Environmental Information – PowerPoint PPT presentation

Number of Views:338
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Jeffrey C. Worthington Chair, Energy and Environmental Division American Society for Quality OEI Director of Quality Office of Environmental Information U.S Environmental Protection Agency


1
Jeffrey C. WorthingtonChair, Energy and
Environmental DivisionAmerican Society for
QualityOEI Director of QualityOffice of
Environmental InformationU.S Environmental
Protection Agency
Integrating Performance Measures and Quality
Systems
straight from the mouse's mouth - Mickey Mouse
Lorena Romero CedenoProgram AnalystOffice of
Environmental InformationU.S Environmental
Protection Agency
ASQ 32nd Annual National Energy Environmental
Conference San Antonio, Texas September 19-20,
2005
iaidq
2
Jeffrey Worthington- BIO
  • Director of Quality for the USEPA Office of
    Environmental Information. Jeff has served as
    the Director of Quality Assurance for the USEPA
    Office of Research and Development (ORD) National
    Risk Management Research Laboratory (NRMRL) and
    as the Director of Quality Assurance for TechLaw,
    Inc. He is an American Society for Quality (ASQ)
    Certified Quality Manager and ASQ Certified
    Quality Auditor. Jeff, Senior ASQ member,
    founding member of the Education Division, Chairs
    the ASQ Energy and Environment Division and
    participates on the ASQ Division Affairs Council.
    He is a founding member and serves on the Board
    of Directors for the recently established
    International Association for Information and
    Data Quality (IAIDQ). Jeff is a member of the
    Editorial Board of Quality Assurance, Science,
    and the Law and previously served as Editorial
    Board member for the Journal of Environmental
    Forensics, Environmental Laboratory magazine, and
    Environmental Testing and Analysis magazine.
  • He has been with the Federal Government since
    1994. Jeff supported environmental engineering
    quality at NRMRL, joining a team authoring the
    combined quality and management system for EPAs
    Environmental Technology Verification program.
    He co-lead the EPA team developing EPAs
    Information Quality Guidelines. Jeff co-authored
    a peer review journal paper and received the
    USEPA Science and Technological Achievement
    Award, Level III for equating EPA policies and
    procedures to U.S. Supreme Court Sound Science
    Criteria (2002). Jeff has spoken at numerous
    national and regional conferences on the subjects
    of quality management, audit management,
    information quality planning and assessment, data
    authenticity, data quality, and data integrity.

3
Lorena Romero Cedeno - BIO
  • Lorena R. Romero-Cedeño is the Manager of Quality
    for the Office of Planning, Resources and
    Outreach for the Office of Environmental
    Information. She graduated in 1999 from Colorado
    State University and currently is finalizing two
    master degrees in Community and Regional Planning
    and Latin American Studies from the University of
    New Mexico. Lorena has a passion for the
    integrity of environmental data being translated
    into Spanish targeted to the Spanish speaking
    public. She is currently conducting research on
    environmental data for her master thesis and
    holds two years of experience teaching Spanish
    Language at the University of New Mexico.

4
DISCLAIMER
  • The opinions expressed in this technical
    presentation are those of the author and do not
    necessarily reflect the views of the US EPA.

5
  • SPONSORED BY MICKEY..
  • WHY MICKEY?
  • Statue theme in 2005
  • Hes cool!!!
  • He is a performer must know something about
    performance measurements
  • He is a genius
  • Wears the best clothes
  • Never ages
  • Is very popular
  • Is everywhere
  • Has a lot of friends
  • Certified quality inspector (usually wears white
    gloves)

Mickey Mouse
6
  • MICKEY FACTS
  • Cherry Blossoms
  • Age
  • Oscar
  • First words
  • First cartoon character to ___
  • Quotes

Mickey Mouse
7
Lobsta' Mickey
8
Information Mickey
The Mouse is in The House
9
Quality Mickey
Q
Arithmetic is being able to count up to
twenty.... without taking your shoes off
10
Selected Performance Measurement Systems Used
inEPA and Federal Government
  • GPRA Government Performance and Results Act
  • FMFIA Federal Managers Financial Integrity Act
    (management controls)
  • EVM Earned Value Management
  • CPIC Capital Planning and Investment
    Coordination (especially information technology)
  • Balanced Scorecard

11
GPRA Government Performance and Results Act
  • Mandated by OMB
  • Effort to identify and track project/program/proce
    ss outcomes (vs. outputs)
  • Managed at a high level in the organization
  • In EPA, may include
  • QA/QC procedures
  • Data quality reviews
  • Data limitations
  • Error estimate
  • Each office negotiates their own list of measures
    with OCFO and OMB
  • Annual performance plan submittals www.omb.gov

12
Outcomes vs. Outputs
  • Outputs productivity or efficiency metrics such
    as the number of reports written per month
  • Outcomes the impact that the output has on
    success of the process organization

13
FMFIA Federal Managers Financial Integrity Act
(management controls)
  • FMFIA Act of 1982
  • Agency weaknesses vs.
  • Annual Management Accomplishments and Challenges
    report
  • Offices self-identify adequacy of management
    controls
  • OIG, GAO, and OMB overview comments may be part
    of overview
  • Action plan to make corrections and identify
    schedule and measures to correct

14
EVM -Earned Value Management
  • Standard - ANSI/EIA-748-A Standard for Earned
    Value Management Systems
  • EV value of completed work expressed in terms
    of the budget assigned to that work
  • Objective measure of work accomplished
  • Based on budgeted value of the work
  • What you got for what it costs you.
  • Compliance standard criteria grouped in 5 areas
  • Planning, scheduling, and budgeting
  • Organization
  • Analysis and management reports
  • Revisions and data maintenance
  • Accounting considerations
  • Performance measurement
  • Establish a baseline
  • Monitor in time units
  • Review both cost variance and schedule variance
    from baseline

15
Office of Environmental Information Report-
Earned Value Management Fiscal Year
2004Lessons Learned December 2004
  • Applied to assist IT project managers consistent
    with the ANSI/EIA-748 standard and in line with
    requirements of the OMB Exhibit 300.
  • 3 categories of LESSONS LEARNED
  • Refinement of EVM methods
  • Increasing consistency of project reporting
  • Facilitating management analysis of EVM data

16
LESSON LEARNED 1Refinement of EVM methods
  • Use EVM across all phases of mixed life-cycle
    projects
  • Separate milestones and associated resources for
    system life-cycle phase
  • Separate milestones and associated costs by
    contractor wherever possible
  • Keep milestones from getting too large in
    duration, cost, or scope
  • Attempt to limit milestones to a single fiscal
    year (or less)
  • Establish objective measures for determining
    earned value

17
LESSON LEARNED 2Increasing consistency of
project reporting
  • Use standard template for reporting
  • Institute standard reporting cycles

18
LESSON LEARNED 3Facilitating management analysis
of EVM data
  • Provide both numerical and graphical
    representations of EVM data
  • Use color-coded standardized scoring system

19
Balanced Scorecard
  • A Management System (not only a measurement
    system) that enables organizations to clarify
    their vision and strategy and translate them into
    action.
  • Provides feed back for both internal processes
    and external outcomes in order to continuously
    improve strategic performance and results.
  • Balanced Scorecard Institute www.balancedscoreca
    rd.org

20
Balanced Scorecard view organization from 4
perspectives
  • Learning and growth perspective
  • Business process perspective
  • Customer perspective
  • Financial perspective

21
Balanced ScorecardBusiness Process Perspective
  • Products and services conforming to customer
    requirements
  • Mission-oriented processes AND support
    (repetitive) processes
  • (easier to benchmark repetitive processes)

22
Balanced ScorecardFinancial Perspective
  • Timely and accurate funding data
  • Cost-benefits

23
Balanced ScorecardCustomer Perspective
  • Customer focus
  • Customer satisfactions
  • Matching products and services to customer groups

24
Balanced ScorecardLearning and Growth Perspective
  • Employee training
  • Organizational culture toward personal and
    professional growth
  • Keeping knowledge workers in a continuous
    learning mode

25
(No Transcript)
26
(No Transcript)
27
BRM OMBs Business Reference Model
  • From OMBs Federal Enterprise Architecture
    Program Management Office (FEAPMO)
  • Identifying and categorizing an organizations
    lines of business and sub-functions to relate
    them to the enterprise architecture

28
PRM OMBs Performance Reference Model
  • From OMBs Federal Enterprise Architecture
    Program Management Office (FEAPMO)
  • A standardized approach to IT performance
  • Define measurement indicators
  • Establish relationship inputs and outcomes
  • Sets baselines and targets for improvement
  • Closely related to the BRM

29
GENERALIZATION for various performance
measurement systems
  • Relationship to organizations vision and mission
    may need to be verified
  • Need to verify measures
  • Are real
  • Are meaningful
  • Are not redundant
  • Include quality (limitations on use)
  • Are we measuring this item because it is easy to
    measure?
  • Is the goal convenient or challenging?
  • Applicability - What does measuring this
    particular thing tell you about the
    organizations performance?
  • Completeness Are enough measures being tracked
    to characterize a broader sense of organizational
    performance?

30
How to Speak the Language of Senior
ManagementStephen George Quality Progress May
2003
  • OBSERVATIONS
  • The language of senior management is very
    different than the language of quality managers
  • To have their suggestions heard and accepted,
    quality professionals need to learn managements
    financial vocabulary
  • There are seven steps you can take to improve
    communication with management

31
Speaking the Language of Senior ManagementThe
Seven Steps
  • PROVE quality professionals have to prove the
    need and then prove the value of proposed
    improvement processes
  • COACH quality managers must help senior
    management move goals, objectives, and strategies
    into action, . To move from strategies to
    measures to projects
  • INFLUENCE involve senior management in quality
    processes, participate as trainers, incporated
    quality in reviews
  • PARTNER with the CEO if possible, or with any
    other senior manager, focus on the area of
    greatest need, ask for guidance from senior
    management
  • PILOT initiate a project in an area of value to
    the organization
  • BENCHMARK with senior managers in other
    organizations
  • ALIGN align the quality system language with
    the senior management language, everyone should
    be speaking the same language

32
Balanced scorecard for a quality system HOW??
  • Look for ways for the quality system to
    demonstrate an improvement in each of the four
    perspectives
  • Try to use a balanced scorecard for the quality
    system itself can you really do this?
  • Performance measures of the organization vs.
    performance measures for individual processes
    (e.g., quality systems) does this mean there is
    a new customer to consider in the balanced
    scorecard, the internal customer?
  • How much you are spending internally on your own
    process, for example
  • When do the management system and the quality
    system come into full alignment?

33
Balanced scorecard for a quality system
  • OUTPUTS
  • OUTCOMES

34
Balanced scorecard for a quality system
  • Learning and growth perspective
  • Business process perspective
  • Customer perspective
  • Financial perspective

35
Mickey says - arithmetic is counting to twenty
without taking off our shoes
  • Quality Mickey says
  • Performance measurement of quality is measuring
    both outputs and outcomes

36
HOW CAN AN INTERESTED QUALITY MANAGER TRAIN
THEMSELVES?
  • www.balancedscorecard.com

37
WHAT ELSE IS GOING ON AT EPA FOR QUALITY NOW?
  • OEI is re-developing its 5 year QMP
  • OEI has delegated authority for directives for
    Agency information policy, including quality
    policy

38
  • THE END

See Ya Real Soon
Mickey Mouse
39
CONTACT INFORMATION
  • Jeffrey Worthington
  • 202-566-0997
  • worthington.jeffrey_at_epa.gov
  • Lorena Romero-Cedeno
  • 202-566-0978
  • romeo-cedeno.lorena_at_epa.gov

40
A Body of Knowledge for Information and Data
Quality
WHY?
FREEDOM from the tyranny of quality chaos
41
(No Transcript)
42
(No Transcript)
43
(No Transcript)
44
(No Transcript)
45
(No Transcript)
46
(No Transcript)
47
(No Transcript)
48
(No Transcript)
49
FREEDOM from the tyranny of quality chaos
50
Statue of Freedom - 1855, 1862, 1863
51
(No Transcript)
52
OVERVIEW
  • Trends in information quality
  • What is a Body of Knowledge?
  • Who owns a BOK?
  • Basic structure
  • Conclusion
  • Resources

A Body of Knowledge for Information and Data
Quality
53
Trends in information quality
  • MORE IS BETTER
  • More information is better?
  • Faster information is better?
  • More privacy is better?
  • More access is better?

54
Trends in information quality
  • WHAT ARE THE TRENDS?
  • Information technology build-out phase is nearing
    completion
  • Increased access
  • Increased transparency
  • Recognition that info is a resource
  • Recognition that IT is not the strategic
    advantage (IT Doesnt Matter)
  • Increased need for security
  • Increased need for control (?)

55
What is a Body of Knowledge (BOK)?
  • an aggregate of what is known and understood
    within a field of endeavor due to familiarity
    gained through experience or association
    www.findmehere.com

56
What is the purpose of a BOK?
  • To provide a means for people with shared
    interest to better communicate
  • To provide formal organization and recognition to
    a knowledge area
  • To recognize and improve knowledge
  • To serve as the basis for a test or certification

57
iaidq
  • International Association for Information and
    Data Quality www.iaidq.org
  • IAIDQ purpose to create a world-wide community
    of people who
  • Understand the critical roles data and
    information play
  • Recognize the consequences of poor quality data
    and information
  • Wish to help organizations enjoy the benefits of
    improved data and information
  • IAIDQ mission
  • Increase the awareness of the impact of poor
    quality data and information.
  • Help leaders understand that the high losses can
    be dramatically reduced.
  • Provide a network for members to exchange tips
    and techniques for quality improvement.
  • Provide opportunities to learn critical skills
    for making quality information and data a
    reality.
  • Membership types
  • Profession members
  • Academic members
  • Lay members
  • Student members

58
Usability Professionals Association -
http//www.upassoc.org/
  • What is usability?
  • The degree to which something software,
    hardware, or anything else Is easy to use and a
    good fit for the people who use it.

59
Usability Body of Knowledge
  • Methods
  • User interface design principles and guidelines
  • Organizational integration of usability
    (including managing usability teams, integrating
    usability into software development, introducing
    usability to organizations and clients and making
    a business case for usability)
  • Roles, skills, and job categories for usability
    professionals
  • Definitions of usability terms
  • Related fields and disciplines

60
What is usability?
  • a quality or characteristic of a product
  • whether a product is efficient, effective, and
    satisfying for those who use it
  • the name for a group of techniques developed by
    usability professionals to help create usable
    products
  • a shorthand term for a process or approach to
    creating those products, also called
    user-centered design

61
Project Management Body of Knowledge -
  • Project Management Institute
  • http//www.pmi.org/info/pp_pmbok2000welcome.asp

62
(No Transcript)
63
Software Engineering Body of Knowledge
www.swebok.org
  • ASSOCIATED DISCIPLINES
  • Cognitive sciences and human factors
  • Computer engineering
  • Computer science
  • Management and management science
  • Mathematics
  • Project management
  • Systems engineering
  • Software configuration management
  • Software construction
  • Software design
  • Software engineering architecture
  • Software engineering management
  • Software engineering process
  • Software evolution and maintenance
  • Software quality analysis
  • Software quality analysis
  • Software requirements analysis
  • Software testing

64
Do we have a BOK for QUALITY in the US EPA??
65
EPA Body of Knowledge Areas for Quality
  • What you need to know to work or support quality
    in the EPA.
  • Quality Management Systems
  • Quality Project Planning
  • Data Quality Objective Planning
  • Quality Training
  • Science measures and associated metadata
  • Environmental methodologies
  • Concepts hierarchial quality systems, graded
    approach.
  • Did I leave anything out?

66
Your examples.?
  • _________________________
  • _________________________
  • _________________________
  • _________________________
  • _________________________
  • _________________________

67
Who owns a BOK?
  • Certification program
  • ASQ Certified Quality Engineer (CQE) has a CQE
    BOK, also for CQA, CQM, and CSQE
  • All are owned by ASQ
  • Professional organizations
  • Government bodies
  • Academic groups

68
How to structure a BOK for information and data
quality?
  • Focus on
  • The basics the basic principles
  • The what of information and data identify and
    describe them
  • The how of information and data the processes
  • The who what a information and data quality
    practitioner is and how they do their work

69
Basic structure
  • Overarching principles for information and data
    quality
  • Communication 1 Basic terminology
  • Communication 2 Alignment with the organization
  • Information and data quality features, measures,
    and acceptance criteria
  • Preliminary review of an organizations
    information and data quality
  • Managing and planning for information and data
    quality
  • Implementing information and data quality
  • Assessing information and data quality
  • Tracking and reporting information and data
    quality
  • Roles, skills, and job categories for information
    and data quality professionals

70
What are the quality principles for information
and data?
  • general quality management principles apply to
    information, data, and information and data
    quality in the same way they apply to all basic
    management processes
  • information has unique characteristics the bear
    on your ability to manage the quality
  • information is a resource of the enterprise and
    the quality should be managed as any other
    resource
  • information quality includes the quality of
    content, format, and functionality
  • information quality includes
  • information features (including functionalities)
  • freedom from defect
  • customer service
  • effectiveness and efficiency
  • you must be able to measure the information
    quality, in order to understand it completely,
    manage it, and to continually improve quality
  • you must be able to track quality costs
    (failures, scrap and rework, etc.) in order to
    effectively manage the resources used to ensure
    information quality

71
What is unique about information?
  • as a resource?
  • It cannot be used up
  • It can be copied
  • It can be shared across large distances
  • It can be moved easily
  • It is difficult to keep secure
  • It may become dated
  • Age may not affect its quality

72
What is unique about information?
  • as a product?
  • The same information can be provided/sold to more
    than one person
  • It can be delivered cheaply
  • Your product can be easily shared by the person
    who purchased it

73
Communication 1Basic terminology
  • Terminology needs for information and data
    quality
  • Recognition that terminology is
    discipline-specific
  • Recognition that there may be conflicting views
    on the definitions for certain terms (i.e.,
    information, data)
  • Multiple definitions may be needed for the same
    term
  • More complex definitions may be needed (data
    quality vs. data entry quality vs. data content
    quality vs. data content transfer quality)
  • Need to include senior management terminology and
    align quality terminology with senior management
    terminology

74
Communication 2Alignment to the organization
  • Methodology for tracking to performance measures
    for the organization
  • Best practices for communicating information and
    data quality to senior management
  • Models for relationship of information and data
    to the organizations product

75
Information and data quality features, measures,
and acceptance criteria
  • Standardized list of features and functionalities
  • Standardized measures and measurement units for
    the features and functionalities
  • Suggested acceptance criteria
  • Relationship between concepts and measures
    clearly mapped (access vs. usefulness vs.
    usability vs. integrity vs. security vs.
    transparency vs. objectivity vs. accuracy etc.)

76
Preliminary review of an organizations
information and data quality
  • Techniques to identify priorities for measurement
    in existing systems.
  • Process to use existing available data about data
    to determine quality.
  • Best practices in automated information and data
    quality measurement for existing data systems.
  • Methods to review current quality measure in
    comparison to organizations mission and goals.

77
Managing and planning for information and data
quality
  • Identification of standard types of products that
    should be subject to management and planning
  • For each standard type of information product and
    data, a standard set of planning criteria
  • Standard techniques for identifying the level of
    planning needed
  • General planning guidance document.
  • Process to identify goals for general information
    product and data quality projects.
  • Process to identify acceptable measures to
    determine conformance with the goals.
  • Processes to crosswalk management/planning to
    other project management initiatives in an
    organization
  • Processes to crosswalk information product and
    data quality objectives to the organizations
    performance measurement.

78
Implementing information and data quality
  • Identification of standard implementation phases
  • Best Practices for implementing information and
    data quality initiatives.
  • Example standard operating procedures (SOPs).

79
Assessing information and data quality
  • Identification of hierarchy of assessment.
    (system vs. program vs. project vs. data system
    vs. data set vs. data field).
  • Standard procedures for planning, implementing,
    and reporting assessment information.
  • Suggested procedures for resolving corrective
    actions for existing data deficiencies.
  • Types of assessments based on existing BOKs,
    self-assessment, third-party, conformance,

80
Tracking and reporting information and data
quality
  • You cant manage information and data quality
    without having the information to manage.
  • You must be able to track and report your
    information and data quality in terms of the
    organizations performance goals. (see
    COMMUNICATION 2 in the BOK)

81
Tracking and reporting information and data
quality
  • Methodology to identify what to report
  • Standard reporting formats for information and
    data quality
  • Routine reporting to middle and senior management
  • Techniques to trend data and information quality
    to track improvement

82
Roles, skills, and job categories for information
and data quality professionals
  • Information and data quality managers
  • Information and data quality assessors

83
Conclusion
FREEDOM from the tyranny of quality chaos
  • We are already developing a body of knowledge in
    this area.
  • Structure to our BOK will help us communicate
    with each other.
  • It will help us better
  • plan for information and data quality
  • measure information and data quality
  • establish organizational performance
  • align with organizational goal.
  • The job of quality professionals will be better
    planned and understood.

84
Resources for information quality data quality
body of knowledge
  • Larry English, www.infoimpact.com
  • Dr. Tom Redman, www.dataqualitysolutions.com
  • Michael Bracket
  • Dr. Wang, MIT Data Quality Group
  • American Society for Quality (ASQ) Information
    Integrity Group
  • International Organization for Information and
    Data Quality (IAIDQ.org)
  • Project Management Institute
Write a Comment
User Comments (0)
About PowerShow.com