Basic Training for Compliance Assistance Providers (BCAP) Module 4 PowerPoint PPT Presentation

presentation player overlay
1 / 39
About This Presentation
Transcript and Presenter's Notes

Title: Basic Training for Compliance Assistance Providers (BCAP) Module 4


1
Basic Training for Compliance Assistance
Providers (BCAP) Module 4 Measurement and
Data Collection April 2009
2
Purpose
  • Explain
  • Why we measure results
  • What we measure
  • How we design and measure outcomes
  • How we analyze and utilize our measurement data
  • Tips for marketing results

3
Why Do We Measure Results?
  • Demonstrate to stakeholders how compliance
    assistance is improving the environment
  • Critical for program management (i.e. what was
    effective)
  • Helps to justify budget and agency priorities
  • Accountability to public, Congress and Office of
    Management and Budget (OMB)

4
Why We MeasureAccountability Requirements
  • Government Performance and Results Act (GPRA)
    requires EPA to set goals in 5-year Strategic
    Plans and those plans can include specific CA
    sub-objectives
  • Compliance assistance annual commitments under
    National Program Guidance
  • Performance and Accountability Report to the
    President, Congress, OMB and public on EPAs
    progress toward achieving its goals and
    objectives
  • (For updates to these documents
    www.epa.gov/ocfo)

5
What We MeasureEPA CA Measures - Outputs and
Outcomes
  • Output measures are collected in ICIS
  • Number of activities
  • Number of entities reached
  • Number of products produced
  • Examples of outputs
  • Number of workshops, presentations, facility
    visits
  • Number of people attending workshops
  • Number of tools developed, tools distributed
  • Number of telephone calls returned

6
What We MeasureEPA CA Measures - Outputs and
Outcomes
  • Outcome measures collected in ICIS
  • Increased understanding
  • Improved environmental management practices
  • Reduced, eliminated or treated pollution
  • Examples of outcomes
  • Change in facility practice such as installing
    labels on chemical drums
  • Employees with increased understanding as a
    result of training
  • Reduction in pounds of pollutants
  • Facility reduction in generation of hazardous
    waste due to change in equipment

7
What We MeasureOECA National Program Guidance
GPRA
  • Annual Commitment System (ACS) certified measure
    for CA (as of 2009)
  • Conduct outcome measurement for 100 of all
    compliance assistance workshops, training, onsite
    visits and revisits, which support the OECA
    national priorities, and report the results of
    these outcomes in ICIS
  • Report on exceptions to the 100 and provide
    brief explanation in ACS
  • Strategy-specific ACS measures for CA may apply.
  • GPRA Measures for CA also exist check website
    for current measures http//www.epa.gov/ocfo/pla
    nning/gpra.htm

8
What We Measure Other CA Measures
  • Additional program results are collected and
    reported into the Integrated Compliance
    Information System (ICIS). Examples include
  • Number of entities reached by compliance
    assistance
  • Number of CA tools developed by type (e.g.,
    tools, workshops, etc.)
  • Percentage reporting entities increased
    understanding of environmental requirements
  • Number of entities that seek assistance from
    EPA-sponsored Compliance Assistance (CA) Centers,
    and that receive assistance from EPA
  • Information from web-based CA Centers on improved
    practices, pollution reductions, beyond
    compliance actions, etc.
  • OECA National Priority measures
  • Number of entities reached by national priority
  • Specific CA measures tracked by individual
    Strategy Implementation Teams (SITs ) are
    manually collected

9

QUIZ question Outputs Outcomes
  • Are the following outputs or outcomes?
  • Percentage of facilities reporting they increased
    their understanding of environmental requirements
  • Number of regulated entities that reduce, treat,
    or eliminate pollution
  • Percentage of facilities receiving training
    guides from the EPA
  • Total decrease in fugitive air emissions from
    paper factories
  • Outcome
  • Outcome
  • Output
  • Outcome

10
Big Picture of Measurement Building Blocks
How We Design and Measure Outcomes?

Report Data
Present Data
Analyze Data
Collect Data
Choose Data Collection Methods Paperwork
Reduction Act Resources Needed
Planning Phase
Define Project Measures
Define Project Goals
11
How We Design and Measure OutcomesDefine Project
Goals
  • Clearly identify the project goals
  • Identify the project measures
  • Define the purpose and scope of the measurement
    tool
  • Descriptive Collected survey results and other
    anecdotal evidence (not statistically valid)
  • Predictive Statistical analysis
  • Generalize the results to a broader sector

12
How We Design and Measure Outcomes?Define
Project Goals
  • Goals should be SMART
  • S Specific
  • M Measurable
  • A Attainable
  • R Realistic
  • T Time dependent

13
How We Design and Measure Outcomes Define
Project Goals CA Outcome Measures
  • Changes in Understanding
  • Percentage of regulated entities who report
    better understanding of environmental
    requirements because of compliance assistance
  • Changes in Behavior
  • Facilities changing an environmental management
    practice as a result of EPA assistance
  • Environmental and Human Health Improvements
  • Reduction in pollution as a result of EPA
    assistance

14
How We Design and Measure OutcomesDefine Project
Goals Goals and Outcomes
15
How We Design and Measure Outcomes Data
Collection How Much Effort Will Measurement
Take?
Outcome Measures
Customer Satisfaction
Output Measures
Changes in Understanding
Changes in Behavior
Changes in Environmental and Human Health
Outcomes
LEVEL OF EFFORT TO MEASURE
LOW
HIGH
Number of tools developed
Did you meet your audiences expectations?
Number of facilities that adopt regulatory changes
Number of CA providers reporting improved ability
to provide CA
Number of regulated entities reporting reduced
pollutants/ emissions
Number of entities reached
Amount of reduced pollutants or emissions
Number of workshops
Amount of prevented pollution
Number of CA projects with an integrated
strategy
16
How We Design and Measure OutcomesData
Collection Comparing Data Collection Methods
17
How We Design and Measure OutcomesMatching Data
Collection Methods with CA Activities
18
How We Design and Measure OutcomesData
Collection Do You Need an ICR?
  • Often, if you need to use a survey or
    questionnaire to gather outcome data you will
    need to secure an Information Collection Request
    (ICR), before you collect information.
  • Paperwork Reduction Act (PRA) imposes this
    requirement. Applies to both mandatory and
    voluntary data collection efforts.
  • The PRA requires federal agencies to obtain
    Office of Management and Budget (OMB) approval
    prior to collecting similar information from more
    than nine people, in a given year.
  • Check with your Regional Compliance Assistance
    Coordinator to find out whether an ICR exists to
    cover your survey or whether one will be need to
    be developed (9-12 month process).

19
How We Design and Measure Outcomes Data
Collection How To Get an ICR
  • Developing a separate ICR
  • Obtain the ICR Handbook, EPAs guide to
    developing on ICR from the Office of
    Environmental Information, Collection Strategies
    Division
  • or
  • Visit http//intranet.epa.gov/icrintra on the EPA
    intranet
  • Also, see OMB Guidance at www.omb.gov

20
How We Design and Measure Outcomes Data
Collection Statistically Valid Surveys
  • Do you need to have a statistically valid data
    collection?
  • Are policy decisions going to be made as a result
    of your project?
  • Do you want to and will you be able to generalize
    your results to the population of interest?
  • Do you want to compare two groups?
  • Will everyone in the population of interest have
    an equal opportunity to be selected for the
    survey?
  • If you answer YES, a statistically valid study
    may be warranted.
  • If you answer NO, an anecdotal assessment may be
    sufficient.

21
How We Design and Measure Outcomes Details To
Remember When Developing Surveys
  • What is the purpose of the survey?
  • Who will be surveyed?
  • How will the survey be administered?
  • How will survey information be analyzed and
    reported?
  • Who will use the results?

22
How We Design and Measure Outcomes Data
Collection Developing a Survey Tool
  • Introduction
  • Purpose of the survey
  • How the data will be used
  • Anonymity statement
  • Including information from the Privacy Act
    Statement
  • Public Law 93-579, the Privacy Act of 1974
    requires that you be informed of the purposes and
    uses to be made of the survey. Authority to
    collect this information is granted in Title 5 of
    the U.S. Code. Providing this information is
    voluntary. In no case will the information be
    used for making decisions affecting specific
    individuals.
  • Instructions
  • Use question-specific instructions inside the
    question number and not as free-standing
    entities.

23
How We Design and Measure Outcomes Data
Collection Developing/Modifying Survey Questions
  • Questions should be
  • Salient
  • Clearly written
  • Concise
  • Specific
  • Explicit
  • Relevant
  • Appropriate for the recipient
  • Capable of providing the desired results
  • Avoid questions that are
  • Ambiguous
  • Overlapping
  • Multi-part

24
How We Design and Measure OutcomesData
Collection Question Response Formats
25
How We Design and Measure Outcomes Data
Collection Tips for Implementing a Survey
  • Workshop Survey
  • Keep to fewer than 20 questions
  • Block out time during workshop to complete
  • Phone Survey
  • Send notification letter beforehand
  • Train surveyors for consistency
  • Mailed Survey
  • Verify mailing list
  • Track responses
  • Include stamped return envelope
  • Online Survey
  • Use bias reduction methods such as
  • E-mail passwords Only one response per person
  • Only allow use of Web site after completing
    survey
  • Randomly select recipients using pop-ups
  • Onsite Visit/Revisit
  • Done by EPA or certified inspector
  • Conduct pre-visit outreach
  • Use a checklist, consistent for initial and
    revisits.

26
How We Design and Measure Outcomes
Data Collection Pilot Testing Survey
  • Need an ICR if pilot includes more than 9
    non-federal employees
  • Pilot testing ensures that questions are
  • Easily understood
  • Appropriately asked
  • Appropriately answered
  • Use representative sample of the larger group
  • Pilot test feedback
  • Explain the process
  • Evaluate the survey as they are taking it
  • Request written comments on the survey
  • Hold a group discussion
  • Use the feedback to modify the survey

27
How We Design and Measure Outcomes
Discussion Question Increasing
Understanding of Hazardous Waste Management on
the U.S/Mexico Border
  • Region 6, in partnership with the Texas
    Commission on Environmental Quality (TCEQ), found
    that warehouses along the U.S./Mexico border were
    violating RCRA requirements because of a lack of
    knowledge of RCRA and proper hazardous waste
    management. In response, Region 6 and TCEQ
    developed a compliance assistance seminar
    designed to improve understanding of RCRA.
    Region 6 conducted a survey of the seminar
    participants to learn how to improve it and to
    determine whether it was effective CA and worth
    continuing.
  • What types of questions would you include in an
    on-site survey following the seminar?

28
How We Analyze and Use Measurement Data How Do
We Use Our Measurement Results?
  • Assess whether program is meeting its goals, and
    if not, why not
  • Look at results achieved and whether the
    resources allocated for those results make sense
  • Assess whether goals need to be modified (/-),
    or if a change in approach is warranted
  • Analyze trends and determine whether they signal
    a need to change priorities or approaches

29
How We Analyze and Use Measurement Data How Do
We Use Our Measurement Results?
  • External and internal audiences rely on
    measurement data to make a range of decisions.
  • Big-picture internal analysis looks at these
    questions
  • Did we accomplish what we planned?
  • Did we stay within budget?
  • Did we achieve the desired environmental results?
  • Which CA approach was most effective at achieving
    outcomes?
  • Helps managers make informed decisions about
    program performance, future direction,
    priorities, and budget allocations

30
How We Analyze and Use Measurement Data How Do
We Use Our Measurement Results?
  • External audiences
  • EPA reports annually to the President, Congress,
    and OMB on GPRA measures, Annual Performance
    Goals and Strategic Plan goals
  • Impacts Agencys budgets, targets and measures
  • Stakeholders assess results to make conclusions
    about the Agencys effectiveness and performance

31
How We Analyze and Use Measurement Data FY2007
Example - Enforcement Compliance Annual Results
Entities Reached with EPA Compliance Assistance

FY2007 Data Sources Integrated Compliance
Information System (ICIS), 10/13/07 and on-line
usage report data source for previous fiscal
years annual ICIS data and on-line usage
reports
31
32
How We Analyze and Use Measurement Data Example -
Outcomes from EPAs Direct Compliance Assistance
Provided to Regulated Entities
95
94
91
91

74
50
51
28
13

FY2007 Data Source ICIS
A correction to the database in FY 2007 improved
the accuracy of this years data. Disclaimer
minor corrections have been made to previous
years data. Also, these measures are not
calculated from a representative sample of the
regulated entity universe. The percentages are
based, in part, on the number of regulated
entities that answered affirmatively to these
questions on voluntary surveys. The percentages
do not account for the number of respondents who
chose either not to answer these questions or the
survey.
33
How We Analyze and Use Measurement Data Example -
Outcomes from EPAs 15 Web-Based Compliance
Assistance Centers

88
84
82
83
81
77
55
53
46

FY2007 Data Source On-line surveys completed
during FY2007 Disclaimer These measures are
not calculated from a representative sample of
the regulated entity universe. The percentages
are based, in part, on the number of regulated
entities that answered affirmatively to these
questions on voluntary surveys. The percentages
do not account for the number of respondents who
chose either not to answer these questions or the
survey.
34
How We Analyze and Use Measurement Data
Analyzing Your Data
  • Data Entry and Handling
  • Database
  • Spreadsheet
  • Summary text for qualitative information
  • Ambiguous Data
  • Blank answers in surveys are neither affirmative
    nor negative
  • Blank answers in pre-test /post-test are
    incorrect

35
How We Analyze and Use Measurement Data
Different Ways to Display Data
Table
Pie Charts
Fugitive Air Emissions in 2006
Bar Graph
Fugitive Air Emissions in 2007
36
How We Analyze and Use Measurement DataPercent
Change
  • A measure used to compare changes between
    measures with different initial values or taken
    in different units.
  • Example Comparing HAP emissions between 2
    facilities with different initial emissions

37
Tips for Marketing Results
  • Although some projects are never presented that
    doesnt mean your result should go unnoticed
  • Ways to market results
  • E-mail your supervisors
  • Report news with a few bullets that they can
    repeat easily whenever there is an opportunity to
    promote your work and the results
  • Example TRI workshop update
  • 30 people came to the TRI workshop
  • Every participant reported an increased
    understanding of the requirements
  • 25 reported a better ability to fill out the
    forms
  • ALWAYS put results into ICIS (Integrated
    Compliance Information System)

38
Conclusion
  • Having meaningful results to tell your story is
    important because it helps
  • Communicate the actual results of the CA
    activities in terms of environmental benefit
  • Explain how and why CA is an important component
    of the compliance and enforcement program
  • Explain how your resources are being spent

39
Measurement and Data Collection Resources
  • Guide for Measuring CA Outcomes
  • http//www.epa.gov/compliance/resources/publicatio
    ns/
  • assistance/measures/cameasuring.pdf
  • OECA Web site for measurement resources
  • http//epa.gov/compliance/assistance/measures/inde
    x.html
  • For EPA employees only http//intranet.epa.gov/o
    eca/caspd/cacoordinators/
  • measurement/index.html
  • EPA Information Collection Request (ICR) Center
  • http//epa.gov/icr/
Write a Comment
User Comments (0)
About PowerShow.com