Preparing Quality Assurance Project Plans - PowerPoint PPT Presentation

1 / 83
About This Presentation
Title:

Preparing Quality Assurance Project Plans

Description:

The authority of this Order applies only to EPA organizations except as ... Information on individual samples used to form composite samples is lost in compositing ... – PowerPoint PPT presentation

Number of Views:231
Avg rating:3.0/5.0
Slides: 84
Provided by: deniseg9
Category:

less

Transcript and Presenter's Notes

Title: Preparing Quality Assurance Project Plans


1
Preparing Quality Assurance Project Plans
  • Presented By
  • Denise L. Goddard, Chemist
  • Quality Assurance Section
  • Athens, Georgia

2
EPA DISCLAIMER
  • This Presentation is for Training Purposes Only.

3
EPA - QA Documents for Preparing Quality
Assurance Project Plans
  • Guidance on Systematic Planning Using the Data
    Quality Objectives Process, EPA QA/G-4,
    EPA/240/B-06/001 (February 2006)
  • Requirements for Quality Assurance Project Plans,
    EPA QA/R-5, EPA/240/B-01/003 (March 2001)
  • Guidance for Quality Assurance Project Plans, EPA
    QA/G-5, EPA/240/R-02/009 (December 2002)

4
Are QAPPs Really Required??? YES!!
  • Quality System Requirements Approved Quality
    Assurance Project Plans (QAPPs) or equivalent
    documents defined by your organizations QMP, for
    all applicable projects and/or studies that will
    involve environmental data collection or where
    environmental decisions will be made for a
    particular site. QAPP must be approved prior to
    any data gathering work or activities, except
    under circumstances requiring immediate action
    (emergency response) to protect human health and
    the environment or operations conducted under
    police powers.

5
Organizational Applicability??
  • EPA Organizations Covered under Executive Order
    5360.1, A2 The Agency-wide Quality System
    requirements defined by this Order apply to all
    EPA organizations, and components thereof, in
    which the environmental programs conducted
    involve the scope of activities described in
    Section 5.a above. The authority of this Order
    applies only to EPA organizations except as
    addressed by Section 5.d(2) below.

6
External Organizations Requirements
  • Extramural Agreements Agency-wide Quality
    System requirements may also apply to non-EPA
    organizations. These requirements are defined in
    the applicable regulations governing extramural
    agreements. Agency-wide Quality System
    requirements may also be invoked as part of
    negotiated agreements such as memoranda of
    understanding (MOUs). Non-EPA organizations that
    may be subject to quality system requirements
    include

7
Extramural Agreements
  • (a) Any organization or individual under direct
    contract to EPA to furnish services or items or
    perform work (i.e. contractor) under the
    authority of 48 CFR 46, (including applicable
    work assignments, delivery orders, and task
    orders)
  • 40 CFR 31 Grants Cooperative Agreements with
    State Local Governments
  • 40 CFR 35 State Local Assistance

8
The Purpose of a Quality Assurance Project Plan
  • As a planning document, the QAPP should contain a
    detailed description of environmental data
    collection activities and operations, the
    problems associated with a site, the sampling and
    analysis requirements, the decisions to be made,
    and the necessary QA/QC activities governing this
    effort.

9
Issues Addressed by a QAPP
  • The QAPP must provide sufficient detail such as
  • The projects technical and quality objectives
    these must be well defined and agreed upon by all
    affected parties and stakeholders
  • The program-specific and site-specific
    requirements (stipulated in consent decrees,
    records of decision, regulations, statutes,
    etc.).
  • The intended measurements, data generation or
    data acquisition methods that are appropriate for
    achieving project goals/objectives.

10
Issues Addressed by a QAPP Cont
  • A summary of the assessment procedures for
    confirming that data of the type, quantity and
    quality required and expected were obtained, and
  • A description of the process for evaluating the
    limitations on the use of the information or data
    obtained that includes identifying, documenting
    and communicating the limitations to all affected
    parties and stakeholders.

11
Overview of Content Requirements
  • To be effective, the QAPP must clearly state
  • The purpose of the environmental data operation
    (e.g., enforcement, research and development,
    rulemaking),
  • The type of work to be done (e.g., pollutant
    monitoring, site characterization, risk
    characterization, bench level proof of concept
    experiments), and
  • The intended use of the results (e.g., compliance
    determination, selection of remedial technology,
    site closure, development of environmental
    regulations).

12
Before We Start - Some Preliminaries
Format/Content Requirements
  • Because the QAPP is a formal document it should
    contain
  • A Title Page containing the title of the
    document, the Identification of the Organization
    that Prepared the QAPP, the Preparation Date and
    the Version Number The document should be
    Paginated
  • An Approval Page Containing Signature and Date
    Blocks for each of the individuals/organizations
    responsible for approving this document.

13
Some Preliminaries Format and Content
Requirements
  • A Distribution List Containing the Names,
    Mailing Addresses, Phone Numbers, and Email
    Addresses for each of the individuals and
    organizations requiring copies of the approved
    QAPP.
  • Table of Contents For Text, Tables, Figures,
    Maps Appendices. If there are numerous Tables,
    Figures Maps Place these items in the
    Appendix to reduce breakup of the text.

14
Some Cautionary Tips!!!
  • Some Cautions
  • Avoid using generic language that does not
    provide the required information or level of
    detail required.
  • For projects requiring the generation of chemical
    or biological data, make sure that you produce a
    list of contaminants of concern or identify the
    biological parameters of interest.
  • Make sure the approved QAPP is distributed to
    project personnel, laboratory staff and if you
    are using CLP, identify the COCs in project log
    (unless there are numerous contaminants).

15
Lets Start - Components of a QAPP
  • A QAPP is composed of approximately 25 elements
    that are grouped into four classes or categories
    as follows
  • Class A Project Management
  • Class B Measurement/Data Acquisition
  • Class C Assessment/Oversight
  • Class D Data Validation/Data Usability

16
Class A Topics - Overview
  • The elements in this group address
  • Project Management
  • Project History/Site History
  • Goals Objectives of the Project
  • Project Outputs

17
Class A Topics
  • The following topics must be addressed as part of
    the Class A components/elements
  • A1 Title/Approval Page
  • A2 Table of Contents
  • A3 Distribution List
  • A4 Project/Task Description
  • A5 Problem Definition/Background Info
  • A6 Project/Task Description
  • A7 Quality Objectives Criteria DQOs/DQIs
  • A8 - Documents Records

18
A4 Project/Task Organization
  • The following information is required
  • Identify the individuals/organizations that will
    participate in the project/study discuss their
    roles/responsibilities identify the principal
    data users, decision makers, QA Manager,
    stakeholders and end data users.
  • QA Manager should be iidentified in the QAPP
    this individual should be independent of data
    collection operations, should have direct access
    to senior management, have overall authority over
    data collection activities when non-conformance
    with the QAPP is encountered.
  • An organizational chart depicting the lines of
    communication and authorities between senior
    management, the QAM and project personnel
    should also include principal and end data users,
    decision makers, stakeholders, contractors and
    any subcontractors.

19
Organizational Chart 1
Senior Management
Laboratory Analysis
Field Sampling Staff
Data Validation
20
Organizational Chart 2
Senior Management
Laboratory Analysis
Data Validation Data Quality Assessment
Field Sampling Staff
QA Manager
21
Organizational Chart 3
Senior Management
QA Manager
Data Validation Data Quality Assessment
Laboratory Analysis
Field Sampling Staff
22
Organizational Chart 4
Senior Management
QA Manager
Project Management
Laboratory Analysis
Field Sampling Staff
Data Validation Data Quality Assessment
23
Organizational Chart 5
QA Manager
Senior Management
Project Manager
Decision Makers
Laboratory Analysis
Field Sampling Staff
Data Validation Data Quality Assessment
Organic Analysis
Inorganic Analysis
John WU DQA
Linda Good D. Val.
Joe Smo
Jane Doe
24
A5 Problem Definition/Background
  • Summarize the problem to be solved
  • The decision to be made
  • Or outcome to be achieved
  • Include background/historical information
  • Include scientific and regulatory perspectives

25
A6 Project/Task Description
  • Summarize all work to be done
  • Specify all measurements that must be taken
    identify which measurements are critical or
    non-critical - critical measurements will be used
    to make site decisions non-critical
    measurements wont be used during the decision
    making process
  • Provide a list of all of the equipment required
  • Identify any products that will be produced
  • Provide Maps, Charts, Figures Tables

26
A7 Quality Objectives Criteria
  • Describe the quality goals/objectives for the
    project provide the performance criteria for
    achieving these goals/objectives, etc.
  • Provide the project-specific data quality
    objectives (both qualitative and quantitative)
    and the specific data quality indicators
    (precision, bias, sensitivity, comparability,
    completeness and representativeness) relevant to
    the project/study.

27
Brief Overview of the Systematic Planning Process
  • Data Quality Objectives Process
  • Step 1 State the Problem
  • Step 2 Identify the Goals of the Study
  • Step 3 Identify the information inputs
  • Step 4 Define the Boundaries of the Study
  • Step 5 Develop the Analytical Approach
  • Step 6 Specify Performance or Acceptance
    Criteria
  • Step 7 Develop the Plan for Obtaining Data

28
Additional Thoughts on the DQO Process
  • Include any and all assumptions concerning site
    contamination, contaminant pathways, remedial
    techniques, clean-up design, monitoring
    strategies, etc., as part of the DQO process.
  • Identify any suspected potential departures from
    assumptions in support of the DQO process.

29
A8 Special Training/Certifications
  • Identify and describe any specialized training
    (including QA training) needed by project
    personnel required to successfully complete the
    project or task.
  • Discuss how such training will be provided
    discuss who is responsible for obtaining internal
    training for staff.
  • Discuss where training documentation will be
    maintained.
  • Specify whether professional certifications,
    accreditations or licenses are required for staff
    to perform their designated tasks/duties.

30
A9 Documents Records
  • Describe the process and responsibilities for
    ensuring the appropriate project personnel have
    the most current approved version of the QAPP,
    including version control, updates, distribution
    and disposition.
  • Itemize the information required in project
    documents, records and reports. The type of
    information required for analytical data reports
    must be specified for both hard-copy and
    electronic formats. Data deliverables can and do
    include raw data, data from other sources such as
    computer databases, literature searches, field
    logs, sample preparation logs, analysis logs,
    instrument printouts, model inputs and outputs
    files, and the results of calibrations and QA
    checks.

31
A9 Documents Records
  • Specify whether status/progress reports and final
    reports are required.
  • Specify or reference all applicable requirements
    for the final disposition of records/documents,
    including location and retention time.
  • Identify the individuals who are responsible for
    preparing project documents, records and reports
    also identify who within EPA will receive this
    information.

32
Class B Topics - Overview
  • Discuss all aspects of data collection and
    generation
  • Describe sampling design and provide rationale
    for your approach
  • Specify the analytical measurements both field
    and fixed laboratory
  • Describe sample handling and chain-of-custody
    requirements
  • Specify QA/QC samples with acceptance criteria

33
Class B Topics
  • B1 Sampling Process Design
  • B2 Sampling Methods
  • B3 Sample Handling Custody
  • B4 Analytical Methods
  • B5 Quality Control
  • B6 Instrument/Equipment Testing, Inspection
    Maintenance
  • B7 Instrument/Equipment Calibration Frequency
  • B8 - Inspection/Acceptance of Supplies
    Consumables
  • B9 Non-Direct Measurements
  • B10 Data Management

34
B1 Sampling Process Design Experimental Design
  • Describe the experimental data generation or data
    collection design for the project, including as
    appropriate
  • The types numbers of samples required
  • The design of the sampling network
  • The sampling locations, frequency of collection
    at each location and sample matrices
  • The measurement parameters of interest, and
  • The rationale for the sampling design chosen.

35
Sampling Designs Should be Consistent with your
Conceptual Models!!
  • Evaluate your underlying assumptions -whether
    they are conscious or unconscious
  • Use a statistical tool or sampling tool such as
    Visual Sample Plan to test your sampling design.
  • Use historical data if available to determine the
    actual distribution of contaminants.

36
B1 Sampling Designs
  • Directed Sampling Designs
  • Judgmental Sampling
  • Probability Sampling Designs
  • Simple Random
  • Systematic/Grid
  • Stratified
  • Composite
  • Adaptive
  • Collaborative ( Double)
  • Hot Spot

37
Judgmental Sampling Design - Pros
  • Judgmental sampling is the subjective selection
    of sampling locations in space time by an
    individual analyst or expert.
  • Consistent with intuitive feeling
  • Easy to direct, easy to do
  • May be cost effective if the conceptual site
    model for the project is correct
  • Great if you know absolutely everything there is
    to know about the site and your conceptual site
    model is absolutely correct.

38
Judgmental Sampling Design - Cons
  • Inference from sample to population questionable
  • Use of incorrect conceptual model can lead to
    incorrect decisions can be a disaster.
  • Not suitable for estimating underlying population
    parameters (e.g., mean) with specified confidence
    Cannot use statistics to evaluate distribution
    of data with any degree of confidence with this
    sampling design this is no underlying assumption
    that the data are normally distributed.
  • Not suitable for testing hypothesis about
    underlying populations with specified decision
    error rates

39
Simple Random Sampling - Pros
  • Simple in concept and provides proper
    (theoretical support) data for statistical data
    analysis representative sampling locations are
    chosen using the theory of random chance
    probabilities
  • Protects against bias in estimating parameters
    (e.g., means) and testing hypothesis
  • Is the basic building block of more complicated
    (and efficient) sampling designs.

40
Simple Random Sampling - Cons
  • Ignores available information that could be used
    to develop more cost-effective sampling designs
  • Not as effective as other designs for delineating
    patterns of contamination or finding hot spots
  • Difficult to find randomly selected sampling
    locations
  • Tends to demand large numbers of samples

41
Systematic (Grid) Sampling
  • Systematic (grid) sampling consists of collecting
    samples according to a specified pattern at
    regular intervals in space or time within a grid
    pattern
  • Square or rectangular grid patterns over space
  • Equal-interval sampling along a straight line

42
Systematic Sampling - Pros
  • Easy to explain and implement and provides
    uniform coverage of site or project
  • Good for estimating boundaries, trends, or
    patterns of contamination over space or time.
  • May yield more precise estimates of population
    parameters than other sampling designs
  • Required for statistical data analysis to
    estimate trends and spatial patterns

43
Systematic Sampling - Cons
  • Systematic sampling can cause estimated means to
    be biased if the sampling grid pattern lines up
    with any pattern of contamination.
  • More information is needed (than for simple
    random sampling) about the population to estimate
    the variance of the estimated mean.

44
Stratified Sampling
  • The target population is divided meaningfully
    into contiguous sub-populations called strata
  • Sampling locations are selected independently
    within each strata using some sampling design

45
Stratified Sampling - Pros
  • Dramatically reduces the variability present in
    the population and hence improves precision
  • Enables estimates of individual areas to be made
  • Assists in providing good coverage of the project
  • Allows for increased samples from policy or
    project sensitive areas

46
Stratified Sampling - Cons
  • Requires advanced knowledge in order to divide
    the study area into roughly homogeneous strata
    before sampling
  • The number of samples to be taken in each stratum
    must be determined
  • If strata boundaries are inaccurate, what appears
    to be outlier data can appear due to being in the
    wrong strata

47
Composite Sampling
  • Many individual (grab) samples are combined and
    thoroughly mixed to make a homogeneous whole.
  • At random, sub-samples (composite samples) are
    made and sent to the laboratory for analysis.
  • The physical size of composite samples are the
    same size as those obtained at random.

48
Composite Sampling - Pros
  • Allows for estimating the mean concentration with
    the same precision at a lower cost
  • Provides better coverage of the study site
    without increasing the number of chemical
    analyses
  • Allows for a more representative sample from a
    basic area of sample support (sampling unit).
  • Can be used in combination with other sampling
    designs.

49
Composite Sampling - Cons
  • Information on individual samples used to form
    composite samples is lost in compositing
  • Potential for loss of contaminants (volatiles)
    during the mixing and handling phase
  • Potential for reactions and interactions among
    analytes during compositing
  • Need to make decision on how many grab samples to
    be composited and how many composite samples to
    send for analysis

50
WHY IS YOUR SAMPLING DESIGN IMPORTANT!!
  • UNCERTAINTY!!!
  • UNCERTAINTY!!!
  • UNCERTAINTY!!!
  • Due to the Variability Between Analytical Results
    Within a Given Data Set???
  • OR
  • Due to Sampling Issues???

51
And The Correct Answer Is
  • BOTH!!!
  • ANYTHING ELSE??? You bet, in addition to how you
    collected the samples is the important issue of
    WHERE you collected your samples and this relates
    back to your sampling design and the assumptions
    you made concerning site conditions which in turn
    directed the development of your conceptual site
    model these issues could have greatly increased
    your uncertainty and may lead to a wrong
    decision. A wrong sampling design and a flawed
    conceptual site model will lead to DECISION ERROR.

52
Bottom Line!!
  • It is understandable that analytical studies,
    with their sophisticated instrumentation and high
    cost, are often perceived as the dominant element
    in a site characterization project/study. Yet,
    despite that sophistication and high cost,
    analytical data generated under a scientifically
    defective or unsound sampling design will have
    limited utility.

53
The Best Result
  • Data Set Distribution Normality

54
Normal Distributions the Central Limit Theorem
  • The normal distribution is one which appears in a
    variety of statistical applications. One reason
    for this is the central limit theorem. This
    theorem tells us that sums of random variables
    are approximately normally distributed if the
    number of observations is large. For example, if
    we toss a coin, the total number of heads
    approaches normality if we toss the coin a lot of
    times. Even when a distribution may not be
    exactly normal, it may still be convenient to
    assume that a normal distribution is a good
    approximation. In this case, many statistical
    procedures, such as the t-test can still be used.

55
Ranked Set Sampling A Combination of Statistics
Expert Judgment
  • A sampling design where expert judgment is used
    in combination with simple random sampling
  • Simple random sampling is used to create a large
    number of potential samples. The expert then
    ranks these potential samples and selects which
    to send for analysis.

56
Ranked Set Sampling Pros Cons
  • Pros
  • Better representativeness through using experts
  • Better precision than random sampling
  • Same simple formulae to use
  • Cons
  • Increased cost of the expert ranking samples
  • Difficult quantifying exact improvement
  • Need to find best variable to do the ranking on
  • ..but the pros definitely outweigh the cons

57
B2 Sampling Methods
  • Describe the procedures for collecting samples
    provide SOPs
  • Specify sampling methods and equipment
  • Provide sample container, volume, preservation,
    and holding time requirements
  • Describe the decontamination procedures
  • Provide a list of sampling equipment
  • Describe performance requirements for sampling
    methods
  • Identify the location of support facilities
  • Identify the individuals who are responsible for
    implementing corrective actions during field
    sampling activities

58
B3 Sample Handling Custody
  • Describe the requirements for sample handling
    custody in the field, laboratory, and during
    transport, taking into your holding time
    requirements.
  • Include sample handling requirements for
    packaging, transporting and storing the collected
    samples.
  • Provide examples of sample labels, custody forms,
    sample custody logs and custody seal.

59
B4 Analytical Methods
  • Identify the analytical methods, instruments,
    equipment required.
  • Discuss how laboratory staff are to sub-sample
    the collected environmental sample.
  • Identify the contaminants of concern and specify
    the extraction, digestion and analytical method
    for each contaminant
  • Specify the laboratory decontamination and waste
    disposal procedures
  • Identify the individuals who are responsible for
    implementing corrective actions when problems are
    encountered during extraction, digestion or
    analysis of the samples.
  • Specify the detection limit requirements for each
    contaminant.
  • Provide the regulatory standard(s) (action
    limits, ARARs, MCLs, water quality standards,
    etc.).

60
B5 Quality Control
  • Identify QC activities needed for each sampling,
    analysis, or measurement technique. For each
    required QC activity, list the associated method
    or procedure, acceptance criteria, and corrective
    action.

61
B5 Quality Control Samples
  • Specify the type and frequency of quality control
    sample collection or QC activity
  • Blanks
  • Spikes (MS/MSDs)
  • Duplicates
  • Standard Reference Materials
  • Rinsates/Equipment Blanks
  • Second Column Confirmation

62
B5 Quality Control Samples
  • Specify the acceptance criteria for spike
    recoveries and the precision requirements.
  • Specify the frequency of QC sample collection and
    analysis.

63
B6 Testing, Inspection Maintenance
  • Identify the instruments/equipment requiring
    testing, inspection maintenance during data
    collection operations (both field and fixed
    laboratory).
  • Provide the testing, inspection and maintenance
    procedures identify the individuals who are
    responsible for these tasks.
  • Specify the frequency of instrument equipment
    testing, inspection maintenance.
  • Discuss the corrective actions necessary when
    instruments equipment no longer function as
    required.
  • Identify the location of spare parts for
    repairing items.

64
B7 Calibration Frequency
  • Identify all tools, gauges, instruments and other
    sampling, measuring and test equipment used for
    data generation or collection activities
    affecting the quality that must be controlled
    and, at specific periods, calibrated to maintain
    performance within specified limits.

65
B7 Calibration Frequency
  • Identify the instruments/equipment requiring
    calibration.
  • Describe the calibration procedures and identify
    the standards used during calibration.
  • Specify the frequency of calibration and specify
    the acceptance criteria for calibrations (for all
    instruments/equipment).
  • Identify the individuals who are responsible for
    calibrating instruments/equipment.
  • Identify the individuals who are responsible for
    performing calibrations.

66
B8 Supplies Consumables
  • Identify the supplies consumables that are used
    during field data collection operations
  • Supplies consumables would include calibration
    solutions/standards, calibration gases, reagents,
    tubing hoses, de-ionized water, potable water,
    electronic storage media (data loggers), etc.
  • Specify the acceptance and rejection criteria for
    each item.
  • Identify the individuals who will inspect
    supplies consumables to ensure that they meet
    the relevant acceptance criteria.

67
B9 Non-Direct Measurements
  • Identify any types of data needed for project
    implementation or decision making that are
    obtained from non-measurement sources such as
    computer data bases, programs, literature
    searches, surveying data and historical
    data/data-bases, modeling, etc.
  • Describe the intended use of this data, and
    define the acceptance criteria for the use of
    such data in the project and specify any
    limitations and restrictions in the use of the
    data.

68
B10 Data Management
  • Describe the project data management process,
    tracing the path of the data from their
    generation to their final use or storage (e.g.,
    the field, the office and/or the laboratory).
  • Describe or reference the standard record-keeping
    procedures, document control system, and the
    approach used for data storage and retrieval on
    electronic media.
  • Discuss the control mechanism for detecting and
    correcting errors and for preventing loss of data
    during data reduction, data reporting, and data
    entry to forms, reports and databases. Provide
    examples of any forms or checklists to be used.

69
B10 Data Management
  • Identify and describe all data handling equipment
    and procedures used to
  • Process
  • Compile
  • And analyze data
  • Including computer hardware
  • Computer software
  • Software configurations
  • Include secondary data sources

70
B10 Data Management
  • Describe the procedures that will be followed to
    demonstrate acceptability of the hardware and
    software configuration required, and describe the
    process for assuring that applicable information
    resource management requirements are satisfied.
  • Discuss how your organization will comply with
    EPA data management requirements as specified in
    EPA Order 2180.1 or newly issued data standards.

71
Class C Topics - Overview
  • The topics in this group address the activities
    for assessing the effectiveness of project
    implementation and associated QA/QC activities.
    The purpose of assessment is to ensure that the
    QAPP is implemented as prescribed.

72
Class C Topics
  • C1 Assessment Response Actions
  • C2 Reports to Management

73
C1 Assessments Response Actions
  • Describe each assessment to be used in the
    project including the frequency and type.
  • Assessments include, but are not limited to,
    surveillance, management systems reviews,
    readiness reviews, technical systems audits,
    performance evaluations, audits of data quality
    and data quality assessments.
  • Discuss the information expected and the success
    criteria (i.e., goals, performance objectives,
    acceptance criteria specifications, etc.).

74
C1 Assessments Response Actions
  • List the approximate schedule of assessment
    activities.
  • For any planned self assessments (utilizing
    personnel from within the project groups)
    identify potential participants and their exact
    relationship within the project organization.
  • For independent assessments, identify the
    organization and the person(s) that shall perform
    the assessments if this information is available.
  • Describe how and to whom the results of each
    assessment shall be reported.
  • Discuss how corrective actions will be
    implemented, documented, tracked and verified for
    closure.

75
C2 Reports to Management
  • Identify the frequency and distribution of
    reports issued to inform management (EPA or
    otherwise) of the project status, or to inform
    them of the results of performance evaluations
    and systems audits, data quality assessments, and
    significant data quality issues.
  • Identify the preparer and the recipients of the
    reports and any specific actions recipients are
    expected to take as a result of the reports.

76
Class D Topics - Overview
  • The topics in this group address the QA
    activities that occur after the data collection
    phase of the project is completed.
    Implementation of these elements determines
    whether or not the data conform to the specified
    criteria, thus satisfying the project objectives.

77
Class D Topics
  • D1 Data Review, Verification Validation
  • D2 Verification Validation Methods
  • D3 Reconciliation with User Requirements

78
D1 Data Review, Verification Validation
  • Specify the criteria used to review and validate
    the data that is provide the acceptance and
    rejection criteria by which the data will be
    assessed to determine the quality of this
    information.
  • Provide a list of the data qualifier flags or
    qualifiers along with their respective
    definitions.

79
D2 Verification Validation Methods
  • Describe the process to be used for verifying and
    validating the data, including the
    chain-of-custody for data throughout the life of
    the project or task.
  • Discuss how issues shall be resolved and the
    authorities for resolving such issues within the
    organization.
  • Describe how the results of data verification
    validation are conveyed to end data users,
    decision makers and stakeholder.
  • Precisely define and interpret how validation
    issues differ from verification issues for this
    project.
  • Provide examples of any forms or checklists to be
    used and, identify any project-specific
    calculation required.

80
D2 Reconciliation with User Requirements
  • Describe how the results obtained from the
    project or task will be reconciled with the
    requirements defined by the data user or decision
    maker.
  • Outline the proposed methods to evaluate the data
    and determine those possible anomalies or
    departures from assumptions that were established
    in the planning phase of data collection.
  • Describe how reconciliation with user
    requirements will be documented, issues will be
    resolved, and how limitations on the use of the
    data will be documented, communicated and
    reported to decision makers and stakeholders.

81
Reference Page Appendices
  • Reference Page Contains a list of the
    references cited in the QAPP.
  • Appendices Contains any relevant materials and
    documents that will support the QAPP.

82
QAS Contacts
  • Marilyn Maycock, Chief
  • (706) 355-8553
  • maycock.marilyn_at_epa.gov
  • Denise Goddard, Chemist
  • (706) 355-8568
  • goddard.denise_at_epa.gov

83
QAS Contacts
  • Charlie Appleby, Chemist
  • (706) 355-8555
  • appleby.charlie_at_epa.gov
  • Ray Terhune, Chemist
  • (706) 355-8557
  • Terhune.ray_at_epa.gov
Write a Comment
User Comments (0)
About PowerShow.com