Presented at the Senior Management Service Conference - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Presented at the Senior Management Service Conference

Description:

The Presidency may evaluate the performance of government against set goals, ... indicators will thus have to accommodate such comparisons and be disintegrated ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 34
Provided by: ledule
Category:

less

Transcript and Presenter's Notes

Title: Presented at the Senior Management Service Conference


1

MONITORING AND EVALUATION FRAMEWORK (Implications
for Government)
  • Presented at the Senior Management Service
    Conference
  • by
  • Kefiloe Masiteng
  • 21 September 2004

2
INTRODUCTION
  • MANDATE
  • Monitoring and evaluation of government
    performance has been identified as a
    responsibility of the PCAS
  • The Presidency may evaluate the performance of
    government against set goals, targets,
    equitableness of resource allocation and
    effectiveness and efficiency in service delivery
    across all levels.
  • OBJECTIVE
  • To inform steps to be
  • undertaken in creating a
  • conducive environment enabling
  • all levels of government, private
  • sector, communities and
  • individuals to achieve respective
  • goals in service delivery and
  • Improve performance.

Hence the need to assess essential capacity in
govt.
3
Monitoring versus Evaluation
  • MONITORING
  • Tracking changes in program performance over time
  • EVALUATION
  • Attributing program outcomes to their causes

4
MONITORING
  • WHY MONITOR
  • Monitoring in the government-wide framework
    refers to a set of activity and milestone tracing
    techniques, all of which measure some aspect of
    government performance including the measurement
    of the current status and change over time (trend
    analysis) in any of the initiatives.
  • Monitoring tracks changes in services provided
    (outputs) and the desired results (outcomes),
    providing the basis for accountability in the
    utilization of resources.
  • BENEFITS
  • Monitoring can be put into place as a management
    tool that may be sustained over time. It can be
    used to improve initiatives by identifying
    aspects that are working according to plan and
    yielding positive results, while on the other
    hand it can identify those initiatives that need
    mid-course corrections.

POA Monitoring of the progress made in
attaining the goals set in the SONA and
Makgotla. Cluster POAs Bi-monthly reporting to
Cabinet
5
MONITORING C0MPONENTS
  • Monitoring processes
  • Development and definition of indicators to
    measure the progress made towards meeting
    relevant objectives
  • Data collection mechanisms for and monitoring
    systems to collate data on indicators
  • Data verification, validation and systems clean
    up
  • Data analysis to determine outputs, outcomes and
    trends
  • Report writing on the progress made on
    implementation
  • Distribution and feedback mechanisms across the
    entire spectrum of relevant stakeholders.
  • Capacity needed for Govt
  • Understanding of the POA on GOVT. website
  • Ability to develop relevant indicators for the
    initiatives and interventions arising from the
    cluster POA
  • Information collection strategy on the developed
    indicators
  • Analysis and verification of collected
    information
  • Report writing
  • Communication link with GCIS

6
DEVELOPMENT AND DEFINITION OF RELEVANT INDICATORS
  • Indicators development is based on goals and
    objectives for government
  • These indicators may be calculated on the basis
    of description and formulae allocated to measure
    progress made (monitoring) or determining
    causality (impact assessment).
  • method for acquiring information on indicators,
  • responsibility for collection
  • Info./data source
  • frequency for updating
  • Agreement on evaluation methods
  • Role of Presidency
  • Spearheading the indicator development based on
    POA
  • Setting in place collection, collation and
    report-back/feedback mechanisms

7
IMPACT ASSESSMENT
  • The purpose of impact assessment is primarily to
    measure the degree of change attributable to a
    particular initiative or intervention.
  • Impact assessment addresses the question of
    causality.
  • What differentiates the two processes are the
    evaluation techniques which might just include
    trend analysis in the case of monitoring and the
    analytic techniques used in impact assessment.
  • It determines how much of the observed change in
    the outcome (quality of life, access to services
    e.t.c) at the population can be attributable
    directly to the implementation of government
    policies and programmes and not to other factors.
  • The level of analysis for assessing the impact of
    government policies and programmes is the
    population (beneficiaries).
  • Lessons from the Ten Year Review are crucial
  • Planning for future government reviews based on
    TYR indicators
  • Development of Mid-term review indicators based
    on MTSF
  • Review and refinement of current TYR indicators

8
IMPACT ASSESSMENT PROCESSES
  • Development of assessment frameworks (modeling)
  • Collection and collation of data from different
    sources in relation to developed models
  • Regression (logistic, multivariate e.t.c)
    analysis on dependent and independent variables
  • Interpretation of results/findings to determine
    relationships
  • Report writing on the impact of government
    interventions to the population
  • Distribution of reports to relevant stakeholders
  • Implications for Presidency
  • Advanced policy analysis skills
  • Advanced data analysis skills
  • Basic data mining
  • Basic statistical modeling skills
  • Econometrics
  • Demographic modeling

(work with treasury on economic models) (Working
with departments to compile a compendium of
indicators) (work with Statssa on
demographic/population dynamics NSS)
9
DEVELOPMENT OF DATA COLLECTION MECHANISMS FOR
INDICATORS
  • Role of Government
  • Data collection
  • Verification
  • Validation
  • Report writing
  • To enable comparisons (demographic, social,
    economic, financial and corporate governance)
    across the provinces, population groups, gender
    and age groups around government sectors over
    time and space.
  • The data collected on indicators will thus have
    to accommodate such comparisons and be
    disintegrated within the developed systems and
    databases according to the above mentioned
    categories, especially the GDC.

10
INFORMATION MANAGEMENT
  • The use of information systems in monitoring
    provides a reliable flow of information to allow
    management to keep abreast with the progress in
    the implementation of policy thrusts, programmes
    and activities based on decisions made in
    different gatherings.
  • Information systems facilitate assessment of the
    quality, quantity and timeliness of policy and
    programme inputs while operational constraints
    towards programme and policy effectiveness are
    identified, thus gaps may be addressed.
  • They may further provide contextual information
    for evaluation processes.
  • Role of Presidency
  • Reporting Formats from FOSAD
  • (the project card)
  • EIMS
  • Roll out
  • TRAINING
  • Commitment
  • Integration with NSS (urgent to review)

11
Three models may be applied in monitoring and
evaluation activities in Government
  • High level tertiary model This model can be
    informed by State of the Nation address, Cabinet
    Decision and cluster priorities
  • Government level monitoring and evaluation
    (PCAS) This model measures the progress made by
    government as a whole in addressing the
    objectives and implementing priority programmes
  • Departmental Monitoring and Evaluation
    Initiatives
  • This level addresses the progress made by
    individual departments in implementing their
    programmes in line with government priorities.
    These include indicators to measure programme
    level objectives (outputs), developed within each
    department in their informed by their strategic
    frameworks

12
Government model for systems integration at
National level
Govt. POA
Executive Info. (CabEnet)
Planning
ME
Government Statistics (Departmental routine
systems) National Statistical System)
Programme level Statistics/Info (Departmental
Information Systems)
Provincial and Local Government ME
13
OPERATIONALISING GOVERNMENT- WIDE MONITORING AND
EVALUATION (PHASE 1)
  • Five results, each of which will be delivered as
    a Report
  • A review of existing public service monitoring
    and evaluation systems.
  • An early step in the process of creating a
    national monitoring and evaluation system for
    government will involve reviewing existing
    Departmental ME systems so that existing
    capacity and capability is properly drawn upon.
  • A review of government reporting requirements,
    procedures and needs.
  • A review on progress in the development and
    implementation of government wide ME systems by
    central or coordinating departments.
  • Results of consultations with all provincial
    administrations and FOSAD clusters on their
    performance indicators.
  • A logic model and framework architecture for a
    national ME framework, (including a
    dashboard-style presentation of a national
    scorecard).

14
Monitoring and Evaluation
15
Monitoring and Evaluation
16
Monitoring and Evaluation
17
ROLE OF COORDINATING DEPARTMENTS
  • Institutions at the centre of government need to
    take the initiative in designing
  • performance assessment systems for the whole of
    government i.e PCAS, OPSC,
  • National Treasury, DPSA, DPLG.
  • These should link clearly into the Medium Term
    Expenditure and Strategic Frameworks
  • and should show how assessments and evaluations
    should deliver useful information
  • with practical recommendations.
  • Such transversal systems could include
  •         Good governance (OPSC/Presidency)
  •         Value for Money (National treasury)
  •         Service Delivery (DPLG/OPSC)
  •         Human Resource utilization (DPSA)
  •         An Early Warning Systems
    (DPSA/Presidency)

18
ROLE OF SECTOR DEPARTMENTS
  • Government wide ME system will be
    operationalised on the
  • understanding that each individual department
    will take responsibility for
  • their own monitoring and evaluation processes
    according to the
  • guidelines and standards mentioned above.
  • Monitoring is meant to take place at three
    different levels
  • Ø Implementation monitoring, evaluation, early
    warning and data collection at all three
    spheres of government using inputs, outputs and
    outcome indicators
  • Ø Monitoring of national departmental inputs,
    outputs and outcomes by the coordinating
    departments (PCAS, OPSC, National Treasury, DPSA,
    DPLG)
  • Ø Monitoring of process inputs, outputs and
    outcomes by the departments themselves
  •  Evaluation will also take place at these three
    levels but will be restricted to process and
    impact analysis.

19
ROLE OF PROVINCIAL AND LOCAL GOVERNMNET
  • Operationalisation of the framework will comprise
    provincial, departmental systems and the
    government wide supplementary systems listed
    above, some of which still need to be developed.
    Work on such development should be considered a
    priority.
  • Government wide ME system will be
    operationalised on the understanding that
    individual Provinces will take responsibility for
    their own monitoring and evaluation processes
    according to the guidelines and standards
    mentioned above.
  • The role of Premiers Offices in driving
    provincial ME will also need a special focus.
    This highlights the need for the offices of the
    Premiers in all Provinces to establish monitoring
    and evaluation processes and apply them to local
    government.

20
Components of Programme Monitoring and Impact
Assessment
Program level
Population level
INPUTS
PROCESSES
OUTPUTS
OUTCOMES
  • Resources
  • Personnel
  • Equipment
  • Finance
  • Project Cycle Phases
  • 1. Housing Dev. Process
  • Access to land
  • Land avail. Agreement
  • 2. Planning Process
  • Layout
  • Civil eng. Design
  • 3. Township est. process
  • Install Civ. Eng. Services
  • Units construction
  • 4. Hand over process
  • Keys to beneficiaries
  • Deliverables
  • Serviced sites
  • Subsidies approved
  • Units completed
  • Units under const.
  • Projects approved
  • Fem. Headed H/holds
  • Budget Exp.
  • Impact
  • Housing access
  • Better lives
  • Beneficiaries
  • Objectives met

Impact Assessment
Monitoring
21
RELEVANT SKILLS AND RESOURCES REQUIRED
  • Research
  • statistical/ data analysis
  • specialized software to perform modeling and
    other evaluation techniques
  • Research design for evaluation may include
    population surveys, community surveys or forums,
    focus groups as well as randomized experiments
  • Policy analysis and report writing.

22
Key Questions for Program ME
  • Did the program achieve its objectives?
  • Were the results attributable to program efforts?
  • Which program activities were more or less
    important/effective?
  • Did the intended benefit from the program?
  • At what cost?
  • What is a program?
  • Nationally organized, often publicly sponsored,
    effort to deliver social-economic services to
    target populations with need
  • Organizational systems activated for service
    delivery
  • Indefinite lifetime
  • Has an institutional host that is organic, of
    known size, adaptive, and operates in a changing
    environment

23
Scope of Program ME
  • What level of program evaluation?
  • National, subnational, specific site?
  • Implications for ME design
  • Inference of results
  • Relevant time frame?
  • Relevant units of action?

24
RELEVANT SKILLS AND RESOURCES REQUIRED
  • Research
  • statistical/ data analysis
  • specialized software to perform modeling and
    other evaluation techniques
  • Research design for evaluation may include
    population surveys, community surveys or forums,
    focus groups as well as randomized experiments
  • Policy analysis and report writing.

25
Illustration of Program Monitoring
Program outcome indicator
Program start
Program end
TIME-gt
26
Illustration of Program Monitoring
Program outcome indicator
Actual?
Program start
Program end
TIME-gt
27
Illustration of Program Impact
With program
Change in program outcome
Without program
Program start
Program end
TIME-gt
28
Illustration of Program Impact
With program
Change in program outcome
Without program
Program impact
Program start
Program end
TIME-gt
29
The Role of the Logical/Strategic/Conceptual
Framework
  • Logical vs Strategic vs Conceptual
  • Clarify program objective/strategic
    outcome/dependent variable
  • Interrelate units, levels and directions of
    action
  • Allow for consensus-building around a common
    paradigm

30
Example of a Strategic Framework
Strategic Objective/Priority
Objective 1
Objective 2
Indicator 1
Indicator 1 Indicator 2 Indicator 3
Indicator 2
31
Example of a Conceptual Frameworkfor a
Structural Model
Individual demand
Adequacy of Delivery
Output of delivery
Service utilization
Program supply
32
Example of a Conceptual Frameworka Structural
Model
Individual demand
Housing Delivery
Adequate Housing
Service utilization
Program supply
Self- sufficiency
Institutional capacity
Technical inputs
33
Monitoring versus Evaluation
Can good monitoring lead to good evaluation?
  • Can good monitoring lead to good evaluation?
  • Indicators Significant and influential factors
  • Framework Theoretically sound model
  • Directionality Temporally correct causal flow
  • Levels Appropriate hierarchy of units
  • Coupling quantitative and qualitative assessment
    methods
Write a Comment
User Comments (0)
About PowerShow.com