The Need for a Common Language for Foresight - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

The Need for a Common Language for Foresight

Description:

PREST, Manchester Business School, University of Manchester. Outline ... The process by which knowledge of policies, administrative arrangements, ... – PowerPoint PPT presentation

Number of Views:39
Avg rating:3.0/5.0
Slides: 17
Provided by: luke77
Category:

less

Transcript and Presenter's Notes

Title: The Need for a Common Language for Foresight


1
The Need for a Common Language for Foresight
  • Luke Georghiou
  • PREST, Manchester Business School, University of
    Manchester

2
Outline
  • Foresight as a policy instrument subject to
    transfer
  • Purposes of benchmarking and evaluation
  • Policy comparison
  • Policy learning
  • Benchmarking and indicators
  • Conclusions

3
Selective Chronology of National Foresight
4
Policy transfer
  • The process by which knowledge of policies,
    administrative arrangements, institutions and
    ideas in one political system (past or present)
    is used in the development of policies,
    administrative arrangements, institutions and
    ideas in another political system. (Dolowitz,
    2000

5
Reasons for failure in policy transfer
  • Uninformed transfer
  • borrowing country has insufficient information
    about the policy that is being transferred with
    the result the policy is imperfectly implemented.
  • Incomplete transfer
  • crucial elements of a policy or programme that
    made the policy or programme a success are not
    transferred.
  • Inappropriate transfer
  • insufficient consideration given to social,
    economic, political and ideological differences
    between the borrowings and the transferring
    country leading to programme failure

6
Approximate definitions of evaluation and
benchmarking
  • Evaluate - to ascertain value, to judge the worth
    of ...
  • Benchmarking - to find and implement good
    practice through comparing the performance of an
    organisation with that of others, especially with
    best practice
  • Common elements of comparison and measurement but
    in different interpretations

7
Purposes of evaluation and benchmarking
  • Evaluation
  • Accountability
  • Legitimation/ justification
  • Learning
  • Benchmarking
  • Control
  • Improving competitive position relative to
    benchmarks
  • Cooperative learning from others

8
Aims in this context
  • Why coordinate and compare foresight activity in
    Europe?
  • policy learning
  • policy synergies
  • critical mass, scale and scope
  • harmonised outputs
  • resolving subsidiarity issues
  • What are the clients expecting (EU and national
    governments) ?

9
Benefits of policy comparison
  • Contextualisation of case studies (single
    programmes) to reduce bias from national cultural
    context
  • Test hypotheses about relations between key
    variables
  • Eg wider participation reduces creativity
  • Allow classification schemes to simplify
    discussion and identify common distinctive
    features
  • Make predictions about future activities or
    designs based on cross-cutting analysis

10
Pitfalls of policy comparison
  • Diverse and multiple variables
  • Most effects of foresight draw on multiple
    influences
  • Risks in drawing inference from small number of
    cases
  • Risks of confusing same terminology with same
    meaning in different social or political context
  • Eg interested public in Futur
  • Also language issue translation into and out of
    English can create false convergence
  • Appreciation in UK Training in Hungary
  • Drifts in levels of analysis
  • Comparing national initiative rooted at centre of
    government with one organised by agency without
    full national legitimacy

11
Bases for comparison
Different outcomes
Similar programmes
Similar outcomes
Different programmes
12
Policy learning
  • Imitation
  • Do almost exactly the same
  • Adaptation
  • Follow model but adapt to context
  • Combination
  • Combining elements of two or more programmes
  • Inspiration
  • Taking experience of others as starting point for
    new design
  • Elimination
  • Realisation that approach not valid for own
    circumstances

13
Conditions favouring learning
  • Instrument
  • Low specificity
  • Clarity of instrument and availability of
    information on detail of operation
  • Sufficiently innovative not to clash with
    existing similar activity
  • Environment
  • Similar institutional setting
  • Possibly deriving from EU membership obligations
  • Ideological/philosophical appeal to political
    sponsors
  • Time of change in which new policy ideas being
    sought
  • Similar available resource levels
  • Similar available expertise levels
  • Methodological/ operational
  • Domain expertise

14
Specific pitfalls in benchmarking
  • Inappropriate choice of comparator
    countries/programmes
  • Is EU wide comparison valid given diversity if
    not how to cluster?
  • Wrong performance measures
  • See next slide
  • Data not available
  • Non-existent or cooperative channels to obtain it
    not in place or not deep enough
  • Data not reliable
  • Source may be biased
  • Wrong causality
  • Apparent attribution of effects may be wrong
  • Unuseable outcome
  • Identified best practice not implementable for
    reasons given earlier
  • Costs of benchmarking exceed benefits

15
Limitations of performance indicators
  • Main problem that they often measure what is
    measurable rather than what is needed
  • Crudely constructed regime may distort
    performance (Goodharts Law) or be subject to
    manipulation (Gibbons Law)
  • Current vogue for performance indicators both
    threat opportunity
  • Evaluations must be located in systemic context
  • Basic requirement for a performance indicator
    regime is clear understanding of context, goals
    and relationships between goals and effects
  • Logic model approach in evaluation a useful tool
    in this context
  • Emphasis on holistic approach to programme and
    its interdependencies
  • Important to compare across full cycle from
    rationale to implementation

16
Conclusions
  • Benchmarking foresight already complicated by
    previous policy transfers
  • Nevertheless substantial rewards from transparent
    evaluations and effective comparison and learning
  • Clear contextual and process understanding
    depends on
  • The need for a common language
Write a Comment
User Comments (0)
About PowerShow.com