Scaling Up Agricultural R - PowerPoint PPT Presentation

About This Presentation
Title:

Scaling Up Agricultural R

Description:

none – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 35
Provided by: WilliamA9
Category:

less

Transcript and Presenter's Notes

Title: Scaling Up Agricultural R


1
Scaling Up Agricultural RD in AfricaUsing
Prizes to Reward Adoption of Successful
InnovationsWilliam A. MastersPurdue
Universitywww.agecon.purdue.edu/staff/masters
2
Motivation Why we need prizes
  • Africa needs farm productivity growth
  • worsening malnutrition and rising rural
    population
  • delayed adoption of improved varieties
  • Successful RD aims for three Cs
  • Concordance (investing proportionally to size of
    target)
  • Complementarity (investing in things ignored by
    others)
  • Catch-up (investing in imitation of successes
    elsewhere)
  • To hit targets need a mix of funding
    instruments
  • Grants contracts to build capacity
  • IPRs to provide incentives for marketable
    innovations
  • Prizes to provide incentives for non-marketable
    innovation

3
Africas food crisis is worsening
Figure 2. Data and projections for malnutrition
by region, 1995-2015
4
Undernutrition remains the worlds leading health
burden, causing vulnerability to many diseases
Attribution of disease burden to major risk
factors (estimates for high-mortality developing
countries, 2000)
5
The rural poor are particularly undernourished
Stunting by residence and wealth
Source FAO (2004), The State of Food Insecurity
in the World 2004. Rome, FAO.
6
Food production in Africa involves a a wide
variety of crops
Food production in Africa by crop, 1961 and 2002
7
Food-crop output makes the difference
8
More and more Africans haveno choice but to be
farmers
9
Africa faces an unusually severe and prolongued
demographic burden
10
Africa has a lot of catch-up to do
11
Existing techniques are not very profitable
12
The difference is not due to governance
13
The difference is linked to tech. adoption
14
and linked to low RD investment
15
RD levels vary across countries but have not
grown over time
16
RD payoffs vary but are consistently high
Estimated return to agricultural research and
extension (/year)
Source Alston, J.M., M.C. Marra, P.G. Pardey,
and TJ Wyatt. 2000. "Research returns redux A
meta-analysis of the returns to agricultural
RD." Australian Journal of Agricultural and
Resource Economics, 44(2) 185-215.
17
but sustaining sufficient public investment has
been difficult!
18
How prizes would work
  • We propose to help donors
  • pay innovators proportionally to the economic
    gains from new technologies
  • to accelerate and extend innovations beyond what
    is now being achieved,
  • using verifiable data submitted by innovators
    after initial adoption,
  • subject to adjudication by expert panels.

19
Schematic Overview of the Proposal
Donors specify lines of credit for target doma
ins
Innovators submit data on new techniques after
adoption
The secretariat makes the market
-- technical assistance -- dispute resolution
-- payouts per measured gains
Other innovators make counter-claims and imitat
e successes
Other donors see successes, expand prizes and co
ntracts
20
Step 1. Donors specify lines of credit
  • Target domains and institutional eligibility
  • Prizes are to be paid in proportion to net
    benefits,
  • after any value capture through input sales
  • to reward spillovers from private activity
  • Initially, use fixed time period/variable share
    system
  • accept all applications received by deadline
  • pay out proportionally
  • rate depends on funds available / measured
    benefits
  • e.g., with a 1 m. fund and 10 m. in
    benefits, rate is 10

21
Step 2. Innovators submit data
  • After initial adoption, innovators could submit
  • technology data, following guidelines
  • experimental data on input/output change
  • survey data on extent of adoption
  • market data on prices and quantities
  • assumptions and elasticities
  • agreement on attribution of effort, e.g.
  • Bt cotton -- 45 Monsanto, 45 NARS, 10 NGO
  • New forage -- 33 IARC, 33 NARS, 34 Extension

22
Data needed for prize application
to compute annual economic gains from an
innovation
23
Data needed (contd.)
to estimate adoption rates over time
Fraction of surveyed domain
Other survey (if any)
First survey
Projection (max. 3 yrs.)
Linear interpolations
Year
First release
Application date
24
Data needed (contd.)
to cumulate gains over time
Discounted Value (US)
Statute of limitations (max. 5 yrs.?)
Projection period (max. 3 yrs.?)
First release
NPV at application date, given fixed discount ra
te
25
Guidelines for experimental data
  • Goal is to measure difference in outputs and
    inputs between the new technology and its
    alternative
  • The applicant must provide evidence of a standard
    similar to that used in international scientific
    journals of the relevant discipline
  • The applicant must maintain appropriate
    experimental records, and make them available to
    the prize secretariat on request.

26
Guidelines for survey data
  • Goal is to measure the extent of adoption in each
    year, as a fraction of some appropriate domain.
  • The applicant must provide evidence that the
    adoption domain has similar agroeconomic
    conditions as the experiment sites, and that the
    alternative techniques are similar.
  • The applicant must maintain appropriate survey
    records, and make them available to the
    secretariat on request.

27
Guidelines for market data
  • Goal is to measure the size of the market to
    which the adoption and productivity gain
    applies.
  • The applicant must provide evidence that this
    market is represented by the survey and
    experiment.
  • The applicant must use official data where
    available, and provide survey evidence otherwise.

28
Guidelines for assumptions and elasticities
  • Cost reduction is constant
  • experiments refer to average supplier
  • Demand elasticity is zero
  • quantity produced is fixed
  • Supply elasticity is unitary
  • pct. changes output and cost have equal value
  • Discount rate is 5
  • moderate incentive for earlier results

29
Guidelines for attribution and prize-sharing
  • Prize shares submitted with initial application
  • Challengers data may be accepted by applicants
    as amendment to initial application, or submitted
    separately for adjudication
  • Adjudication panel may accept part or all of the
    data submitted, and use data from other sources.

30
Step 3. Verification and adjudication
  • Secretariat response within 60 days (?)
    Site visits, documentation requests (if
    any)
  • Others challenges -- within 120 days (?)
  • Acceptance of amended proposals (if any)
  • Filing of alternative submissions (if any)
  • Adjudication panels -- within 180 days (?)
  • Appointment of panel from expert pool
  • Decisions from adjudication panels (if any)
  • Donor payouts within 210 days (?)

31
Implementation and Governance
  • Two governance models
  • Initially, use existing institution
  • e.g. AATF, AERC, FARA or SROs
  • Eventually, build a fully independent
    secretariat
  • With an elected board of 12 trustees,
  • 4 elected by donors, 4 by researchers, and 4 by
    both
  • each serving 4-year terms, so each elects
    one/year
  • and a ban on receiving funds while on the board,

  • all participants would have continuous incentive
    to maintain trustworthy prize-giving systems over
    time.

32
Implementing prizes whats done
  • Road-testing and refinement
  • at scientific meetings
  • (US, Canada, Italy, Switzerland)
  • in scientific journals
  • (AgBioForum, Intl. J. of Biotechnology)
  • for popular and policy audience
  • (ABC News J. of International Affairs)
  • Financial support
  • Adelson Family Foundation of New York
  • Advisory board
  • Launched October 11th, 2004 in New York

33
Members of the Advisory Board
  • Simeon Ehui (World Bank)
  • Robert Evenson (Yale)
  • Richard Nelson (Columbia)
  • Phil Pardey (Minnesota)
  • Carl Pray (Rutgers)
  • Jeffrey Sachs (Columbia)
  • Pedro Sanchez (Columbia)
  • Brian Wright (Berkeley)
  • David Zilberman (Berkeley)

34
Other endorsements
  • Walter Alhassan (Ghana, former DG of CSIR)
  • Julian Alston (UC Davis)
  • Jock Anderson (World Bank)
  • Alain de Janvry (UC Berkeley)
  • Bruce Gardner (U of Maryland)
  • Anil K. Gupta (Natl. Innovation Foundation,
    India)
  • Michael Kremer (Harvard)
  • Oumar Niangado (Syngenta Foundation)
  • George Norton (Virginia Tech)
  • Rob Paarlberg (Wellesley)
  • Prabhu Pingali (FAO)
  • Per Pinstrup-Andersen (Cornell)
  • Jim Ryan (Australia, former DG of ICRISAT)
  • Eugene Terry (AATF, Nairobi, former DG of WARDA)
Write a Comment
User Comments (0)
About PowerShow.com