A Strategy for Prioritising Non-response Follow-up to Reduce Costs Without Reducing Output Quality - PowerPoint PPT Presentation

About This Presentation
Title:

A Strategy for Prioritising Non-response Follow-up to Reduce Costs Without Reducing Output Quality

Description:

Methodology Directorate. UK Office for National Statistics. Outline of presentation ... We impute/construct where there is non-response. Then estimate totals as ... – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 25
Provided by: ruthes
Category:

less

Transcript and Presenter's Notes

Title: A Strategy for Prioritising Non-response Follow-up to Reduce Costs Without Reducing Output Quality


1
A Strategy for Prioritising Non-response
Follow-up to Reduce Costs Without Reducing Output
Quality
  • Gareth James
  • Methodology Directorate
  • UK Office for National Statistics

2
Outline of presentation
  • Introduction
  • response-chasing in ONS business surveys
  • Understanding non-response
  • effects, patterns and reasons
  • Strategy for response-chasing
  • scoring methods current investigations and
    future strategies

2
3
Introduction
  • Non-response the failure of a business to
    respond in part or full to a survey. Effect on
  • bias and standard error,
  • perception of output quality,
  • business behaviour
  • Improve response rates by
  • better questionnaire design, sample rotation
    rates,
  • response-chasing - necessary, but expensive
  • Quality improvements and efficiency targets
  • effective targeting needed

3
4
Current practice at ONS
  • Use of targets (mainly counts, occasionally
    other variables)
  • Written reminders to all.
  • Then targeted phone calls, could lead to
    enforcement
  • Businesses identified as key (by survey area)
    chased intensively first
  • After keys, principle to chase large-employment
    businesses next
  • Methods differ between surveys

4
5
Current practice at ONS
  • Areas for improvement
  • Methods for key businesses
  • make more consistent, transparent, scientific
  • Effective use of response-chasing tools
  • Team structure and knowledge
  • (Area undergoing restructure)
  • Efficiency initiatives
  • save resources some changes already implemented
  • effects being monitored evaluation needed

5
6
Efficiency initiatives removal of second
reminders
6
7
UNDERSTANDING NON-RESPONSE
8
Patterns of non-response
  • Industrial sector - identified those with lower
    response rates (e.g. catering, hotels)
  • High correlation between industry response rates
    at early and final results
  • Size of business larger businesses take longer
    to respond. Chasing strategy ensures responses
    are received later though

8
9
Intensive Follow-Up (IFU) exercise
  • Dual aims
  • to estimate non-response bias (work in progress
    see final paper)
  • to establish reasons for non-response and (later)
    cost response-chasing
  • Used the Monthly Inquiry into the Distribution
    and Services Sector (MIDSS)
  • dedicated team for the IFU
  • contacted c.600 non-responders per month in
    chosen industries
  • businesses to receive up to 5 phone calls
  • reason for initial non-response nature of call
    length of call

9
10
IFU results returned data
  • c.80 of all businesses selected for IFU returned
    questionnaire, but
  • many businesses returned questionnaire just after
    deadline no call needed!
  • Only c.60 of those contacted returned
    questionnaire

10
11
IFU results reasons for non-response
Reason for initial non-response Number who gave a reason Returned data after IFU calls Still didnt return data after IFU calls
Forgot, missed date 667 77 23
Too busy, too low priority 361 67 33
Actively decided not to 67 33 67
11
12
BUILDING A RESPONSE-CHASING STRATEGY
13
Dealing with businesses that dont respond
  • Aim to make response-chasing more efficient
  • Create a scoring system to prioritise/categorise
    non-responders
  • Focus on reducing non-response bias

13
14
Estimation in ONS business surveys
  • We impute/construct where there is non-response.
  • Then estimate totals as
  • where

14
15
Bias in ONS business surveys
  • Total potential non-response bias ( total
    imputation error) given by
  • We will concentrate on
  • (i.e. the absolute error of imputation for each
    business)

15
16
Scoring - principles
  • Reduce imputation error by attempting to predict
  • (Large value means increased risk if business is
    imputed therefore target these)
  • May also wish to score to encourage good response
    behaviour from businesses e.g. new-to-sample
  • Need a system that is easy to use and justify.

16
17
Scoring methods
  • (McKenzie) Calculate imputation error from
    previous returns then rank into deciles 0, 1,
    , 9.
  • (Smallest Largest)
  • New-to-sample or long-term non-responders 10
  • Tested on MIDSS in 2001-2 implementation issues
  • (Daoust) Calculate weighted contribution to
    estimates categorise into 3 groups for
    follow-up
  • New investigations with adapted methods

17
18
Current investigations in MIDSS
  • Predict imputation error in monthly turnover (
    y)
  • Various predictors available
  • Rank businesses then group
  • No imputation score?
  • Use stratum average.
  • Assess actual error against predicted.

18
19
Results (5 groups)
  • Percentage of within
    each priority score group

Actual
Score Imputation error
4 88
3 8
2 3
1 1
0 ltlt 1
19
20
Results
  • Percentage of within
    each priority score group

Actual Weighted prediction Weighted prediction Weighted prediction
Score Imputation error Previous imp. error
4 88 73
3 8 12
2 3 10
1 1 3
0 ltlt 1 2
19
21
Results
  • Percentage of within
    each priority score group

Actual Weighted prediction Weighted prediction Weighted prediction
Score Imputation error Previous imp. error Register turnover
4 88 73 68
3 8 12 15
2 3 10 8
1 1 3 5
0 ltlt 1 2 4
19
22
Results
  • Percentage of within
    each priority score group

Actual Weighted prediction Weighted prediction Weighted prediction Unweighted prediction
Score Imputation error Previous imp. error Register turnover Register employment Register employment
4 88 73 68 42 40
3 8 12 15 20 15
2 3 10 8 11 12
1 1 3 5 9 18
0 ltlt 1 2 4 18 15
19
23
Conclusions
  • Significant gains available in response chasing
  • Future plans
  • Refinements to scores
  • optimum predictor
  • individual adjustments (e.g. long-term
    non-responders)
  • overall or by separate industry groups?
  • multivariate surveys
  • Dynamic updating of scores
  • Live testing

20
24
References
  • Daoust, P., (2006), 'Prioritizing Follow-Up of
    Non-respondents Using Scores for the Canadian
    Quarterly Survey of Financial Statistics for
    Enterprises', Conference of European
    Statisticians
  • McKenzie, R., (2000) 'A Framework for Priority
    Contact of Non Respondents', Proceedings of the
    Second International Conference of Establishment
    Surveys

21
Write a Comment
User Comments (0)
About PowerShow.com