Designing and Using a Behaviour code frame to assess multiple styles of survey items - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Designing and Using a Behaviour code frame to assess multiple styles of survey items

Description:

Sound file (.wav) Automatically routed through interview. Tags to skip to relevant data items ... Sound file. Blaise interviewing screen. Coding program. Data ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 37
Provided by: iserEs
Category:

less

Transcript and Presenter's Notes

Title: Designing and Using a Behaviour code frame to assess multiple styles of survey items


1
Designing and Using a Behaviour code frame to
assess multiple styles of survey items
Alice McGee and Michelle Gray
2
Presentation outline
  • Background to study
  • Aims of research
  • Methodology
  • Designing a behaviour code frame
  • Using the behaviour code frame
  • Analysing the data
  • Lessons learned

3
Background to study
  • English Longitudinal Study of Ageing (ELSA)
  • Dependent Interviewing (DI)
  • Two types of data item
  • Feed Forward (DI)
  • Non-Feed Forward (non DI)
  • Little evaluation of the impact of DI on data
    quality conducted to date

4
Aims of research
  • Research aims
  • To assess how DI affects data quality
  • To explore how Rs react to feed-forward phrases
  • To find whether this varies by nature and
    sensitivity of topic
  • Methodological aim
  • To explore the combination of CARI and Behaviour
    Coding as methodological tools

5
Methodology
  • Computer Assisted Recorded Interviewing (CARI)
  • Computer acts as a sophisticated tape recorder
  • Unobtrusively records interaction
  • Behaviour Coding
  • Codes systematically applied to
    interviewer-respondent behaviours
  • Uncover and assess problems with questions
  • Two methods combined for this study

6
Designing the code frame
7
Principles for good design
  • Code frame adapted from Cannell et al (1989)
  • Short and straightforward
  • Few, easy to apply codes
  • Discrete
  • Broad rather than specific

8
Behaviours coded
  • Question asking behaviour for interviewers
  • Immediate response behaviour for respondents
  • Whether partner intervened (concurrent
    interviews)
  • Final outcome of the entire exchange

9
Two behaviour code frames
  • Two code frames designed
  • DI (feed-forward) items
  • non DI (non feed-forward) items
  • First level exchange (initial utterance)
  • Code what occured before other person speaks

10
Code frame
11
Behaviours coded
  • Interviewer/Interviewer feed-forward
  • Respondent/Respondent feed-forward
  • Whether partner intervened
  • Final outcome
  • One code per behaviour

12
Interviewer codes
  • Exact Wording/Slight Change 01
  • Major change 02
  • Omission 03
  • Question became a statement 04
  • Inaudible Interviewer/Other 05
  • Not applicable 99
  • denotes where notes must be made

13
Interviewer feed-forward codes
  • FF item read as worded/slight change 01
  • FF statement became a question 02
  • FF question became a statement 03
  • Other major change 04
  • Omission 05
  • Inaudible Interviewer/Other 06
  • Not applicable 99
  • denotes where notes must be made

14
Respondent codes
  • Adequate Answer 01
  • Inadequate Answer/Elaboration 02
  • Clarification 03
  • Question Re-Read 04
  • Don't Know 05
  • Refusal 06
  • Inaudible Respondent/Other 07
  • Not applicable 99
  • denotes where notes must be made

15
Respondent feed-forward codes
  • Affirmed FF item - adequate 01
  • Disputed FF item - adequate 02
  • Inadequate Answer/Elaboration 03
  • Clarification 04
  • Question Re-Read 05
  • Don't Know 06
  • Refusal 07
  • Inaudible Respondent/Other 08
  • Not applicable 99
  • denotes where notes must be made

16
Partner intervention codes
  • Yes 01
  • No 02
  • Not applicable (no partner present) 99
  • denotes where notes must be made
  • Code used for where the respondents partner
    intervened and subsequently answered for the
    respondent

17
Final outcome codes
  • Adequate Answer 01
  • Inadequate Answer 02
  • Don't Know 03
  • Refusal 04
  • Inaudible/Other 99
  • denotes where notes must be made
  • Coding whether the final answer meet the
    objective of the question

18
Technical details
19
CARI equipment
  • Equipment testing
  • External microphones
  • CARI built into Blaise program
  • Recording switched on and off at relevant items
  • Sound files automatically generated and saved
  • Sound files removed from interviewer laptops
  • Macro run
  • Data sticks (USB)

20
Behaviour coding system
  • Conducted within Blaise
  • Coding program designed for this purpose
  • Weststat testnote software
  • Three windows displayed simultaneously
  • Blaise interviewing screen
  • Coding entry screen
  • Sound file (.wav)
  • Automatically routed through interview
  • Tags to skip to relevant data items

21
Using the code frame
22
Sound file
23
Blaise interviewing screen
24
Coding program
25
(No Transcript)
26
Data preparation and analysis
27
Organising the data
  • Two types of data
  • Behaviour codes (quantitative)
  • Coder notes on non-standard behaviours
    (qualitative)
  • All data automatically stored in Excel tab
    delimited file
  • One Excel file produced for each coder
  • Excel files amalgamated
  • Exported into SPSS

28
Data preparation
  • More cleaning than expected
  • Two main problems
  • Duplicate files (limitations of system used)
  • Incorrect code frame used at interviewer and
    respondent behaviours (DI and non DI items)

29
Analysing the data
  • SPSS
  • Frequencies and crosstabulations
  • Coder notes provided additional context
  • Very small base sizes at some items due to
    routing

30
Advantages and disadvantages of our approach and
lessons learned
31
What worked
  • CARI
  • Unobtrusive in nature
  • Minimal impact on interviewers and respondents
  • Behaviour coding
  • Able to run statistical analyses
  • Able to draw conclusions
  • Method of coding easier than paper (routing)

32
What didnt worked
  • CARI
  • High number of inaudible or hard to hear cases
    (1/3 of respondents)
  • Purchased speakers to help
  • Next time
  • Fully re-test microphones
  • Probe respondents reasons for not giving consent
    to being recorded

33
What didnt worked
  • Behaviour coding
  • Lengthy and costly process
  • Coding (approximately 45 mins per interview)
  • Data cleaning
  • Over complex code frame
  • Coding method found cumbersome, limited and error
    prone
  • Coder judgement not measured

34
Next time...
  • One code frame only
  • Build in sufficient time for each stage
  • Clear rationale for behaviour coding
  • Inter-coder reliability test (Kappa score)
  • Adequate sample for uncommon questions
  • Create more sophisticated, less error prone
    coding system

35
Discussion Questions...
36
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com