Are you spending too much time in the lab running studies - PowerPoint PPT Presentation

1 / 65
About This Presentation
Title:

Are you spending too much time in the lab running studies

Description:

Craig's List - www. ... present a list of requirements for completing the study (e.g. be ... have a list of questions at the end for ps to report various ... – PowerPoint PPT presentation

Number of Views:99
Avg rating:3.0/5.0
Slides: 66
Provided by: kevinpete
Category:
Tags: craig | lab | list | much | running | spending | studies | time

less

Transcript and Presenter's Notes

Title: Are you spending too much time in the lab running studies


1
Are you spending too much time in the lab running
studies?
Are you looking for a way to spend more time with
your computer each day?
2
We have the solution for you!
3
Internet Surveys for Intellectual Dummies
  • Conducting Psychology Studies Online
  • Theory Application
  • Sara Konrath, Social Psych PhD Candidate

4
Theory 10 am noon
  • comparison of internet with traditional methods
  • advantages disadvantages
  • dos don'ts of internet research
  • how to reduce drop out rates
  • some recruiting ideas and places to post
  • ethics, security control

5
Application 1 4 pm
  • basic html
  • UM Lessons
  • Random assignment

6
  • All files (including these PowerPoint ones) can
    be found at my website
  • sitemaker.umich.edu/skonrath/internet_research

7
Internet vs. traditional survey methods
  • most studies comparing paper and pencil to
    computer-based surveys produce basically the same
    results (e.g. Stanton, 1998 ONeill Penrod,
    2001 Joinson, 1999 Davis, 1999 Buchanan
    Smith, 1999 etc.)
  • this is true for a wide variety of research
    topics
  • this also generalizes to Internet surveys (see Ch
    2 of Birnbaum for a summary)

8
Demographics
  • Some say that psychological science is based on
    research with rats, the mentally disturbed, and
    college students. We study rats because they can
    be controlled, the disturbed because they need
    help, and college students because they are
    available.
  • Birnbaum, 1999, p.399

9
Demographics
  • web experiments have more diverse participants
    than typical lab studies (e.g. Reips, 1996)
  • demographic characteristics of internet
    participants expected to become more similar to
    general population demographics

10
Ethnicity Gender
  • overwhelming majority of web participants so far
    have been white
  • usually between 80-90 in most studies
  • studies report very wide range in proportion of
    females
  • anywhere from 8 (Swoboda et al, 1997) to 71
    female (Pasveer Ellard, 1998)
  • depends on what youre studying (e.g. online
    gamers almost no women)

11
Country of origin
  • Internet provides opportunity to do
    cross-cultural research (provided you can
    translate) from here
  • however, most web studies so far report majority
    American (or Canadian) participants
  • between 80-90
  • rest usually European
  • language of surveys (English) is obviously a
    factor

12
Age
  • average age in most studies is between 26-35
    years
  • wider distributions than in lab studies

13
Advantages
  • as mentioned, potential access to demographically
    culturally diverse participant population
  • Includes previously inaccessible unique groups
  • e.g. drug dealers (Coomber, 1997)
  • e.g. people with specific head injuries
    (Browndyke et al, 1997)
  • Internet can be a research tool or an object of
    research (e.g. online role-playing games, web
    communities)

14
Advantages
  • bring experiments to participants instead of the
    opposite
  • participants can choose convenient time to
    participate
  • can participate in comfortable familiar
    surroundings
  • participants find computer surveys more
    interesting perceive them as shorter than paper
    pencil ones (Rosenfeld et al, 1993)

15
Advantages
  • high statistical power because of large samples!
  • participants usually motivated
  • in typical psychology experiments students
    participate to get class credit
  • boring alternative like writing an essay
  • students may be less-than-eager subjects
  • web no one is making them participate

16
Advantages
  • data collection takes less time
  • can get a few hundred participants in a few days
  • simultaneous access by participants (no
    scheduling problems)
  • automated data entry
  • saves time
  • saves money
  • reduces data entry errors

17
Advantages
  • easy to make changes
  • if you notice something weird is happening, can
    easily change update webpage
  • direct lesson of motivational confounding
  • If there is a higher dropout in one condition
    compared to another, this may mean that it is
    more difficult or boring than the other one
  • this may explain your effects (rather than your
    intended manipulation)

18
Advantages
  • no experimenter effects
  • As long as experimenters are present there is the
    possibility that they might give subtle clues or
    make errors that systematically bias data
  • increased anonymity
  • participants may self-disclose more freely online

19
Advantages
  • dropout
  • usually considered bad but there are two reasons
    that it may be considered an advantage
  • lose unmotivated participants (hopefully early)
  • ethics participants dont feel any pressure to
    finish study (in most lab situations there is at
    least some social pressure to stay)

20
Disadvantages
  • drop out!
  • can be high online (average 34 range 1-87)
  • should compare demographics of ps who dropout
    with those who finish
  • think of exclusion criteria before you analyze
    your data (e.g. exclude if they fail to answer
    main dv)
  • we will discuss ways to reduce dropout later

21
Disadvantages
  • often need technical background to create and
    manage surveys
  • not always! today we will learn an easy
    alternative.
  • less control of experimental setting
  • surveys may look different on different browsers,
    operating systems, screen sizes, etc.
  • participants have varying download times, which
    may cause difficulty in more complicated studies
    (e.g. using images, reaction time studies)
  • Participants may be completing survey under very
    different circumstances (e.g. in a café, in their
    living room, etc.)

22
Disadvantages
  • self-selection
  • big problem in Internet surveys
  • can ask ps where they saw your ad and compare
    groups
  • still not a representative sample
  • doesnt include people who are uncomfortable
    using the Internet
  • doesnt include people who dont own a computer
  • biased toward white mid-high income North
    American (English speaking) males

23
Disadvantages
  • some participants may not be serious
  • - at end of study give them a do not use my
    data checkbox
  • - check data for strange answers (e.g. age 99,
    sex often, etc)
  • multiple submissions
  • - not likely as most web studies are not
    thrilling
  • - easiest way is to ask them not to submit twice
    or at beginning ask if they have already done
    study
  • - can also check IP addresses / email addresses

24
Disadvantages
  • no experimenter to ask questions / clarify (low
    comprehension of survey)
  • instructions must be especially clear
  • always pretest
  • ask for comments at end of survey
  • provide email address for feedback
  • participants will always be at computers
  • computer-independent behaviors or attitudes
    cannot be measured
  • many studies cannot be done online

25
Some Dos
  • Edit or Pretest before you Publish
  • once a survey is online it can be costly to make
    changes
  • you can lose or frustrate potential participants
    who try to access a web page that is down, throw
    your dataset out of whack by moving or changing
    questions, etc.
  • therefore, be certain of all questions, response
    formats and directions before you advertise your
    survey

26
Dos
  • KISS Keep It Simple Sweetheart
  • if you want your online research to work the
    right way for all of your participants (who are
    using a variety of computers and internet
    browsers), keep your project simple and
    straightforward
  • avoid flashy presentations complex layouts
  • may increase dropout bias sample toward more
    sophisticated computer users

27
Dos
  • Be Empathetic in Your Design
  • when you are designing your survey, think about
    it from your participant's point of view
  • would you be willing to answer that question
    online?
  • could it be phrased differently?
  • since they can't ask for clarification, is the
    question clear?
  • it is always a good idea to have someone who is
    unfamiliar with your research proof-read your
    survey for errors, clarity, and flow

28
Dos
  • Multi-Site Data Collection
  • online research allows you to collect data from
    many different participant pools (e.g. discussion
    boards, psychology survey websites)
  • ask subjects where they saw your ad (to compare
    groups)
  • well talk more about where to recruit later

29
Some Donts
  • Length of surveys
  • people are turned off by long surveys and are
    less likely to start and finish if the survey is
    too long
  • ask only questions that you actually need answers
    to
  • each question you ask more time for your
    participants
  • if you are not giving anything in return, the
    shorter, the better
  • longer surveys are generally best suited for an
    audience who is receiving something in return
    (i.e. class credit, money, etc.)

30
Donts
  • In experimental surveys, dont use obvious URLs
    to distinguish your conditions
  • - e.g. http//www.lessons.umich.edu/cond1
  • vs. http//www.lessons.umich.edu/cond2
  • instead, embed your condition within a bunch of
    random numbers letters
  • e.g. http//www.lessons.umich.edu/6x7rz1o
  • vs. http//www.lessons.umich.edu/6x7rz2o

31
Donts
  • Dont improperly use form elements
  • - radio buttons drop down menus should not
    have default settings
  • - if they did, it would be hard to tell when ps
    answered vs when they skipped the question

32
How to reduce drop out rates
  • Reips, Ch 4 of Birnbaum
  • create an attractive website by
  • using a nice looking web design
  • putting up signs that this could be an
    interesting site (e.g. awards, comments)
  • not using commercial banners
  • having multilingual versions

33
Reducing drop out rates
  • emphasize your sites high trustworthiness by
  • providing the name of your institution
  • emphasizing the scientific purpose
  • ensuring (and keeping) confidentiality
  • providing contact information
  • use web design with short loading times (Iimit
    unnecessary pictures, etc)

34
Reducing drop out rates
  • provide ps with information about their current
    position in the time structure of the experiment
    (e.g. page 1 of 10)
  • offer feedback (e.g. post preliminary results
    after two weeks)
  • offer gratification (e.g. chance of winning a
    prize)

35
Reducing drop out rates
  • High-entrance barrier technique (Reips, 2002,
    p249) bundle demotivating factors at very
    beginning of study so ps dropout before
    manipulation
  • e.g. tell ps that participation is serious
    science needs good data
  • e.g. personalize ask for email address or phone
    number (must have IRB approval)
  • e.g. tell them how long the study will take

36
Reducing drop out rates
  • introduce a warm-up phase (Reips et al, 2001) to
    keep dropout low during experimental phase
  • can include practice trials, pilot studies, etc.

37
Reducing drop out rates
  • ask for participant seriousness or the likelihood
    that p will complete the study (e.g. Musch
    Klauer, 2002)
  • pre-determine to only analyze data from serious
    participants
  • Musch Reips (2000)
  • completion of survey 55 without a reward
  • up to 86 with reward (individual payment or
    lottery prizes)

38
Reducing drop out rates
  • Frick, Bachtiger Reips (2001)
  • Varied placement of personal information (PI)
    questions
  • Varied information about financial incentives
    (FI)
  • Found the following drop out rates
  • 21.9 with PI at end and no FI info
  • 14.9 with PI at beginning at no FI info
  • 13.2 with PI at end and FI at beginning
  • 5.7 with PI and FI at beginning

39
Reducing drop out rates
  • Frick et al (2001) also found better compliance
    (more complete responses) when ps were asked
    personal info questions at the beginning of the
    experiment
  • so, at the very least, always put some
    demographic questions at the beginning of your
    survey!

40
Some recruiting ideas
  • UM psychology subject pool online study
  • random email sample from UM using student
    directory
  • psychology experiment lists online
  • participants looking for surveys
  • message boards online
  • you go to participants
  • can select specific (e.g. online gamers, people
    interested in politics) or general group of
    participants

41
Where to post 1
  • Psychology experiment lists online
  • http//www.socialpsychology.org/expts.htm
  • http//psych.hanover.edu/research/exponnet.html
  • www.lab-united.com
  • http//www.studyresponse.com/
  • http//genpsylab-wexlist.unizh.ch/

42
Where to post 2
  • Newspapers, magazines, or online communities
  • Specialized Lawn Landscape
  • http//www.lawnandlandscape.com/messageboard/
  • Or more general The Guardian -
    http//www.guardian.co.uk/talk/
  • Craigs List - www.craigslist.org
  • Netscape, MSN, Excite, Google, Yahoo community
    message boards

43
Recruitment Script Example
44
Netiquette
  • When posting to discussion groups
  • dont want to appear as spam
  • consider reputation of psychology researchers
  • ask permission from site moderator
  • try to stay on topic
  • never double-post (a second reminder posting a
    few days later is fine)
  • give people in the group a link where they can
    view basic results

45
Promotion techniques
  • make survey attractive (e.g. nice layout)
  • academic non-profit websites are better
    accepted than commercial ones
  • have a site that is interesting enough that users
    are likely to forward the address to friends
  • avoid changing URLs

46
Ethics
  • obviously, follow usual IRB standards, but
    additional things to consider when running
    studies online
  • absence of researcher
  • uncertainly regarding adequate informed consent
    debriefing
  • potential loss of anonymity confidentiality
    (e.g. IP address)

47
Ethics
  • Informed Consent
  • use first page of survey as an information
    consent form
  • cant get an actual signature so give ps a button
    that says
  • "I Agree
  • by clicking link participant accepts the terms of
    the consent form and starts the survey
  • another option
  • ask users NOT to submit their results unless they
    have read and agree to the information letter
  • Format
  • length is your enemy keep the consent form as
    concise as possible, and free of jargon and
    acronyms.

48
(No Transcript)
49
Ethics
  • Voluntary Participation
  • assured because participants are always free to
    leave if they don't want to finish the survey
  • online participants probably feel very little
    pressure to continue
  • participants in labs may continue due to
    face-to-face social pressures, or the fact they
    have already paid to park their car, etc.
  • participants also have the right to leave
    questions blank

50
Ethics
  • Age of subject
  • usually must be at least 18 yrs old (unless
    special IRB permission)
  • can ask question Are you over 18? and if answer
    is no, direct to feedback page
  • only advertise in adult-dominated webspaces
  • design websites so they are not appealing to
    children
  • -e.g. no cartoons, etc

51
Ethics
  • Anonymous Participation
  • dont ask for anymore personal information than
    is required
  • web page software usually logs IP address that
    participant accessed your page from
    (e.g.,102.403.506.807) but otherwise only the
    demographic information the respondent explicitly
    answers will be stored
  • best to not ask for email addresses that will be
    stored in the main data file (some email
    addresses are personally identifying)
  • participants may want to be informed of the
    general results later, so you may want to have a
    website listed where you will post the results by
    a certain date

52
Ethics
  • Data Confidentiality
  • in online surveys data confidentiality is
    achieved by storing the data file on a computer
    in a personal account (userid and password
    accessible)
  • researcher (or research team) should be the only
    one with access to data

53
Ethics
  • Feedback
  • transfer to a closing page that acknowledges the
    submission, thanks the subject, etc.
  • if applicable, this page should quickly inform
    them that no individual scores are to be
    provided
  • so that they dont go back and resubmit
  • should also provide contact information (email
    address of researchers)

54
(No Transcript)
55
Ethics
  • in this case I put the debriefing form on another
    page so it would be harder for participants to go
    back and change their answers after they read it
  • follow usual IRB guidelines for feedback forms

56
(No Transcript)
57
Security
  • nothing prevents survey ps from downloading and
    examining html source
  • depending on survey topic, you may get angry
    email
  • try to pleasantly address any concerns (e.g.
    being accused of spamming)
  • report any adverse events to IRB

58
Security
  • nothing prevents ps from trying to foil your
    survey results
  • try something like this at the end
  • When we conduct research, it is important that
    participants pay attention to the questions, and
    want to do the survey. Otherwise we cannot use
    their data. If for any reason you were unable to
    pay attention to the questions and answer them
    carefully, it would be helpful for us to know
    that. You will not be penalized in any way. We
    would simply appreciate your honesty.
  • Check the following box if you think we should
    not use your data for the above reasons

59
Control
  • to minimize group (rather than individual)
    participation
  • give instruction that this is for an individual,
    not a group
  • ask participants how many people helped to
    complete the survey (or were in the room when
    survey was completed)

60
Control
  • ways to make sure the survey-taking environment
    is as similar as possible
  • present a list of requirements for completing the
    study (e.g. be in quiet place, allow 15 min,
    close other programs, etc.)
  • warm up ps to get them in the same mindset
  • have a list of questions at the end for ps to
    report various distractions (e.g. kids around,
    talking on phone, etc.)

61
Recommended Resources
  • http//psych.fullerton.edu/mbirnbaum/web/examples.
    htm
  • Birnbaum, M. (Ed.). (2000). Psychological
    Experiments on the Internet. San Diego Academic
    Press.
  • Birnbaum, M. (2000). Introduction to Behavioral
    Research on the Internet. Upper Saddle River,
    N.J. Prentice-Hall.
  • Reips, U. Bosnjak, M. (2001). Dimensions of
    Internet Science. Pabst Science Publishers,
    Lengerich, Germany.

62
TESS
  • time-sharing experiments for the social
    sciences
  • http//www.experimentcentral.org/
  • easiest way to do a representative online survey

  • they run the survey for you
  • best of all
  • its free!

63
Other programs
  • Survey Monkey
  • http//www.surveymonkey.com/Pricing.asp
  • Ive never used this but many researchers do. The
    basic rates are 20 a month they list other
    programs and prices at the above link.
  • Any comments from those of you who have tried
    this?
  • Survey Wiz Survey Assistant both designed by
    psychologists
  • http//psych.fullerton.edu/mbirnbaum/programs/sur
    veyWiz3.htm
  • http//www.mohsho.com/s_ware/how.html

64
Lunch break 12-1
  • Pizza pop 3rd floor atrium

65
Application 1 4 pm
  • basic html
  • UM Lessons
  • Random assignment
Write a Comment
User Comments (0)
About PowerShow.com