The Need for Psychological Science - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

The Need for Psychological Science

Description:

* * * Definitions should be clearly defined and quantifiable Operational definitions reduce subjectivity and expectancy effects and allow for replication Confounding ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 41
Provided by: Prefer773
Category:

less

Transcript and Presenter's Notes

Title: The Need for Psychological Science


1
The Need for Psychological Science
2
Scientific Method
  • To produce knowledge (pure research) or solve
    practical problems (applied research)
  • ? Theory explains, organizes and enables one to
    predict behavior or events
  • ? Hypotheses a statement to test and revise
    theories a measurable relationship/prediction
    between variables (beer parties GPA)
  • ? Replicate repeat studies independently to
    extend validity and reliability of research

3
The Need for Psychological ScienceExamples of
Faulty Reasoning
  • Hindsight Bias I-knew-it-all-along phenomenon
  • The mind builds its current wisdom around what we
    have already been told. We are biased in favor
    of old information.
  • For example, we may stay in a bad relationship
    because it has lasted this far and thus was
    meant to be.

4
Hindsight Bias Examples
  • A letter comes in the mail informing an
    individual that he was accepted into a college.
    When he tells his mother she says, I really had
    a feeling that you were going to get in (even
    though she had expressed doubts to his father
    earlier that week).
  • An individual notices that outside, its
    beginning to look a little bit gray. He says to
    himself, I bet that its going to rain this
    afternoon. When it actually does rain, the
    individual tells himself that he was certain that
    it would when he saw the clouds rolling in earlier

5
Overconfidence The tendency to think we know
more than we do
  • We are much too certain in our judgments.
  • We overestimate our performance, our rate of
    work, our skills, and our degree of self-control.
  • We overestimate the accuracy of our knowledge.
    People are much more certain than they are
    accurate.
  • Overconfidence is a problem in eyewitness
    testimony.
  • Overconfidence is also a problem on tests. If
    you feel confident that you know a concept, try
    explaining it to someone else.
  • Overconfidence Error 1Performance
  • Overconfidence Error 2Accuracy

6
Overconfidence continued
  • The 'God Complex', we tend to believe that we
    know the answer to complicated problems. This can
    be seen in daily gossip where individuals
    confidently propose solutions to world issues.
  • Ex. Teachers sometimes decide that some
    individuals and groups are more intelligent than
    others.

7
  • Scientific Attitude driven by curiosity and
    critical thinking examining assumptions,
    uncovering hidden values, evaluate evidence and
    assessing conclusions (TWA and 60 Minutes).
  • http//www.cbsnews.com/8301-505263_162-57590192/tw
    a-flight-800-crash-inside-the-missile-theory/

8
  • Confirmation bias A tendency to search for
    information that confirms ones preconceptions
  • decision makers have been shown to actively seek
    out and assign more weight to evidence that
    confirms their hypothesis, and ignore or
    underweigh evidence that could disconfirm their
    hypothesis.
  • False consensus effect The tendency to
    overestimate the extent to which others share our
    beliefs, behaviors, and attitudes.
  • Ex Romantic relationships between people often
    start off with a glow as hormones and False
    Consensus overshadow real differences. However,
    the cloud-9 effect eventually wears off as the
    loving couple eventually discover that they are
    not, after all, that similar (and in fact often
    are amazingly incompatible!).

9
  • Belief bias - The tendency for ones preexisting
    beliefs to distort logical reasoning.
  • Ex I will accept that some good ice skaters are
    not professional hockey players, but will reject
    an assertion that some professional hockey
    players are not good ice skaters (which, although
    it seems unlikely, is possible).
  • Belief perseverance - The tendency to cling to
    ones conceptions after the basis on which they
    were formed are discredited.
  • Zombie Apocalypse??

10
Scientific Attitude Part 1 Curiosity
  • always asking new questions
  • That behavior Im noticing in that guy is that
    common to all people? Or is it more common when
    under stress? Or only common for males?
  • Hypothesis Curiosity, if not guided by caution,
    can lead to the death of felines and perhaps
    humans.

11
Scientific Attitude Part 2 Skepticism
  • not accepting a fact as true without
    challenging it seeing if facts can withstand
    attempts to disprove them.
  • Skepticism, like curiosity, generates questions
    Is there another explanation for the behavior I
    am seeing? Is there a problem with how I
    measured it, or how I set up my experiment? Do I
    need to change my theory to fit the evidence?

12
Scientific Attitude Part 3 Humility
  • Humility refers to seeking the truth rather than
    trying to be right a scientist needs to be able
    to accept being wrong.
  • What matters is not my opinion or yours, but
    the truth nature reveals in response to our
    questioning. David Myers

13
Research Methods in Psychology
  • Descriptive Methods
  • Naturalistic Observation
  • gathering data about behavior watching but not
    intervening
  • Case Studies
  • observing and gathering information to compile an
    in-depth study of one individual
  • Surveys and Interviews
  • having other people report on their own attitudes
    and behavior

14
Case Studies
  • Examining one individual in depth
  • Benefit can be a source of ideas about human
    nature in general
  • Example cases of brain damage have suggested
    the function of different parts of the
    brain (e.g. Phineas Gage)
  • Danger overgeneralization from one example he
    got better after tapping his head so tapping must
    be the key to health!
  • http//www.youtube.com/watch?v9QXI_BxlY7M

15
Natural Observation
  • Observing natural behavior means just watching
    (and taking notes), and not trying to change
    anything.
  • This method can be used to study more than one
    individual, and to find truths that apply to a
    broader population.

16
Survey
  • A method of gathering information about many
    peoples thoughts or behaviors through
    self-report rather than observation.
  • Keys to getting useful information
  • Be careful about the wording of questions
  • Only question randomly sampled people

17
  • Correlational research (May include survey,
    interviews, tests, naturalistic observation,
    longitudinal, cross-sectional studies)
  • Experimental
  • Quasi-Experimental (no random assignment to
    condition)

18
Research Methods in Psychology
19
Descriptive Research Methods
  • Naturalistic observation
  • Advantages
  • Avoids observer effect/reactivity (of subject)
  • Provides ideas for further research
  • Disadvantages
  • Potentially time consuming and expensive
  • No control of variables or over extraneous
    variables
  • Not replicable
  • Examples Piaget, Naturalistic examples, Quasi

20
Descriptive Research Methods
  • Surveys, interviews, questionnaires and tests
  • Advantages
  • Relatively inexpensive, easy way of collecting
    large amounts of data (attitudes, interests,
    aptitudes)
  • Assuming a true random sample generalizable
  • Disadvantages
  • Poor construction or administration of questions
  • Poor sample unrepresentative (not generalizable)
  • Measures beliefs, not behaviors
  • Issues of self-report, memory and honesty

21
Descriptive Research Methods
  • Case studies Of individuals, groups or
    phenomena
  • Advantages
  • Potentially, deeply revealing about individuals
  • Disadvantages
  • No experimental control
  • Sample size extremely small generalizability?
  • Potential bias, both subject and experimenter
  • Examples, Phineas Gage, Freud and Little Hans

22
Descriptive Research Methods
  • Archival Research
  • Advantages
  • Enormous amounts of data used to see trends
    relationships and outcomes
  • Disadvantages
  • No control over data collection or if reliable
  • Examples Analysis of studies conducted by other
    researchers, or look at historical data (e.g. the
    Wild Child)

23
Research Methods
  • Longitudinal method Examples? Ads/Disads?
  • Cross-sectional method Advantages?
  • Cross-cultural method Purposes?

24
Correlational Studies
  • Correlational studies look at the degree of
    relationship between variables and not the effect
    of one variable on another variable
  • Correlation DOES NOT equal causation. A
    relationship may be suggested, but it does not
    prove that one variable causes the other to
    change. For example, a correlational study may
    suggest a relationship between academic success
    an self-esteem, but it does not mean that
    academic success causes increases self-esteem

25
Finding Correlations Scatterplots
  • Place a dot on the graph for each person,
    corresponding to the numbers for their height and
    shoe size.
  • In this imaginary example, height correlates with
    shoe size as height goes up, shoe size goes up.

Height
Shoe size
26
Correlational Studies
Partic GPA TV Hours/ week
1 3.1 14
2 2.4 10
3 2.0 20
4 3.8 7
5 2.2 25
6 3.4 9
7 2.9 15
8 3.2 13
9 3.7 4
10 3.5 21
27
Correlational Studies
  • Examples of Positive Correlation
  • 1. SAT scores and college those with higher SAT
    scores also have higher grades in college
  • 2. Happiness and helpfulness as peoples
    happiness level increases, so does their
    helpfulness
  • Examples of Negative Correlation
  • 1. Education and years in jail people who have
    more years of education tend to have fewer years
    in jail
  • 2. Crying and being held babies held less tend to
    cry more

28
Scatterplots and Correlation
  • Correlation coefficient (Pearson-product moment
    correlation coefficent) measures 3 types
  • 1.00 Positive (or direct)
  • -1.00 Negative (indirect)
  • 0 No correlation

29
Correlational Studies - Problems
  • Illusory correlation detecting relationships
    where none exist (weathercold). Other examples?
  • Third-Variable
  • Research showed a strong correlation between
    contraceptive use and number of electrical
    appliances in the home (Li, 1975).   Why?  
  • CorrelationMethodsReviewWS

30
Experimentation
  • Important Terms/Concepts. Most knowmust know
  • hypothesis
  • independent/dependent variables
  • operational definitions (quantifiable)
  • population and random/stratified sample
  • representative sample
  • generalizability
  • experimental and control group (or condition)
  • random assignment
  • placebo use and effect
  • confounding variables
  • single and double blind procedures
  • statistical method/significance
  • replication

31
Define your Population
  • Population the group researchers wish to study
  • All humans?
  • People with depression?
  • Adolescents?

32
Sampling
  • Sample a subgroup of your population
  • In order for results to be generalizable to the
    population, a sample must be representative (size
    is key)
  • Random sample everyone in the population has an
    equal chance of being in your sample

33
Independent and Dependent Variables
34
Operational Definitions (for Variables)
  • Definitions should be clearly defined and
    quantifiable
  • Operational definitions reduce subjectivity and
    expectancy effects and allow for replication

35
Confounding (Hidden) Variables
  • Confounding variables - Variables in a study that
    are not controlled for (outside factors, e.g.?)
  • Ways to control for confounding variables
  • Large sample size (more apt to be representative)
  • Random assignment to groups (control and
    experimental)
  • Blinding - Single v. double
  • Single controls for reactivity (observer effects)
  • Double controls for expectancy effects (research
    bias)
  • Placebos or sham treatment

36
Experimentation Other Terminology
  • Quasi-Experimental design "experiments that have
    treatments, outcome measures, and experimental
    units, but do not use random assignment to create
    the comparisons from which treatment-caused
    change is inferred." (Cook Campbell)
  • A between-subject design Different subjects. This
    enables random assignment of subjects to
    conditions
  • A within subjects design Same subjects where each
    is exposed to all of the conditions (uses
    repeated measures)

37
Research pitfalls
  • Experimenter Bias
  • Self-fulfilling prophecy The experimenter arrives
    at conclusions that support his/her hypotheses
    based on the need to do so, not data
  • Halo effects The tendency for people to transfer
    a positive opinion based on irrelevant
    information, i.e., people tend to think that more
    attractive people are also smarter

38
Research pitfalls
  • Observer effect (aka reactivity) the effect the
    experimenters presence has on subjects
  • The Hawthorne effect is the tendency for change
    to occur simply because subjects are aware an
    experiment is being conducted
  • Social desirability bias is the tendency for
    subjects to be able to respond in an experiment
    in a way that they believe would be most socially
    desirable

39
Ethics in Experimentation
  • APA Requirements/Guidelines - Ethical Principles
    of Psychologists and Code of Conduct (2002)
  • Human experimentation must cause no harm
  • Informed consent
  • Confidentiality
  • Debriefing
  • Research institutions must have an Institutional
    Review Board (IRB)
  • Role of deception? (Baumrind)

40
Animal experimentation
  • Controversies
  • Institutional Animal Care and Use Committees
  • Appropriate Beneficial and Caring (ABC)
    Guidelines
  • Related Issues of anthropomorphism,
    generalization, and anthropocentrism
Write a Comment
User Comments (0)
About PowerShow.com