Operant Conditioning Schedules - PowerPoint PPT Presentation

About This Presentation
Title:

Operant Conditioning Schedules

Description:

Chapter 6. Schedules of Reinforcement. Schedule of Reinforcement ... 'pen' resetting. Variable Ratio (VR) Varies around mean number of responses; e.g., VR 25 ... – PowerPoint PPT presentation

Number of Views:69
Avg rating:3.0/5.0
Slides: 45
Provided by: michaelr5
Category:

less

Transcript and Presenter's Notes

Title: Operant Conditioning Schedules


1
Chapter 6
  • Operant Conditioning Schedules

2
Schedule of Reinforcement
  • Appetitive outcome --gt reinforcement
  • As a shorthand we call the appetitive outcome
    the reinforcer
  • Assume that weve got something appetitive and
    motivating for each individual subject
  • Fairly consistent patterns of behaviour
  • Cumulative recorder

3
Cumulative Record
  • Cumulative recorder
  • Flat line
  • Slope

4
Cumulative Recorder
pen
paper strip
5
Recording Responses
6
The Accumulation of the Cumulative Record
VI-25
7
Fixed Ratio (FR)
  • N responses required e.g., FR 25
  • CRF FR1
  • Rise-and-run
  • Postreinforcement pause
  • Ratio strain

8
Variable Ratio (VR)
  • Varies around mean number of responses e.g., VR
    25
  • Short, if any postreinforcement pause
  • Never know which response will be reinforced

9
Fixed Interval (FI)
  • Depends on time e.g., FI 25
  • Postreinforcement pause scalloping
  • Clock doesnt start until reinforcer given

10
Variable Interval (VI)
  • Varies around mean time e.g., VI 25
  • Dont know when time has elapsed
  • Clock doesnt start until reinforcer given

11
Response Rates
12
Duration Schedules
  • Continuous responding for some time period to
    receive reinforcement
  • Fixed duration (FD)
  • Set time period
  • Variable duration (VD)
  • Varies around a mean

13
Differential Rate Schedules
  • Differential reinforcement of low rates (DRL)
  • Reinforcement only if X amount of time has passed
    since last response
  • Sometimes superstitious behaviours
  • Differential reinforcement of high rates (DRH)
  • Reinforcement only if more than X responses in a
    set time

14
Noncontingent Schedules
  • Reinforcement delivery not contingent upon
    passage of time
  • Fixed time (FT)
  • After set time elapses
  • Variable time (VT)
  • After variable time elapses

15
Choice Behaviour
16
Choice
  • Two-key procedure
  • Concurrent schedules of reinforcement
  • Each key associated with separate schedule
  • Distribution of time and behaviour

17
Concurrent Ratio Schedules
  • Two ratio schedules
  • Schedule that gives most rapid reinforcement
    chosen exclusively

18
Concurrent Interval Schedules
  • Maximize reinforcement
  • Must shift between alternatives
  • Allows for study of choice behaviour

19
Interval Schedules
  • FI-FI
  • Steady-state responding
  • Less useful/interesting
  • VI-VI
  • Not steady-state responding
  • Respond to both alternatives
  • Sensitive to rate of reinforcemenet
  • Most commonly used to study choice

20
Alternation and the Changeover Response
  • Maximize reinforcers from both alternatives
  • Frequent shifting becomes reinforcing
  • Simple alternation
  • Concurrent superstition

21
Changeover Delay
  • COD
  • Prevents rapid switching
  • Time delay after changeover before
    reinforcement possible

22
Herrnsteins (1961) Experiment
  • Concurrent VI-VI schedules
  • Overall rates of reinforcement held constant
  • 40 reinforcers/hour between two alternatives

23
Key
Schedule
Rft/hr
Rsp/hr
Rft rate
Rsp rate
24
The Matching Law
  • The proportion of responses directed toward one
    alternative should equal the proportion of
    reinforcers delivered by that alternative.

25
Bias
  • Spend more time on one alternative than predicted
  • Side preferences
  • Biological predispositions
  • Quality and amount

26
Varying Quality of Reinforcers
  • Q1 quality of first reinforcer
  • Q2 quality of second reinforcer

27
Varying Amount of Reinforcers
  • A1 amount of first reinforcer
  • A2 amount of second reinforcer

28
Combining Qualities and Amounts
29
Extinction
30
Extinction
  • Disrupt the three-term contingency
  • Response rate decreases

31
Stretching the Ratio/Interval
  • Increasing the number of responses
  • e.g., FR 5 --gt FR 50, VI 4 sec. --gt VI 30 sec.
  • Extinction problem
  • Shaping gradual increments
  • Low or high schedules

32
Extinction
  • Continuous Reinforcement (CRF) FR 1
  • Intermittent schedule everything else
  • CRF easier to extinguish than any intermittent
    schedules
  • Partial reinforcement effect (PRE)
  • Generally
  • High vs. low
  • Variable vs. fixed

33
Discrimination Hypothesis
  • Difficult to discriminate between extinction and
    intermittent schedule
  • High schedules more like extinction than low
    schedules
  • e.g., CRF vs. FR 50

34
Frustration Hypothesis
  • Non-reinforcement for response is frustrating
  • On CRF every response reinforced no frustration
  • Intermittent schedules always have some
    non-reinforced responses
  • Responding leads to reinforcer (pos. reinf.)
  • Frustration S for reinforcement
  • Frustration grows continually during extinction
  • Stop responding --gt stops frustration (neg.
    reinf.)

35
Sequential Hypothesis
  • Response followed by reinf. or nonreinf.
  • Intermittent schedules nonreinforced responses
    are S for eventual delivery of reinforcer
  • High schedules increase resistance to extinction
    because many nonreinforced responses in a row
    leads to reinforced
  • Extinction similar to high schedule

36
Response Unit Hypothesis
  • Think in terms of behavioural units
  • FR1 1 response 1 unit --gt reinforcement
  • FR2 2 responses 1 unit --gt reinforcement
  • Not response-failure, response-reinforcer but
    response-response-reinforcer
  • Says PRE is an artifact

37
Mowrer Jones (1945)
300 250 200 150 100 50
  • Response unit hypothesis
  • More responses in extinction on higher schedules
    disappears when considered as behavioural units

Number of responses/units during extinction
FR1 FR2 FR3 FR4
absolute number of responses number of
behavioural units
38
Economic Concepts and Operant Behaviour
  • Similarities
  • Application of economic theories to behavioural
    conditions

39
The Economic Analogy
  • Responses or time money
  • Total responses or time possible income
  • Schedule price

40
Consumer Demand
  • Demand curve
  • Price of something and how much is purchased
  • Elasticity of demand

41
Three Factors in Elasticity of Demand
  • 1. Availability of substitutes
  • Cant substitute complementary reinforcers
  • e.g., food and water
  • Can substitute non-complementary reinforcers
  • e.g., Coke and Pepsi
  • 2. Price range
  • e.g., FR3 to FR5 vs. FR30 to FR50

42
  • 3. Income level
  • Higher total response/timethe less effect cost
    increases have
  • Increased income --gt purchase luxury items
  • Shurtleff et al. (1987)
  • Two VI schedules food, saccharin water
  • High schedules rats spend most time on food
    lever
  • Low schedules rats increase time on saccharin
    lever

43
Behavioural Economics and Drug Abuse
  • Addictive drugs
  • Nonhuman animal models
  • Elasticity
  • Work for drug reinforcer on FR schedule
  • Inelastic...up to a point

44
  • Elsmore, et al. (1980)
  • Baboons
  • Food and heroin
  • Availability of substitutes
Write a Comment
User Comments (0)
About PowerShow.com