Unit 6-3 - Operant Conditioning
Unit 6-3 - Operant Conditioning
Conditioning
Mental Note:
Do not eat porcupines.
The Origins of Operant Conditioning
Law of Effect
principle that behaviors followed by favorable
consequences become more likely, and
behaviors followed by unfavorable
consequences become less likely
Rewarded behaviors happen more often, and
punished behaviors happen less often.
B. F. Skinner: Father of
operant conditioning and
pigeon enthusiast
The Origins of Operant Conditioning
Skinner’s primary technological contribution was the
operant chamber, also known as the Skinner Box.
chamber with a bar or key that an animal manipulates to
obtain reinforcements (like food/water) or punishments (like
electric shocks)
contains devices to
record responses
In regard to the following terminology, think of
“positive” and “negative” as addition and
subtraction, not good and bad.
Reinforcement
any event that strengthens the behavior it follows
Fixed Ratio
reinforces a response after a
specified number of responses
faster you respond, the more
rewards you get
$$$ Example – being paid for every ten
burgers you make
Schedules of Partial Reinforcement
Variable Ratio
reinforces a response after an
unpredictable number of
responses
very hard to extinguish
because of unpredictability
Example – slot machines
Schedules of Partial Reinforcement
Fixed Interval
reinforces a response after a
specified time has elapsed
response occurs more frequently
as the anticipated time for reward
draws near
Example – the morphine drip
button that patients use at a
hospital
Schedules of Partial Reinforcement
Variable Interval
reinforces a response at
unpredictable time intervals
produces slow steady
responding
Example – pop quizzes
Schedules of Partial Reinforcement
Generally, the rate of acquisition matches the rate of extinction.
Behaviors learned quickly fade quickly. Those that are acquired more
slowly resist extinction for longer.