Learning- Part 2
Learning- Part 2
Skinner Box Experiment: Skinner created an experimental apparatus called the Skinner
Box (or operant conditioning chamber) to study animal behavior, especially in rats and
pigeons. In the box, animals could press a lever (for rats) or peck a disk (for pigeons) to
receive a reward (typically food). Skinner found that animals would repeat behaviors
that produced favorable outcomes, demonstrating the role of reinforcement in
learning.
Process of Operant Conditioning
Operant conditioning relies on the association between a behavior and its consequence. The
process includes:
Through repeated interactions, the subject learns to associate specific behaviors with outcomes,
leading to an increase or decrease in the frequency of the behavior.
Extinction: The gradual weakening of a behavior when it is no longer reinforced. For example, a
rat may stop pressing a lever if pressing no longer produces food.
Shaping: Gradually guiding behavior toward a desired goal by reinforcing successive
approximations (small steps) of the behavior.
REINFORCEMENT VS PUNISHMENT
Positive Reinforcement: Adding a pleasant stimulus (e.g., giving a reward for good
behavior).
Negative Reinforcement: Removing an aversive stimulus to increase behavior (e.g.,
turning off a loud alarm when the desired action is completed).
Fixed-Ratio Schedule: Reinforcement is provided after a specific number of responses (e.g., every 5th
response).
Effect: High rate of response with a short pause after each reinforcement.
Fixed-Interval Schedule: Reinforcement is given for the first response after a fixed period (e.g., every
5 minutes).
Effect: Response rate increases as the reinforcement time approaches, then drops after
reinforcement.
Variable-Interval Schedule: Reinforcement is provided for the first response after varying time
intervals (e.g., checking emails randomly).
Effect: Steady and moderate rate of response with strong resistance to extinction.