0% found this document useful (0 votes)
1K views

ABA Exam Study Guide

This study guide covers key concepts in behavior analysis including: 1. The three levels of scientific understanding - description, prediction, and control. 2. The core assumptions of behaviorism - determinism, empiricism, experimentation, replication, parsimony, and philosophic doubt. 3. The seven dimensions of behavior analysis - applied, conceptually systematic, technological, analytic, generality, effective, and behavioral. 4. The differences between the three branches of behavior analysis - applied behavior analysis, behaviorism, and experimental analysis of behavior.

Uploaded by

Taylor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1K views

ABA Exam Study Guide

This study guide covers key concepts in behavior analysis including: 1. The three levels of scientific understanding - description, prediction, and control. 2. The core assumptions of behaviorism - determinism, empiricism, experimentation, replication, parsimony, and philosophic doubt. 3. The seven dimensions of behavior analysis - applied, conceptually systematic, technological, analytic, generality, effective, and behavioral. 4. The differences between the three branches of behavior analysis - applied behavior analysis, behaviorism, and experimental analysis of behavior.

Uploaded by

Taylor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

EXAM #5 STUDY GUIDE

1. List and define each level of scientific understanding.


Science is a systematic approach for seeking & organizing knowledge about the natural world to achieve a
thorough and concrete understanding of the phenomena under study (e.g., socially significant bx’s)
Three different types of investigations provide different levels of scientific understanding, and each level
contributes to the overall knowledge base in a given field. These levels of scientific understand include:
• Description → Objective observation to collect facts that can be quantified, classified and examined for
possible relations with other known facts, often suggestive of hypotheses or questions for additional
research objective observation #nocause
• Prediction → Relative probability that when one event occurs, another event will, or will not, occur based
on repeated observations to identify correlations between events #correlations #notcausation (e.g.,
college ≠ freshman 15)
• Control → highest level of scientific understanding from which functional relations can be derived

2. List and define each assumption of the science of behavior.


• Determinism → Assumption upon which science is predicated, presuming that the universe is a lawful &
orderly place in which all phenomena occur as a result of other events #causeandeffect #lawfulness #if/then
#orderlyandpredictable
– Events do not just occur at will
– Events are related in systematic ways
• Empiricism → Objective observation of phenomena of interest #FACTS #experimental #observation
– The practice that all scientific knowledge is built upon
• Experimentation → Controlled comparison of some measure of the phenomenon of interest (DV) under
2+ different conditions in which only 1 factor at a time (IV) differs from 1 condition to another #manipulate
#functionalrelation #experimentalanalysis
• Replication → The repetition of experiments, including repetition of IV conditions within experiments, to
determine the reliability and usefulness of findings #reliability #repeatability #replication
• Parsimony → Simple, logical explanations must be ruled out, experimental or conceptually, before more
complex or abstract explanations are considered #KISS #simplefirst
– Helps to fit findings within the field’s existing knowledge base
• Philosophic Doubt → The continuous questioning of the truthfulness & validity of all scientific theory &
knowledge that involves the use of scientific evidence before implementing a new practice, then
constantly monitoring the effectiveness of the practice after its implementation. #healthyskeptic
3. List and define the 7 dimensions of behavior analysis.
• Applied → bx’s selected for change are socially significant & of immediate importance to the individual
• Conceptually Systematic → bx ∆ interventions are derived from basic principles of bx
– Assist in integrating discipline into a system instead of a “bag/collection of tricks”
• Technological → procedures are identified & described with sufficient detail and clarity, enabling
replication of the protocols
• Analytic → reasonable demonstration of a functional relation by way of experimental control
• Generality → bx change lasts over time, across settings, & spreads to other bx’s
• Effective → treatment sufficiently improves the target bx to produce practical results reaching clinical or
social significance for the individual
• Behavioral → bx’s chosen must be observable & measurable

4. Describe & explain the differences between the (3) branches of behavior analysis.

ABA Behaviorism EAB


• Focus = theoretical & philosophical issues
• Conceptual basis of bx principles as it relates
• A scientific approach to across many spectrums
improving socially significant • Underlying principles are based in radical
bx … behaviorism, → attempts to explain all
behavior, including private bx (e.g., thinking,
• in which procedures derived
feeling). Basic research conducted in
form the principles of bx are
contrived settings with the
systematically applied to
(3) Assumptions RE: private events (e.g., thoughts primary goal of discovering &
improve socially significant bx
& feelings): clarifying fundamental
… AND
– Private events = bx principles of bx
• demonstrate experimentally – Bx that takes place within the skin is
that the procedures employed distinguished from other (“public”) bx only by
were responsible for the its inaccessibility
improvement in bx. – Private bx has no special properties & is
influenced by (i.e., is a function of) the same
kinds of variables as publicly accessible bx
5. What factors should you consider when identifying target behaviors?
Identifying POTENTIAL Target Bx’s:
• What is the topography?
• When does the bx occur? How often?
• How does the bx manifest? / What are the antecedents? consequences?
• What duration? magnitude?

Prioritizing Target Bx’s


1. Threat to Health/Safety
2. Frequency
– Opportunities to Use New Bx
– Occurrence of Problem
3. Longevity
– How long has the problem bx been going on?
4. Potential for ↑ Rates of Reinforcement
5. Importance/Relevance
– Skill Development
– Independence
6. Reinforcement for Socially Significant Others
– Social Validity
7. ↓ Negative Attention
8. Likelihood of Success
– Research
– Practitioner’s experience
– Environmental Variables
– Available Resources
9. Cost–Benefit (e.g., Client’s Time & Effort)

6. Describe stimulus-response relations and provide an example of a reflex.


A reflex is a stimulus-response relation consisting of an antecedent stimulus & the respondent bx it elicits
(e.g., knee jerk to tap just below patella)
Respondent bx’s are elicited (“brought out”) by stimuli that immediately precede them, and the antecedent
stimulus and response it elicits form a functional unit called a reflex. In other words, these bx’s are primarily
involuntary responses that occur whenever the eliciting stimulus is present.
Alternatively, operant bx’s are shaped through the stimulus changes, or consequences, that have
immediately followed the bx in the past. This is the S-R-S model for three-term contingencies.
7. Define functional relations.
Functional Relations → Exists when a well-controlled experiment reveals that a specific change in one
event (DV) can reliably be produced by specific manipulations of another event (IV), and the change in the DV
was unlikely to be the result of other extraneous factors (confounding variables)
• Events can only be “co-related” → nearly impossible to factor out all other possible “causes"

8. Define, describe, & provide an example for each of the following data collection strategies:
• ABC Recording = preferred method of bx assessment to determine which bx’s to target for ∆

and the
Direct, continuous
a descriptive, antecedent
observation, aka ABC in the client’s
Anecdotal temporally sequenced conditions &
recording, in which natural
Observation an observer
account of all bx’s of consequences for
environment.
interest … those bx’s as they
documents…
occur …
from which
information on the is not derived but from the
A form of indirect
Interviews assessment …
problem bx, from direct retrospective report
antecedents, & observation… of others.
consequences …
Also referencing
direct assessment, is conducted in the to identify
Direct by way of direct and
this preferred client’s natural potential target
Observation strategy for data environment … bx’s …
repeated methods.

collection …

9. Describe each of the following measurement strategies:


a. Interval Schedules → Reinforcement is provided for the first response following a set or variable
amount of time
b. Ratio Schedules → Reinforcement is provided for emitting a set or variable number of responses
c. Frequency → Number of occurrences of a bx within a given period of time (i.e., rate)
• EXAMPLE: Sarah says hello to her mother 15 times within a period of 30 minutes.
d. Magnitude → the intensity or degree to which a bx occurs, often measured with a recording
instrument or on a rating scale
e. Latency → Time from the occurrence of some stimulus event to the onset of the target bx
10. Define reinforcement. Discuss whether all preferred items always serve as reinforcers or not.
Discuss why it is incorrect to say that reinforcement “doesn’t work.”
Reinforcement → stimulus ∆ in the environment contingent on a response, which ↑ the future frequency
of the response bx
• Important Attributes:
– Time between bx & consequence — Immediacy of reinforcement is critical to create the
relationship between bx and consequence (aka functional relationship)
– Conditions present when bx occurs
– Motivation present for the desire of the consequences
Preferred Items = Reinforcers??? → SOMETIMES, BUT NOT ALWAYS
Sometimes preferred items may not be strong reinforcers. It depends on the individual’s access to and
motivation for the item. If a child loves Minecraft videos and has unrestricted access to these videos at
home, they may not work as a reinforcer at Grandma’s house. Just because Minecraft videos are a highly
preferred item does not mean the child will work for that item.
Preferred items hold the potential to function as reinforcers, but that does not necessarily indicate that
they are also effective. Preference for stimuli shift over time in relation to several different environmental
variables. These environmental variables van impact the effectiveness of a specified stimulus as a
reinforcer at a given time by either ↑/↓ the value of the stimulus. For example, Sally’s BCBA decided to
remove Skittles as a reinforcer because she noticed Sally would just play with the candy rather than eat it.
One month later, Sally’s BCBA brings back/reintroduces the Skittles to increase the likelihood of Sally
wanting the Skittles. The likelihood for the Skittle functioning as a reinforcer is greater after a month of
removal because Sally has Skittle deprivation.
“Reinforcement doesn’t work.”
Reinforcement refers to any stimulus condition following a bx that ↑ the future frequency of the bx and
similar bx’s. If an ↑ in the future frequency for that bx is not observed, then the stimulus presented did
not function as a reinforcer at all.
Keeping in mind that reinforcement is to be understood from a functional perspective, individual
preferences and aversions, behavioral tendencies and responses, as well as environmental factors, can
dictate any potential for a stimulus to function as a reinforcer in one set of circumstances, but not in
another. Therefore, it is not that reinforcement “didn’t work,” but the stimulus chosen as the reinforcer
may have been a preferred item, but it did not, in fact, function as a reinforcer at all.
11. Define, describe, and provide an example for:

Reinforcement Punishment
Response bx immediately followed by ADDITION of Response bx immediately followed by ADDITION of
some appetitive stimulus in the environment that some aversive stimulus in the environment that ↓
↑ future frequency of response bx & related bx future frequency of response bx & related bx

EXAMPLE: EXAMPLE:
(+) Billy completes his math homework after school Billy does not take off his shoes after playing
and immediately his mother gives him ice cream as outside and tracks mud through the kitchen. Billy’s
a reward. In the future, Billy is more likely to mother makes him mop up the floor. In the future,
complete his math homework right after school so Billy is less likely to wear muddy shoes inside to
that his mother will give him ice cream. avoid cleaning the floor.
Response bx immediately followed by REMOVAL of
Response bx immediately followed by REMOVAL of
some appetitive stimulus in the environment that
some aversive stimulus already present in the
↓ future frequency of response bx & related bx
environment that ↑ future frequency of response
bx & related bx
EXAMPLE:
Billy’s brother built a tower with his new Lego set.
EXAMPLE:
(–) Billy is given a plate of vegetables to eat with his
Billy uses the Lego tower as his Nerf gun target,
aims, & knocks down his brother’s tower. Billy’s
dinner. Billy screams and his mother immediately
mom takes his Nerf toys away from him until he
takes the plate of vegetables away. In the future,
helps his brother rebuild the entire tower (“until…”
when Billy is given a plate of vegetables he is more
= contingency). In the future, Billy is less likely to
likely to scream so he does not have to eat his
knock down his brother’s Legos to avoid losing Nerf
vegetables.
gun privileges.

Extinction → Withholding/discontinuing reinforcement for a previously reinforced response bx for which


the primary effect is a ↓ in the frequency of bx until it reaches pre-reinforced level or ultimately ceases to
occur. Aside from the primary effect of extinction, gradual ↓ in frequency & amplitude of response, the
other effects include:
– Extinction Burst → initial, but temporary, ↑ in the frequency & intensity of the bx as the individual
“tries harder” to get the result that was previously reinforced
– Spontaneous Recovery → an ↑ in the magnitude or rate of a previously extinguished bx after
operant extinction has occurred, which appears and then quickly goes away

★ Should simultaneously be teaching (& reinforcing) an alternate behavior!!!!

• EXAMPLE: Johnny has received attention from his mother in the past each time he engaged in
screaming bx. His mother no longer provides attention contingent on screaming. Johnny’s screaming
bx eventually stops because his bx of screaming is no longer being reinforced.
Contingent Observation → A non-exclusionary time-out procedure in which the individual is allowed to
remain within the reinforcing environment but is not permitted to engage in any reinforcing activities for a
predetermined period of time and has to sit and watch others engage in reinforcing activities
EXAMPLE: Everyone in Logan’s class has free-time and they are playing with toys of their choice. During
this free-time, Logan punches one of his classmates. Practicing non-exclusionary time-out, specifically
contingent observation, the teacher guides Logan to the time-out chair within the classroom where
Logan has to sit and watch his classmates enjoy free-time. Logan must sit there for 2 minutes.

Overcorrection/Restitutional Overcorrection → Individual corrects the consequences of their bx by


restoring the environment to an improved state from before the event
EXAMPLE: Removing gum under a desk where the student placed theirs = restitution. HOWEVER,
removing gum from under ALL desks = restitutional overcorrection.

12. Define generalization and maintenance.


• Generalization → The effects of contingencies (reinforcement/extinction/punishment) & learning spread
to other settings, bx’s, or stimuli
• Maintenance → The ability of the individual to continue to demonstrate acquired responses over time,
after reinforcement has been thinned

13. Describe variable and continuous reinforcement contingencies. How are they different?

Continuous Reinforcement Contingencies (CRF) → Presentation of a reinforcer following each


demonstration of the desired behavioral response
• EXAMPLE: If every time you hear the doorbell ring & there is someone on the other side of the door
with a package for you, that would be continuous reinforcement
• Measurement Systems:
• Timing → duration, latency, & IRT
• Event Recording → frequency, count, & DTT (trial by trial or exact count)
Variable Reinforcement Contingencies = INT (Intermittent Schedules of Reinforcement) →
Reinforcement is presented for only some, but not all, instances of a desired response behavior

Effects on Behavior
Schedule Definition Schedule Example
Characteristics
in Effect
You get a free latte
Reinforcement provided after you purchase
Fixed Ratio High response Post-Reinforcement
for emitting set # of 10
(FR) rate Pause
responses
= FR10
You get a dollar for
Reinforcement provided Slow to every 10 minutes
Fixed Interval
for 1st response after set moderate Scalloped Responding you run
(FI) amount of time response rate
= FI10
A gambler hits the
• Typically no PCP jackpot after an
Reinforcement provided
• Very resistant to average of 10 spins
Variable Ratio for emitting variable/ Slow rate of
extinction on the slot
(VR) changing # of responses response
– MAX # responses machine
on the avg
before extinction
= VR10

Your crush texts


Reinforcement provided
Stable, you about every 6
Variable Interval for 1st response after
consistent MAX time to extinction hours
(VI) variable/changing time
response rate
on the avg
= VI6

• Post-Reinforcement Pause → Present on fixed schedules of reinforcement due to the predictability


of the schedule the individual may stop responding following reinforcement resuming based on time
or value of R+/–
• Limited Hold (LH) → Following the end of an interval, the individual has a set amount of time to
engage in target bx before the next interval begins
• Schedule Thinning → involves↑ the ratio/interval schedule for the target bx
• Schedule Thickening → involves ↓ ratio/interval to provide a denser schedule of reinforcement
Difference(s) → Some vs. ALL behaviors receive reinforcement
14. Describe & provide examples of extinction of behaviors maintained by positive reinforcement
and extinction of behaviors maintained by negative reinforcement. How are they different?
• Extinction of Bx maintained by (+) reinforcement → Bx’s maintained by (+) reinforcement are
placed on extinction when those bx’s do not produce the reinforcer
– Individual gets something they did not have prior to engaging in the bx
EXAMPLE: Susie screams and cries when her mother will not buy her M&M’s at the grocery store. Susie’s
mother always ends up buying the M&M’s so Susie will stop screaming. Susie engages in this bx to get
the M&M’s. Therefore, Susie’s screaming/crying bx is being (+) reinforced. HOWEVER, if Susie’s mother,
decided to no longer buy the M&M’s when she screamed and cried, Susie’s mother would apply an
extinction procedure to the bx because she would no longer be providing (+) reinforcement, M&M’s.
EXAMPLE 2: When Lola wanted gummy bears, she generally screamed at her mother until she gave her
some gummy bears. To reduce Lola’s screaming bx to get gummy bears, Lola’s mother would apply a
procedure for the extinction of bx maintained by (+) reinforcement by no longer providing Lola with
gummy bears when she screamed.

• Extinction of Bx maintained by (–) reinforcement → Bx’s do not produce a removal of the aversive
stimuli, which means that the individual cannot escape the aversive situation
– AKA Escape maintained by Extinction
EXAMPLE: Betsie cries so the teacher will take her to the other room where she does not have to do
Jumpstart with the rest of her classmates. Betsie’s teacher can extinguish this bx by making Betsie stay in
the classroom, even when she is crying/tantruming.

15. Define avoidance and escape contingencies. How are these two contingencies different?
Avoidance → response bx prevents or postpones the presentation of an aversive stimulus
• EXAMPLE 1:
– EO: Raining outside…you are still inside –– nice & dry.
– SD: Friend Says, “Do you have an umbrella?”
– R: Put up umbrella (prior to going outside)
– SR(–): AVOID rain falling on your head
★ Reinforcer = avoidance of the EO (aka aversive stimulus)

• Avoidance Contingencies: #negativereinforcement


• Discriminated → Your BFF texts you & tells you that your ex is at the gym so you go deadlift at
another gym. #SD #headsup
• Operant → You konw your ex goes to a particular gym so you buy a membership to a new gym.
#noSD #nosignal
• EXAMPLE 2: I don’t even want to get pregnant so I’m going to avoid it by taking birth control
Escape → any response designed to move away from or eliminate an already present aversive stimulus
• EXAMPLE 1:
– EO: Raining falling on your head as you walk down the street
– SD: Friend Says, “Do you have an umbrella?”
– R: Put up umbrella
– SR(–): ESCAPE rain falling on your head
★ Reinforcer = termination of the EO (aka aversive stimulus)

• EXAMPLE: Shit the condom broke and now I need to escape what has just happened & take plan B
• Why?
– These bx’s remove an undesired situation or person
• When?
– Escape bx’s occur at a time in which something is viewed as being too hard, too boring, or too loud
• How to respond?
– Teach the learner to request a break when needed. Divide tasks into small parts or give within-
activity choices (e.g., If child flops out of their chair during a work session, provide them with
access to a break card)

16. What is shaping? How can you use it to teach someone a new behavior?
Shaping → differential reinforcement of successive approximations to the final performance of a desired bx
• When we shape a bx, we provide praise & reinforcement for each “baby step” of a larger NEW skill
• When just beginning to shape a new bx, a continuous reinforcement schedule should be used for
the initial responses each time you increase the criteria for the reinforcement
• Shaping a bx within a response topography means that the form of the bx remains constant, but
differential reinforcement is applied to a dimension of the bx
– EXAMPLE 1: Shaping to teach an individual to sign “please” to get access to a toy → First reinforce
lifting the hand, then reinforce lifting the hand to the chest, then reinforce lifting the hand to the chest
& making a circular motion.
– EXAMPLE 2: When a baby learns to walk, we reinforce each part of the process: crawling — standing —
single steps — walking
17. Define the following terms:
a. MO → any environmental variables that (a) alters the effectiveness of some stimulus, object, or event as
a reinforcer; and (b) alters the current frequency of all bx that has been reinforced by that stimulus,
object or event
• EXAMPLE 1: Removing Skittles as a reinforcer because you notice the child just plays with the candy
and doesn’t eat it. Then bringing the Skittle back the next month to increase the likelihood the child
will want the candy.
• EXAMPLE 2: When you have mowed the lawn on a hot day, a cold glass of lemonade is much more
motivating & rewarding than it is on a cold, snowy day sitting in your jams by the fire.

MO vs. SD
MO SD
Pertains to the value of the reinforcer for the Functions as a signal for the availability of a
individual in the environment particular reinforcer
Do I care about/Am I interested in the Can I have it?
consequences being offered? Is it available?
Does it make me want more? Does it mean I can have it?

b. EO → a MO that ↑ the reinforcing effectiveness or value of a some stimulus, object or event (e.g., food
deprivation establishes food as an effective reinforcer; Skittle deprivation establishes Skittles as an
effective reinforcer)
• You have been driving for 5 hours & you look down at the needle gage of your car only to see the gas
is practically on empty. **This is your EO as the gas station is becoming more & more
valuable as you drive. Several minutes later, you see a gas station on the left over the highway.
This is the SD. You now take the exit & drive over the highway (something you would not have
done if you didn’t see the gas station).
c. AO → an MO that ↓ the reinforcing effectiveness or value of a given stimulus, object or event.
d. Abative Effect → a ↓ in the current frequency of bx that has been reinforced by some stimulus, object
or event (e.g., food ingestion abates bx that has been reinforced by food)
e. Evocative Effect → an ↑ in the current frequency of bx that has been reinforced by the stimulus, object
or event (e.g., food deprivation evokes, or ↑ the frequency of, behavior that has been reinforced by food)
18. What is operant stimulus control? How do you know when operant stimulus control has been
achieved?
Operant stimulus control has been achieved when a response occurs more frequently in the presence of
a specifi c stimulus, but rarely occurs in the absence of the stimulus
Stimulus Control → Bx that occurs more often in the presence of an SD than S∆
• Occurs when rate, latency, duration, or intensity/amplitude of a response is altered in the presence of
an antecedent stimulus
• SD → signals availability of reinforcement
• S∆ → signals that reinforcement is NOT available

19. What are the functional properties of language?


Pertain to the causes of the verbal response & are functionally under antecedent control:
• MO → Mand
• Non-Verbal SD → Tact
• Verbal SD → Intraverbal
• Vocal SD → Echoic & Transcription
• Written SD → Textual

20. Define and provide an example of point-to-point correspondence and formal similarity.
• Point-to-Point Correspondence → 2+ stimulus components control 2+ response
components
• Formal Similarity → SD & response product are in the same sense mode & they physically
structurally resemble each other

You might also like