studies behavior that's an outcome of the environment
two areas of behaviorism:
research (take what we learn in research and apply it) and applied (how we can use it in everyday life; behavior modification--> changing behavior)
everything people say or do; actions, reflexes; habits; verbal behavior; involves actions not label; describe what the actions are that define or identify the behavior (ex: label = anger; description = kicking, screaming, etc.); involves physical dimensions: frequency, duration, and intensity; labels are often mistaken as a cause; the description points out the individual behaviors that we can actually change
how often a behavior occurs
how long a behavior lasts
magnitude or strength of the behavior
is on the other end of the spectrum from behavioral psychology
what are two ways to describe a type of behavior?
overt (observable; can be observed, described, and recorded) or covert (private events such as thinking); it has an impact on the environment (changes something in the environment around you or something social)
what is all behavior considered?
lawful; it always effects the environment; there is always a relationship between the two
field concerned with analyzing (identifying behavior, describing it and understanding why it's happening) and modifying behavior (developing ways to change a behavior); it relies on direct assessment
what are the characteristics of behavior modification?
focus on behavior (meant to change the behavior not the person's traits), based on basic behavioral principles (came from research and can be applied to people), focused on current environmental events (controlling variables -what controls behavior), procedures are clearly described, treatment used by people in everyday life, measurement of behavior change, de-emphasize past events as cause of behavior, rejection of hypothetical "underlying" causes (a.k.a. "Explanatory fictions")
measurement of behavior change does what?
main action/observation is taken or made before and after and then the behavior is compared; changes can be immediate and long term
de-emphasizing past events as cause of behavior means what?
deciding how what's happening now is influencing behavior
what type of behavior analysis, or what focus of behavior does this class fall under?
applied behavior analysis (we apply principles to change behavior); another type of behavioral analysis is experimental analysis of behavior or simply Behavior Analysis (this is the scientific study of behavior)
physiologist; discovered basic processes of respondent conditioning; found a reflex could be conditioned to a neutral stimulus
Law of Effect (behavior that produces a favorable outcome is more likely to be repeated); beginning of operant conditioning
father of behaviorism; all behaviors are conditioned; conducted the Little Albert experiment
main figure now in behaviorism; Respondent vs. Operant conditioning; laid foundation for behavior modification; "Skinner Box" or conditioning chamber (led to better experimental control; he would look at their baseline behavior before training them then would provide food for when the rats pushed lever and lever pressing was increased, behavior was strengthened, then hew ould break contingency and stop providing food when lever was pressed) represents operant behavior
What are some examples of areas of application in the study of Behavior?
developmental disabilities; mental illness; rehabilitation; clinical psychology; self-management; health psychology; etc.
observing and recording changes in behavior; measurement of target behavior(s); assess before, during and after treatment
you'll known if treatment's even worth it or choose best one
during treatment assessment:
does treatment need to be changed?
after treatment assessment:
did the treatment work? need to go back and try something new?
What types of behavior are there?
target (one that can be modified); behavioral excess (target behavior you want to decrease; ex: smoking, eating fast food); behavioral deficit (target behavior you want to increase (ex: studying, exercising)
What types of Behavioral Assessment are there?
indirect (gain information of behavior from the person telling of it or others who saw it; ex: interviews, questionnaires, rating scales, etc.); and direct (observing and recording behavior as it happens; ex: direct observation/recording)
What are the advantages and disadvantages of indirect behavioral assessment?
it's easy, cheap, and efficient; however, it can lead to incomplete reports, biased, and inaccurate
What are the advantages and disadvantages of direct behavioral assessment?
it's more accurate; however, it's costly and time consuming
use action verbs; be objective, unambiguous, no inferences used/made about internal states; sometimes use interobserver reliability (two different people should agree the behavior observed matches the definition)
what are the logistics of recording behavior?
the who, when, and where?; who's the observer? (someone other than the subject, or are they self-monitoring?); when? (the observation period); where? (natural setting where the behavior occurs on its own, or in a contrived setting where the behavior occurs in a lab or a simulated environment where there is more influence but also more control?)
What types of recording methods are there?
continuous (constantly observe/record each instance of behavior; identify onset and offset, frequency, duration, intensity, latency or time until onset); real-time (exact time of onset and offset; frequency and duration); product (indirect; used when behavior results in a tangible outcome of interest; advantage: observer isn't needed; disadavantage: who engaged in the behavior?); interval (observe behavior during certain times; partial vs. whole interval)
how many times behavior occurred in a time period
examples of recording instruments:
ex: data sheet and paper and pencil; stop watch, notepad to tally; sticky note each time; pedometer; recording must be immediate and practical
person behaves different because they know they're being observed; to limit this: wait til the person being observed is used to having you there; discretely record; self-monitoring
primary tool used to document behavior; represents occurrence of behavior over time; compare level of behavior before and after treatment; used to evaluate behavior change and make decisions
what are the components of a graph?
variables; axis labels (dimensions of behavior and unit of time); numbers on axis (x axis is unit of measurement of time; y unit of measurement of behavior); data points (indicates one period of observation and the level of behavior; data is plotted on the graph; could be one instance or an average of instances; connected by a line); phase lines; phase labels
vertical line separating different conditions/phases; change in treatment; phase is a period of time in which same treatment is in effect; data points aren't connected across phases
each phase must be labeled to indicate treatment phase; baseline is when there is no treatment phase; treatment phase is the manipulation or intervention
used to determine if treatment was/wasn't effective/responsible for behavior change; rule out possibility of confound (another variable that could be responsible for behavior change that you weren't responsible for/the researcher didn't plan for)
the manipulation by the researcher
the level of the target behavior
what is the research method in behavior modification?
measure the dependent variable; manipulate the independent variable; deomonstrate a change in the target behavior; replicate
relationship between behavior and the environment; behavior modification procedure caused the change in behavior; behavior changes as a function of procedure; cause/effect; establish if: behavior changed after the independent bariable manipulation and all other variables were held constant; process is replicated getting same results
what are the types of designs?
AB; reversal (ABAB); multiple baseline; alternating treatments; and changing criterion
baseline (A), treatment (B); advantage: easy to compare 2 phases (how behavior has changed or is different); disadvantage: not true research design because haven't ruled out any confounds; doesn't demonstrate functional relationship; this type of design is mostly used in applied settings, not research
reversal (ABAB) design:
procedure returns to baseline and then treatment is implemented again; baseline, treatment, baseline, treatment; extension of AB design with baseline and treatment implemented twice; demonstrates functional relationship; more replication (if results are the same) there is less likely a confound was present; questions of this are: is it ethical to take away treatment if it changed the behavior? and can treatment even be removed?
can't always do ABAB; instead, establish baselines and implement treatment at different times or settings; there are 3 types of this particular design: across subjects, across behaviors, and across settings, all 3 of these demonstrate a functional relationship because treatment is replicated across two or more baselines (rules out confounds or coincidenses)
treatment implemented across several subjects at different times; treatment is staggered across time for each subject; this rules out confounds
treatment implemented for two or more behaviors of the same subject; treatment staggered across behaviors for one subject
treatment implemented across two or more settings in which same behavior os same subject is measure; treatment staggered across settings for same behavior in same subjects
baseline and treatment rapidly alternate
within treatment phase, performance criteria (goal) changes as behavior successively meets short-term goals
what treatment doesn't represent a functional design?
process by which a behavior is strengthened by the consequences that follow it; strengthened if frequency, duration, and intensity increases; it is a functional relationship; defined by the overall effect on the future occurrences of behavior
Law of Effect (Thorndike):
first to study effect of consequences on behavior; if the behavior had favorable outcomes, it was likely to be repeated
behavior controlled by the environment; behavior acts on the environment to produce a consequence; behavior is strengthened through reinforcement; its consequences control/modify behavior; operates on the environment or in response to the environment
strengthens the behavior (can be ANYTHING); defined by the effect it has on behavior not the nature of the stimulus; there are 3 different steps: behavior happens, consequence follows, behavior is strengthened
What are two types of reinforcement?
positive and negative
produces a stimulus to strengthen behavior (such as rewards); something is added to the environment; followed by presentation of a stimulus and behavior is strengthened (behavior, or operant, the addition of a stimulus, or reinforcer, and the behavior is strengthened)
takes away something from the environment to strengthen behavior; removal of a stimulus (an aversive stimulus -any consequence that weakens behavior) and the behavior is strengthened; defined by the effect it has on behavior, not the nature of the stimulus; acts as a reinforcer when it's removed from the environment (behavior, removal of aversive stimulus, behavior is strengthened)
how do you identify whether a reinforcer is positive or negative?
determine what the operant is; decide what happened after the behavior (stimulus added or removed?); then see what happened to the behavior (was it increased?)
what types of negative reinforcement are there?
escape (behavior results in the termination of, or escape from, the aversive stimulus and the behavior is strengthened; contact with the aversive stimulus is removed) and avoidance (behavior results in the prevention of, or avoidance of, the aversive stimulus and behavior is strengthened; never actually come into contact with the aversive stimulus; have to have some kind of warning stimulus or signal with this)
What are the different types of reinforcers?
unconditioned (primary; biologically relevant; don't need to learn that it has reinforcing value; ex of positive: food, water, sex; ex of negative: pain, cold, heat), and conditioned (secondary; once neutral but established as reinforcer; value has to be learned; ex: parents' attention, gold star, clicker training, money); some are called generalized conditioned reinforcers (represents a variety of primary or secondary reinforcers, ex: money)
opportunity to engage in a behavior can act as a reinforcer for another behavior; high-probability behaviors can reinforce low-probability behaviors (ex: have to finish homework before allowed to go outside)
What factors influence Reinforcement?
immediacy; contingency; establishing operations; characteristics of the reinforcer; individual differences
contiguity; temporal relationship between a response and reinforcer (time between behavior and delivery of reinforcer/consequence); consequence is most effective as a reinforcer when it occurs immediately after the behavior (short delay is most effective; longer delay weakens the relationship, with long delay you could unintentionally reinforce a different behavior); this explains why some behaviors are more "addictive" than others (immediate vs. delayed gratification)
consistency; degree of correlation between behavior and consequence; relationship between behavior and consequence (perfect correlation = behavior produces the consequence andt he consequence never occurs without the behavior first); when the consequence happens always after the behavior it's most effective; rate of learning is directly related to how often behavior follows the consequence
establish the effectiveness of a reinforcer at a particular time; any event that changes the value of a stimulus as a reinforcer; satiation level (less likely we'll participate in a behavior for the reinforcer), deprivation level (more likely we'll be engaging in a behavior because of a reinforcer); reinforcer effectiveness varies directly with time since last consumption of that reinforcer (especially for primary reinforcers)
characteristics of the reinforcer:
size matters (larger reinforcers are more effective than small ones); intensity, magnitude, and amount
individual preferences; varying satiation points; varying ability (strength, "intelligence", fatigue); competing contingencies (always have a choice of something else (each behavior could have different consequences)
schedules of reinforcement:
rules specifying when a consequence happens because or as a result of a behavior (which responses will be followed by the reinforcer); different ones result in reliable changes in behavior; reinforcers can be delivered based on a response or based on the passage of time (time = interval schedules; responses = ratio schedules); reinforcers can be delivered on a predictable (fixed) or unpredictable (variable) basis, schedules can be any combination of these
(CRF); each response is followed by the reinforcer; used for acquisition (learning) of new behaviors; this isn't the best for maintaining behavior; better for new behaviors (long-term reinforcement is almost impossible; hard to reinforce every instance of behavior)
reinforcer is delivered on some but not all occasions of behavior; used for maintenance of acquired behaviors (there are many possible ways to arrange intermittent schedules of reinforcement)
fixed interval schedule:
reinforcer is delivered after a set amount of time (the time is the same/predictable); not directly related to responses (the reinforcing coming doesn't make the behavior happen faster); results in a pattern of behavior in which the rate of responding increases with passing of time until delivery of reinforcer
variable interval schedule:
reinforcer is delivered for the first response after a variable amount of time; time is different for each reinforcer; it's unpredictable; it's not directly related to responding; results in moderate to high steady rates of responding
fixed ratio schedule:
reinforcer is delivered after a set number of responses; a number of responses required to receive a reinforcer is the same for each reinforcer (it's predictable); directly related to responding; results in high rates of behavior in bouts, followed by a break in behavior upon reinforcement
variable ratio schedule:
reinforcer delivered after a variable amount of responses; number of responses varies between each reinforcer (it's unpredictable); results in highest, steadiest rates of behavior
more likely to be used in training new behaviors because they produce higher, steadier rates than interval schedules
reinforcer delivered for continuous performance of a behavior for some period of time
when two or more schedules of reinforcement are in effect at one time for different behaviors
procedure in which the reinforcement of a previously reinforced behavior is discontinued; this leads to the decrease of the behavior; zero probability of reinforcement (the behavior is never followed by the reinforcer); doesn't require the application of aversive stimuli (it isn't the same thing as punishment although they do both weaken behavior)
The Extinction Burst:
when extinction is first applied, behavior increases; there can be an increase in frequency, duration, or intensity of the behavior; there can be an increase in "emotional" and/or aggressive behaviors; there can be an increase in novel/new behaviors (can be useful when trying to elicity new behaviors)
What are the behavioral effects of extinction?
spontaneous recovery (behavior occurs again later after it was eliminated with extinction); context (natural tendency of behavior to occur again in situations similar to those before extinction); and resurgence (if behavior is reinforced during extinction, it will re-emerge)
extinction and positive reinforcement:
the positive reinforcer is no longer delivered following the behavior
extinction and negative reinforcement:
the aversive stimulus is no longer removed following the behavior (escape/avoidance is no longer provided)
what are the misconceptions about extinction?
isn't just ignoring (although this can be a way of extinction if the reinforcer was attention as a result of a behavior); it's not punishment (punishment is presenting an aversive stimulus to decrease behavior; extinction is when the removal of aversive stimulus was reinforcing behavior and you leave it)
resistance to extinction:
some behaviors take longer to extinguish; continued responding during the extinction process; behavior continuing to occur during extinction leads to better resistance to extinction; behavior that diminishes more quickly leads to less resistance
what factors influence extinction?
schedule of reinforcement; reinforcement during extinction; history of reinforcement; response effort
schedule of reinforcement influencing extinction:
ex: conditioned reinforcement=faster extinction; intermittent=more gradual, more resistance to extinction because it's harder to distinguish between extinction and intermittent reinforcement (the ability to tell the difference between the two becomes difficult)
reinforcement during extinction influencing extinction:
behavior will take longer to extinguish; extinction becomes intermittent reinforcement
history of reinforcement influencing extinction:
behavior with a long history of reinforcement may have more resistance to extinction than a behavior with a shorter history of reinforcement
response effort influencing extinction:
a response requiring greater effort may diminish more quickly during extinction than a response requiring less effort
Want to see the other 92 Flashcards in Chapters 1-5 from notes?JOIN TODAY FOR FREE!