A complex reinforcement procedure in which the participant is permitted to choose which of several simple reinforcement schedules will be in effect. Once a choice has been made, the rejected alternatives become unavailable for some time.
A complex reinforcement procedure in which the participant can choose any one of two or more simple reinforcement schedules that are available simultaneously. Concurrent schedules allow for the measurement of choice between simple schedule alternatives.
A schedule of reinforcement in which every occurrence of the instrumental response produces the reinforcer. Abbreviated CRF.
A graphical representation of how a response is repeated over time, with the passage of time represented by the horizontal distance (or x axis), and the total or cumulative number of responses that have occurred up to a particular point in time represented by the vertical distance (y axis).
A reinforcement schedule in which a response is reinforced only if it occurs before a specified amount of time has elapsed following the preceding response.
differential reinforcement of high rate. Abbreviated DRH.
A reinforcement schedule in which a response is reinforced only if it occurs after a specified amount of time has elapsed following the preceding response.
differential reinforcement of low rate. Abbreviated DRI
The gradually increasing rate of responding that occurs between successive reinforcements on a fixed interval schedule.
fixed interval scallop
A reinforcement schedule in which the reinforcer is delivered for the first response that occurs after a fixed amount of time following the last reinforcer.
fixed interval schedule. Abbreviated FI
A reinforcement schedule in which a fixed number of responses must occur in order for the next response to be reinforced.
fixed ratio schedule. Abbreviated FR
A schedule of reinforcement in which only some of the occurrences of the instrumental response are reinforced. The instrumental response is reinforced occasionally, or intermittently. Also called partial reinforcement
The interval between one response and the next. _______ can be differentially reinforced in the same fashion as other aspects of behavior, such as response force or variability.
interresponse time or IRT
A reinforcement schedule in which a response is reinforced only if it occurs after a set amount of time following the last reinforcement.
A restriction on how long reinforcement remains available. In order for a response to be reinforced, it must occur during the _______ period.
A rule for instrumental behavior, proposed by R.J. Hernstein, which states that the relative rate of responding on a particular response alternative equals the relative rate of reinforcement for that response alternative.
A mechanism for achieving matching by responding so as to improve the local rates of reinforcement response alternatives.
Greater sensitivity to the relative rate of reinforcement than predicted by perfect matching.
A pause in responding that typically occurs after the delivery of the reinforcement on fixed ratio and fixed interval schedules of reinforcement.
The high and invariant rate of responding observed after the post-reinforcement pause on fixed ratio reinforcement schedules. The ratio ends when the necessary number of responses have been performed. and the participant is reinforced.
A reinforcement schedule in which reinforcement depends only on the number of responses the participant performs, irrespective of when those responses occur.
Disruption of responding that occurs when a fixed ratio response requirement is increased too rapidly.
A reinforcement schedule in which a response is reinforced depending on how soon that response is made after the previous concurrence of the behavior
A program, or rule, that determines how and when the occurrence of a response will be followed by the delivery of the reinforcer.
schedule of reinforcement
Less sensitivity to the relative rate of reinforcement than predicted by perfect matching.
The mathematical function that describes how reinforcer value decreases as a function of how long a participant has to wait for delivery of the reinforcer.
value of discounting function
A reinforcement schedule in which reinforcement is provided for the first that occurs after a variable amount of time from the last reinforcement.
variable interval schedule. Abbreviated VI
A reinforcement schedule in which the number of responses necessary to produce reinforcement varies form trial to trial. The value of schedule refers to the average number of responses needed for reinforcement.
Variable ratio schedule. Abbreviated VR.
Want to see the other 26 Flashcards in Chapter 6 terms on Flashcards to study?JOIN TODAY FOR FREE!