Former name: Motivation and Reinforcement.
Christopher Newland, Ph.D.
(Fall, 1998)

Course: PG 636 Experimental Analysis of Behavior: Behavior and its Consequences.

Prerequisites: PG 681 Conditioning and Learning.

Meeting Times: Monday, Wednesday 14:00 - 16:00

Text: Readings from the primary and secondary literature. Available at Speedy Print.

Instructor: Christopher Newland, Ph.D.

Office Hours: M W 8:00-10:00

Phone: 844-6479


Overview and Course Objectives. We will examine the roles that consequences, and their scheduling, play in the acquisition, maintenance, and structure of behavior. As befits a graduate seminar we will devote attention not only to mechanisms, and theories about mechanisms by which consequences select and shape behavior but also to issues surrounding methods, measurement, and quantification. Topics will include the scheduling of reinforcers, aversive control, conditioned reinforcement, the generation of complex response units, choice, molecular determinants of behavior, mathematical models of behavior, and the dynamics of behavior during transition states.

Course Structure. The course will be structured as a seminar based upon the primary literature and focused literature reviews. Through the course of the term we will review the topics and papers listed on the syllabus. For most topics, a review has been identified that covers the literature and current thinking on a topic. These reviews are thematic, generally have a theoretical position to advance, and are grounded in data. Indeed, many of them present experiments in some detail. Therefore, we will let the review bring us up to date on a topic. Some papers from the primary literature will also be covered during a section. These will be papers representing the experimental approach to a topic, or they will be recent papers on a topic.

Come to class prepared to discuss the literature reviews and prepared to present the primary papers or the experiments covered in detail in a review paper. By this I mean that you should be able to present the authors' rationale, the methods, results, and conclusions with skeletal notes. This does not mean reading highlighted sections directly from the paper. By 8:00 A.M. on the day of class, submit to me at least one question or interpretation of the topic. If something is confusing or needs clarification then I will spend some class time on it. Other questions/observations will be read to the class (anonymously) for discussion. Good questions will be those that generate discussion.

In your discussion of a topic be thorough, concise, and clear. Try the "tell'em" strategy: "tell 'em what your going to tell 'em, tell 'em, and tell 'em what you told 'em." Set up each paper by summarizing what question is being addressed, why it is an important question, and what methods might be used to address it. It is the presenter's responsibility to provide the background material required to understand the paper, and this may mean going to the library to read papers referred to prominently. Keep all discussion focused on what was done, what happened, and how this was interpreted. Describe all procedures carefully: say what was done and what happened. Always place all behavior in the context of the environmental contingencies such as three-term contingency of reinforcement or the contingencies of respondent conditioning. This may be difficult at first, but it is worth the effort. We will not tolerate folksy descriptions of behavior, but instead, will respect the spirit of Lloyd Morgan's canon:

In no case may we interpret an action as the outcome of the exercise of a higher psychical faculty, if it can be interpreted as the outcome of the exercise of one which stands lower in the psychological scale. (From Morgan, C.L. (1894), An Introduction to Comparative Psychology. London: Walter Scott.)

Evaluation. Evaluation will be based upon your presentations, class participation (quality and quantity), and the exam/term paper. The distribution will be as follows: Class participation will count as 1/3 of your grade. Each take-home exam will count as 1/3 of your grade.

Following are some of the criteria used to evaluate the presentations:

-- Clear and succinct description of the research question and coverage of the points listed above.

-- Clear description of the methods. (include important details, not all details).

-- Graphical presentation of the results (let's use the board and dispense with transparencies).

-- Presentation of the author's conclusions.

-- The extent to which you go beyond the paper and incorporate what you know, or what you are learning in this course. This can be in the form of critical comment on weaknesses, unanswered questions raised, further research prompted by this experiment, or extensions to understanding human behavior.

Following are some of the criteria for evaluating participation of those not presenting:

-- Clear evidence that you have read the paper.

-- Questions asked and points of discussion raised.

-- Insights about how two or more of the papers tie together (especially relevant for those with no responsibilities to present during a class).

-- Participation in discussion.

Students with Disabilities. Students with a disability documented by Auburn's Program for Students with Disabilities should schedule a meeting with me early in the term. I will work with the student to meet the accommodations recommended by the Program for Students with Disabilities.

Schedule (Fall, 1998)

Week Of Week Topic Readings
Reinforcement and Aversive Control
15 June 1 Scheduling positive reinforcers 1-3
22 June 2 Aversive control 4-10
Conditioned Reinforcement and the Assemblage of Complex Response Units
29 June 3 Conditioned reinforcement 11-15
6 July 4 Complex response units: Second order schedules.(variability as an operant/machado) 

IRT as the operant/Galbicka and Platt.

Choice, Matching, and Molar Determinants of Behavior.
13 July 5 Choice and concurrent schedules: Molar analyses 20 (chapt 1,2,3), 21-23, 
20 July 6 Choice and concurrent schedules. Generalized Matching. Chapter 4.,24-25 
27 July 7 Matching and psychophysics Chapter 11,26-17
3 Aug 8 Behavioral economics 28-36
Acquisition and Change
10 Aug 9 Dynamics: acquisition. 37-42
17 Aug 9 Dynamics: behavioral momentum. 

Molecular structure in behavior.

24 Aug Selectionism. 48-49


[1.] Zeiler, M. (1977). Schedules of reinforcement: the controlling variables. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of Operant Behavior (pp. 201-232). Englewood Cliffs, NJ: Prentice-Hall.

[2.] Gentry, G. D., & Marr, M. J. (1982). Intractable properties of responding under a fixed-interval schedule. Journal of the Experimental Analysis of Behavior, 37, 233-241.

[3.] Zeiler, M. D. (1979). Output dynamics. In M. D. Zeiler & P. Harzem (Eds.), Reinforcement and the Organization of Behaviour (Vol. 1, pp. 79-115). New York: John Wiley and Sons.

[4.] Baron, A. (1991). Avoidance and punishment. In I. H. Iversen & K. A. Lattal (Eds.), Experimental Analysis of Behavior. Part 1. (pp. 173-417). Amsterdam.: Elsevier.

[5.] Dinsmoor, J. A. (1977). Escape, Avoidance and Punishment: Where do we Stand? Journal of the Experimental Analysis of Behavior, 28, 83-95.

[6.] Abbott, B., & Badia, P. (1979). Choice for signaled over unsignaled shock as a function of signal length. Journal of the Experimental Analysis of Behavior, 32, 409-417.

[7.] Perone, M., & Galizio, M. (1987). Variable-interval schedules of timeout from avoidance. Journal of the Experimental Analysis of Behavior, 47, 97-113.

[8.] Herrnstein, R. J., & Hineline, P. N. (1966). Negative reinforcement as shock-frequency reduction. Journal of the Experimental Analysis of Behavior, 9, 421-430.

[9.] Rescorla, R. A. (1968). Pavlovian conditioned fear in Sidman avoidance learning. Journal of Comparative and Physiological Psychology, 65, 55-60.

[10.] Mellitz, M., Hineline, P. n., Whitehouse, W. G., & Laurence, M. T. (1983). Duration-reduction of avoidance sessions as negative reinforcement. Journal of the Experimental Analysis of Behavior, 40, 57-67.

[11.] Gollub, L. (1977). Conditioned reinforcement: schedule effects. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of Operant Behavior (pp. 288-312). Englewood Cliffs, NJ: Prentice Hall.

[12.] Blackman, D. (1977). Conditioned suppression and the effects of classical conditioning on operant behavior. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of Operant Behavior (pp. 340-363). Englewood Cliffs, NJ: Prentice Hall.

[13.] Fantino, E. (1977). Conditioned reinforcement: Choice and information. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of Operant Behavior (pp. 313-339). Englewood Cliffs, NJ: Prentice Hall.

[14.] Williams, B. A., & Dunn, R. (1991). Preference for conditioned reinforcement. Journal of the Experimental Analysis of Behavior, 55, 37-46.

[15.] Mazur, J. E. (1995). Conditioned reinforcement and choice with delayed and uncertain primary reinforcers. Journal of the Experimental Analysis of Behavior, 63, 139-150.

[16.] Marr, M. J. (1979). Second-order schedules and the generation of unitary response sequences, Advances in the Analysis of Behavior. Vol 1. Reinforcement and the Organization of Behavior. (pp. 223-260). New York: Wiley.

[17.] Galbicka, G., & Platt, J. R. (1984). Interresponse-time punishment: A basis of shock-maintained behavior. Journal of the Experimental Analysis of Behavior, 41, 291-308.

[18.] Newland, M. C., & Marr, M. J. (1985). The effects of chlorpromazine and imipramine on rate and stimulus control of matching to sample. J Exp Anal Behav, 44, 49-68.

[19.] Machado, A. (1997). Increasing the variability of response sequences in pigeons by adjusting the frequency of switching between two keys. Journal of the Experimental Analysis of Behavior, 68, 1-25.

[20.] Davison, M., & McCarthy, D. (1988). The Matching Law: A Research Review. Hillsdale, NJ: Erlbaum.

[21.] Heyman, G. M., & Monhaghan, M. M. (1987). Effects of changes in response requirement and deprivation on the parameters of the matching law equation: New data and review. Journal of Experimental Psychology, 13, 384-394.

[22.] Petry, N. M., & Heyman, G. M. (1994). Effects of qualitatively different reinforcers on the parameters of the response-strength equation. Journal of the Experimental Analysis of Behavior, 61, 97-106.

[23.] McDowell, J. J., & Wood, H. M. (1985). Confirmation of linear system theory prediction: Rate of change of herrnstein's k as a function of response-force requirement. J Exp Anal Behav, 43, 61-71.

[24.] Miller, H. L. (1976). Match-based hedonic scaling in the pigeon. Journal of the Experimental Analysis of Behavior, 26, 335-347.

[25.] Todorov, J. C. (1971). Concurrent performances: Effect of punishment contingent on the switching response. Journal of the Experimental Analysis of Behavior, 16, 51-62.

[26.] McCarthy, D. C., & Davison, M. D. (1991). The interaction between stimulus and reinforcer control on remembering. Journal of the Experimental Analysis of Behavior, 86, 51-66.

[27.] Alsop, B., & Davison, M. (1991). Effects of varying stimulus disparity and the reinforcer ratio in concurrent schedule and signal-detection procedures. Journal of the Experimental Analysis of Behavior, 56, 67-80.

[28.] Hursh, J. B., Clarkson, T. W., Miles, E. F., & Goldsmith, L. A. (1989). Percutaneous Absorption of Mercury Vapor by Man. Archives of Environmental Health, 44, 120-127.

[29.] Hursh, S. R. (1980). Economic concepts fo the analysis of behaivor. Journal of the Experimental Analysis of Behavior, 34, 219-238.

[30.] Hursh, S. R. (1999). Behavioral economics: Introduction and current status. In R. Vuchinich & W. Bickel (Eds.), Behavioral Economics and Addicitive Behavior .

[31.] Fantino, E. (1998). Behavior analysis and decision making. Journal of the Experimental Analysis of Behavior, 69, 355-364.

[32.] Mathis, C. E., Johnson, D. F., & Collier, G. (1996). Food and water intake as functions of resource consumption costs in a closed economy. Journal of the Experimental Analysis of Behavior, 65, 527-547.

[33.] Collier, G., Johnson, D. F., Borin, G., & Mathis, C. E. (1994). Drinking in a patchy environment: the effect of the price of water. JEA, 62, 169-184.

[34.] Collier, G., Johnson, D. F., & Morgan, C. (1992). The magnitude-of-reinforcement function in closed and open economies. Journal of the Experimental Analysis of Behavior, 57, 81-89.

[35.] Baum, W. M., & Kraft, J. R. (1998). Group choice: competition, travel, and the ideal free distribution. Journal of the Experimental Analysis of Behavior, 69, 227-245.

[36.] McCarthy, D., Voss, P., & Davison, M. (1994). Leaving patches: Effects of travel requirements. Journal of the Experimental Analysis of Behavior, 1994, 185-200.

[37.] Newland, M. C., & Reile, P. A. (1999). Learning and behavior change as neurotoxic endpoints. In H. A. Tilson & J. Harry (Eds.), Target Organ Series: Neurotoxicology . New York: Raven Press.

[38.] Mazur, J. E. (1992). Choice behavior in transition: development of preference with ratio and interval schedules. Journal of Experimental Psychology: Animal Behavior Processes, 18, 364-378.

[39.] Hanna, E. S., Blackman, D. E., & Todorov, J. C. (1992). Stimulus Effects on Concurrent Performance in Transition. Journal of the Experimental Analysis of Behavior, 58, 335-347.

[40.] Hunter, I., & Davison, M. (1985). Determination of a behavioral transfer function: White-noise analysis of session-to-session response-ratio dynamics on concurrent VI VI schedules. Journal of the Experimental Analysis of Behavior, 43, 43-59.

[41.] Hinson, J. M., & Staddon, J. E. R. (1983). Hill-climbing by pigeons. Journal of the Experimental Analysis of Behavior, 39, 25-47.

[42.] Galbicka, G., Kautz, M. A., & Jagers, T. (1993). Response acquisition under targeted percentile schedules: a continuing quandary for molar models of operant behavior. Journal of the Experimental Analysis of Behavior, 60, 171-184.

[43.] Nevin, J. A., Mandell, C., & Atak, J. R. (1983). The analysis of behavioral momentum. Journal of the Experimental Analysis of Behavior, 39, 49-59.

[44.] Nevin, J. A. (1992). An integrative model for the study of behavioral momentum. Journal of the Experimental Analysis of Behavior, 57, 301-316.

[45.] Newland, M. C. (1997). Quantifying the molecular structure of behavior: separate effects of caffeine, cocaine, and adenosine agonists on interresponse times and lever-press durations. Behavioural Pharmacology, 8, 1-16.

[46.] Iversen, I. H. (1991). Methods of analyzing behavior patterns, Techniques in the Behavioral and Neural Sciences. Vol 6. Experimental Analysis of Behavior. Vol 2. (pp. 193-242). Amsterdam: Elsevier.

[47.] Payla, W. L. (1992). Dynamics in the fine structure of schedule-controlled behavior. Journal of the Experimental Analysis of Behavior, 57, 267-287.

[48.] Donahoe, J. W., Burgos, J. E., & Palmer, D. C. (1993). A selectionist approach to reinforcement. Journal of the Experimental Analysis of Behavior, 60, 17-40.

[49.] Zeiler, M. D. (1984). The sleeping giant: reinforcement schedules. Journal of the Experimental Analysis of Behavior, 42, 485-493.

Take Home Examinations.

All work on this examination should be done by yourself with no outside help. A passing answer will draw from the materials that we cover during the course. An "A" answer will draw from materials covered outside of the course. Do not refer to a paper that you have not read; by this I mean the whole paper, especially the methods and results. Feel free to illustrate your answers. Photocopying a figure is acceptable, as long as it is referenced. All references should be in APA format. Each item could easily be a term paper, but I do not want that much detail. No answer should be more than about 4 pages, (1 ½ spaces, 12 characters/inch) plus figures and references and some could be less. This means that you will have to plan your presentation of the material very carefully. All references may be listed at the end of the exam, if that is easier for you.

I am giving this examination on the first day of class, in order that you may plan your time for this term. However, I would like to reserve the right to modify a question, or to add an alternative question, if an interesting topic comes up in our class discussion.

Examination 1.
Due 10 July 1998

Answer any 6 of the following 9 questions:

  1. One distinction that has been made between time- and response- based schedules has been the role that the reinforcement of interresponse times plays in these schedules.
    1. Describe clearly and succinctly this distinction.
    2. Defend, from the literature, the proposition that consequences can act on interresponse times.
    3. Describe at least one of the theoretical positions on fixed-interval schedule performance. Include the proposed mechanism and the supporting evidence.
  2. Describe and explain Sidman avoidance.
    1. Provide a complete description of the schedule (stimuli, responses, and consequences).
    2. Using the figure from Sidman's original figure, plot the relationship between the R-S interval and the average interresponse time for a S-S interval of 2.5". Use log-log coordinates. Describe what you see.
    3. Describe the three-term contingency as it applies to behavior maintained by the Sidman avoidance, and related procedures. Briefly characterize at least two theories of performance under such schedules and describe the empirical support, and lack thereof, for these theories. You may organize your answer by considering the discriminative stimulus occasioning the lever press and its consequence.
  3. Describe the conditions under which respondent mechanisms act in aversive control. In your discussion, include some procedures by which higher-order respondent conditioning might be identified.
Additional question. It has been said that punishment suppresses more behavior than the punished response. Even that it suppresses all behavior.

-- Review evidence to support or refute this comment.

-- Is the inverse true for reinforced behavior?

  1. Describe at least two quantitative theories of how punishment acts on positively reinforced behavior. In describing these theories, be sure to identify the units that each term in the theory assume, the units and interpretation of any free parameters, what happens at extreme values, and the conditions under which the equations do, and do not, apply and whether they view punishment as symmetric or asymmetric.
  2. Characterize the role that discriminative stimuli and conditioned reinforcers play in shaping complex response units (second-order schedules, response sequences, or other complex units). Be sure to include some discussion of the distinctions (empirical and theoretical) among neutral, discriminative, and secondarily reinforcing stimuli. Discuss whether there might be a role for respondent mechanisms to act here, and how you would determine that role.
  3. Discuss the concept that "variability is an operant." Include in your discussion a characterization of what the response unit is in these experiments or, lacking that, the conditions under which such a unit might be identified.
  4. Describe the conditions under which electric shock functions as a reinforcer. Has this puzzle been resolved?
  5. Discuss the role of timing in schedules of reinforcement. Include a description what "scalar" timing is.
  6. Why is there a pause in fixed-ratio schedules? Fixed-interval schedules?

Take-Home Examination 2. Due 14 August 1998

The first question and the 14th question (about Palya's analysis) are mandatory. Answer any 5 of the remaining questions.

  1. Beginning with Findley (1958) original experiment, or earlier if you like, review the idea that the visit (or interchangeover time) is the important operant in concurrent schedule performance.
  2. Under what conditions does "matching" obtain with concurrent ratio schedules?
  3. Describe the application of the generalized matching relationship to any three of the following. In your discussion be sure to describe the relevant units, free parameters, important values of the parameters, and the conditions under which the quantitative relationship does not apply. (The point here is to extend the relationship to other schedule relationships. Let me know if I've overlooked one that you would like to review.)
    1. Concurrent DRL schedules
    2. Delay-of-reinforcement
    3. Avoidance
    4. Concurrent VR schedules
    5. Reinforcer durations.
  4. Pick any two of the following accounts of behavior under concurrent schedules and describe the theory, its quantification, supporting evidence, and difficulties with the theory.
    1. Matching is the cause, not an effect.
    2. Melioration
    3. Molecular maximization
    4. Molar maximization
  5. Distinguish between a closed and an open economy. Include definitions of the two, whether this represents a continuum or a dichotomy, and whether this distinction is important behaviorally. Include discussion of at least two applications where such a distinction might be important, and how it is important.
  6. Describe the "self-control" procedure and the behavior it supports.
  7. Describe the relationship between theories of signal detection and the matching relationship. Extend this thinking to applications of TSD in public health considerations or diagnosis.
  8. Describe the "concatenated" matching law and its empirical support.
  9. Describe the quantification that has resulted from Premack's proposal that reinforcement is relative.
  10. Review whether and how the principles of conditioning apply across different classes of vertebrates (mammals, birds, fish) and phyla (e.g., insects). One approach to this question might be to generate a table showing some of the important characteristics (control by consequences, temporal patterns, aversive control, conditioned reinforcement, schedule patterns, etc) where known. A table alone will not be enough, however, some narrative is necessary.
  11. Describe and critically evaluate laboratory "analogs" of foraging.
  12. Contrast selectionistic accounts of reinforcement with any alternative.
  13. Distinguish among the different ways that Weiss describes distributions of interresponse times.
  14. Take any two schedules, and the relevant figures, characterized by Palya and write a narrative understandable by a first-year graduate student, in their first week of class, explaining that description. In your discussion, consider how variability, or the lack thereof, might make that behavior more or less sensitive to change.


Send mail to with questions or comments about this web site.
Last modified: July 08, 2010