<%BANNER%>

Contingency Values of Varying Strength and Complexity

Permanent Link: http://ufdc.ufl.edu/UFE0022588/00001

Material Information

Title: Contingency Values of Varying Strength and Complexity
Physical Description: 1 online resource (75 p.)
Language: english
Creator: Samaha, Andrew
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: contingency, dependent, independent, reinforcement, response
Psychology -- Dissertations, Academic -- UF
Genre: Psychology thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Precise control over the reinforcers that follow behavior and the reinforcers that are presented in the absence of behavior may help to provide a clearer understanding of the role of response-dependent and response-independent reinforcers. Four experiments examined lever pressing in rats as a function of a contingency for the delivery of sucrose pellets. Contingencies were arranged by manipulating the probability of a reinforcer given a response and the probability of a reinforcer given no response. Experiment 1 examined acquisition and maintenance of lever pressing during positive contingencies (where the probability of a reinforcer given a response was higher than the probability of a reinforcer given no response) and complex positive contingencies (a positive contingency where the probability of a reinforcer given no response is greater than zero). Results indicated lever pressing was not acquired under the complex positive contingency, was acquired under the positive contingency, but persisted during a return to the complex positive contingency for all three subjects. In Experiment 2, subjects were exposed to the same sequence of conditions as subjects in Experiment 1 but after first experiencing negative (.00/.10) and complex negative contingencies (.05/.10). In general, results of Experiment 2 were similar to the results of Experiment 1 except that responding did not persist during the second exposure to .10/.05 for two subjects and, for one subject, acquisition during the positive contingency was more difficult to obtain than for any of the subjects in Experiment 1. In Experiment 3, a two-component multiple schedule was arranged where one component was associated with early exposure to a negative contingency while the other component was associated with only positive contingencies. Results indicated that, overall, the multiple schedule method did not detect differences in subsequent responding. In Experiment 4, the effects of a gradual shift from a positive to a negative contingency were examined. Results indicated that lever pressing decreased accordingly as contingencies became more negative. In addition, maintenance under negative contingencies was more likely when smaller contingency changes were made from one condition to another. All of the results are discussed in terms of understanding naturally occurring schedules of reinforcement in the acquisition and maintenance of appropriate and problematic human behavior.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Andrew Samaha.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Vollmer, Timothy R.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-08-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022588:00001

Permanent Link: http://ufdc.ufl.edu/UFE0022588/00001

Material Information

Title: Contingency Values of Varying Strength and Complexity
Physical Description: 1 online resource (75 p.)
Language: english
Creator: Samaha, Andrew
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: contingency, dependent, independent, reinforcement, response
Psychology -- Dissertations, Academic -- UF
Genre: Psychology thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Precise control over the reinforcers that follow behavior and the reinforcers that are presented in the absence of behavior may help to provide a clearer understanding of the role of response-dependent and response-independent reinforcers. Four experiments examined lever pressing in rats as a function of a contingency for the delivery of sucrose pellets. Contingencies were arranged by manipulating the probability of a reinforcer given a response and the probability of a reinforcer given no response. Experiment 1 examined acquisition and maintenance of lever pressing during positive contingencies (where the probability of a reinforcer given a response was higher than the probability of a reinforcer given no response) and complex positive contingencies (a positive contingency where the probability of a reinforcer given no response is greater than zero). Results indicated lever pressing was not acquired under the complex positive contingency, was acquired under the positive contingency, but persisted during a return to the complex positive contingency for all three subjects. In Experiment 2, subjects were exposed to the same sequence of conditions as subjects in Experiment 1 but after first experiencing negative (.00/.10) and complex negative contingencies (.05/.10). In general, results of Experiment 2 were similar to the results of Experiment 1 except that responding did not persist during the second exposure to .10/.05 for two subjects and, for one subject, acquisition during the positive contingency was more difficult to obtain than for any of the subjects in Experiment 1. In Experiment 3, a two-component multiple schedule was arranged where one component was associated with early exposure to a negative contingency while the other component was associated with only positive contingencies. Results indicated that, overall, the multiple schedule method did not detect differences in subsequent responding. In Experiment 4, the effects of a gradual shift from a positive to a negative contingency were examined. Results indicated that lever pressing decreased accordingly as contingencies became more negative. In addition, maintenance under negative contingencies was more likely when smaller contingency changes were made from one condition to another. All of the results are discussed in terms of understanding naturally occurring schedules of reinforcement in the acquisition and maintenance of appropriate and problematic human behavior.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Andrew Samaha.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Vollmer, Timothy R.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-08-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022588:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101222_AAAABW INGEST_TIME 2010-12-22T18:46:05Z PACKAGE UFE0022588_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 33780 DFID F20101222_AABGYV ORIGIN DEPOSITOR PATH samaha_a_Page_65.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
fa403e410bed258fdeb39877a80c2292
SHA-1
d9f3150bf6241093985ef5dfa76f1f2546aad38e
114565 F20101222_AABGMC samaha_a_Page_19.jpg
1824d5c8f05a02f397997dcffcc62bd0
768ea9148cb4c0bee20b108084c9744f0ec9ca4e
25271604 F20101222_AABGHF samaha_a_Page_69.tif
6a4390eb5e8d02358e4871725c07c769
cf966fd1803d93a1f888b02bc193f327d4b88560
2250 F20101222_AABGTX samaha_a_Page_23.txt
506e5c7b41740311d067241883d5f83e
42ca10c02a20e03868aab2c0e7149f0218e01722
8478 F20101222_AABGYW samaha_a_Page_65thm.jpg
9ea921711f45920cb9cccc36bd4d1228
d1a5c17890d0ba22abbf0bf5e76fd730fb265d1b
105923 F20101222_AABGMD samaha_a_Page_21.jpg
84491feb7e148d570c1513ffa096c18e
8d6ec0a9adcdf5343dc64cd9925098c2a15dadc8
98410 F20101222_AABGHG samaha_a_Page_49.jpg
acef22c69f86b9e0fb2bc023bdbc6815
f0be53097082375d3b3d75a506a6482ecd5adb7a
25265604 F20101222_AABGRA samaha_a_Page_56.tif
571c18d39be8e51f6cae8e07fd14eb51
d0f0b0fc544bf3f6cd696c242750e48c2a62cbc6
2096 F20101222_AABGTY samaha_a_Page_24.txt
f7b7e80e77700ef395db079776eec4ad
2017e85af95633b106b04d899452a91643e7d246
33764 F20101222_AABGYX samaha_a_Page_66.QC.jpg
d7a3930c789acc8d3bcb45abd418e967
356092f04ded89f3281bf60180beb49aef0651c4
110849 F20101222_AABGME samaha_a_Page_22.jpg
6faa92edbafdc2b43586ba3b74db4892
fe80886555879c481a3dec7d1b058a2f8df9f5b6
F20101222_AABGRB samaha_a_Page_57.tif
34bdbc0f72cfa1ca18b192a57c49203c
a5b801bb1127c3c63cf4b974d21b7c0b319f5e43
2161 F20101222_AABGTZ samaha_a_Page_25.txt
dd1dcd33f49a5e598c7e02ae724cbc38
456c098dda0f539f849e9f50bf2073720f4fb049
103409 F20101222_AABGMF samaha_a_Page_24.jpg
75df1b4d16a6a1d7243364571817e60b
e6fbfecb0111509219fc84578b080fd1df31d64b
F20101222_AABGHH samaha_a_Page_60.tif
c6d515fbb75db3672b847cb411e0a5d6
5b5a25edd841a8a6155db66768ba23547943fe2a
F20101222_AABGRC samaha_a_Page_58.tif
f6af89cf85f66f075517049f6d85ae17
37072d160291fd27bc71677afb4b6f3dc7187a37
34990 F20101222_AABGYY samaha_a_Page_67.QC.jpg
a6309060f6b5a5cc7ba0b44cab29f5a2
f192aabc0bb84dced2b22cc2bef86e0763e103ff
35807 F20101222_AABGWA samaha_a_Page_15.QC.jpg
ad0ed48ee08aa9dd0afdad0fefd1d44a
89da60686a14652f6cb06368e792d4fb87a64fc1
104548 F20101222_AABGMG samaha_a_Page_26.jpg
8c6a4a6be0054b48fa514be14284233d
ab74f3b618051d17b993aed2db4a74825835ae5c
32569 F20101222_AABGHI samaha_a_Page_49.QC.jpg
32216ded9425676b24140def1cfe444c
0e101bf2a245257c07fc9f333fd550b111da8153
F20101222_AABGRD samaha_a_Page_61.tif
ff9447e432471305060b268eae8841f2
2609b9a30beeb20136f5d6554200fc18e7bf9e60
8456 F20101222_AABGYZ samaha_a_Page_67thm.jpg
83ba3fda342c120edff1f60962416c9e
983f05bc76cc11a667ed887e70f8a10f8169b3ff
9241 F20101222_AABGWB samaha_a_Page_15thm.jpg
dbcca9a4aed80324229dfd64f48d41ca
4ad8509dbd2c87e87a525be06eca6f11cc50ed41
66829 F20101222_AABGMH samaha_a_Page_28.jpg
05713d7bc30657323d97b384d17b4530
498866cc7c0ff325774e2374f8a67a4eb63dd6bc
7041 F20101222_AABGHJ samaha_a_Page_37thm.jpg
c64a3c3e3b895924c0468b3c3722a769
9a192b1d6927b1530843eb73235f9734f501a42a
F20101222_AABGRE samaha_a_Page_62.tif
1de95d9e971a1854e47384bb2d9971c8
60da0511303194ecc91b1d872cde84ea5ab421e7
36612 F20101222_AABGWC samaha_a_Page_16.QC.jpg
2dc1b79c1b3069ed2eceea4077057a97
ef76ca7713c7980e3d7405fc65f329b382f5b598
F20101222_AABGHK samaha_a_Page_36.tif
fa2dd5f391d7063c93413bde0ff7d5d1
126b91353804eb40372fb4200a94fdf44677d942
F20101222_AABGRF samaha_a_Page_64.tif
9aa46a4eca63a5a8968025421de39abd
36d9f1ffc1ca511ff2489da2bfeb09d71f230f48
86774 F20101222_AABGMI samaha_a_Page_29.jpg
f2040073b2cebbcba8e67311bf9de8f8
8a92df7cd83d6fd03f3377e651da4dc812fe8352
8924 F20101222_AABGWD samaha_a_Page_16thm.jpg
0f64d9a27477641f628a59f504e88990
b440afdae9d8a93fce813bcc276a6022950987dc
228 F20101222_AABGHL samaha_a_Page_55.txt
b4c84fc2605a375e620cd4b0fb0b32a5
a106356569e32b04c2a7ac238f25fb6df1d42877
F20101222_AABGRG samaha_a_Page_65.tif
43c084893b7750aed18c0a45af0a7832
889c0ef76dc29ed9a3ed9094d4ed6a89f310f8f6
109346 F20101222_AABGMJ samaha_a_Page_31.jpg
a95171abfbb168ad690a05313d18476e
512afa8b66af1701b91bb36d8f2fc14fdb2cd749
8406 F20101222_AABGWE samaha_a_Page_17thm.jpg
b0ccf09a1f0021cd0a415d2ab1b08ced
f7d6ddb0138eac4c364fdd5795fd44490b9df20e
7486 F20101222_AABGHM samaha_a_Page_10thm.jpg
09b47b9e5a441afb4690060018dd7675
1290a46c23f9e26a0092d2c5d96f2fe0f0225ec5
F20101222_AABGRH samaha_a_Page_66.tif
f3348f6d4ca3aa9740e89c31f301f660
08ce0c20e74da4a5ded7a5e68332f110943d734f
33840 F20101222_AABGMK samaha_a_Page_32.jpg
9f3caeb073d3525dd87755b01138427b
505bec2b80f75dc8ca82e9fdb4c6d339575cad47
8499 F20101222_AABGWF samaha_a_Page_18thm.jpg
ebe310f4dc1cb70de1a4795dad9d2f48
acd70a279324899c2f6408f4e5d6ce81a9d720d7
8175 F20101222_AABGHN samaha_a_Page_38thm.jpg
254994ba7efff2c6a03ed9e513e1ad7f
11947692cad277294f1372246c3c4ec1254b629d
F20101222_AABGRI samaha_a_Page_67.tif
bc44102e02f08a7ad5b39f01591ccde3
a9f70259c155c350419677c5d4aea32b41e359dd
53027 F20101222_AABGML samaha_a_Page_33.jpg
63cfced8d5a15cf2a8a8d8f363b20315
c4189bd963d18241e87e23720e590a85707ac0ad
9073 F20101222_AABGWG samaha_a_Page_19thm.jpg
c31ebaa2edd1567db7f607878029005e
8204128bb2d117661b38642bbce092f67319aa5c
F20101222_AABGHO samaha_a_Page_05.tif
ad86ed9809c8786a7a57d6b95d824af4
686416930b440b909e265d7c2198a39b5f1a6435
F20101222_AABGRJ samaha_a_Page_70.tif
6b6a53c28b5f510e53e6c42ca116f42b
1ed6d22c9edb3a4266aac98d6271178e19eb3bc5
37179 F20101222_AABGWH samaha_a_Page_20.QC.jpg
21269bfa2cdc2080ac4238fabec10d68
e3a0ccc8f4fc6fde917400c20ac0bb80236a79f4
879 F20101222_AABGHP samaha_a_Page_56.txt
cf35d0db0f65c21870ae0aa019183842
658f6f2750eb2293fb558ad580b93961752e2b1f
F20101222_AABGRK samaha_a_Page_71.tif
54948040430c3c0a476c1944e83fcab1
7eb052da494e127897bb6331d484446b08bde0a5
108071 F20101222_AABGMM samaha_a_Page_34.jpg
a51fb3a554cc31a1c6c29a4c7b374bab
3a4c43c1864a3e2b08f236ac232210ae28c31a15
8859 F20101222_AABGWI samaha_a_Page_20thm.jpg
424d07b1c914c30afef4b502c06d6aaf
f4c35041629ae2f5dabe1199a57566c196fa6e18
6228 F20101222_AABGHQ samaha_a_Page_61thm.jpg
21afb8987148bda35d60379b2725035e
81ee734f2bcf4a0359e9f94513d1302c2b26ba80
F20101222_AABGRL samaha_a_Page_73.tif
43c5d9c3474425f8af73d53a9e6edf77
baf6e218cd623da3b56515e1f98b6d5d1ce5bbc8
111305 F20101222_AABGMN samaha_a_Page_35.jpg
58097d473d8b01efb4c2904cb141b185
c9211141d94f0c9edfc9b7a7ed52339b59b09835
33689 F20101222_AABGWJ samaha_a_Page_21.QC.jpg
3131f415bd60ec3aa35ba67b9a91dbdc
6d48495e8d6ea80e0eeef3bfb87d373d3c3fd1e1
106818 F20101222_AABGHR samaha_a_Page_12.jpg
8b5fe322cc264cb422fafef5c76e7ae4
c9f6497c51d06aa478079bd047e9cdd0194b90e9
7680 F20101222_AABGRM samaha_a_Page_01.pro
67cc5beb07e90231186c9388ab44e8de
a52a4e462f7ae2d530a6d3783d43f921e3b28c69
75049 F20101222_AABGMO samaha_a_Page_36.jpg
62a9590f1dbe6c87379b796a817e3e6f
48e7e6f3c06962cb7a44e4fabfce2d9e16a979b1
8268 F20101222_AABGWK samaha_a_Page_21thm.jpg
0f535e3b78c8dd5d25d054ba62d4c03d
b79b7faa6c4cc6abb0bf846201ed0961d33dafca
F20101222_AABGHS samaha_a_Page_59.tif
5bed129efd62297ecce0e62f17349f59
0f8a071b1cc39359efa235073ddf6b13aeb908d6
1037 F20101222_AABGRN samaha_a_Page_02.pro
0e0c68872c4113ccda2b4e6d55936921
b6b1b4c327959e7b8b76924470597d31b3e4b21a
80815 F20101222_AABGMP samaha_a_Page_37.jpg
847e8dd6649503442d2a4d5f3949ccf1
03af4b874342597cc67e88f81d8ac64a3d43c57f
35681 F20101222_AABGWL samaha_a_Page_22.QC.jpg
4a998829c41e1fcadcf8e34bb5facf24
57cd2cc7cb5386e9477205623a00d54e70d0e979
F20101222_AABGHT samaha_a_Page_74.tif
8f17d79e6b3ba514fb5133ae5f8a2fcf
b05a6d10ee07c454260cea9eef177b76734e2a43
699 F20101222_AABGRO samaha_a_Page_03.pro
8517f2bdf3124de365086648bc422daa
f390300a07e08d2401bee0787b77e26deb845e58
102087 F20101222_AABGMQ samaha_a_Page_38.jpg
ca4311c4144f663fbc3df33879af932e
7655f79605270fa6565fc036f945203586ad81bd
38214 F20101222_AABGWM samaha_a_Page_23.QC.jpg
562412c532470f3a8e26f1379b650303
2212315f8e91a78eb27cb8a21a62d3fc62474b1f
107386 F20101222_AABGHU samaha_a_Page_30.jpg
50111974f892997f0d13ee2e44385409
a9ab000cc3abb637bd3f1a997ea74771e9a2970c
26670 F20101222_AABGRP samaha_a_Page_04.pro
c77ad07403b09fa72549539b51b9fadf
4ea22f090735225bbb9a3bed16d3470de2649360
44066 F20101222_AABGMR samaha_a_Page_40.jpg
f1a8b21e1b746d3d4efccd3004449b5e
262fca7e38e366887bf56f5bf4241c013c5b4a90
9123 F20101222_AABGWN samaha_a_Page_23thm.jpg
0b626569280ca0fb7bce8bda19603cde
250d07fb346bbc30e959600a333ec9b6624e0d23
6584 F20101222_AABGHV samaha_a_Page_71thm.jpg
d9e9e8c971c31afe410a96593a7be348
2fce4cc7b0f31910164689ae847547b2196f1d19
43254 F20101222_AABGRQ samaha_a_Page_06.pro
b89f6c73bdc34b14ee55e69717d0d2e6
93078d3058722e87c637a97fce47332929fb0366
87786 F20101222_AABGMS samaha_a_Page_42.jpg
ccc4ad310b977ad7dce8b30414cd525d
ca8222ca351521712a272556422617d58d80f495
8292 F20101222_AABGWO samaha_a_Page_24thm.jpg
62a8b94c0695b6d5fc6c4a8a21154b5a
2f825a52ca48ec0b4a0fbfb40c17a9310b2af880
828385 F20101222_AABGHW samaha_a_Page_36.jp2
76cd021c0d2bb681dd9a20debf9c655c
db49254c1c6bb5f963d76b227dba4f84b39df607
88612 F20101222_AABGMT samaha_a_Page_43.jpg
73e8610b539650ed6bb81f4386a0a8f5
232332a332c9f97a1a020b4d61bf970b7d868899
8930 F20101222_AABGWP samaha_a_Page_25thm.jpg
ffdaf28eb7c3a2bf7785a9ca902ac3c2
3d8540374abec363f81b6a862edf32c0c42e05b1
937 F20101222_AABGHX samaha_a_Page_46.txt
8031d3e8a2d3f175d043ea81bf307fde
4436b6cdd115fa1a6b80e3a5f0ca660e983bb9d4
6194 F20101222_AABGRR samaha_a_Page_07.pro
13d5d8692af2144e0a9c0233a2b9cb0a
d335b657fb243c4571434707a1244f9eb757c43b
107444 F20101222_AABGMU samaha_a_Page_44.jpg
a6dd4e7aa842f105a0add93c8ab0ee6e
bd924ced4434298c6e260f938cfe0525e54c7fcb
33802 F20101222_AABGWQ samaha_a_Page_26.QC.jpg
94f7b282f0641404949a1c260533b163
0043d40ecdad9050c3bec80d44f8a05370c15f2b
225474 F20101222_AABGHY samaha_a_Page_07.jp2
f49942ae730e980ea001f69f36309aa2
ea3d7ecbb5cfb27f4f49d02baaf1d6c05f05d19c
20964 F20101222_AABGRS samaha_a_Page_08.pro
dbf6a8a520503b867b48f2e7a256b106
5509b1e50ecd641af6b465a990e298d5bd6a09e4
75938 F20101222_AABGMV samaha_a_Page_45.jpg
784fea8a63ec3fe769f871c2e4e9b28b
51cd037551e79365b99ba637bb541955f84ebb97
8515 F20101222_AABGWR samaha_a_Page_26thm.jpg
cc0b2eb057bf7fa53d80f63ce54a753e
fec9527ab113ced54793bb76ec9e9abe8f2705c5
39706 F20101222_AABGHZ samaha_a_Page_58.jpg
5ff9011d266c4ac424cc29411b02671e
8fcc19fa4f6994617711fe6438c51715b9da4b73
44956 F20101222_AABGRT samaha_a_Page_10.pro
2354289ceea7195be48efb915b39d58d
35a99c799fe7d4af4d6e5aede3ff3270d6c1840c
63594 F20101222_AABGMW samaha_a_Page_46.jpg
00463369a4c74388b26c533cc114efd6
378b6f2396ec0fac77fa31f41e187d92bb47704e
8610 F20101222_AABGWS samaha_a_Page_27thm.jpg
c5cb271505a9f8f9ccb68a644ebe3036
0a420ba2e250a91bec34ba77d42f307843a1af6a
29058 F20101222_AABGRU samaha_a_Page_11.pro
2656f16da9a6d968c77d8408a163e2a5
b19c75331150e1e1ea92367f23d6dc776f233c73
110743 F20101222_AABGMX samaha_a_Page_47.jpg
3dcb1489597c92bfe2876ae287ad391a
1f24af4ddf451195088b94b6de592eea910f8818
21436 F20101222_AABGWT samaha_a_Page_28.QC.jpg
cd8c25b258ccccb28c5035b83b984d91
8ca68794f10b1240903656b1f4c6066bf12738b2
52137 F20101222_AABGRV samaha_a_Page_12.pro
bfc5d3d8caff9a1ce4d2506704326926
07594f7b249b8fa516216f5f4c3950002cde5184
F20101222_AABGKA samaha_a_Page_09.tif
0d5151fb27b96bcfb86e41f7518cd3ee
51e2613a07bdc593c4d9d647bc31b051948ab24c
74894 F20101222_AABGMY samaha_a_Page_50.jpg
c27554877cf2ff55233385bf510092b1
f1b624c348357bc57c0c6a34f5cfea6cb46e592d
35252 F20101222_AABGFD samaha_a_Page_13.QC.jpg
7051e5443c6250bed298c04598bf238a
7df21841dab1b720e84a460fd71fbb8f51a5d199
28206 F20101222_AABGWU samaha_a_Page_29.QC.jpg
a91fb26882355b91d13f23db395efb0c
7202430ed9b6df467db1dde094bda14c5100f3b1
53236 F20101222_AABGRW samaha_a_Page_13.pro
00d2f95e265b242cc4bfb4ad5c9e633a
f57d1ac1bae24c671cc457b8ba9af7008c8523be
5377 F20101222_AABGKB samaha_a_Page_33thm.jpg
ee41ae090a9f10ff671d8bd1827539ae
8fe8d9d316476a5d803272d71565b22ddb769dc7
35118 F20101222_AABGMZ samaha_a_Page_53.jpg
051506e305bc87948b00b78cd0375661
8b3618680bbd55eebf99e81958455b303522d6f6
37330 F20101222_AABGFE samaha_a_Page_64.pro
141f104f2eb36f2cdf0bae797164e64e
8c7b12dc7eb9e87a42edd3d60e3a418ee09fdc5e
7099 F20101222_AABGWV samaha_a_Page_29thm.jpg
cbd34e80a28af927dae16912abc67302
2340371e5077f74341a15ceec59d00a183933d68
54662 F20101222_AABGRX samaha_a_Page_15.pro
04d3edb3fb1cddbd687e40f5541d147c
faae81a36b9a3262e22aeea018a3b53729972b56
37035 F20101222_AABGKC samaha_a_Page_19.QC.jpg
4bb60ef8ca6bd0da978e7af5237a0c61
42c883cb9e7b77d306d7b7ebdfd40463bd466ef6
169511 F20101222_AABGPA samaha_a_Page_48.jp2
1980d24e5e4f8cd8a5a6d7d494a3f9f7
9de6e0f963f7d7309d237634694b17bf0616c65c
55501 F20101222_AABGRY samaha_a_Page_16.pro
3079e38b3a1d80384879e4cac04c98a0
36c704c2ae101eac6c688c20b881288343309e25
2070 F20101222_AABGKD samaha_a_Page_26.txt
2cfa352c5c0b7ceebadfc6365516d7be
26a7e842ec0c5f2c6f554e41a1abe79c4cb98be1
35233 F20101222_AABGFF samaha_a_Page_18.QC.jpg
abef1a1c36472479f51f294f41db08d9
113621ba8ef9f6e251a7dc0277884b448da9dc48
8504 F20101222_AABGWW samaha_a_Page_30thm.jpg
55dcacaa087d3a2c5cbfca2105fe77ac
b25fbf35d2ad1c083a3923b773d52be130a22782
1051969 F20101222_AABGPB samaha_a_Page_49.jp2
82ff1bb4dafe3102ffaeb822019f1c13
da1579dd13e9b9c36a729c7ee65b30123588304b
56187 F20101222_AABGRZ samaha_a_Page_20.pro
b270f0da5f33741b6fd998b7b44d9bc9
93db244e3e4796f79929820d7fb25c9b99e394a4
8364 F20101222_AABGKE samaha_a_Page_66thm.jpg
f2a85b39d11e9cf58c9cad34c19fad3b
a6d19f076f7c71b1268c925d22273b0c70a92634
F20101222_AABGFG samaha_a_Page_38.tif
11d1e531d18c884aed14fead18252a87
a8fc931a5e87dcd5e0910dd3d3f291d7cc74bd4f
35417 F20101222_AABGWX samaha_a_Page_31.QC.jpg
5f1cef430db563f982c50a75aa7435f3
124a57d1ebedf2a76e0fd72b3b20a20aee0c6547
808025 F20101222_AABGPC samaha_a_Page_50.jp2
3dbe55d5924d9009c3c35d5383c7a9ca
e1596abc5caf253cbb06d2e9bcb4d6b77ece4a18
2181 F20101222_AABGKF samaha_a_Page_16.txt
06b91edb1485faee7579e36487262de9
ddd5505abeeb9fc0eb6d23dc7b6ba164baeed9fb
1051943 F20101222_AABGFH samaha_a_Page_14.jp2
3c1ddf08821bc6e4436c14270833fa86
2d12f415c0f811e92b127859c4276dcf3cec534e
11101 F20101222_AABGWY samaha_a_Page_32.QC.jpg
0e35dd9ef595b1caaa0e3762db8e6f0a
6eddc26e5c7463b9309afd6dd3d365679a1c0f32
1004011 F20101222_AABGPD samaha_a_Page_51.jp2
aef1e8666f76909b908fe17e1b992d43
05f3fcbbcddeff01752a61e55de39a8c7540980a
116514 F20101222_AABGKG samaha_a_Page_72.jpg
2592b15c2925807377154e4334676c4f
06d7a3dad0015961dddcebb4808815d89ef27b72
2105 F20101222_AABGUA samaha_a_Page_30.txt
f89980f5a5905180cc6117fc05cb8ab1
27fe54a38a34451bd14ea47de85597fbadda04ac
F20101222_AABGFI samaha_a_Page_25.tif
0fd4cac64ffc7e840c6e365c50339a44
0b91042a342758cf76e58a2dd2b86c58aaebf8a0
2754 F20101222_AABGWZ samaha_a_Page_32thm.jpg
af1a05fb410b7f48170b2d3503134a82
7a14fecd42444dde04226f1cf06f307e5035ed74
692 F20101222_AABGKH samaha_a_Page_74.txt
2051342fcaead4f4108451b74ac8e37d
4a697c620bc3985f0114c5e82e9961a574349d6d
2132 F20101222_AABGUB samaha_a_Page_31.txt
3dbd7b7af72c1f9c25a003ec4bcc609b
a74c83bd20b233c565c6d5fadf13a94075e47a04
11888 F20101222_AABGFJ samaha_a_Page_08.QC.jpg
49d27651505f3098fe0cb1acfa455461
ca873b2410d30ae5f300e00b394156ec439b803f
201134 F20101222_AABGPE samaha_a_Page_52.jp2
34923a969a0ab8ba25c4d95b14f5cf01
47dd5829da17fd210230b8a1681bc5c8a12959b9
36232 F20101222_AABGKI samaha_a_Page_47.QC.jpg
2ae2b150bd9ce8c73f65fd89e3c75022
c05336319e8c1955ebdae78892f2b4a56a5e014c
640 F20101222_AABGUC samaha_a_Page_32.txt
f23399b967769b07d4a2f97cd560910f
c4fabeed73886069eaa3112576daeb863ba6842b
14048 F20101222_AABGFK samaha_a_Page_62.pro
c7ba05cff45e3bb37616d86f51ccf318
aa5a33c263c41d76a5f6ef7cf058725841158866
456629 F20101222_AABGPF samaha_a_Page_53.jp2
66b49abf3a8547812acbec4d58521759
040e75238ece685017cf23e525c4a9c514af0c28
36773 F20101222_AABGZA samaha_a_Page_68.QC.jpg
7b461b2c4bc55d2bfba928c4fff0f9a1
059435446e8e3c6dda396817f2202eee0f75535c
7095 F20101222_AABGKJ samaha_a_Page_48.pro
4b13c062c344a64e00ec931258e29d5f
911db4618bcb7f0fd40bf1b62ea116a0f4ed7dee
298 F20101222_AABGUD samaha_a_Page_33.txt
fcd51857fd63921cdbcb10d42e45801c
fe248a60b8da08ce0e82608ebb781ffe607344d0
51984 F20101222_AABGFL samaha_a_Page_17.pro
784cc428132a829bb6a6faf8d0db529e
11bc90d7ed29b681f171e9c06bd08385ac5df15f
1051976 F20101222_AABGPG samaha_a_Page_54.jp2
3227f1d88731a210a1987c39a5be5cfb
ea0a86b2f994142bde0fb494812153c6a291e68f
9045 F20101222_AABGZB samaha_a_Page_68thm.jpg
8bb94862bc94b16161c7d942fc087073
635ed6b571d959930dd79f44dd52f6607ac81bf6
2175 F20101222_AABGUE samaha_a_Page_35.txt
ae7d14b6091026e6a07fd60212f021de
aac3d4b48aa095bd18175d8888b12bfad16ba624
115932 F20101222_AABGFM samaha_a_Page_23.jpg
be2df02cda7d9efb8a83d089151a5008
efaf6b17dc300ff93cbc69b6838e9afef5f62c83
639033 F20101222_AABGPH samaha_a_Page_56.jp2
52d57b05b6743c6d385d4b806982c59b
89a83ec5c3bd4e43bdc8ea720379963448ba1055
35450 F20101222_AABGZC samaha_a_Page_69.QC.jpg
0cb369702c9a5663ea744fb478cade76
770a045265118b522e344095cce20e6f9533208a
17304 F20101222_AABGKK samaha_a_Page_48.jpg
2a98e1b8d90ae586f9416d94febc9656
3d7477c0e3b51957d52b896edf47b892cfe34461
1446 F20101222_AABGUF samaha_a_Page_36.txt
a0f49504699cb1325cb1e8fe67fb52d5
56a79e229564e2db8e4715986e200c60517a0620
106165 F20101222_AABGFN samaha_a_Page_65.jpg
bbe7d328cb7c26058e3361b34196f8c1
13d10845596dda52f9b429ea60bb0a43a7894472
1051979 F20101222_AABGPI samaha_a_Page_57.jp2
b4eda37d68e2a5be4b90504762535c8a
c40e759ee7b8b02387be917dac429f60c4b64a49
8980 F20101222_AABGZD samaha_a_Page_69thm.jpg
be4d240977d91cd48d07d05d5acce443
2a4893fa60887e1bfea466b544f51ec30071688f
F20101222_AABGKL samaha_a_Page_72.tif
51bdd6a831b3d63cac74ffc5357fb51a
066501165ea16b5dddc9984dcb35e7b3625bee57
1718 F20101222_AABGUG samaha_a_Page_37.txt
cbf055b73254c31040ed636532f977de
4ddb5e87d9ceabebef8abb415cba7ca8714e73b4
4958 F20101222_AABGFO samaha_a_Page_11thm.jpg
e5b1de64b42854527dd883fa2fd288fd
f6d7429a0cf1365aa54aba183220682b949d8740
615482 F20101222_AABGPJ samaha_a_Page_59.jp2
f38f3ea2a3d3d36456122979fb24a868
834393d83a7f361636c025c7a42358184a438f1d
34090 F20101222_AABGZE samaha_a_Page_70.QC.jpg
f653462f35e5e01ea16fd964c1abad32
2460ef00937edb6df18dc4a61171a814bcaca779
2141 F20101222_AABGKM samaha_a_Page_15.txt
7822823183cc1d5e1f41109ccf25e49c
e67b24e3ca86359c9a169bd57de6d5248b5b0ac2
2085 F20101222_AABGUH samaha_a_Page_38.txt
bc7b058611dba42b14f4103cefc4ddfe
a301a8cae83df56e6198e42c4e48d1a23fb9688c
31508 F20101222_AABGFP samaha_a_Page_28.pro
960ce28074d233af8b47aafe75ef1ada
48ab711c5cb59afc99075a46ca38efe1bd587a55
1051981 F20101222_AABGPK samaha_a_Page_63.jp2
f2b5e1e0497a2519b73766161a3686d4
f513b8bdd00d4c7ecdbc1bf0d4a67858a2e8cb6f
27192 F20101222_AABGZF samaha_a_Page_71.QC.jpg
71cf5afd4ae3800f6c947fef7711f976
1497b211f9f28da7b84340e5aeee1927f0f1fedd
58106 F20101222_AABGKN samaha_a_Page_04.jpg
01b1076dd23ddad0b4ad2e6d387b83c2
253f5d62858cd05d636f94f956257921b5971c64
172 F20101222_AABGUI samaha_a_Page_39.txt
1d3f790bf9c47f342e5a7c98dfde13bb
cbac399f5b9d46583986f99bee39e95ae0251bcd
34032 F20101222_AABGFQ samaha_a_Page_24.QC.jpg
bb7f032506821d023231e3915abd8a81
f2fc15ee1a3db9ed623491986deee80041c0a4c8
F20101222_AABGPL samaha_a_Page_66.jp2
fbf5c17dac6962b2f30027245748ca4c
ca4ddd494c80f8df640e13127bd27e08a8281a53
34940 F20101222_AABGZG samaha_a_Page_72.QC.jpg
4d79bf29b0fe4affeae8edf2c9c242a8
d984f778823cb94167ceb250a8fa3ec6eebe9a2c
1051963 F20101222_AABGKO samaha_a_Page_65.jp2
cd1ca3a64e23f8ac39a63a08f700dcc1
f7297aa04a9ca73b7a73b0004a65b5ebb7dfad6e
696 F20101222_AABGUJ samaha_a_Page_40.txt
20b3f194147dad6dc0a086929909f9a8
2bf2cc1353653239908ed1e2d4f5127611d0d337
39161 F20101222_AABGFR samaha_a_Page_37.pro
1e3d8487b52bef263a7208cebec33ea7
704b400c416577c9703d0a8a0f97d4914581a602
1051983 F20101222_AABGPM samaha_a_Page_67.jp2
f301fca1cb5e1543d9fcae7efe2e6439
b93f7551f4884a86845c385f82d8be557b5c1df1
35630 F20101222_AABGZH samaha_a_Page_73.QC.jpg
d57669cd73883a5175ba0c2cba320710
9dce5ee24f63b91637257cb524b676cc57f2996c
53151 F20101222_AABGKP samaha_a_Page_18.pro
d56c29f6683905e93cf4c22f93fa0bd3
625c4b93e334f5500c4c90c5adceb3ade8a26c1e
1803 F20101222_AABGUK samaha_a_Page_41.txt
b9d99c2d66ef6a9b724b06c11ff23dae
845beb546fc4b61aa84637335b9edc1ded0d77e9
94494 F20101222_AABGFS samaha_a_Page_51.jpg
535f1a86375d8cc0e269399db05ece80
c1a15ac7b44fdd28d0affb995c4b37d31c8a6161
1051951 F20101222_AABGPN samaha_a_Page_68.jp2
948ab81ac1616b6baf9a2d1f1cbe74d6
7d173797b344c6305bb3e787dc1a27150c7c28e4
9211 F20101222_AABGZI samaha_a_Page_73thm.jpg
3749ca6d5af6128a10abe754d323d334
85cc0c1735eef97a274d96fd53eb9ab64451f5b8
F20101222_AABGKQ samaha_a_Page_75.tif
ec2f12374e9039c81fa09a407bc070ca
0109c334236e01823b409ba6fdb4a6e67a4f1d3a
1785 F20101222_AABGUL samaha_a_Page_43.txt
0ac246236a07e87e48edf1a9a4c94615
aa721087db789141f4a84981af57f89f122e8b37
F20101222_AABGFT samaha_a_Page_48.tif
5a72d411ce241dd9925271f63d271378
c2ec5cd0c24090120337b55e2049bec7ad0cab55
908142 F20101222_AABGPO samaha_a_Page_71.jp2
a2839b56400b4d40d7480d250d7481c6
cc533258cf88232252d92fac12b59b60221c76af
10780 F20101222_AABGZJ samaha_a_Page_74.QC.jpg
a966a101b16f411b866dca92596e0c80
96835c35589a1fedfbbbd443f2949639db42843d
389456 F20101222_AABGKR samaha_a.pdf
b12c254089860de273c84683f3836894
11b03d7bd48c8bd1a25c4c4c04a66139a23adb68
2046 F20101222_AABGUM samaha_a_Page_44.txt
48f15afcc43ed7fd35b1a5907a15a308
1e5ee8d847cf320351924667399d72760d7190b5
721241 F20101222_AABGFU samaha_a_Page_28.jp2
a640827f92236395a6641d042abdc8f2
5d078c7626a3dfabdd8a8b34532d7063fbd0cdee
2697 F20101222_AABGZK samaha_a_Page_74thm.jpg
289f1ca7775fae962dd932563b89b9c1
26d29aecd314008cbd1c86e3ee691438ca3786eb
1597 F20101222_AABGKS samaha_a_Page_71.txt
ee5600c09d374ccfb8234ebef46bcc97
b67da150a26a5a46a4053eb537853d22dd4ac2a2
1430 F20101222_AABGUN samaha_a_Page_45.txt
8516af15f9ab381e171e61a1af379019
3037712b9bf32544dfe570d9221649cbb4a8c8b3
1180 F20101222_AABGFV samaha_a_Page_03.QC.jpg
989136297074691cef179524d55146e8
26a4952acd7e9f68f623c9de60f790e7202572a9
386232 F20101222_AABGPP samaha_a_Page_74.jp2
1d01d7586b8b90b572ba2fdc5b911625
fb6cdc87ce34631787417baa23b708f487dc8cdc
12651 F20101222_AABGZL samaha_a_Page_75.QC.jpg
e7e17500fefffdc2d2e12d83c606d380
dbaeb770381bd9770f305f1f0bc39e3d0256e83a
535797 F20101222_AABGKT samaha_a_Page_58.jp2
d052027c5c073920484e93fe77951729
370744e0830093470c68d13c5bbe8da03db49e4c
286 F20101222_AABGUO samaha_a_Page_48.txt
930c34345b829e857a4ad911ed6bb3ff
6878aa3f6f6ada55ebebdb7b4ad12f6777021f83
8775 F20101222_AABGFW samaha_a_Page_63thm.jpg
11de1b512d0fc03271f24a47ea6afcf3
ca5dd0e02079f6716bccdea03b87fc6922d23e39
412312 F20101222_AABGPQ samaha_a_Page_75.jp2
49e4718761079a6acac8a49bcd50765f
f743572e708e8002d3520ce6e2b9516327197c24
88235 F20101222_AABGZM UFE0022588_00001.mets FULL
43a58f96afc891dd0530b22e653e09ad
861d54217e0a431ff15dbbe3d99fd6fcd617cdd1
2036 F20101222_AABGUP samaha_a_Page_49.txt
dbd07436250a0a1c32c82099b38bc6f6
4c9250951d3112a72868313f4f6df3feeb59b690
F20101222_AABGKU samaha_a_Page_44.tif
dced711b83afc23654e88ec7f40cf502
7cc76eb73efde6470f0d42de41ce02cf84143a6f
55264 F20101222_AABGFX samaha_a_Page_47.pro
76efc0564021de8239b1bfee0095e2e9
48f9d2db33f60fc3fec22130006962e7437b950d
F20101222_AABGPR samaha_a_Page_01.tif
9f093ad222c07e419cf202af3e39aaf4
b914c3ab23ceb3a319919c9cad7bacdcd7205ad2
1435 F20101222_AABGUQ samaha_a_Page_50.txt
e7bdb001ce79820b79e869216a241aca
70f9b95dd1d69dbeae5b865738207c8a285d0a5e
57062 F20101222_AABGKV samaha_a_Page_23.pro
50acb0d545b8b05a6988cae7fbd89a63
d9ed609186fc03610028dd8e0c24b2c8b30758bb
11945 F20101222_AABGFY samaha_a_Page_40.pro
fba225334ada7d5ac9f71a2b09c53b7b
558483ebeab44216c64b5dc11598b29721381893
F20101222_AABGPS samaha_a_Page_02.tif
d36246f45908c90293ba66ecebcb4437
32a9f432ba44dba2a06561e05eb318870042922b
1843 F20101222_AABGUR samaha_a_Page_51.txt
9f0c8737b5d5102b634bef68b80795ba
a336fec7cc44da21362a612a0922680054f2fd68
F20101222_AABGKW samaha_a_Page_27.txt
6ba8fc5b843bd2ce63c5998a4b2660a4
8bcf099bbf1a0aff50cdd67dc29e0cc9a025c75c
F20101222_AABGFZ samaha_a_Page_41.tif
00a084a26d2f7c846ea351d8601a7b0b
792ece8fa8dd6731014a604a232d5c768ea75af2
F20101222_AABGPT samaha_a_Page_03.tif
536876dd56f4ffb160e2502177cd6a04
862e854bcdd9f61ba85357548be1691f0c4c5de0
942 F20101222_AABGUS samaha_a_Page_53.txt
fb76d405b896c585bc2ff94e4d51789f
02661962e58056a92343026e032319132189bdc7
586060 F20101222_AABGKX samaha_a_Page_46.jp2
44c88ecc012ad80edc90e04aa9bfb0a5
73a2bba6a38990eeeea9c72e8ed2637a34da1696
F20101222_AABGPU samaha_a_Page_04.tif
e71ce0dfbea5569385f521fb4b5d1fcc
c571c014d80852b3022f096d12872609d8ee45ab
2198 F20101222_AABGUT samaha_a_Page_54.txt
ce64d0c704b33664dcc9b4a8f6bc9d84
517509b710cc1525f1984ed1ae579de685ee6500
F20101222_AABGKY samaha_a_Page_15.tif
41fc40ccd391000e6cb06af16a17ce6f
5f183a0bbd8b1e486ce00f50a0018f1e4177d558
F20101222_AABGPV samaha_a_Page_06.tif
ea145c8f76b09aec9a97adc7fff6d449
d81b11ac17dd50c47b9ef2af6ce4192643995295
2170 F20101222_AABGIA samaha_a_Page_47.txt
26e9f06219108ee55019e5ad6594a395
10f4c27b7b4cabb1c58c8a6df479d192c7322046
415 F20101222_AABGKZ samaha_a_Page_01.txt
b9b72b355c3b12e95c2d4a16095982ef
9eaafb3b8c261fcf772b0e33e114bc95f72e789f
F20101222_AABGPW samaha_a_Page_07.tif
458032904f4bd0614dbe489eb1eb921e
28725107aca04ecb0351102b00dab561d64dad01
40139 F20101222_AABGIB samaha_a_Page_71.pro
8fd54183c8078417b9dfda89764fdf6c
d41e2aeee87237265e6a4988c83be1da46e9c0bb
2113 F20101222_AABGUU samaha_a_Page_57.txt
1a05cbfad9a7708865a6dc3e1053bf0c
c82e278dd497f320c4621675dd708992752401b1
F20101222_AABGPX samaha_a_Page_08.tif
2d2a3363c26c6f9d5f5de59189a550d6
df412e5f70aa61215ad6755618ff744fcf944c54
54978 F20101222_AABGIC samaha_a_Page_22.pro
18dede64235d81201c6c0a2c91cb1bf7
a72555b72d8050624de20a11bf94ab44f1f0470f
196 F20101222_AABGUV samaha_a_Page_58.txt
3bd915d5e85fd180a6d316177d551562
47c3e66eab3e276e8913301aa3edc218597bdd31
111432 F20101222_AABGNA samaha_a_Page_54.jpg
b15b6e603c886cf4ad6c79bdad718d9b
a52a59942694c082307127412a450ed59e8cae08
F20101222_AABGPY samaha_a_Page_10.tif
b93eb216fe2a2406e14d37dea3bb4ba3
750b46d93511b0b0a28478898b3625a3f3253266
1051948 F20101222_AABGID samaha_a_Page_12.jp2
28563600539ae25abc4afee43b57014f
ec4d38735c8eb198f5f8bc200b43963cedd21b62
1084 F20101222_AABGUW samaha_a_Page_59.txt
4802625052047b4515d1c7b9518e3dd9
9e0c27f5d0d58af496bc16dd07edb34322162626
14381 F20101222_AABGNB samaha_a_Page_55.jpg
b07f3a1a5ec0b153be9d4c57653552e9
79e196f0421ac8074bf5d33a34a4cfdcab72ed3b
F20101222_AABGPZ samaha_a_Page_11.tif
c77d3a4894014259c8fdd0279e3b2678
9b01497533334be838e1c709f569337ebbf1cb21
11048 F20101222_AABGIE samaha_a_Page_39.jpg
6eb7bafc74407d6d6b86a16ef7870e7b
62416ff84246c1357373597068e268ba1aba8b90
394 F20101222_AABGUX samaha_a_Page_60.txt
cf1100086be82b04ca210b2c41f3fa2d
4c562b055d37a910e1262a14a3625cfdf9382b88
46639 F20101222_AABGNC samaha_a_Page_56.jpg
5c7f114a48ce428f2f180efa4d9ef1c4
d9ece2bfea11842bb121f57cca172806fa3aa574
8833 F20101222_AABGIF samaha_a_Page_22thm.jpg
905f55a8ced6c61845652017a52decdb
d96ed195461453a6e2801c29b754dd42f1b44177
1561 F20101222_AABGUY samaha_a_Page_61.txt
3428bd2db67b7e53a6d47f13273c0e79
9279e7a6b0d05dd3187db5f40ef2c7412e5298fe
57201 F20101222_AABGND samaha_a_Page_59.jpg
0d6f616fac5eab89aef58263df9fe4d3
cb740770eec2aea04b9bcfcb9eeb2b406e8ca9e7
854959 F20101222_AABGIG samaha_a_Page_64.jp2
608b6d62a89c9e7041d49a5c443f4462
4ccd1a25063662e49f15eef5d176ea2b4d14d0a8
52571 F20101222_AABGSA samaha_a_Page_24.pro
0d0c99933f6a5d4939e5ecdf847f4228
128f613bb796b813598cc055f358bde31406d829
951 F20101222_AABGUZ samaha_a_Page_62.txt
048626874959f590e123fcc5b0fb5fbe
1f0e45ba087db97e8e40a169f4de29cfd190c684
31096 F20101222_AABGNE samaha_a_Page_60.jpg
4442e1393086b350394f4913ed6f6bc4
1dd3455cf082fe283bfed2f79a2ac325da424c3f
F20101222_AABGIH samaha_a_Page_51.tif
4ac321a301665d77384da53a6125ec90
d7c498ebd87ca3bd974df24332658ec782e2a115
55083 F20101222_AABGSB samaha_a_Page_25.pro
71fec19fad744314ceebda05de6f4147
91e8c9d582a6787c5093da6a1eb26996f292aace
80914 F20101222_AABGNF samaha_a_Page_61.jpg
d335271a2a99c5824f7744b35936f1c4
99a29f8d9d617e1ccc2e09a79a3104179063b550
51704 F20101222_AABGSC samaha_a_Page_26.pro
5a74fca509aa841a75ca7923741d92e2
4d5eb1ce8651eab0738e057ab655679707ca05c4
18136 F20101222_AABGXA samaha_a_Page_33.QC.jpg
58f6562254b7d64a1be35663bc984f5b
c3122f7b52f5db54de8854de528f0a71b1bf7523
44493 F20101222_AABGNG samaha_a_Page_62.jpg
c308235a713409959199cc07517e5fc7
39f89274ffa67b009b4f8fe33c9ea8bef06db715
1657 F20101222_AABGII samaha_a_Page_48thm.jpg
8626df0beeace57a74417d8c839a4e07
2b460e6d846a5fee9cba9362e9656efefb0f22b4
53352 F20101222_AABGSD samaha_a_Page_30.pro
1970401430963131bd5fd6eb9cb4b533
1930dccd2b9edc77035b4faae948c064bd754025
35094 F20101222_AABGXB samaha_a_Page_34.QC.jpg
6a2e2f7142172ded9617f182ba0d98d9
4209425da0fb3c8b38e3de3421c3aa83326c595d
110572 F20101222_AABGNH samaha_a_Page_63.jpg
a6f2279e5c84dda85e782f6d609ea6b7
9ca8ef5802075539b5ed6a8802f93d7592d9dc9f
93445 F20101222_AABGIJ samaha_a_Page_41.jpg
f9b661cc3ba567674087cad0e857850f
e183180ebfb685613d1440717463e328f0bafbff
15087 F20101222_AABGSE samaha_a_Page_32.pro
3e356ff2893cc571757ba98c7900babc
6db6eb3db8bfd6c93ecd6d78806327bb46784c52
8776 F20101222_AABGXC samaha_a_Page_34thm.jpg
471a67c4e42f95d137c23983a55d0e2f
9034a516dcc9796ec0a966e04e8850a897a836d7
77048 F20101222_AABGNI samaha_a_Page_64.jpg
9160abf1b35ebd5966f8d55094100a08
ad78ee1ed07748dfbfe2874902b7ae15275d7759
108973 F20101222_AABGIK samaha_a_Page_13.jpg
e11f8eb8c567570e0759546115fc2c99
a5b9d993b20a88f8dbf8a69f96b48caaf29cccba
6591 F20101222_AABGSF samaha_a_Page_33.pro
af2335b161cfce1d8b10bdad66006638
4bb45078f077ee02fbf45d77936d378ecb95605e
36317 F20101222_AABGXD samaha_a_Page_35.QC.jpg
087a75681b5d80671f46d5dec592b81b
f9b777de7f00b4dd968146f14c62236bbb880361
102526 F20101222_AABGNJ samaha_a_Page_66.jpg
109ff3032eb9cbf573aad971755bafde
6dada7f97e77428aaabd79c2e5bd4d99ffd32e8c
52361 F20101222_AABGIL samaha_a_Page_21.pro
284b4f43748e48c0f68f14322141ebfd
457004f9638c8e889acf10490f2150eb7992c04c
54107 F20101222_AABGSG samaha_a_Page_34.pro
932eea33d901e8840906a3335a70ae10
2b1fb9f884834850bc87db0d4eb7c9f9632c7f79
8700 F20101222_AABGXE samaha_a_Page_35thm.jpg
42b409817cfe47191d05b1651d6586e6
863c9865eff236ec6489572eb1a22325b555540a
107736 F20101222_AABGNK samaha_a_Page_67.jpg
a8e42d61fc1beac3ec1f027ae84f7543
09bcaaaaa48a22d0c14986751349c89ec3ca6e38
25594 F20101222_AABGIM samaha_a_Page_61.QC.jpg
10d04990fd540120ccbc388dd7c9c650
bb22ba8a97b58ce571e50e762b2c5337e036fda6
55517 F20101222_AABGSH samaha_a_Page_35.pro
b29ffd0095344ec3160141cf17d96133
51d9e2739c0183ea32e1a43c3f090260902ed2c1
25036 F20101222_AABGXF samaha_a_Page_36.QC.jpg
246e77581f881f9693abb44c1f05779f
2a8a77ba639fc7b3cd9f535e3ade62af19fe186e
114334 F20101222_AABGNL samaha_a_Page_68.jpg
dceb9f552334b26a3679f41ee443337c
75cb08cc6a621244b7f6a446610e75b5d03ae4e1
8486 F20101222_AABGIN samaha_a_Page_12thm.jpg
c3c574ce034f83c96d31fe6c3c93187c
6e1d7575fdfe063773cfb5a0352ef93e77244557
36175 F20101222_AABGSI samaha_a_Page_36.pro
f0e81d3841f9effec836303a5ddb76d5
7a8f5a92b22b13cd14dfa988b12c22ae83c66b91
6065 F20101222_AABGXG samaha_a_Page_36thm.jpg
7ff2b1562c3a09beb2861c4fd66c8de9
a4cc787e73854a49cbaf10d3edc502817fc9e4ec
107529 F20101222_AABGNM samaha_a_Page_69.jpg
df20f895ba055ebc44898d30349c29a0
360788b232df9ea36fd82afaedb6d58784fd8311
1051975 F20101222_AABGIO samaha_a_Page_69.jp2
b2df6e62a11e6d0ae74a06d619230df0
9bd82f9edcf778f0aee77d182fdec99e2ce24cac
51836 F20101222_AABGSJ samaha_a_Page_38.pro
9824eca0ab70853c19fa281f5a6702ee
10f2f449482ac3fa9ce9ed4bab5d34f40877a94c
32382 F20101222_AABGXH samaha_a_Page_38.QC.jpg
3830bfd6d07914e1a1c89fbb723b4745
ad5b6f28d560c0e57decc8c00790e88303ab04b4
42270 F20101222_AABGIP samaha_a_Page_51.pro
b5efc298f1eca778231844cd2cf7f97d
121eb43c992920533f5a2d4bd3c56d375d5a9aa7
45258 F20101222_AABGSK samaha_a_Page_41.pro
705a605493cff8b4f9951cacbed10ba9
cd420d5738b73e766af291660082c0803d61e2a9
4194 F20101222_AABGXI samaha_a_Page_39.QC.jpg
28b5fdd22b5a831086149ffeab87b220
13246abb01e446acdf6396fa94092e1189ee23ad
105814 F20101222_AABGNN samaha_a_Page_70.jpg
b524a8e92377d46563bc913ad948cb0a
3429014853c5c7f36248438d7d5f7e68559cbc1f
8773 F20101222_AABGIQ samaha_a_Page_31thm.jpg
d7377880a5077679b07c1281a7746088
3777571e80a34b3a238bcfbeab6d9278f76000a9
42423 F20101222_AABGSL samaha_a_Page_42.pro
20a91a9c0ca285bcd22df027a5b7f465
dd98b3e3f95a8e51d1f82a4642dd814f1168c84a
1122 F20101222_AABGXJ samaha_a_Page_39thm.jpg
0bb16ffc5c4c09248cf0065d3266e095
6d71da2f2d13a26f619f7d49a86feb5b7514beab
81761 F20101222_AABGNO samaha_a_Page_71.jpg
0bd0db3c70c31ea9c53c1a8245098696
c022d0137813fdfa04dc1a33a3dee0dab53c4801
53647 F20101222_AABGIR samaha_a_Page_69.pro
b7d3f00fa36f9c11a105978a68744782
fbbcd43e5e64c28a8786f0a9f010d78fb90bdb91
43445 F20101222_AABGSM samaha_a_Page_43.pro
e00a36a1eb2067dfd2976df46bbeb670
207335e3fce537638cd280c1c10f17a4212d94b5
14787 F20101222_AABGXK samaha_a_Page_40.QC.jpg
e7dc6c5abcce93d872f8e15d2dc411f3
15adebb301d6915ee5eb8a73c9afe7743a51b2e3
123589 F20101222_AABGNP samaha_a_Page_73.jpg
0f816cfc89090d421495c7180df4876d
2663f330ed1f66227d03fe45711958fdcbd0bb65
1051978 F20101222_AABGIS samaha_a_Page_70.jp2
0e4bc639ecf48a7715d261ba44f4c4d6
2c54401407edc3594c67059ce2e9f1afa931a7e1
35998 F20101222_AABGSN samaha_a_Page_45.pro
82b40b9666d37533bbc810608ede6317
d449fccbb7668493c15731eeb565cec1248dbd66
4739 F20101222_AABGXL samaha_a_Page_40thm.jpg
4beb67d34214fb231f4bd494fbc72b49
318bfcf99ff3a528aac88b1da9fb6d86ba2c59c9
236242 F20101222_AABGNQ samaha_a_Page_01.jp2
f6cd9afc6cab7f8655aef7eaab9f43a2
98860a1a79a71940231d7bfbc1843c29b6d092d3
35941 F20101222_AABGIT samaha_a_Page_54.QC.jpg
2cb893a95503dca6ffee97d127c11714
32de765192f7df80b437a36a24a47c53d6986496
15534 F20101222_AABGSO samaha_a_Page_46.pro
a52893a751dbd88e1e5fa69be43b6b29
e30e03e72751b53148de8acf14d314999c42b833
30464 F20101222_AABGXM samaha_a_Page_41.QC.jpg
05b3433f175467b9ba5b84267b7df564
0eaf4d2f7beaad6fed8dac8fed49eb0cfa397f7c
31928 F20101222_AABGNR samaha_a_Page_02.jp2
b37bcfa55e1c2a0d60a42639788778e4
34d938469066b3f09259c13044c04970cdfb8c92
1925 F20101222_AABGIU samaha_a_Page_52thm.jpg
1e5457a0dcdea9b86136b722974c72b2
4ed6cd8f95d5ffbb3239b1a0f2024a7eadb77ba1
47481 F20101222_AABGSP samaha_a_Page_49.pro
0236295fa2f1a9c827be5602f52e5efb
20b878c411f305d65f2c5c747cf04a5e791e0bbf
7256 F20101222_AABGXN samaha_a_Page_41thm.jpg
0ed7bc92f0ae3dc34a5ddd4a9f4bd63d
e5cb30db4e83659dbc07711f88ed7be54d20ad4a
611896 F20101222_AABGNS samaha_a_Page_04.jp2
db50d952af08a3ba1863c0cf6c1962fa
842b42e4eca7b7dabea6159f2fd318697438ae66
2195 F20101222_AABGIV samaha_a_Page_14.txt
696e5739ffab84bf4f147b31a95ff2d4
3082b041e970c557bbf17c1d3badd6da810c29bb
35604 F20101222_AABGSQ samaha_a_Page_50.pro
4e89a457b96ef2c6b3d2f96faf1fe39e
9712df01172dbdc1e89bcaf0a261dfe2ecdf78df
28562 F20101222_AABGXO samaha_a_Page_42.QC.jpg
e667326e09f2834014b8c46943a13e7e
59a32514b3bc8419aec0559efddc4b58293c48eb
1051977 F20101222_AABGNT samaha_a_Page_05.jp2
8e98faff0fcfffba1f631f38cf5ad41c
9f82ad4ea594cfd41e7e4a66da7e5318644a9430
F20101222_AABGIW samaha_a_Page_26.jp2
76a1a95177008a892e6bdb429aa78789
8bcedcd0ae034bb1f433df3a9f28441bb3b31553
8544 F20101222_AABGSR samaha_a_Page_52.pro
120e157e1e4e330bdea3fbddf4d01e51
ad2b0c871d4fac11ff2e4e9e9dcf8ca34a7ca6dc
7414 F20101222_AABGXP samaha_a_Page_42thm.jpg
31d38ca72f5ab0ea8653f88e4c03e657
bc1dc6a5de118e701ae7b4f0a6dbee6c5ade11ad
F20101222_AABGNU samaha_a_Page_06.jp2
383a003096d130e5756dbf2835dee4dc
58e3c2f3d4416defcbdfcea0f46f8c8b1217c671
1837 F20101222_AABGIX samaha_a_Page_42.txt
48b07fa43ded285b39d33b63c2980aff
1076d428504344731bb6207463abe683f868843e
29207 F20101222_AABGXQ samaha_a_Page_43.QC.jpg
04044b8d4a40ae40700b6ec6d3437fbc
0f8e439366c62712d5fcbcc60df8f3749ca47e5f
715973 F20101222_AABGNV samaha_a_Page_08.jp2
d0d80f1560ffe790bc3bd8daae86915a
cfde8d182761609ef8c73471394250ea2140f8bb
3250 F20101222_AABGGA samaha_a_Page_75thm.jpg
5c0d1fd3011394f47669bab2d9f974eb
8fe95b6121095af35da2385deef34682736f9dd1
1256 F20101222_AABGIY samaha_a_Page_28.txt
a2776d78cb5327fd277c86f18fb4c6bd
1b69166aa2cddd388604cdff24e6a1044f453244
17160 F20101222_AABGSS samaha_a_Page_53.pro
4a78e8553b202bd9379ca6b214eb4418
909f71bb04e1111cd49333d8154496c3e0cf2b00
7320 F20101222_AABGXR samaha_a_Page_43thm.jpg
adee28108576aa439f8fbf44b53d93ee
98b2313456b7aebc0207ca24053f722a58fc2695
838564 F20101222_AABGNW samaha_a_Page_09.jp2
5cdc3299a09b54dcaef8ad534c226072
4847bd75cd37680037b9de7f7f3ce4dcffcc4eea
F20101222_AABGGB samaha_a_Page_28.tif
4ab35b29c34b4ef8494e42a281fc11b8
df22e1f3d3788046ed85f01d4d9726e204348904
1051966 F20101222_AABGIZ samaha_a_Page_13.jp2
ba3c1ff39adac96e7ea232897725cc66
b717e9f00ea903f3a75ac82a087a9e7c18c00db2
55886 F20101222_AABGST samaha_a_Page_54.pro
4555449f6f304df9cb7fd38b3cf21bfb
edb2f06c44815e02427297c23c3f4b7dca713bee
34536 F20101222_AABGXS samaha_a_Page_44.QC.jpg
3b5fc2184afc1f44f7dd70d4d22178f7
74b5538815b2ddda106878c9455234677f75a5d8
1022997 F20101222_AABGNX samaha_a_Page_10.jp2
cccb8c16ba3c631d3a4bddae2ce1f4d0
ea5ff98ab4250395ee4e18d6ae875136a3220b2f
F20101222_AABGGC samaha_a_Page_63.tif
e0ff22d4cf393ba870fc3f8c90d12a18
c0db05b10047a56b935fa7be97bebceef06c9c60
5591 F20101222_AABGSU samaha_a_Page_55.pro
62c5831c1b6aee3efbbf4b826ff0f0fe
743b7c10d5c9d87a2f1d8038fb9bf7d665ea9546
25226 F20101222_AABGXT samaha_a_Page_45.QC.jpg
1361b864e02e919c6e3f12593dc47ea2
595c1b4fcc2d7a68b8fb428883aab5ecf8c11770
664961 F20101222_AABGNY samaha_a_Page_11.jp2
bb71c6bdf16c570fa05c958c7b1ee033
47bfdc0303d65be5e23ef1217b57d713d1e9a92d
57475 F20101222_AABGGD samaha_a_Page_19.pro
9495a2946bd5a08418e489a56e05255d
a2d0caa77e1d3a7157490e3b54e47e8f91f0e997
15817 F20101222_AABGSV samaha_a_Page_56.pro
0b9c9d67bc48b852cb70d454be90f322
2b769074b43a0acccf2d699ada48bd6bbd8ce29e
410007 F20101222_AABGLA samaha_a_Page_60.jp2
cbb138152076d40f6d69c7747f1c779b
f1e26a5d0ae13ea965a6b0c2c19ec7d68d61fe60
6100 F20101222_AABGXU samaha_a_Page_45thm.jpg
61653ee4831dd6a1c01abf5fc83d0b09
1e2321f9ef56d71ed30a41a7fa4403d54e4a3bd1
1051918 F20101222_AABGNZ samaha_a_Page_15.jp2
683284f832ebfe025446fc88dbf4781d
83acbf33ded3bd471660c356bbf10866e86f76c7
1983 F20101222_AABGGE samaha_a_Page_66.txt
7935a588643c6fd947287bd2aa4c788b
74357538803c46e7972626c22a6f8ee41b9ad76f
53428 F20101222_AABGSW samaha_a_Page_57.pro
20dc6c938a55e81247f9287aa44a9773
cc071916d779c52c6d18e4f56b34d6c3d4204c71
35299 F20101222_AABGLB samaha_a_Page_27.QC.jpg
c53011ddf2ca40b97bd719ad38afb893
02d21eeab57f73e2a08bb892b0931140857daac9
21368 F20101222_AABGXV samaha_a_Page_46.QC.jpg
f20d2e014a722cd0aced0c96bfe2c37a
a9f30e21dadecd29380e7b912dad8dcff74587bf
57367 F20101222_AABGGF samaha_a_Page_05.pro
5fac6f7380f5a47e094ee65cb1922f38
e2c9fd0bea855e8a3fb2d01f9756b250b6e9c23f
3870 F20101222_AABGSX samaha_a_Page_58.pro
323edb1024023bde5fb026850e2edfc4
0c24701b44395b47df07e36ba70c43592fcb4022
112354 F20101222_AABGLC samaha_a_Page_16.jpg
374805ae647f40faddfdfb26506b1255
f78dfa507f9df4eaa80b136af64bdfc3cff555b9
8836 F20101222_AABGXW samaha_a_Page_47thm.jpg
0f1cea0cd0edb8989bb02654aa50dfd0
7c414e9d5ad4c9087947bad4a756ef720a385d4e
F20101222_AABGQA samaha_a_Page_12.tif
0dc9d9e6e7f1222e8f855e4e138b00c6
9e898a4a77280e667ba3a7fbef9b6f0b60a3f9d6
27123 F20101222_AABGSY samaha_a_Page_59.pro
32e1d3f938ebd1840713f533c4dd7324
c7bdde13b3b1f14de807cf9ff78ed5b6e91a14af
F20101222_AABGLD samaha_a_Page_32.tif
2f4aa259bc16551e3c35af5663364811
93d118d7c7415e858ba577997ec80fbc1bc847fb
F20101222_AABGGG samaha_a_Page_49.tif
0252f85cf3b3b6ac57706941d07d7c31
9b5013afd8df8dba9ece8d70c534a2f6bd5366f2
F20101222_AABGQB samaha_a_Page_13.tif
a8ae63f71283d0b006c6e20eac1dd76e
09ce20bca8f99f638bbfda132ae1160376096b3a
7990 F20101222_AABGSZ samaha_a_Page_60.pro
490910bc52e97c0c9e2303571cfb1c30
ecbaf9bfc4cea1b42236254c87f37d76571d6ee3
3062 F20101222_AABGLE samaha_a_Page_08thm.jpg
40dd23cadbfd25d39f72c18acd207e96
8d6a6718b4377ab3ee1e7f6c0cc0f4f3a1a63489
24017 F20101222_AABGXX samaha_a_Page_50.QC.jpg
00404242f53cfca2dfd6984d2de77f78
79ed63ac23e974855ef82860704c4fbb287cd9d8
55919 F20101222_AABGGH samaha_a_Page_14.pro
7a1b3246a948b6ace09ceedfd8b373d0
9d1e39060eac1854deb5d7826772b1d20dc7038e
F20101222_AABGQC samaha_a_Page_14.tif
aa003869ffdfeb101b8b1fc49b7494af
36a92200776de70fbda330d9432e1380b86a686c
107764 F20101222_AABGLF samaha_a_Page_57.jpg
c0b9aa3e01eb70a07b94a1fb4fb49c27
62e908fdebf01326f79e0f80b0feb53bf37f8817
6193 F20101222_AABGXY samaha_a_Page_50thm.jpg
8c0fedf38d8643d39b74cc88416e0f14
6d13804c74441c9a9a38322fef9c2f42a9df71b1
2159 F20101222_AABGVA samaha_a_Page_63.txt
34ee66640f50a46f62900b2a325f2840
89a49262b7945be3cb934e28b2fc89e2ce35cb9d
36098 F20101222_AABGGI samaha_a_Page_30.QC.jpg
d827974d2bdcee6ed4deaa74c9c18f21
47d715538f467df6e99f4f798b8494623be7d8a6
F20101222_AABGQD samaha_a_Page_16.tif
3620921d0d8f281a9640d5dfc9c79572
7a603140e07f537394f0d3065969dc7be9aa1e04
F20101222_AABGLG samaha_a_Page_17.tif
0e58af38e98186f0279fd13918c68f5d
eafbdf96d22255d0af36842a1d09060e098e4d45
7644 F20101222_AABGXZ samaha_a_Page_51thm.jpg
0409c4e365d5c3cf3b1e60002b66d001
87b0353788e5998638c3e70638d7ca14fbb3cbd4
1486 F20101222_AABGVB samaha_a_Page_64.txt
1cd617f4777233ae28197a109a464997
1127fdd1938bd6f063b76a56c014ad47d8068e4a
38572 F20101222_AABGGJ samaha_a_Page_74.jpg
f1b7bc7adbca0791fa89869fd1521138
ebc82135e709d99ac5db6e6d3a4504cd5c2c6c7d
F20101222_AABGQE samaha_a_Page_18.tif
f6023c14d438d39dc438c4c058792150
fa66cfd274fc431b19b6901287a10fa65fa9b941
19991 F20101222_AABGLH samaha_a_Page_11.QC.jpg
730147c91c099d905c0ef9e07f672354
055f277cce92e97b6bad479bd0354c97ec53fda7
2119 F20101222_AABGVC samaha_a_Page_65.txt
cfafb14dad5231f9f0428eda7b5c9da1
cf5abfae330c531176c837c22bcf790ba744fe5a
1051973 F20101222_AABGGK samaha_a_Page_30.jp2
ac1c5c647d53dcd3e9e70d885bf77840
c1623127c0dc1cee2ffd319d19d5c968f2dd68c5
F20101222_AABGQF samaha_a_Page_19.tif
ba4f0f6cb7a1f37bc42a60451cead6ad
d5e2a1b1130e15f1dcc69beaf5d9c508a985255e
881467 F20101222_AABGLI samaha_a_Page_61.jp2
5eb8be3fc83e722093a1fa40ed880e6b
05a044db730560858d822cd8740fdeff5bf30179
2094 F20101222_AABGVD samaha_a_Page_67.txt
d2e9bb7d3c68f50df1f9aa048df6b476
8061f4ef0ad0009a9d608844b15bb8fb81369e44
20004 F20101222_AABGGL samaha_a_Page_03.jp2
f6f2b81753bbeb9c00ef85dfda3b90cb
a214fdbf07333729685dd8edcbb174d9b96481d3
F20101222_AABGQG samaha_a_Page_21.tif
8578c4d6cb6bc89cf77bc59f5a16ae5a
d5cb3c0fee500ed0d3c372ab16830a96bb3664c7
4198 F20101222_AABGLJ samaha_a_Page_39.pro
e4de8897cc988cf97dcc52ca8297558b
48bc21f1f19cbd79bb17542fb8280a1a3ffa649f
2220 F20101222_AABGVE samaha_a_Page_68.txt
e0fb69ba9b2768b45b13ebd7341d7d4c
f1b77fc9ff15a7ba567a354c01d5902ee49d4de5
2222 F20101222_AABGGM samaha_a_Page_20.txt
37816043ebf36a7310d1208a734fd5b9
55d27995f7faf838c0ebd0ea34114f543ba01339
F20101222_AABGQH samaha_a_Page_22.tif
171fbd78900c2b65e98c21b26b792547
61f39acd3ba54178223f6a44d7198daf31062c4a
F20101222_AABGLK samaha_a_Page_27.tif
abdf95d477a2dc7aa730d6a58497721e
2f6ad43e321b2151874dd75879eba3a8a3993805
2342 F20101222_AABGVF samaha_a_Page_72.txt
742c4f5d2e9e2ea86df1473a0d22c275
eba9aca11e69ce02f864259bdf0647eb14262e00
3292 F20101222_AABGGN samaha_a_Page_03.jpg
e4f8c7baba884dbc5626aa33e76547fc
6d05bcedfc97e4e505ed423cac2383b24a407569
F20101222_AABGQI samaha_a_Page_23.tif
d0c61bd141493f1ac8ee853d8f186ecc
9634557184fc3649ba84bf5cedc1f62cd288d7b4
2516 F20101222_AABGVG samaha_a_Page_73.txt
184cc2f0c1be680493ae98be86f4340d
b24130b1e3289e8d0b98cfbbd7316541b1c7e223
6314 F20101222_AABGGO samaha_a_Page_48.QC.jpg
581daa83f7ec8f042da262374ef3d007
f1239129ddc2da1ed98fc2803163fb6086c33012
F20101222_AABGQJ samaha_a_Page_26.tif
5409a7c068cc23fee635d6923b5fcd4e
ab23509eb612c12cdee7de462af2423539f55e7f
27957 F20101222_AABGLL samaha_a_Page_51.QC.jpg
c0cdb533cb3649c434b68551452f8c6f
8de381a775d72500ca077fcdb509b1ea564d42f0
738 F20101222_AABGVH samaha_a_Page_75.txt
c3410122e7a6428b318e3c61a71061e7
8b283ede80306a63ce3f02a2ea5cd36c59aa41da
F20101222_AABGQK samaha_a_Page_29.tif
04b62f0287eb507fc279b72320220b23
c0dc4980d48cc595fc26493318011f27de0bf0d0
2109 F20101222_AABGLM samaha_a_Page_69.txt
3ae9ed70a827dba17213a64d2fed3f80
7721b68cec5143bf712e6ec2149f6b70a154a696
2062 F20101222_AABGGP samaha_a_Page_70.txt
3555d6325e1e73a2d3710f465595fadb
30ef06c77ba7f88589b57ee9a0803c3c69da4cdd
2017 F20101222_AABGVI samaha_a_Page_01thm.jpg
99a8277ccc8bac41f85537465a7879b5
f805c84ae8cec2a27cbf77bec74b7fb847d17ec1
F20101222_AABGQL samaha_a_Page_30.tif
6db07a7a5b2d246ecf3da94d44fc9e68
1aa81f0672333d1e2910518a116d332a9176c940
38101 F20101222_AABGLN samaha_a_Page_09.pro
b853ad43ee92d5ae6cd04e0192d8673e
1c23dcd40cfee0eb58245153417182ea88843c28
1958 F20101222_AABGGQ samaha_a_Page_06.txt
1dda172a28319e993133aa98e888b43e
50cbf87b18600ac0af0c1e2a864084a53a202e0b
7501 F20101222_AABGVJ samaha_a_Page_01.QC.jpg
ebe3c16ca47f4ba40bdfc973acc063f0
db3f2e00e4fa9789bc4bb397eeb25e9e9c43ad25
F20101222_AABGQM samaha_a_Page_31.tif
9aa1bf8dace480a50abab0b0ba170a86
cccedcb2406eaa156f64bf6404b975e65b42605a
113884 F20101222_AABGLO UFE0022588_00001.xml
a277799122c738bf4493a596d0c06a24
299d50e7640cc02e43fcb054c510e707ef49034c
5233 F20101222_AABGGR samaha_a_Page_28thm.jpg
151866797a1c9cb12ae04cec6ecd8256
5d4697b91fafd46ffbd1faa4eb7452d1a43b53a7
1323 F20101222_AABGVK samaha_a_Page_02.QC.jpg
6c12b28a9b2c43b834801c7e4be70757
d29af5b16e1f4459be3d1579a92d8e7e734fc61c
F20101222_AABGQN samaha_a_Page_33.tif
87b20c2ff433b057eca01e3011cdb01e
2970bccd4c2d6d0c94a25940c84cb1574ce221b1
F20101222_AABGGS samaha_a_Page_39.tif
9419ee6614cdc7ef6c8fe69830d52ef4
ebcce8bb24766c2d0629e84d647d730d55b34a25
673 F20101222_AABGVL samaha_a_Page_02thm.jpg
deca14d5a29e8f49fd54ed09a70e5789
38edda2507bee5e052f16e8ad9954e1202374fff
F20101222_AABGQO samaha_a_Page_34.tif
5ee5d39d5d4aadf87ce8bcee8ac9aa4f
b13fa25ca76d24534656b0fe1496bce36422e85f
F20101222_AABGGT samaha_a_Page_73.jp2
3b18949433ddc3aa7e98dbea2c7ca142
782633a57a0b4f5ca1d357245d3eb84332760282
443 F20101222_AABGVM samaha_a_Page_03thm.jpg
95b6e3af1a732a04ab943f5d376b0e8e
21374e72f2ed75f23df10c17542bfe67dd4a91cb
F20101222_AABGQP samaha_a_Page_35.tif
c18eb443d5e24b6687368054e6d82f4a
29bbfe278ccdc0d327bbc1b8323d5b10a5ca0676
23951 F20101222_AABGLR samaha_a_Page_01.jpg
6d6f8716a99c3e607de0672bfd1f1a8b
0b07b22ee16b4eb6a02e15d3be56b9f7e7c566fa
817873 F20101222_AABGGU samaha_a_Page_45.jp2
180e52bb32bae48302c0cb092227a05d
8fcd19475ef34a381a309d8ffe3375deea8fa011
18907 F20101222_AABGVN samaha_a_Page_04.QC.jpg
da894ba016513ff74d288bfc8866b65b
5c13e627deeb17b360351360ea0f20ee7ad2fa3a
4554 F20101222_AABGLS samaha_a_Page_02.jpg
2b94b3322cdef69d9e3db93ef502ef44
c3503012bee0f41f385306cbd8d8f6653c3e2e5f
6317 F20101222_AABGGV samaha_a_Page_46thm.jpg
3e144ac7bf13ae9b5e88e22d1d753cf1
7a755151609ed7b6adf6d26a8ffb1844363172be
4669 F20101222_AABGVO samaha_a_Page_04thm.jpg
94b88b9f695e17e5f950074d9aa9ee8e
73499fdc92be685da82953a724675c64c66bbea1
F20101222_AABGQQ samaha_a_Page_37.tif
6dd43ca413b3d80d878524dcc3404dc2
e4bbbe3c44c45073ab8298ea9a1ba16d2f8bba3c
72684 F20101222_AABGLT samaha_a_Page_06.jpg
f4bdcc57567c3396d0d693908a57a98c
6817e2d2adff98c9661946cf1d49576ee3e0588e
109504 F20101222_AABGGW samaha_a_Page_25.jpg
fcb5280ac58d6c41c5cb8f174e4f4276
636333f94a10f7b1ed594f4ce1838db41e02cde7
19578 F20101222_AABGVP samaha_a_Page_05.QC.jpg
6a575961478de4947b8f544fc4dcd6fc
b36794d2204cf0edbdea68a3933da2c4efbb9dbd
F20101222_AABGQR samaha_a_Page_40.tif
5098ce9e6117c66b3526c0e7a3cab9f8
7da8c514dff5ceff803c4a02ea0e4c3c108570d5
15633 F20101222_AABGLU samaha_a_Page_07.jpg
ab7d81b2a55884fbbf7843fb2a4a1f2a
6dd520abba9f50fd6f96b40b5b0db14da535726e
105767 F20101222_AABGGX samaha_a_Page_17.jpg
e77cf525dc1e32adc90aa1149bd8be58
d94393111ba4120faff77e1d5ed80ef8e3dd11b2
5097 F20101222_AABGVQ samaha_a_Page_05thm.jpg
8b9c33ad75483f5ee1f8619fa78d8eec
6fbae8ad22af393ac9f7b8587b9f7e5ce3858ac1
F20101222_AABGQS samaha_a_Page_42.tif
f471a55edb048a4e600e4a8c4b816490
3f2745e4a61598d15a8af6c927fcac5371564f69
42097 F20101222_AABGLV samaha_a_Page_08.jpg
9c2a2555badeef2bbcda015859a3c9f1
c41200ddf3f0e541a93ebc7983a3f5695ec820a6
8594 F20101222_AABGGY samaha_a_Page_70thm.jpg
cabb994c6f0a4be024f2c19d7420d40b
49009a5fd65f288e327e64aced115f64bbee6b8a
15043 F20101222_AABGVR samaha_a_Page_06.QC.jpg
35eeb9f0ed931a08a52ab744cc6c4791
1097f1c210e03baee7a378567fa7064065f932b3
F20101222_AABGQT samaha_a_Page_43.tif
e214bb8547179b2ea1af2af7a89d5dc2
5a116feb588e4ef3c996fdd7bf8ff68a9b6970d8
76874 F20101222_AABGLW samaha_a_Page_09.jpg
1b81f436089fbdacb0e96025dc566321
0f70e72a20c16ff4da99481cd9c6e70e733b1be4
F20101222_AABGGZ samaha_a_Page_46.tif
951b6c4a75aeeb42971b31227bc1d3d5
b711afa5d143cd2c6628617c8f28eeb7cf8c80a6
5020 F20101222_AABGVS samaha_a_Page_07.QC.jpg
9f44a8e8b98b9255e93f274c18257ca6
c7ee3db040148250b48ea8297fd2936e9874bbdf
F20101222_AABGQU samaha_a_Page_45.tif
27e78efcf08f2ad79ae0825a5dbc408c
9db2758e2d0bd5648bf9a5580779c6c4c0666ec4
94427 F20101222_AABGLX samaha_a_Page_10.jpg
cd0b0df9805dc1b73bd2a02ebed5b54d
a863ad2203a67a3c54e36e063d73047ac9715db4
21345 F20101222_AABGVT samaha_a_Page_09.QC.jpg
9b6cd2f25031f91041249fd766ee50a0
64f238dea82cb37beb77769e36a11a7cdba017f3
F20101222_AABGQV samaha_a_Page_50.tif
fb89fd1fb45e161fa1fa1d9d5b49776a
0b461d6376ddcd13937e4d17c666cf82bad8534e
1797 F20101222_AABGJA samaha_a_Page_29.txt
faa67f90ddeb57aa1a7074574f79f423
c4d22af093648519e18bd70217d71e3814485d89
61964 F20101222_AABGLY samaha_a_Page_11.jpg
da1b58162646475a1b643db6c0d134cf
d1eae0fe984e97b9e58f8eac2675dfd372083a3c
5337 F20101222_AABGVU samaha_a_Page_09thm.jpg
c088f08316b6d8aabee3f668a8b765ab
fa1f1fa4703eb3baf3077eb39dd03225e0fe174b
F20101222_AABGQW samaha_a_Page_52.tif
bd467e210713bb8559371b34c5893e43
1532517fbf58e130519d4ff2c4ba82fc64457e1b
40278 F20101222_AABGJB samaha_a_Page_75.jpg
f6b98bf1ab09734699410812c10abce1
9a05045340e63cebf08fa70e1453f26666c67741
111124 F20101222_AABGLZ samaha_a_Page_14.jpg
f83cccd560b6c14573481c52fcb46a51
4fc6a2b059b8c6efd3150fb32cf7b42234744acf
F20101222_AABGQX samaha_a_Page_53.tif
871dd4158cdef9e80d2f560340f1fac6
7e9d6591f5343036a77d3768bb51935087099113
F20101222_AABGJC samaha_a_Page_47.tif
ee18e49022ba73f52e8c845cb69016a9
2333900a2ffe168c9b8fc15f2decc73e2ffea51c
29451 F20101222_AABGVV samaha_a_Page_10.QC.jpg
5408b19bedb0bdfeae460ff45846199b
51f1d33d0337274822b7407de6b313b2212e25e6
F20101222_AABGJD samaha_a_Page_68.tif
54df5fb718cf44431378e94e88a8fb37
fbc2bdf3359b315d34d94f6e2747e63758a35502
1051926 F20101222_AABGOA samaha_a_Page_16.jp2
d0ce881cccc26e182851a86a3d4b2394
f3faa9f41d54e8a30810b55e83e9e044c3b47ba2
F20101222_AABGQY samaha_a_Page_54.tif
29e8fa1f7f45ee9e2b17a014f7973b9f
06064d59504cdc4029915d83bee8c4c70ab8e005
34808 F20101222_AABGVW samaha_a_Page_12.QC.jpg
0f04411e1f4d26621c705d89c85cd629
73f1ac19d1e3a250f674c5efdf0a7fa0086bfd00
126492 F20101222_AABGJE samaha_a_Page_55.jp2
04189819664a3a1058fb7f7cb7bd10ad
cfbe14b8f40d5b35766d5fd03a167ac081c0aa5d
F20101222_AABGOB samaha_a_Page_17.jp2
b054e208262cea6880fdd3957d3259c6
e2895541a0d6cb5efdd64a6d1d065910f6cefca8
F20101222_AABGQZ samaha_a_Page_55.tif
bd3e0620d9dd809c0b96f8e07aab32ed
cee359c09212c607bad1362a6e29f84f17823c5e
8696 F20101222_AABGVX samaha_a_Page_13thm.jpg
af138dfa325689c1d0fc87dd72891879
e76ce8cceb7da0a735529afd413eb4c7b35bc538
2057 F20101222_AABGJF samaha_a_Page_17.txt
75c05cadaaa7192e0664d7123455e677
4a91c0aed52e0c8fb155e7512367569c566031b9
1051986 F20101222_AABGOC samaha_a_Page_18.jp2
401318df97b4d1d3400f930270a8c3fc
442cf298b8805a7d4000ed683f15ca6f8276dcf8
36363 F20101222_AABGVY samaha_a_Page_14.QC.jpg
3c535c92946799b69560bd50bfabc8ce
3271544ca98e357d53336f5466b15f4723a67ad9
1051962 F20101222_AABGJG samaha_a_Page_72.jp2
cea44efff982c5b7b7493b9d0c2ecaa0
52976fa7e35adc692aac9d85468a59c5d2c8b013
38993 F20101222_AABGTA samaha_a_Page_61.pro
032190bd95e45dad105b1c2fa9db9f78
a1b4f970a5799fc932822e5fade6ebb80a34e53a
1051925 F20101222_AABGOD samaha_a_Page_19.jp2
bdd8a64792eaa37de1e8c64aecacf5ea
13f9d72ab433d9e367009a0fd585dda9187682e4
8977 F20101222_AABGVZ samaha_a_Page_14thm.jpg
4a01dfe6067049722b0ebd2e3826799c
ad1afd437af3549ee8c23053a4e2f0427bddd85a
597280 F20101222_AABGJH samaha_a_Page_62.jp2
29d575c85791ce4ed78c924b45b6928a
e823e187d4ba77b67de8b41572de854dd75b52d3
51966 F20101222_AABGTB samaha_a_Page_65.pro
3ca305bc96ddde3f89d0665768dbc324
3e0777c2a7684ff03afee45f253c6efb770d7f7c
1051946 F20101222_AABGOE samaha_a_Page_20.jp2
3f6b3ca4081a1499af11abfe8566a603
1a491062579039408be8661d80718d76a885f727
54746 F20101222_AABGJI samaha_a_Page_63.pro
5fd68acdb892887b757944b3530bed3d
dfe40afbe1e6fea928147ee4e43126979ff8578b
50401 F20101222_AABGTC samaha_a_Page_66.pro
de777c17da4bbe471363ef1966281e78
23aaf9182ae6d0989a3a75dd966781bc4e4b7b36
F20101222_AABGOF samaha_a_Page_21.jp2
557a948d67c3959e23675b76a9ed8167
b35dd72a7745cd4ae39e4ddf2609eb8f90f0c888
7065 F20101222_AABGYA samaha_a_Page_52.QC.jpg
033e6095014eac4356b4a62e2284c7e7
49f4cba1261c0c5cbb0bd1ccb390acd0c18f5ac9
53178 F20101222_AABGTD samaha_a_Page_67.pro
4b1a2f41762ada15de2800f612a8b8b2
764a20e6f8ba5080afa66c7ba9b8f80830274029
1051908 F20101222_AABGOG samaha_a_Page_22.jp2
53285d7d23b6bfd6b3d8ae57aded3abe
e53bd96de3d57f6276dc370e9616c1c47cd7f5dd
12324 F20101222_AABGYB samaha_a_Page_53.QC.jpg
1774c426255ec7bb145c576bdb279055
53cd020378e53c20a61a82bcc9f46030a10c7533
20609 F20101222_AABGJJ samaha_a_Page_52.jpg
01ee6cd2fbd02c7fb8cbb25fd2cec493
4f77893c35b1ba3a16dd9b184a47916b0c6a3717
56528 F20101222_AABGTE samaha_a_Page_68.pro
e77a20f142c9adbde887e399c23ca3bd
6604b85e25826ace584d8c9776d59db0bde5f4d5
F20101222_AABGOH samaha_a_Page_23.jp2
174f0ea2f393825c0dd9ddf080334c33
e297a2af8c9003f623d24e04adec6e88ecd92ef4
3632 F20101222_AABGYC samaha_a_Page_53thm.jpg
5671100206792055ef9aab381da14dc5
f946540750081cd690d1254b5f4afe087980fc9c
41139 F20101222_AABGJK samaha_a_Page_29.pro
7b8bde978314fcb54744e6f4ed17c490
57e64e6fe18d0057212dbbcd49d6635e59304773
52496 F20101222_AABGTF samaha_a_Page_70.pro
2180b65889c46e1e667bcd4665143e6f
1e6589408f7080f3e9e4c5efd4b6fc347e0e8bb2
1051896 F20101222_AABGOI samaha_a_Page_24.jp2
f75f699d9de66057ff8c60c0dd01170d
0e6576338b975934d6b455d01858f32016d57c02
8739 F20101222_AABGYD samaha_a_Page_54thm.jpg
8a62768f178f2825786ebe409e17255f
85bfcb512d4e513d9a621addcd0753249d75424d
58658 F20101222_AABGTG samaha_a_Page_72.pro
02792659c0a4ba2d682bde5cdb87a065
177dd1902af39ac82b03136ce8169a2ee73542d9
1051960 F20101222_AABGOJ samaha_a_Page_25.jp2
d3fdd1d72265848c1de82d0e38af5b79
cd47e1d1a4f7659c54bc18407122025e696c0da2
91611 F20101222_AABGJL samaha_a_Page_05.jpg
68d853cdb99ff66343190af8828a873d
c353d9505d7b18db72f36b38402fe703e664e751
5343 F20101222_AABGYE samaha_a_Page_55.QC.jpg
17bf0a583557ccfe412b51a5ffefffd8
a90466092293f27ae18115f531db06ccf4684096
62758 F20101222_AABGTH samaha_a_Page_73.pro
535fa67d905e0c99aa8fbf6eef4bffdb
8b4eff811909f3f18d659ab48cf627bdedcccadc
1051929 F20101222_AABGOK samaha_a_Page_27.jp2
54c8b32712d14195b075de00cfb9ebbc
d1a6df7d47314b1fa6dcbddd460b2a3309b21175
25336 F20101222_AABGJM samaha_a_Page_37.QC.jpg
17efcb1b3c04acdbb4a3d0e34f04996a
ce9929233a3f52451035732bc8a46f0123eb97cd
1360 F20101222_AABGYF samaha_a_Page_55thm.jpg
8d13ec33476e6556ee87187bcaf3abfb
29aabb2e9939f6d265d24bad095d0a299c1745b5
17078 F20101222_AABGTI samaha_a_Page_74.pro
3f1dd5fc484d9adec5122af06125f203
ca756ac046940f513c9c4ed1a3d6169823ffd7fd
936966 F20101222_AABGOL samaha_a_Page_29.jp2
0e029699c8b5e49f9315fed5f836cabb
2e761f0f3a2deff1bd0b69351eb15c87024bdfea
8519 F20101222_AABGJN samaha_a_Page_44thm.jpg
de266829028ef758d4e3a7fb13092931
a2e36a828a2da41e16b1906b3c69318a04125829
14899 F20101222_AABGYG samaha_a_Page_56.QC.jpg
a208e5fa3e68dc78c1b239861d1092ad
b0a2b93b6d983f104ae300d0d0f45802f8520f67
17650 F20101222_AABGTJ samaha_a_Page_75.pro
651e28e8b564525f0e03702d1f225cd4
ba00e8312ada2de12f45e6c025eb371e1c61ebd5
1051984 F20101222_AABGOM samaha_a_Page_31.jp2
952471fb5e132b1630f4fa4118a17a3b
f451bb5c19fa2bed02f93904e2de283ec2a2b5b8
3735 F20101222_AABGJO samaha_a_Page_06thm.jpg
0fdf5993e6489f5ffa06198019566533
6242a95701ccdcb9ee64b1ebb826d02cbc2d1648
4230 F20101222_AABGYH samaha_a_Page_56thm.jpg
632dbbf460e914f6524c4346738a3b4c
138e9b48291e4b49deba560ea730ecde9743827b
106 F20101222_AABGTK samaha_a_Page_02.txt
89619e7c17b0d36d271c256b6b03b209
6737a05f9c5231f5a0f0888340641114d7e85659
345227 F20101222_AABGON samaha_a_Page_32.jp2
affbe4a7a8b56125c097da5f470226db
cc9a7ee09c51f8d46460445954c0fb6de1b92a0a
52098 F20101222_AABGJP samaha_a_Page_44.pro
cdf26011c183a4c6550d5443150000b5
97ae387bd54089bfcfe2b013db29a1989c666b7f
35349 F20101222_AABGYI samaha_a_Page_57.QC.jpg
cd848c849c5637d502e912e94405d716
874d42beee713a41eb20643ece3c411a380f93c2
91 F20101222_AABGTL samaha_a_Page_03.txt
c5abc0ae086ae9f3776a72723bec2c00
59aef98adedddd842aa06fd250e04306fc302e74
2152 F20101222_AABGJQ samaha_a_Page_12.txt
47204e524270595d010fe0334cd19be6
c2d48783b85026866d454e44870e51c64358e229
8457 F20101222_AABGYJ samaha_a_Page_57thm.jpg
94a76bc7c1527e1d6894c1962e0ba9c7
aeb2b4f2a26b7efca3f5da8b0d7d879c2eed543c
1103 F20101222_AABGTM samaha_a_Page_04.txt
da215ce62cdb4f01641b667d1779aafc
78688df4d93dfca2318e7cfc37801a65a6b07064
486202 F20101222_AABGOO samaha_a_Page_33.jp2
f83c61b355244efd57d7d937a1a7ce5e
3ed1df80b1a29b399bac62dd7f51fcdfcf29ff14
257 F20101222_AABGJR samaha_a_Page_07.txt
f25233ec66540612fd9bdeb2eb5a93cc
cfac73f18eb96647f05e45db97d1fd10885763de
13733 F20101222_AABGYK samaha_a_Page_58.QC.jpg
ae821c4ef13288b74969d3e371c7d8ef
970410692248ae4148f2801aedb5d5a839668956
2536 F20101222_AABGTN samaha_a_Page_05.txt
a1c9d23eb91184aa61a04e8cf4e04582
558a0f4bc1486e4c542e90443b259e1fe7bd1490
F20101222_AABGOP samaha_a_Page_34.jp2
e8593b40d533187b894fd9552f51d5bb
be6a9eb6944af8467a0c16df1b0742c07a3a5c48
F20101222_AABGJS samaha_a_Page_24.tif
c48a099d8d92b3416090c1fc24893609
b8eff6ca53f2db377082b240c8befffa2e31ce16
3942 F20101222_AABGYL samaha_a_Page_58thm.jpg
f21efbb392e4ac4e7b61a104c6a2ecd9
5188173d9f0468690d15836445ba5835acf39f26
903 F20101222_AABGTO samaha_a_Page_08.txt
6d33cb2cf7f474f3bffed67e80223ce5
c5e9238c46ac66671f250b0d2ce915691796b84c
F20101222_AABGOQ samaha_a_Page_35.jp2
680cc4e09841cc74a54b611cdc069673
08e282a7b17b22588804428067f4b6fdbe3b02c4
1546 F20101222_AABGJT samaha_a_Page_07thm.jpg
47442186974d0204f7b1df4297075920
424ca528bb11f15fa9229af6697e8b82d821cc28
18475 F20101222_AABGYM samaha_a_Page_59.QC.jpg
68a0f310ad1fc37bb8004a065a4a0e76
5bc82ddd853068d53f3768113b8d09737cd648b4
1590 F20101222_AABGTP samaha_a_Page_09.txt
8dcea448a03e70ee73beb69a28ff52fb
3e1310787a9b1f92d6996a1f740187e3590747cf
875897 F20101222_AABGOR samaha_a_Page_37.jp2
65138c8430d8534df0fa7d0024c5e29e
c6619ac3b454655c5b764e56ff7052328f79db47
108992 F20101222_AABGJU samaha_a_Page_27.jpg
98c4e326ac009257e1c8a3cec1f2e4b2
14fdeb5106f968b1261f7b2315d33b00deed8cfc
4536 F20101222_AABGYN samaha_a_Page_59thm.jpg
9f0ccd03ddd838aa036fc638e9c59c62
1f24c2b70f3188d0db0f4e75791e1fbbe2d282bd
1957 F20101222_AABGTQ samaha_a_Page_10.txt
ab7352cec258d3c0eab669fff7a787a3
1d6cde235d4585e1dff9f564139dcd7c84894289
1051961 F20101222_AABGOS samaha_a_Page_38.jp2
0ef42d0a7a0dcfe436688eecac3316a4
5032af7d15f8af1ff583ba27413a81b962c81b34
36105 F20101222_AABGJV samaha_a_Page_25.QC.jpg
cbaa6188b40b01f912f25f7e20c62a36
ad6062f57182fe718a9b031523bd9d6b5fcf5bb0
10880 F20101222_AABGYO samaha_a_Page_60.QC.jpg
d17a7baec9dd55339e88ae8fc2d0b097
33f2442f66ef8ece9e612cee17215abdb56c16a8
1165 F20101222_AABGTR samaha_a_Page_11.txt
d93d17a962d0e24198a59ac017ff1430
b153f62269c946de3fd7a688653dd1ab0dda3105
96115 F20101222_AABGOT samaha_a_Page_39.jp2
1cd7cc1df9dfd6b700770de5bc8dff3e
8b2db61308eed157c2cb4a06224b6d1b9ca0c008
34490 F20101222_AABGJW samaha_a_Page_17.QC.jpg
020230e0e096448954ea0d21e4187281
97d0205212b58ebae55b87d07e40ca88c8f7f6eb
3495 F20101222_AABGYP samaha_a_Page_60thm.jpg
7add92ca39cc433894fd734de6765159
fa170312f9bb73056a7274e873baba7f2e171c1d
2090 F20101222_AABGTS samaha_a_Page_13.txt
7c0748d794bc4ce9ea30c2ae80e97657
b3e1a3c2a4dc0e3534a2cfd14d24c2dffac2749d
373480 F20101222_AABGOU samaha_a_Page_40.jp2
b2c59c97f4b471aece0202d6a70d1eef
863c3884d77acba71f5823402c2e8ec1674f0f0f
8963 F20101222_AABGJX samaha_a_Page_72thm.jpg
e8d52c211aa3d399c8b089882abdc490
5c57eda52bccfe9b3f6faac38f8c6f7edb4fec19
14517 F20101222_AABGYQ samaha_a_Page_62.QC.jpg
105d9a0573492b91d9f225e36bdffb6e
fb8c55d62fa5914fbd2213ad34b241ac14267e04
1024384 F20101222_AABGOV samaha_a_Page_41.jp2
ed28dcd671ef265fd3f78b382e66516b
35eb08e0d502e80dcb6aae30f38fbd6df4389709
344 F20101222_AABGHA samaha_a_Page_52.txt
0f99134f2a9b7adf887ef867c5b9c14d
b95d35f37073600893e96d92bc4a56448c8b1980
114106 F20101222_AABGJY samaha_a_Page_20.jpg
e920d31532fe36561d19acdba1a52207
52a6345383eb403621dd0a636c36f88614d9e932
3998 F20101222_AABGYR samaha_a_Page_62thm.jpg
ffb04fa80293022f603a2dd5c493042b
6d76bb05860bb76d85a5b8b7e6643cefc9c2b997
2104 F20101222_AABGTT samaha_a_Page_18.txt
2a8ed0369bfafe940ccaeda6880d66c5
5a976beafa59019ed0a69a576ae2d751a540bed6
960139 F20101222_AABGOW samaha_a_Page_42.jp2
a1488334753a353ea249d24db4d52405
d64af12ce11634c2d3a7928b1d051a05c293fb8b
53594 F20101222_AABGHB samaha_a_Page_27.pro
35a63c65b95814731a97d35c34060788
71bbc9bd8874933c29dded0dc5b3930e2ed18246
54178 F20101222_AABGJZ samaha_a_Page_31.pro
a359ce0076eaf13f5a4526a421b77ae2
7023df8c12705deee05108d80cac83cb8a5836d2
35088 F20101222_AABGYS samaha_a_Page_63.QC.jpg
8eb75515ccab75d081ec783857e26455
d3e7e59e09c904701b92649d861f2c2097d2d231
946712 F20101222_AABGOX samaha_a_Page_43.jp2
9ae17a4e6ddb7f16c84ca868b2d071fb
fa2a5cd5929266463efade2bcfb0d1b79d79e25f
F20101222_AABGHC samaha_a_Page_20.tif
b1c783270ab264b5c3c47d648443b5f7
321128a59ff8742c7021c4878987497d4dee13b6
2255 F20101222_AABGTU samaha_a_Page_19.txt
fcf49560e34d16802f76468150f07862
3fa95c15a5216a983f26814a355db3e817d09fbf
25644 F20101222_AABGYT samaha_a_Page_64.QC.jpg
de717c64f8b9fb3e83979a7d7819bc89
230057a4967cea8cccc11b1c480e401a797ae54e
111207 F20101222_AABGMA samaha_a_Page_15.jpg
192c6332dd1715bd392d4af220d51890
7526551701c862baaa29bfe196c79b39320ef909
1051974 F20101222_AABGOY samaha_a_Page_44.jp2
49694df77232427d1c33e39d7e3ccd87
59f65c1e310b01e7e66a1b4ec2007ca71c7fd5a4
2135 F20101222_AABGHD samaha_a_Page_34.txt
7cae921572f33a582220ae3e818dde53
6635611002da8571946d5d89772cbb01ab67a82f
2061 F20101222_AABGTV samaha_a_Page_21.txt
550bedc0c846bc6a0540a86bfe5f7c0e
9c007aa52111b483760ea52ee0483d8b33272aa7
6302 F20101222_AABGYU samaha_a_Page_64thm.jpg
9acc95d80985b530ad69c5f1faaf280d
b60e40e15295dccf2d977db7b3b9cdd5a9ed9d25
107153 F20101222_AABGMB samaha_a_Page_18.jpg
7c216308de40d200bfea85c1de0027c5
e38c3c47aed8ac2cbf89cc1928b152d7687b8b58
1051903 F20101222_AABGOZ samaha_a_Page_47.jp2
cab74e607659af7cf0f8a62042b040ff
d9466d71e15bfc8a1951f485a22682565c73c1ea
7913 F20101222_AABGHE samaha_a_Page_49thm.jpg
c23e521fd31e2192a6d4cc67c706c8df
6661b82c11d60dee5d7aa076d4ef6976072eafb8
2154 F20101222_AABGTW samaha_a_Page_22.txt
29b3a84c34498294b2d1603a673d37ca
eabb55492fbf5520f381f20a5ec2be2e5519dc93







CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY


By

ANDREW LAWRENCE SAMAHA














A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2008







































O 2008 Andrew Lawrence Samaha






































To my dog, Enzo.









ACKNOWLEDGMENTS

I would like to thank the faculty, staff, and students of the University of Florida

Psychology Department for contributing toward my education in psychology and behavior

analysis and challenging me to become a better student and scientist. Above all others, I would

like to express my sincerest gratitude to Dr. Timothy Vollmer for his support, guidance, and

patience as my faculty advisor. I would also like to thank the members of my dissertation

committee for their attention and feedback for which I am extremely grateful and honored: Dr.

Timothy Hackenberg, Dr. Brian Iwata, Dr. David Smith, and Dr. Colette St. Mary.

I would also like to acknowledge Stephen Haworth and Dr. Frans van Haaren for helping

to establish the lab in which this research was conducted, Dr. Jonathan Pinkston and Dr. Jin

Yoon for their innumerable contributions during my early development as a student, and Dr.

Gregory Hanley and Dr. Rachael Thompson for encouraging me to pursue a career in Behavior

Analysis.












TABLE OF CONTENTS


page


ACKNOWLEDGMENTS .............. ...............4.....


LIST OF TABLES ................ ...............7............ ....


LIST OF FIGURES .............. ...............8.....


AB S TRAC T ............._. .......... ..............._ 10...


CHAPTER


1 INTRODUCTION ................. ...............12.......... ......


Brief History of Reinforcement ................. ......._._... ...............12 ....
Considering the Occurrence and Nonoccurrence of Behavior ................. .......................18
An Analogy in Respondent Conditioning.................. ....... .............2
Previous Research on Complex Contingencies of (Operant) Reinforcement ........................23
Translational Research............... ...............26
Goal s of the Current Research ................. ...............27........... ...


2 EXPERIMENT 1 .............. ...............29....


Purpose .............. ...............29....
Method ................. ...............29.................

Subj ects ................. ...............29.................
App aratu s................. ...............30...............
Procedures .............. ...............30....
Conditions............... ...............3
Results and Discussion .............. ...............32....


3 EXPERIMENT 2 ................. ...............37..............


Purpose .............. ...............37....
M ethod ........._ ......... ....... ...............37.....

Subj ects and Apparatus ................. ...............37......... ....
Procedures .............. ...............37....
Conditions............... ...............3
Results and Discussion .............. ...............38....














4 EXPERIMENT 3 .............. ...............42....


Purpose .............. ...............42....
M ethod ............... .... ........... ...............42.......

Subj ects and Apparatus ........._.__........_. ...............42...
Procedures .............. ...............42....
Conditions............... ...............4
Results and Discussion .............. ...............43....


5 EXPERIMENT 4 ........._.__........_. ...............49....


Purpose .............. ...............49....
Methods ................. ...... ..............4

Subj ects and Apparatus ........._.__........_. ...............49...
Procedures .............. ...............50....
Conditions............... ...............5
Results and Discussion ........._.__........_. ...............51....

Subj ect 2003 ........._.__........_. ...............51....
Subj ect 2004 ........._.__........_. ...............54....
Subject 1903 .............. ...............57....
Subject 2001 .............. ...............59....
Subj ect 2005 ........._.__........_. ...............61....


6 GENERAL DI SCUS SSION ........._.__........_. ...............65...


LIST OF REFERENCES ........._.__....... .__. ...............72...


BIOGRAPHICAL SKETCH .............. ...............75....










LIST OF TABLES


Table page

1-1 Contingencies for each condition in Hammond (1980). .................. ................2

4-1 Contingencies for each condition of Experiment 3 ................. ............... ....___ ...43











LIST OF FIGURES


FiMr IM Le

2-1 Experiment 1: All Sessions ................. ...............33...............

3-1 Experiment 2: All Sessions ................. ...............40........... ...

4-1 Experiment 3: All Sessions ................. ...............46...............

5-1 Experiment 4: Sequence of Conditions ................. ...............51........... ..

5-2 Experiment 4: Subj ect 2003 ................ ...............53........... ..

5-3 Experiment 4: Subj ect 2004 ................. ...............56........... ..

5-4 Experiment 4: Subj ect 1903 ................ ...............58..............

5-5 Experiment 4: Subj ect 2001 ................ ...............60........... ..

5-6 Experiment 4: Subj ect 2005 ................. ...............62........... ..









LIST OF ABBREVIATIONS


DRO Differential reinforcement of other behavior. This is a common treatment for problem
behavior whereby reinforcers are arranged to follow some period of time in which
problem behavior does not occur.

FI Fixed-interval schedule. This is a schedule of reinforcement whereby a reinforcer is
delivered following the first instance of behavior after a fixed-amount of time has
elapsed. For example, FI-30 would mean that the first response after 30 s would be
reinforced.

FR Fixed-ratio schedule. This is a schedule of reinforcement whereby a reinforcer is
delivered following the nth instance of behavior. For example, FR-30 would mean that
the 30th response would produce a reinforcer.

NCR Noncontingent reinforcement. This is a common treatment for problem behavior
whereby reinforcer are arranged independent of behavior, usually according to the
passage of time (e.g., every 30 s).

VI Variable-interval schedule. This is a schedule of reinforcement whereby a reinforcer is
delivered following the first response after some variable interval of time has elapsed.
That amount of time centers around an average determined by an experimenter-set
di stributi on.

VR Variable-ratio schedule. This is a schedule of reinforcement whereby a reinforcer is
delivered following, on average, the nth instance of behavior. The exact response
requirement changes from trial to trial according to some experimenter-set distribution.









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY

By

Andrew Lawrence Samaha

August 2008

Chair: Timothy R. Vollmer
Major: Psychology

Precise control over the reinforcers that follow behavior and the reinforcers that are

presented in the absence of behavior may help to provide a clearer understanding of the role of

response-dependent and response-independent reinforcers. Four experiments examined lever

pressing in rats as a function of a contingency for the delivery of sucrose pellets. Contingencies

were arranged by manipulating the probability of a reinforcer given a response and the

probability of a reinforcer given no response.

Experiment 1 examined acquisition and maintenance of lever pressing during positive

contingencies (where the probability of a reinforcer given a response was higher than the

probability of a reinforcer given no response) and complex positive contingencies (a positive

contingency where the probability of a reinforcer given no response is greater than zero).

Results indicated lever pressing was not acquired under the complex positive contingency, was

acquired under the positive contingency, but persisted during a return to the complex positive

contingency for all three subj ects.

In Experiment 2, subjects were exposed to the same sequence of conditions as subj ects in

Experiment 1 but after first experiencing negative (.00/. 10) and complex negative contingencies

(.05/. 10). In general, results of Experiment 2 were similar to the results of Experiment 1 except









that responding did not persist during the second exposure to .10/.05 for two subj ects and, for

one subj ect, acquisition during the positive contingency was more difficult to obtain than for any

of the subjects in Experiment 1.

In Experiment 3, a two-component multiple schedule was arranged where one component

was associated with early exposure to a negative contingency while the other component was

associated with only positive contingencies. Results indicated that, overall, the multiple

schedule method did not detect differences in subsequent responding.

In Experiment 4, the effects of a gradual shift from a positive to a negative contingency

were examined. Results indicated that lever pressing decreased accordingly as contingencies

became more negative. In addition, maintenance under negative contingencies was more likely

when smaller contingency changes were made from one condition to another. All of the results

are discussed in terms of understanding naturally occurring schedules of reinforcement in the

acquisition and maintenance of appropriate and problematic human behavior.









CHAPTER 1
INTTRODUCTION

The following experiments examined acquisition and maintenance of lever pressing in rats.

The purpose of the research was to investigate contingencies of reinforcement in which

reinforcers are presented both following behavior and following periods of time in which

behavior did not occur. Although reinforcement contingencies are commonly arranged

experimentally such that a response must occur to produce a reinforcer, based on prior research

in applied behavior analysis, it is likely that in nature a blend of events occur following both the

occurrence and nonoccurrence of behavior occur. A better understanding of such contingencies

has important implications for understanding acquisition of both problem and appropriate

behavior in the development of human behavioral repertoires.

Brief History of Reinforcement

In Schedules ofReinforcement (1957), Ferster and Skinner categorized hundreds of

variations on relations between behavior and environment, known as reinforcement schedules.

Investigating such relations involved arranging contingencies between behavior and features of

the apparatus that could exert control over behavior. The contingencies took the form of if-then

relations with some specification of behavior or time and behavior affecting some feature of the

environment. For example, if a response key in a pigeon chamber was pecked 25 times, a

solenoid was then activated to raise a hopper filled with grain. Or, the first response after 15 s

resulted in hopper access. These procedures have come to be known as a fixed-ratio (FR) and

Eixed-interval (FI) schedules of reinforcement, respectively. Fixed-ratio schedules specify that

reinforcers are to be delivered following some Eixed number of responses. Examples of FR

schedules include piece-work reimbursement systems in which workers are paid for completing a

set amount of work. Fixed-interval schedules specify that reinforcers are to be delivered









following the first response after some fixed period of time. For example, an FI-10 min schedule

specifies that the first response after 10 min will produce a reinforcer. In addition to FR and FI

schedules, Ferster and Skinner also examined the effects of varying the response requirement

around some average following every reinforcer delivery. These were referred to as variable-

ratio (VR) and variable-interval (VI) schedules. Variable-ratio schedules specify that reinforcers

are delivered following, on average, the nth response but the exact number of responses

necessary to produce each individual reinforcer is unpredictable. The exact distribution of

response requirements is controlled by the experimenter. Variable-interval schedules specify

that reinforcers are delivered following the first response after the passage of some variable

length of time. Similar to VR schedules, the length of time is centered on some average value

but varies unpredictably from reinforcer to reinforcer.

Reynolds (1968) made an important distinction that further extends the notion of

reinforcement schedules. In his text, A Primer ofOperant Conditioning, Reynolds wrote about

the difference between dependencies and contingencies. According to Reynolds, dependencies

describe relations in which some consequence occurs if and only if behavior occurs. All of the

schedules described in Schedules ofReinforcement arranged dependencies. For example, the

mechanical delivery of grain in an operant chamber may be dependent on a key press. Turning

on the light in one's office is dependent upon hitting the light-switch. And, according to

Reynolds, contingencies describe the obtained relations found in the environment, including

those that occur as a result of dependencies and those that occur for other reasons. For example,

a reinforcer may be programmed to occur every 60 s whether or not behavior happens. Suppose

that, by accident, a response occurs at second 59. This accidental contingency may produce a









reinforcement effect and the relation may be expressed as a reinforcement contingency despite

the fact that there is no dependency between behavior and the delivery of reinforcers.

In our day to day lives, behavior can enter into relations that likely consist of a blend of

dependencies, accidental pairing, and events that follow periods with no behavior. In order to

understand these kinds of relations, a method or framework must be established to integrate

them. Consider the behavior one person (Albert) might engage in to get another person's

(Jane's) attention (for the purpose of the example, assume that attention is a reinforcer). For

example, Albert might say "Hello" or attempt to make eye-contact with Jane. Through

observation and experimentation, it might be possible to show that making eye contact is

reinforced on about every other occasion. This approximates something like a random-ratio

schedule where each response is associated with a .5 probability of being followed by a

reinforcer. But, what if Jane initiates a conversation with Albert before Albert had a chance to

do anything? How should this "extra" attention be conceptualized? There are a few possibilities.

One is that a reinforcement effect will occur merely as a result of the contiguity, or brief delay,

between behavior and the subsequent attention. The other is that a reinforcement effect for eye-

contact would result if eye-contact was correlated with an increase in the probability of receiving

attention over the background probability of attention.

Skinner and others certainly recognized that there was value in examining the effects of

reinforcers that were delivered for free or independent of behavior. For example, Zeiler (1968)

examined the effects of what he termed "response-independent schedules of reinforcement."

Zeiler exposed pigeons to fixed-time (FT) and variable-time (VT) schedules where reinforcers

were delivered according to either a fixed duration of time that did not change from reinforcer to

reinforcer or a quasi-random duration that changed from reinforcer to reinforcer but whose









average remained constant across sessions. Responding in the context of FT and VT schedules

was evaluated after pigeons first experienced FI and VI schedules. The effect of both schedules

was to produce a decrease in the rate of responding however the FT schedule produced

accelerated patterns of responding just prior to reinforcer delivery. This increase was attributed

to adventitious reinforcement or, "the strengthening of behavior because it happens to occur

contiguously with or in close temporal proximity to reinforcement." (p. 412) That is, the pattern

of responding established during FI was maintained during the subsequent FT condition despite

the lack of a dependency between responding and reinforcer delivery. The absence of systematic

patterns observed during exposure to VT was interpreted to have been caused by the

strengthening of behavior other than key-pecking as a result of unpredictable intervals between

reinforcers .

Additional experiments followed Zeiler' s (1968) examination including Lattal and Maxey

(1971). Lattal and Maxey evaluated responding during VT schedules using a multiple schedule.

Multiple schedules involve the alternation between two conditions (or, components) within the

same session. Each component is associated with a unique stimulus or set of stimuli. In Lattal

and Maxey's first experiment, both components were initially set to VI schedules (Mult VI VI).

In later conditions, both components changed to VT schedules (but, at different points in the

experiment). Responding during the VT component persisted longer when the other component

was VI. In addition, responding was higher in the component that was most recently a VI

schedule, suggesting that responding during the VT schedule was partly a function of the

response strength in the previous condition. In the second experiment, responding was examined

following a transition from Mult VI VI to Mult VI Ext (extinction) and then Mult Ext Ext with

occasional 1-session probe evaluations of Mult VT VT. Although extinction typically produces










complete suppression of behavior, responses maintained at approximately 10 responses per

minute, indicating that responding during the VT condition would have likely produce responses

contiguous with reinforcer presentation. Hence, at least some of the response persistence during

VT might be attributed to adventitious reinforcement.

Other researchers noted that the pattern produced by the previous response-dependent

schedule could influence the likelihood of adventitious reinforcement in subsequent response-

independent conditions. For example, Rescorla and Skucy (1969) suggested that relatively high

rates could be obtained in FT following FI because exposure to FI schedules typically produces

rates of behavior that increase prior to reinforcer delivery. Therefore, response-independent

reinforcers delivered at the same frequency would likely follow similar local increases in

responding. Similarly, Lattal (1972) concluded that relative to FT, VT does not produce rates of

responding as high as its response-dependent counterpart (VI) because the VT presentation of

reinforcers is more likely to occur during some behavior other than lever pressing.

In an attempt to understand the relative contributions of dependency and contingency to

responding, several investigators examined schedules that combined features of both. Edwards,

Peek, and Wolfe (1970) compared rates of responding in FR, FT, conj oint FR FT (where

reinforcers were delivered following a fixed number of responses and fixed-periods of time), and

extinction (where reinforcers were not delivered during the session). Edwards et al. found that

the effects of adding response-independent schedules on top of existing response-dependent

schedules produced relatively small decreases in behavior compared to either extinction or FT.

In addition, as the rate of response-independent reinforcement was increased (or, as the intervals

of the FT schedule were decreased) and the response-requirement for response-dependent

reinforcers remained fixed during the conj oint FR FT condition, response rate decreased.









Lattal (1974) examined schedules in which the percentage of reinforcers delivered

according to a variable schedule were response-dependent (while the remainder were response-

independent). This was accomplished by either making every 3rd, 10th, or all reinforcer

deliveries dependent on a response. When response-dependent reinforcers were available,

response-independent deliveries were suspended until after the first response occurred. In

addition, the proportions of response-dependent reinforcers were examined in both ascending

and descending series. Results suggested that response rates decreased as the percentage of

response-dependent reinforcers decreased.

Lattal and Bryan (1976, Experiment 1) examined effects of delivering response-

independent reinforcers according to a VT schedule on top of existing FI performance using a

conj oint FI VT schedule. The experimenters manipulated the rate of reinforcer presentation on

the VT schedule while keeping the FI schedule constant. In general, the results suggested that

VT reinforcer delivery disrupted both the pattern and rate of responding established by the FI

schedule. That is, the positively accelerated rates observed prior to reinforcement on the FI

schedule became more linear when VT reinforcer were introduces. In addition, the overall rate

of responding decreased during the session. However, the authors noted that in some cases, the

addition of response-independent reinforcement had either no clear effect or increased rates of

responding. The authors suggested that the uncontrolled temporal contiguity of responses and

reinforcers delivered according to the VT schedule may have contributed to the lack of consistent

effects.

Additionally, more recent applied studies have shown that responding may persist when

response-independent reinforcers are delivered on top of an existing response-dependent

schedule. For example, Marcus and Vollmer (1996) evaluated whether appropriate









communication behavior would persist following training if the reinforcers maintaining

appropriate communication (and problem behavior) were delivered according to a fixed-time

schedule. Once appropriate behavior was established and problem behavior remained low, the

rate of fixed-time presentation was decreased across sessions. The results showed that

appropriate communication persisted despite the fixed-time delivery of reinforcers.

Additionally, this effect was replicated by Goh, Iwata, and DeLeon (2000).

Considering the Occurrence and Nonoccurrence of Behavior

One feature common to schedules in which reinforcers are delivered following either

responses or following the passage of time is that reinforcers delivered according to the latter

might still follow responses closely in time. This becomes a problem because reinforcers can

have different effects depending on whether or not they follow behavior. In addition, these

different effects can occur independent of whether or not the behavior actually triggered the

delivery (i.e., there does not need to be a dependency between behavior and a subsequent event

for the behavior to be affected by it). So, a conceptualization of reinforcement that includes

those reinforcers that happen after behavior and those reinforcers that happen after some period

of time (regularly or irregularly) is inadequate because some proportion of those latter

reinforcers will inevitably follow behavior. Furthermore, that proportion (of reinforcers

delivered according to a time-based schedule that accidentally follow behavior) is not controlled

by the experimenter but instead, by the organism's behavior. Therefore, to study contingencies

similar to those found in the natural environment, there must be control over the delivery of

reinforcers following the occurrence of behavior and the delivery of reinforcers following the

nonoccurrence of behavior. Fortunately, nomenclature and conceptualizations that supports such

a framework already exist.









Catania (1988) described in his text Learning, the fundamental process and procedures

known as reinforcement. He noted that a prototypical study on reinforcement might compare the

effects of exposing the animal to two conditions: a baseline, where the animal receives no food

and a reinforcement condition, where the animal receives food after each instance of behavior.

The conditions might alternate back and forth a few times so that the experimenter is convinced

it is the reinforcement causing the increase in behavior and not some other, uncontrolled

variable. Following such an experiment, the data might reveal that responding remained low

during the initial baseline condition, increased during the reinforcement condition, then

decreased back down to previous levels during the subsequent baseline condition, and so on. To

some, it may seem like a clear demonstration that reinforcement was responsible for the increase

in behavior, but Catania noted two changes occurring during the transition back to baseline: 1)

the relationship between behavior and food and 2) the mere presence of food in the session. In

light of that limitation, an alternative explanation for the obtained increase in behavior might be

that the food had a general tendency to increase the activity of the animal, which produced not

only an increase in the measured behavior but in other, unmeasured behavior as well. To address

this, Catania described an alternative control condition where, instead of not delivering

reinforcers at all, food is to be delivered for both the occurrence and nonoccurrence of behavior.

He expressed these terms probabilistically such that, for the reinforcement condition, the

probability of a reinforcer given a response was 1.0 and the probability of a reinforcer given no

response was 0 and in the extinction condition, both probability terms would be equal.

In addition, Catania's (1988) conceptualization provides a heuristic for anticipating the

effects of complex contingencies (for lack of a better term, complex is used here to describe

contingencies where both the probability of a reinforcer given a response and the probability of a









reinforcer given no response are greater than zero). Referring back to the above example using

eye-contact, the probability of receiving attention given eye-contact was .5 but sometimes

attention was delivered in the absence of eye-contact. Catania's conceptualization allows us to

evaluate the contingency if we also express the attention that is delivered in the absence of eye-

contact as a probability. If the probability of attention given eye-contact is greater than the

probability of attention given no eye-contact, Catania's framework would predict that eye-

contact would be strengthened as a result of reinforcement. Conversely, if the probability of

attention given eye-contact is less than or equal to the probability of attention given no eye-

contact, Catania's framework would predict that eye-contact would not be strengthened. The

conceptualization might be helpful for improving our understanding of contingencies similar to

those found outside the laboratory.

An Analogy in Respondent Conditioning

Perhaps not coincidently, a similar conceptualization of contingencies has been useful for

understanding respondent conditioning. Rescorla (1967) wrote about confounds present in

common control conditions during tests of respondent conditioning. Respondent conditioning

(sometimes called Pavlovian conditioning) describes conditioning in which a neutral stimulus

comes to produce effects similar to those of an unconditioned stimulus (US) as a result of

operations often simply (and inadequately) described as pairing. Effects of respondent

conditioning are demonstrated by comparing a subject's responses to the CS (conditioned

stimulus) following a test condition (in which the CS and US are paired) and a control condition.

Popular control procedures pre-dating Rescorla's publication involved some variation of

presenting both the CS and the US but, in a manner that was directly contrary to the test

condition. That is, US were often presented before CS such that presentation of the CS was

never predictive of an upcoming presentation of the US. Rescorla made two arguments: 1) the










only difference between the effects of the test and control conditions should be the contingency

necessary to produce conditioning and 2) many of the commonly used control conditions

included two changes: the removal of one contingency and the addition of another. For

Rescorla, the constraints placed on the relation between the CS and the US in typical control

conditions constituted a procedural difference beyond the mere absence of the contingency

responsible for conditioning. Therefore, the ideal control condition was one in which

presentation of the CS and the US was unconstrained.

The test and three of the control conditions (explicitly unpaired control, backward,

conditioning, and discriminative conditioning) described by Rescorla can be expressed

probabilistically (for the sake of completeness, the remaining control conditions were

presentation of the CS alone, presentation of a novel CS, and presentation of the US alone). In

the test condition, in which CS are always presented and removed prior to the US, the probability

of a US given a CS is 1.0 and the probability of a CS given a US is 0. The explicitly unpaired,

backward conditioning, and discriminative conditioning effectively arranged the same

contingency: US always proceed CS and CS never proceed US. Hence, in these control

conditions, the probability of a US given a CS is 0 and the probability of a CS given a US is 1.0.

And in the ideal control condition, in which presentation of the CS and the US are unconstrained

(random), the probability of a US given a CS would be equal to the probability of a CS given a

US.

Lane (1960) investigated the potential effectiveness of control conditions for operant

control of vocalizations in chickens. The control conditions included no reinforcement

(extinction), fixed-time reinforcer delivery, fixed-ratio food tray presentation (a stimulus that

was correlated with reinforcer delivery) without accompanying reinforcers, and DRO (where










reinforcers were delivered given the absence of food). Lane found decreases in each of the

control conditions relative to either Eixed-ratio and Eixed-interval test conditions. Similar results

were obtained by Thompson, Iwata, Hanley, Dozier, & Samaha (2003) who examined Eixed-

time, extinction, and DRO. Both studies reported relatively higher rates of responding during the

Eixed-time condition which was attributed to accidental contiguity between responses and

reinforcers .

Thompson and Iwata (2005) noted the analogy between Rescorla's (1967) description of

ideal control procedures for respondent conditioning and those used for operant conditioning.

Their analysis led them to conclude that, although imperfect for reasons described below,

noncontingent reinforcement (NCR) met Rescorla's definition of a "truly random control."

(Thompson and Iwata, 2005, p. 261) However, the fixed-time delivery of reinforcers does not

ensure that the obtained relationship between behavior and reinforcers is random. Reinforcers,

by definition, have the effect of strengthening whatever preceded them. The strengthening effect

does not depend on the nature of the relationship between behavior and reinforcement (i.e.,

whether the behavior produced the reinforcer or if the reinforcer accidentally followed

behavior). As a result of being strengthened, the rate and/or pattern of behavior may change

such that the obtained contingency is no longer random. In the case of fixed-time delivery of

reinforcers, responses that occur in the interval just before food delivery may be more likely to

occur in the future. Such a case was reported by Vollmer, Ringdahl, Roane, and Marcus (1997)

in which a child's aggression persisted during NCR. An examination of the within-session

pattern of responding revealed that as the individual gained more experience with the treatment,

instances of aggression became more likely just prior to reinforcer-delivery. In other words, the

probability of a reinforcer given aggression was likely higher than the probability of a reinforcer










given the nonoccurrence of aggression. Such a condition is more descriptive of a fixed- or

variable-ratio schedule as opposed to a "truly random control." It is possible that such a problem

only occurs if one uses fixed-time schedules and that NCR implemented using variable-time

schedules would retain the status as the truly random control. However, VT schedules also do

not ensure that the obtained relation between behavior and reinforcers remains random.

Reinforcers that are delivered closely following responses may increase the overall rate of

responding such that, compared to the initial rate of responding that produced a negative

contingency, higher rates of responding may produce positive contingencies. In addition, many

of the studies on reinforcement contingencies already discussed emphasize the role in which

response-independent reinforcers exert their influence on responding in systematic (i.e., non-

random) ways (c.f., Zeiler, 1968; Rescorla & Skucy, 1969; Edwards, Peek, & Wolfe, 1970,

Lattal & Maxey, 1971; Lattal, 1972; Lattal, 1974; Lattal & Bryan, 1976).

Previous Research on Complex Contingencies of (Operant) Reinforcement

To date, two studies have experimentally manipulated contingencies of reinforcement

viewed as the probability of a reinforcer given a response and the probability of a reinforcer

given no response. In the first one, which was a two experiment study, Hammond (1980)

investigated effects of positive and negative contingencies in rats using water as the reinforcer

and lever pressing as the response. Contingencies were arranged by dividing the session into a

series of unsignaled 1-s cycles. At the end of each cycle, .03 ml of water was delivered (or not)

according to two experimenter-programmed probabilities: the probability of a reinforcer given

that at least one response occurred during the previous cycle and the probability of a reinforcer

given that no responses occurred during the previous cycle.

In the first experiment, rats were given a history of a positive contingency before they were

exposed to a zero contingency. Hammond used the term positive contingency to refer to










conditions where the probability of a reinforcer given a response was higher than the probability

of a reinforcer given no response. The term zero contingency was used to refer to conditions

where the probabilities of a reinforcer given a response and given no response were equal. The

specific sequence of conditions and the terms used to describe them are listed in Table 1-1.

Responding decreased rapidly after the introduction of the zero contingency as compared to the

moderately high positive contingency.


Table 1-1. Contingencies for each condition in Hammond (1980).
Condition P(Sr|R) P(Sr|~R) Term
a 1.0 0 Very High Positive
b .2 0 High Positive
c .05 0 Moderately High Positive
d .05 .05 Zero
e .05 0 Moderately High Positive
f .05 .05 Zero
The conditions in Experiment 1 of Hammond (1980) and the terms used to describe them. The
abbreviation P(Sr|R) stands for the probability of a reinforcer given a response and
the abbreviation P(Sr|~R) stands for the probability of a reinforcer given no response.


In the second experiment, 47 rats were given a history of a positive contingency and then

were exposed to either one of two positive contingencies (. 12/.00 or .12/.08), one of two zero

contingencies (. 12/. 12 or .05/.05), or a negative contingency (.00/.05). The results showed that

responding decreased as the contingencies were progressively weakened. In the discussion, the

correspondingly decreased response rates were interpreted to have implications against accounts

of reinforcement that are based on contiguity. Contiguity, when used with respect to operant

behavior, refers to the amount of time that elapses between responses and reinforcers. According

to the author, the contiguity was the same in all conditions of the experiment. Therefore, the

relationship between the probability of a reinforcer given a response and the probability of a

reinforcer given no response must play an important role in determining reinforcement effects.









Borrero, Vollmer, and Wright (2002) translated the findings and procedures used by

Hammond (1980) in the treatment of aggression. A functional analysis (Iwata et al., 1982/1994)

was conducted in order to identify the reinforcers maintaining aggression for two participants.

For both participants, aggression was maintained by social reinforcement, which meant that it

occurred because of the reactions of other individuals in the environment. Specifically, one

participant' s aggression was maintained by escape from activities and the other's was maintained

by access to preferred food items. Following the functional analyses, the participants were

exposed to positive and then neutral (zero) contingencies. Cycle durations were adjusted to be

approximately equal to the average duration of the responses made by the participants. For one

participant, the cycle duration was 1 s and, for the other, the cycle duration was 5 s. The effect

of the contingencies was the same for both participants: positive contingencies produced

maintenance and neutral contingencies produced decreases in aggression. One implication of

Borrero, Vollmer, and Wright is that the procedures used to arrange complex contingencies of

reinforcement may represent a useful method for simulating reinforcement contingencies like

those maintaining problem (or appropriate) behavior in the natural environment. Furthermore,

the effects on socially-relevant behavior seem to be in the direction anticipated by Catania

(1988).

The neutral contingencies described by Hammond (1980) and Borrero, Vollmer, and

Wright (2002) might better fit an operant analog of Rescorla' s (1967) "truly random control."

Neutral contingencies specify that the probability of a reinforcer given a response is equal to the

probability of a reinforcer given no response. If those probabilities are set to values greater than

zero then, responding does not have the effect of increasing the probability of a reinforcer above

that obtained if no response occurs. Therefore, the alternation between positive and neutral









contingencies by Borrero, Vollmer, and Wright (2002) constitutes the demonstration of a control

condition where the only change between baseline and reinforcement is the contingency for not

responding. However, this kind of control condition has not been described or examined in

relevant discussions of operant control procedures (Lane, 1960; Thompson et al., 2003,

Thompson & Iwata, 2005).

Translational Research

Traditional views of science often place a division between two groups of scientists: basic

and applied. Basic scientists are those that do science for the sake of understanding and applied

scientists are those that do it to meet some more immediate need of society (Baer, Wolf, &

Risley, 1968). The extension of the findings of Hammond (1980) and the contingency concept

of reinforcement to the treatment of problem behavior represents an example of how research in

basic science may be applied to address issues that are important to society (i.e., reducing

aggressive behavior displayed by children). This model of the relationship between basic and

applied science is often unidirectional, where information flows from basic to applied. However,

less obvious is the reciprocal role in which application can (or should) guide basic science.

Positive reinforcement is a concept that is clearly basic and fundamental to behavior

analysis. Basic research on positive reinforcement has focused largely on if-then response-

reinforcer dependencies. However, applied research has shown that events known to reinforce

problem (and appropriate behavior) sometimes occur following behavior and sometimes occur

when behavior has not occurred (e.g., Vollmer, Borrero, Borrero, Van Camp, & Lalli, 2001;

Samaha et al., in press). Intuitively, such contingencies are frequent in human environments.

Therefore, examining the necessary and sufficient conditions for reinforcement in the context of

complex contingencies would seem important.









In addition, previous translational research has shown that some effects of reinforcement

seem dependent on not just current contingencies, but also previous experience. For example,

Borrero, Vollmer, Van Harren, Haworth, and Samaha (in prep) used rats to examine lever

pressing during Eixed-time (FT) schedules where reinforcers are delivered according to a clock

(independent of lever pressing). Fixed-time schedules might sometimes produce complex

contingencies because, even though reinforcers are delivered according to a clock, they may

accidentally occur just after a response or after a period of time without responding. Results

indicated that maintenance during the FT condition was more likely when rats had a previous

history of responding on an FI schedule with the same interval value as that used in the

subsequent FT condition. For example, rats with a previous history of FI 30 s (where the first

response after 30 s produced a reinforcer) continued to respond at higher rates in a subsequent

FT 30 s condition (where reinforcers were presented every 30 s independent of lever pressing) as

compared to an FT 15 s condition. While the results of this study do not lend themselves to an

evaluation of the effects of complex contingencies (because the relationship between responding

and reinforcer delivery in the FT condition was not directly arranged by the experimenter), the

results clearly suggested that reinforcement effects in complex contingencies may be influenced

by previous experience. Therefore, a complete description of the necessary and sufficient

conditions for reinforcement in complex contingencies might need to include conditional

statements based on an organism's previous experience.

Goals of the Current Research

The general aim of this dissertation is to present a method to study complex contingencies

of reinforcement. The series of studies seeks to investigate some conditions for observing

acquisition and maintenance under complex schedules of reinforcement. An improved









understanding of complex schedules of reinforcement has implications for how behavior might

be reinforced and maintained in the natural environment.

The following five experiments examined acquisition and maintenance of lever pressing in

rats. In the first experiment, acquisition was examined during two positive contingencies

(. 10/.05 and .10/.00) and effects of exposure to .10/.00 on responding in a subsequent .10/.05. In

Experiment 2, a systematic replication of experiment one was conducted by providing

experience with negative contingencies (.00/. 10 and .05/. 10) prior to the evaluation of

responding in positive contingencies. The results of Experiment 1 and 2 were somewhat

different, such that acquisition and maintenance may have been weakened by the early exposure

to a negative contingency. So, Experiment 3 was designed to evaluate effects of the differences

between Experiment 1 and 2 (the previous exposure to positive contingencies) within subjects.

Finally, in Experiment 4, a method was used to systematically identify the contingency values at

which responding would break down by gradually manipulating the contingency from positive to

negative (. 10/.00 to .00/. 10).









CHAPTER 2
EXPERIMENT 1

Purpose

The purpose of this experiment was to evaluate whether lever pressing could be acquired,

maintained, or both under a complex positive contingency of reinforcement, in which there was

some probability of a reinforcer given behavior (. 10) and some probability of a reinforcer given

no behavior (.05).

Method

Subj ects

Three experimentally native male Wistar (albino) rats purchased at 8 weeks of age were

housed individually in home cages. Experimentally native rats were selected as subjects in order

to control for a history of behavior reinforced by access to food. Conclusions based on the

acquisition of behavior by non-experimentally native organisms would need to be tempered due

to both known and unknown experiences prior to the experiment. Likewise, the conditions under

which food-reinforced behavior could be acquired and maintained in experimentally native

organisms could be tested. Prior to the experiment, rats were given ad-lib food and water for 7

consecutive days. After 7 days, access to food was restricted to 16 g per day. Food was made

available in the home cages immediately following sessions. Water was freely available in the

home cages throughout the experiment. Sessions began after the 7th day of food restriction. All

procedures were approved by the University of Florida Animal Care and Use Committee.

The colony room was illuminated on a 12-hour light-dark cycle with lights programmed

to turn on at 8 am. Temperature and humidity were monitored and maintained at consistent

levels.










Apparatus

Six Coulbourn-Instruments operant chambers were enclosed in sound-attenuated boxes

with exhaust fans. An intelligence panel was mounted on one wall of the chamber measuring 29

cm long X 30 cm wide X 25 cm high. Mounted on the panel were two levers and a pellet

hopper. The pellet hopper was mounted in the center of the intelligence panel (7.0 cm above the

floor) and the levers were located on either side of the hopper (centered 7.0 cm above the floor

and 5.5 cm from the center of the hopper). Also mounted on the intelligence panel were three

color LEDs (light-emitting diodes) mounted horizontally 4 cm above each lever, an incandescent

house-light (2.0 cm from the top-center of the panel), and an incandescent hopper-light. From

left to right, the colors of the LEDs were red, green, and yellow. The side-panels of the chamber

were made of clear acrylic plastic while the ceiling, rear, and intelligence panel were constructed

of aluminum. The bottom of the chamber consisted of a shock floor (although no shock was ever

delivered during the experiment) raised above a white plastic drop pan. A pellet feeder was

attached to the back of the intelligence panel and delivered pellets into the hopper. Lever presses

were defined as any force on the lever sufficient to produce a switch closure (about 0.20 nm).

Responses to both levers were recorded but only responses on the left lever produced changes in

the probability of reinforcer delivery. A PC computer used Coulbourn Instruments' Graphic

State Notation to record lever-presses and control the apparatuses. The computer also emitted

white-noise through a pair of attached speakers at approximately 70 dbs (as measured from the

center of the room).

Procedures

Three 10-min sessions were conducted each day. Each session was proceeded by a 1-min

blackout and the third session was followed by a 1-min blackout before the animal was returned

to its home cage. During sessions, the house light and the lever lights above both levers were









illuminated. Throughout the experiment, the session was divided into unsignaled 1-s cycles

(similar to that described by Hammond, 1980). The computer was programmed to deliver a

single 45-mg sucrose pellet (Formula 5TUL, Research Diets Inc., New Brunswick, NJ) at the end

of each cycle according to a pair of probabilities specific to each phase: the probability of a pellet

delivery given at least one lever press in the current cycle ( P(Sr|R) ) and the probability of a

pellet delivery given no lever presses in the current cycle (P(Sr|~R) ). During a pellet delivery,

the house and lever lights were turned off for 1 s. At the same time, the hopper-light flashed

briefly for 250 ms. The next cycle began when the house and lever-lights were re-illuminated.

Lever presses that occurred during the 1-s blackout did not have any programmed effect and

were not included in the overall rate of responding.

Other than the contingencies implemented during each phase, no lever shaping or hopper-

training was performed prior to or during the experiment. Contingency values (the probabilities

of pellet delivery) for each condition were initially based on the values reported by Hammond

(1980). Pilot work revealed that animals gained excessive weight when exposed to similar

contingency values in combination with session durations of 50 min. Therefore, an attempt was

made to reduce food intake by limiting the total time spent in session to 30 min per day. In

addition, the session time was divided into three 10-min blocks after an examination of within-

session patterns revealed reasonably consistent rates of responding.

Conditions

Condition changes were made following stability as judged by visual inspection. From

this point forward, each condition has been specified using two parameters: the probability of a

pellet delivery given a response and the probability of a pellet delivery given no response

(P(Sr|R) / P(Sr|~R)). Conditions were conducted in the following order: .00/.00 (No Pellet),

.10/.05, .10/.00, and .10/.05.









Results and Discussion

Figure 2-1 shows responses per min of lever pressing for each session, subj ect, and

condition. The following pattern of responding was observed for all three subjects. Little to no

responding was obtained in the initial No Pellet condition (as expected). No subj ect showed

acquisition during the subsequent .10/.05 condition. Responding increased for all three subjects

following exposure to .10/.00. Responding then persisted at somewhat reduced levels (as

compared to the previous .10/.00 condition) following the reversal back to .10/.05.











.1 0/.05


O 100 200 300
SESSIONS


Figure 2-1. Experiment 1 All Sessions. This figure shows responses per min of lever pressing
for each session. Each panel shows data from a different subject.


NO PELLET

._110/.05 .101.00









Three conclusions can be drawn from the data. First, .10/.05 was not sufficient to produce

acquisition in these subj ects during the time period in which they were exposed to the condition.

Second, the lack of acquisition in .10/.05 may be, in part, explained by the reinforcers that were

delivered following cycles without responses given that acquisition was obtained in .10/.00.

Third, responding was maintained during the second exposure to .10/.05, a condition which

previously did not produce responding. It is this third finding that is perhaps most critical. If the

.10/.00 condition is viewed as an independent variable, then exposure to that variable produced a

differential effect in a subsequent condition: .10/.05 in comparison to the .10/.05 condition that

preceded .10/.00.

Although, only a small range of parameter values was examined in this experiment, the

results may have implications for the acquisition and maintenance of problem and appropriate

behavior in humans. With respect to the first effect, it may be that occasional reinforcers

presented in the absence of behavior are sufficient to prevent the acquisition of problem (or

appropriate) behavior. Given the current data, this could be the case even if the probability of

reinforcement given problem (or appropriate) behavior was twice as likely as the probability of

reinforcement given no behavior. Such reinforcers could be arranged using fixed-time schedules

(e.g., noncontingent reinforcement, NCR), differential reinforcement of other behavior (DRO),

or following the occurrence of appropriate behavior as a sort of "inoculation" against the

emergence of problem behavior. On the other hand, too many "free" reinforcers may impede the

development of important appropriate skills.

Koegel and Rincover (1977) showed similar results when, following experience with

intermittent reinforcement, students' correct responses persisted (but eventually decreased) in

another setting when reinforcers were presented following successive incorrect responses or










independent of behavior. When reinforcers were presented following incorrect responses, an

examination of the pattern of responses revealed that the reinforcer appeared to serve as a

discriminative stimulus. That is, correct responses increased after the delivery of a reinforcer

and then decreased across successive trials. Indeed, other authors have observed response

persistence during DRO schedules and have positive a discriminative effect of the reinforcer

(c.f., Thompson, Iwata, Hanley, Dozier, & Samaha, 2003). When reinforcers were presented

independent of correct responses, behavior persisted for much longer. The authors attributed the

enhanced persistence of the response-independent reinforcement to adventitious pairing of

responses and reinforcers. In the current study, response rates persisted (for several hundred

sessions in two cases) under a complex positive contingency. One interpretation of the results of

this study was that the occasional response-dependent reinforcer may have enhanced the

discriminative properties of the all the reinforcers such that responding persisted for much longer

than that observed by Koegel and Rincover (1977).

The acquisition versus maintenance effect with .10/.05 has implications for the

maintenance of appropriate behavior and the treatment of problem behavior. Once acquired,

both appropriate and problem behavior may be relatively robust despite intermittent

reinforcement and occasional reinforcers delivered following the absence of behavior. For

problem behavior, the result suggests that those selecting treatments for eventual implementation

by caregivers should do so while considering the possible effects that treatment integrity failures

will have on the contingency. For example, DRO (a common treatment) specifies that

reinforcers are to be delivered following periods of time in which behavior has not occurred.

Despite even the most ideal training, it is very likely that other factors may result in reinforcers

occasionally following problem behavior (e.g., as a result of intermittent care by untrained or









unmotivated individuals or if the problem behavior is extremely dangerous and necessitates

immediate reactions from caregivers). Such mistakes may appear small but might serve to drive

an initially strong negative contingency toward conditions that would produce maintenance.

Conceptually, the results have implications for understanding the basic principle of

reinforcement. A given contingency may not produce a reinforcement effect in the sense of

"strengthening" behavior, but may produce a reinforcement effect in the sense of "maintaining"

previously acquired behavior.

Similar results were also obtained by Marcus and Vollmer (1996) and Goh, Iwata, and

DeLeon (2000) in which appropriate behavior persisted following exposure to an FT schedule of

reinforcer delivery. However, one important difference between those studies and the current

procedures was that previous authors used specific training procedures (FR-1 in Marcus &

Vollmer and a prompt-delay procedure in Goh, Iwata, & DeLeon) in order to teach the initial

behavior. The current study did not involve explicit magazine training, shaping, or any other

analogous training procedure other than the contingencies of reinforcer presentation.

The next experiment was designed to systematically replicate the procedures of the first

experiment but by providing an initial history with negative contingencies.










CHAPTER 3
EXPERIMENT 2

Purpose

One purpose of this experiment was the same as Experiment 1: to evaluate whether lever

pressing could be acquired and maintained under a complex positive contingency of

reinforcement. However, rats were first exposed to two negative contingencies (.00/. 10 and

.00/.05)>.

Method

Subjects and Apparatus

Four experimentally naive male Wistar (albino) rats were included in Experiment 2. The

animals were acquired, housed, and fed in a manner identical to Experiment 1. In addition, the

same chambers used in Experiment 1 were also used in Experiment 2.

Procedures

Procedures in Experiment 2 were identical to those described in Experiment 1 for all but

one animal. For rat 1852, one 50-min session was conducted per day. A 1-min blackout was

presented prior to and following each session.

Conditions

After an initial No Pellet baseline, four rats were exposed to a sequence of conditions

starting from strong negative to strong positive: .00/. 10, .05/. 10, .10/.05, and .10/.00. It was

thought that this sequence was ordered from least to most likely to produce acquisition of lever

pressing. When shifts in the contingency toward strong positive were associated with increases

in lever pressing, maintenance was evaluated by returning the contingency back the level in the

previous condition. Therefore, the exact sequence of conditions was different for each animal

because it depended, in part, on the animal's performance.









Results and Discussion

Figure 3-1 shows responses per min of lever pressing for each session, subj ect, and

condition. For subj ect 1852, low rates of responding were obtained in the first four conditions:

No Pellet, .00/. 10, .05/. 10, and .10/.05. Similar to the results of Experiment 1, acquisition was

obtained following the change from .10/.05 to .10/.00. Also similar to the results of experiment

1, responding persisted following the change from .10/.00 to .10/.05.

For subj ect 1901, low rates of responding were obtained in the first four conditions: No

Pellet, .00/. 10, .05/. 10, and .10/.05. Unlike the results of Experiment 1, little to no increase in

responding was obtained following the change from .10/.05 to .10/.00 (up to this point, every

subj ect had acquired lever pressing under .10/.00). Acquisition was obtained following the

change to 1.00/.00 and responding persisted following the subsequent return to .10/.00. Also

unlike the results of Experiment 1, responding did not persist following the change from .10/.00

to .10/.05 (up to this point, every subj ect maintained lever pressing under .10/.05 following

ac qui siti on).

The results for subject 1902 were similar to those obtained for subject 1852. Low rates of

responding were obtained in the first four conditions (No Pellet, .00/. 10, .05/. 10, and .10/.05).

Acquisition was obtained following the change from .10/. 05 to .10/. 00 and responding persisted

following the return to .10/.05.

For subj ect 1903, low rates of responding were obtained in the first three conditions: No

Pellet, .00/. 10, and .05/. 10. The ensuing exposure to .10/.05 was carried out for an extended

number of sessions because a modest increasing trend was obtained until session 260. A change

to .05/. 10 resulted in suppression of responding and a return to .10/.05 produced a modest

increase in responding similar to that obtained in the first exposure to .10/.05. Acquisition was










obtained after a change to .10/.00. However, similar to the results obtained for subject 1901,

responding did not persist after the return to .10/.05.














NO PELLET

100 00/.10 .05/.10


80-

60-

40-

20-

0-

100-

80-

60-

40-

20

0-

100-
10

80 -1 I 00

60-

40-

20-

0-

100-

80-

60-

40-

20-

0-
O 100 200


.10/.00

.10/.05 10/.05


118521
`1852














I1SD1I














119021


.10/.05 193


300 400 500 600 700

SESSIONS


800 900


Figure 3-1. Experiment 2 All Sessions. This figure shows rate of lever pressing for subjects
1852, 1901, 1902, and 1903 during Experiment 2.









Some similarities and differences between the results of the Experiment 1 and Experiment

2 deserve comment. First, for subjects 1901 and 1903, responding did not persist during .10/.05

(following acquisition). In experiment 1, all three subjects maintained responding following the

second exposure to .10/.05 (following acquisition). Second, for subject 1901, exposure to .10/.00

did not produce acquisition. In Experiment 1, all three subjects acquired lever pressing under

.10/.00. Third, overall rates of lever pressing were somewhat lower in Experiment 2. With the

exception of subj ect 1852, rates rarely exceeded 40 responses per min during .10/00, which is

only about 40-60% of the levels obtained in that same condition of Experiment 1.

Although the methods used in Experiment 1 and 2 do not allow for a proper comparison of

the effects of previous exposure to negative contingencies, the results of the two experiments

suggest that exposure to negative contingencies might produce a tendency for suppressed

acquisition and suppressed maintenance in subsequent conditions. If true, the finding may have

implications for treatments designed to "inoculate" individuals against the acquisition of problem

behavior. For example, prolonged exposure to DRO-like contingencies might make severe

problem behavior less sensitive to acquisition and maintenance contingencies. In addition, the

finding would also support early intervention programs designed to teach skills and other

appropriate behavior at an early age using strict (strong positive) reinforcement contingencies.

Conversely, too many reinforcers given "for free" in early development might hamper later

acquisition and maintenance of appropriate behavior.









CHAPTER 4
EXPERIMENT 3

Purpose

The purpose of this experiment was to examine effects of early exposure to negative

contingencies within individual subjects using a multiple schedule. A multiple schedule is

defined as the alternation between two (or more) components in a single session with each

component associated with a unique stimulus or set of stimuli (Ferster & Skinner, 1957).

Method

Subjects and Apparatus

Three experimentally-nai've male Wistar (albino) rat were included in Experiment 3. The

animals were acquired, housed, and fed in a manner identical to that used in Experiments 1 and

2. In addition, the same experimental chambers were used.

Procedures

Procedures were similar to those used in Experiment 1 with the following exceptions.

Instead of a conducting three sessions per day, two components were conducted each day in a

multiple-schedule format. Components were presented in pseudo-random order determined by

the running computer at the beginning or each session. Each component was 10 min in duration

and was associated with the illumination of LEDs in different colors and positions. Component

I was associated with the illumination of an LED located left-of-center and above each lever in

the chamber. Component 2 was associated with the illumination of an LED located at the center

line and above each lever. Previous work in other experiments showed that LED location could

exert stimulus control over responding in the context of multiple schedules; hence, it was

deemed adequate for this preparation. Like Experiments 1 and 2, each session was preceded and










terminated by a 1-min blackout. In addition, a 1-min blackout delineated the presentation of

each component.

Conditions

The contingency values associated with each component and condition are listed in Table

4-1. The preparation was designed to provide the subj ect with a history of a positive

contingency in component 1 and a history of a negative contingency in component 2. After

providing those histories, a baseline of responding in a positive contingency (. 10/. 00) was

established in both components prior to the test condition (. 10/. 05).

Table 4-1. Contingencies for each condition of Experiment 3.
Condition Component 1 Component 2
1 ~1.0/.00 (not used)
2 (not used) .00/.10
3 1.0/.00 .00/.10
4 .10/.00 .00/.10
5 .10/.00 .10/.00
6 .10/.05 .10/.05

Results and Discussion

Session-by-session data are presented in Figure 4-1. Results for subj ect 2302 are shown in

the top panel. In the first condition, the animal was only exposed to component 1 (1.0/.00).

Subj ect 2302 acquired lever pressing during the 8th SCSsion. In the next condition, the animal

was only exposed to component 2 (.00/. 10). By the end of the condition, rates of lever pressing

had fallen to an average of 0.29 responses per min and ranged between 0 and 0.8 responses per

min. During condition 3, the subject experienced both component 1 (1.0/.00) and component 2

(.00/. 10) in random order, with the schedule-correlated stimuli, once per day. By the end of

condition 3, responding in component 1 (1.00/.00) was higher (averaging 11.9 responses per

min) compared to component 2 (.00/. 10) (averaging 1.6 responses per min). In the next









condition, component 1 was changed from 1.00/.00 to .10/.00 while component 2 remained

.00/. 10. Responding during component 1 became somewhat variable while responding in

component 2 remained low. Next, component 2 was changed from .00/. 10 to .10/.00 to match

component 1. Responding increased during both components. In the next condition, both

components were changed to .10/.05. During the first four sessions, responding in component 2

(the component associated with previous exposure to a negative contingency) was notably lower

as compared to component 1.

Results for subj ect 2406 are presented in the center panel of Figure 4-1. In the first

condition, the animal was only exposed to component 1 (1.0/.00). Subj ect 2406 acquired lever

pressing during the 10th SCSsion. In the next condition, the animal was only exposed to

component 2 (.00/. 10). By the end of the condition, rates of lever pressing had fallen to 1.2

responses per min. During condition 3, the subject experienced both component 1 (1.0/.00) and

component 2 (.00/. 10) in random order once per day. On average, responding in component 1

(1.00/.00) was higher (averaging 17.4 responses per min) compared to component 2 (.00/. 10)

(averaging 3.2 responses per min). In the next condition, component 1 changed from 1.00/.00 to

.10/.00 while component 2 remained .00/. 10. Responding during component 1 became

somewhat variable while responding in component 2 remained low. Next, component 2 was

changed from .00/. 10 to .10/.00 to match component 1. Responding increased during both

components, averaging 45.41 responses per min in component 1 and 45.21 responses per min in

component 2. In the next condition, both components were changed to .10/.05. Following the

transition to .10/.05, responding in both components 1 and 2 decreased to an average of 28.15

and 31.96 responses per min, respectively. No noteworthy differences in responding were

observed between components 1 and 2.









Results for subj ect 2401 are presented in the bottom panel of Figure 4-1. In the first

condition, the animal was only exposed to component 1 (1.0/.00). Subject 2401 acquired lever

pressing during the 5th SCSsion. In the next condition, the animal was only exposed to component

2 (.00/. 10). By the end of the condition, rates of lever pressing had fallen to 0.09 responses per

min. During condition 3, the subject experienced both component 1 (1.0/.00) and component 2

(.00/. 10) in random order once per day. On average, responding in component 1 (1.00/.00) was

higher (averaging 11.9 responses per min) compared to component 2 (.00/. 10) (averaging 2.5

responses per min). In the next condition, component 1 changed from 1.00/.00 to .10/.00 while

component 2 remained .00/. 10. Responding during component 1 increased while responding in

component remained low. Next, component 2 was changed from .00/. 10 to .10/.00 to match

component 1. Responding increased during both components, averaging 44.0 responses per min

in component 1 and 42. 1 responses per min in component 2. In the next condition, both

components were changed to .10/.05. Responding decreased slightly in both components,

averaging 40.8 responses per min in component 1 and 40. 1 responses per min in component 2.

No noteworthy differences in responding were observed between components 1 and 2.











EXPERIMENT 3
1.0/00 -*- COMPONENT 1
.00/.10 ..0- COMPONENT 2
1.0/.00
.00/.10
.10/OO 101. 00 .1 0/.05
\ .00/.10 .101. 00 .1 0.05
100-

80-

U. 60-






100 200 300 400 500

.00/.10
1.0~0 1.0/.00 .10/.00 .1 0/.00 .10/.06
100 1.000 V .00.10 .00/10 .0.0.00

80-






100

r 80 -f~
~I.
6 0-



Olo1I I I I 1141

204 6 0 0




COMPONENTS


Figure 4-1. Experiment 3 All Sessions. This figure shows responses per min of lever pressing
for each subj ect and each condition of Experiment 3.









In Experiment 3, in an effort to tease out differences between Experiment 1 and

Experiment 2 results, one component of a multiple schedule was associated with early exposure

to positive contingencies while the other component was associated with early exposure to

negative contingencies. Later, following acquisition of lever pressing, both components were

changed to .10/.05. Data for one subj ect, 2302, showed that responding in the component that

was previously associated with the negative contingency was initially lower following the

transition to .10/. 05 compared to the component that was not previously associated with a

negative contingency. Data for subjects 2406 and 2401 revealed no noteworthy differences in

responding between components 1 and 2 following the transition to .10/.05.

Results for all three subj ects failed to replicate the differences in acquisition obtained

during .10/.00 in Experiments 1 and 2. In Experiment 1, three out of three subjects acquired

responding during .10/.00. In Experiment 2, subjects were given an early history with two

negative contingencies (.00/. 10 and .05/. 10) and one out of four subj ects failed to acquire

responding during .10/.00. In Experiment 3, one component of a multiple schedule was

associated with early exposure to positive contingencies while the other component was

associated with early exposure to negative contingencies. Following the experience with

negative contingencies, both components were changed to .10/.00. For all three subjects,

responding during the component previously associated with the negative contingency followed

the same pattern as responding during the component that was not associated with a negative

contingency. Introduction of the complex positive contingency (. 10/. 05) in both components

produced temporary reductions in the component previous associated with the negative

contingency for only one out of three rats. Therefore, these results only modestly support the

notion that early exposure to negative contingencies decreases acquisition. It is possible that the










early exposure to a positive contingency, independent of histories associated with particular

components, weakened any within-subj ect effect of early exposure to negative contingencies. A

future experiment might begin without the positive contingency exposure.









CHAPTER 5
EXPERIMENT 4

Purpose

In the previous experiments, a very small set of contingency values were examined. In

addition, the results of the previous experiments have been described in terms of one or two

possible outcomes: persistence and suppression. It is not clear whether gradual reductions in the

strength of reinforcement contingency would produce concomitant reductions in responding.

Lattal (1974) investigated the degree to which response rates would change in relation to

different combinations of reinforcers delivered according to either VI or VT schedules.

Specifically, the procedures arranged either 0%, 10%, 33%, 66%, or 100% of the reinforcers

according to the VI schedule and the remainder according to VT. The results showed gradual

decreases in responding as a function of the percentage of reinforcers that were response-

dependent decreased. However, the VT schedule permitted contiguity between the responses

and response-independent reinforcers. In fact, results showed that the rates of behavior obtained

using combinations of VT and VI schedules would have likely produced contiguity between

responses and response-independent reinforcers. Therefore, it is not clear whether responding

would be immediately suppressed after contacting weak contingencies in which the proportion of

reinforcers that followed responses was more controlled or whether response rates would

decrease gradually. The purpose of this final experiment was to evaluate a procedure to

gradually weaken the contingency of reinforcement in terms of its effects on responding.

Methods

Subjects and Apparatus

Two rats from Experiment 1 (2003, and 2004) and three rats from Experiment 2 (1903,

2001, and 2005) also participated in Experiment 4. The animals were acquired, housed, and fed










in a manner identical to that used in the previous experiments. In addition, the same

experimental chambers were used.

Procedures

Procedures were identical to those described in Experiment 2.

Conditions

After the last condition from Experiments 1 and 2 (. 10/.05), responding was reestablished

in .10/.00 and from there, the contingency was gradually weakened until the rate of responding

was judged to have been "suppressed." Suppression criterion was defined as at least one session

in which the rate was at or below ten times the highest rate obtained during the last-six sessions

of the No Pellet baseline from the previous experiment. Once suppression was obtained, an

attempt was made to re-establish responding using the contingency from the previous condition.

If responding was re-established, then the contingency was weakened again. If responding was

not re-established, 0. 10/0.00 was re-implemented until rates similar to a prior 0. 10/0.00 exposure

were obtained and contingency weakening began again.

Each contingency value was maintained for three consecutive days (for a total of 6

sessions). Contingency values were selected according to one of two possible sequences

depicted on Figure 5-1. Each rat was exposed to the first sequence at least once. Rats 2004,

1903, and 2005 were exposed to the first sequence and then the second.












0R .10


ru0.08









000 0.02 DD4 096 008 0.10 0.00 0.02 DD4 0908 08 0.10
P ROBABILJTYOF A REINFORCER GIVEN NO RE SPONSE


Figure 5-1. Experiment 4 Sequence of Conditions. Sequences 1 and 2 (depicted on the left
and right panels, respectively) in which the contingencies were weakened across
conditions. The area above the diagonal line represents a positive contingency where
the probability of a reinforcer given a response is greater than the probability of a
reinforcer given no response. The area below the diagonal line represents a negative
contingency where the probability of a reinforcer given a response is less than the
probability of a reinforcer given no response. The diagonal line represents a neutral
contingency where both parameters would be equal.

Results and Discussion

Data from the last six sessions of the last two conditions of Experiments 1 and 2 (. 10/.00

and .10/.05) and each session for each condition (for a total of six sessions per condition) from

Experiment 4 are presented below.

Subject 2003

Data for subject 2003 are presented in Figure 5-2. For 2003, three contingency

manipulations were performed following sequence 1. Those were 10/.00 to .09/.07, .10/.00 to

.05/. 10, and .10/.00 to .05/. 10. The figure shows that responding decreased as the contingency

was progressively weakened. During the first manipulation, the criteria for suppression were

met during .09/.07. A return to .10/.05 did not result in an increase in responding therefore, the

contingency was returned to .10/.00. This produced an increase in responding. At that point, the










second contingency manipulation began. During the second manipulation, responding remained

above suppression criteria until the contingency was weakened to .05/. 10. This sequence was

replicated during the third manipulation. Following the third manipulation, responding increased

during a brief reversal to .10/.00.













O g a a g
a a r~s 9
B B B B B a r~ ~ g ;T
oo ooo o ooo o


100 -


r!
i 'L
Y


hi

ur


It

u


r

i


li"i


ii

ri'
Y


R
'i
i
t'
i
Y


A
i \ii
ii'


R
'
i 4r


K
i
t:
u


X
i
r


fi
i I
it



r


*Y~


X
L
I*J:


c"


SESSIONS

Figure 5-2. Experiment 4 Subject 2003. This figure shows responses per min of lever pressing for the last six sessions of the last
two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Three
contingency manipulations are shown: from .10/.00 to .09/.07, .10/.00 to .05/. 10, and .10/.00 to .05/. 10.


S

ji
9i
ii


S
ii
xi
i II
i










Subject 2004

Results for 2004 are presented on Figure 5-3. For 2004, seven contingency manipulations

were made consisting of five exposures to sequence 1 and two exposures to sequence 2. Those

were from .10/.00 to .09/.06, .10/.00 to .10/.05, .10/.00 to .10/.05, .10/.00 to .09/.06, .10/.00 to

.09/.06, .10/.00 to .08/.03, and 10/.00 to .05/.09. The Eigure shows that responding decreased as

the contingency was progressively weakened. During the first manipulation, the criterion for

suppression was met during .09/.06. A return to .10/.00 resulted in an increase in responding. At

that point, the second contingency manipulation began. During the second manipulation,

responding reached suppression criteria during the next condition: .10/.05. This sequence and

effects were replicated during the third contingency manipulation. Following the third

manipulation, responding increased during a return to .10/.00. At that point, the fourth

contingency manipulation began. Responding persisted during the subsequent condition: .10/.05.

Responding decreased following a change to .09/.06. At that point, the fifth contingency

manipulation began. Responding increased during .10/.00 but reached suppression criteria

during the subsequent condition: .10/.05.

Three out of the Hyve previous attempts to transition from .10/.00 to .10/.05 resulted in

suppression of responding. In the remaining two attempts, suppression occurred in the next

condition (.09/.06). The effects of weakening the contingency according to sequence 2 were

evaluated during the sixth and seventh contingency manipulations. That is, instead of making a

transition from .10/.00 to .10/.05, the sequence was .10/.00, .10/.01, .09/.01, .09/.02, and so on.

During the sixth contingency manipulation, responding increased during .10/.00 and

persisted during the subsequent four conditions: .10/.01, .09/.01, .09/.02, .08/.02, and .08/.03.

During the seventh contingency manipulation, responding increased during .10/.00 and persisted

during the subsequent twelve conditions: .10/.01, .09/.01, .09/.02, .08/.02, .08/.03, .07/.03,










.07/.04, .06/.04, .05/.05, .05/.06, .05/.07, and .05/.08. The suppression was obtained during the

subsequent conditions (.05/.09 and .05/. 10) and response rates increased following a return to

.10/. 00.
















;8 a sr. sr. a 9 9 s a a a
B B B B B B g s ;i ;i E; ;i E; ;i a
,, oooo oooooo


25 50 75 100 125 150 175 200
SESSIONS


Figure 5-3. Experiment 4 Subject 2004. This figure shows responses per min of lever pressing for the last six sessions of the last
two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Seven
contingency manipulations are shown. Suppression was obtained at the following contingency values: .09/.06, .10/.05,
.10/.05, .09/.06, .10/.05, .08/.03, and .05/.08.


W

3 100
2;
E

g g0
C4
60
cn
Z
40

d 20


0










Subject 1903

Results for 1903 are presented on Figure 5-4. For 1903, four contingency manipulations

were performed: one from sequence 1 and two from sequence 2. The figure shows that

responding decreased as the contingency was driven more in favor of not responding. During the

first manipulation (following sequence 1), the criterion for suppression was met during .10/.05.

Subsequence contingencies were examined using sequence 2.

During the second contingency manipulation, a return to .10/.00 resulted in an increase in

responding. Responding persisted during the subsequent condition (. 10/.01) and then decreased

in the next condition (.09/.01). A return to .10/.01 produced increased responding and a

subsequent return to .09/.01 failed to replicate the suppression effects obtained during the

previous exposure to .09/.01 (i.e., responding persisted). Responding still persisted during the

subsequent conditions: .09/.02, .08/.02, .08/.03, and .07/.03. The suppression criterion was

obtained in the following condition (.07/.04) and responding increased during a return to the

previous condition (.07/.03). A return to .07/.04 produced suppression during the 3rd SCSsion. At

that point, the suppression criterion was changed such that an overall downward trend must be

obtained in the condition (in addition to the previous rule that responding must be below 10

times the highest rates obtained during the last six sessions of the No Pellet baseline). Therefore,

the contingency was weakened again to .06/.04, .05/.05, and .04/.06. The suppression criteria

were obtained during .04/.06 and a return to .05/.05 did not produce increased responding.

Therefore, the contingency was returned to .10/.00 and the third contingency manipulation

began.

Responding increased during the return to .10/.00 and persisted during the following four

conditions: .10/.01, .09/.01, .09/.02, and .08/.02. Responding decreased during the subsequent

condition (.08/.03) and then increased following a return to .10/.00.















70

60 -

50 -
40 -
3r 0 -:j

cOc 20 j\ tij *


Ir Ir




u, .05/05 and .08/.03. r










Subject 2001

Results for 2001 are presented on Figure 5-5. For 2001, three contingency manipulations

were performed following sequence 1 from .10/.00 to .09/.07, .10/.00 to .08/.07, and .10/.00 to

.08/.07. The figure shows that responding decreased as the contingency was progressively

weakened. During the first manipulation, responding decreased but remained above suppression

criteria during .10/.05. Suppression criteria were met during the next condition (.09/.06). A

return to .10/.00 resulted in increased responding. At that point, the second contingency

manipulation began. Responding decreased but remained above suppression criteria during the

subsequent two conditions: .10/.05 and .09/.06. Suppression criteria were met during the next

condition (.08/.07) and a return to .09/.06 failed to increase rates. At that point, the third

contingency manipulation began. Responding increased during .10/.00 and persisted for the

ensuing two conditions (. 10/.05 and .09/.06) until suppression was obtained during .08/.07.















601 ,



40 -~






10 20 304 0 07
SESIN
Fiue55 xeien ujc 01 This fiuesosrsossprmno ee rssn o h atsxssin ftels
tw codtosfrmEprmet n an evr eso rmeeycniino h urn xeiet he
cotnec maiuain r hw.Sppeso a banda hefloigcnignyvlus 0/0,.8.7 n
m 1~~08/.07.?" i~










Subject 2005

Results for 2005 are presented on Figure 5-6. For 2005, three contingency manipulations

were made from: one following sequence 1 and two following sequence 2. The figure shows

that responding decreased as the contingency was progressively weakened. During the first

manipulation, responding reached suppression criteria during .10/.05. Like the previous subjects

where responding was suppressed during .10/.05 (1903 and 2004), the effect of weakening

contingencies according to sequence 2 was evaluated. Responding increased following a return

to .10/.00 and persisted during the subsequent eight conditions: .10/.01, .09/.01, .09/.02, .08/.02,

.08/.03, .07/.03, .07/.04, and .06/.04. Suppression criteria were met during the subsequent

condition (.05/.05) and increased following a return to the previous condition (.06/.04).

Responding then decreased but remained above suppression criteria during the subsequent return

to .05/.05. Suppression criteria were met during the subsequent condition (.05/.06) and a return

to .05/.05 did not produce an increase in behavior. At that point the third contingency

manipulation was performed.

Responding increased during a return to .10/.00 and remained above suppression criteria

for the following ten conditions: .10/.01, .09/.01, .09/.02, .08/.02, .08/.03, .07/.03, .07/.04,

.06/.04, .05/.04 and .05/.06. Suppression criteria were met during the subsequent condition

(.05/.07) and increased following a return to .10/.00.















Nrm g
Vi B B
$~g, 99 099
8$~8 m r~ Pi d. ;f Pi 8 8 9 9
,,,,
In ~
~,,,,,
0000 0 0 00000 0 0000 000


100



S80






20


60


25 50 75 100 125 150 175
SESSIONS


Figure 5-6. Experiment 4 Subject 2005. This figure shows responses per min of lever pressing for the last six sessions of the last
two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Three
contingency manipulations are shown. Suppression was obtained at the following contingency values: .10/.05, .05/.05, and
.05/.07.










In general, lever pressing decreased as the contingency was weakened. These results

replicate and extend the findings of Lattal (1974) that showed gradual decreases in responding as

the proportion of reinforcers produced by behavior decreased. In this case, procedures were used

that allowed the probability of a reinforcer delivery given a response and given the

nonoccurrence of a response to be tightly controlled.

In some cases, it appeared as though returns to .10/.00 followed by additional contingency

weakening produced greater persistence over successive attempts to meet suppression criteria.

That is, suppression criteria were not met until a much weaker contingency occurred during later

manipulations as compared to earlier manipulations. For example, subj ect 2003 initially showed

suppression under .09/.07. During the next contingency manipulation, suppression was not

obtained until the contingency was weakened to .07/.08. For subject 2001, suppression was

obtained in the initial contingency manipulation until .09/.06. But during subsequent

manipulations, suppression was not obtained until .08/.07. For subject 1903, suppression was

obtained during the second contingency manipulation at .09/.01 but during subsequent

manipulations, suppression was not obtained until weaker contingencies were reached (.07/.04

and .08/.03, during the third and fourth manipulations, respectively). For subject 2004,

suppression was obtained during the sixth contingency manipulation at .08/.03 but suppression

was not obtained during the seventh manipulation until the contingency was weakened to .05/.08.

Similarly, for subj ect 2005, suppression was obtained during the second contingency

manipulation at .05/.05 but was not obtained until .05/.07 during the third manipulation.

In addition, there were some cases in which contingencies that initially produced

suppression resulted in maintenance after responding was increased during a return to the next

stronger contingency. These results suggest that similar procedures could be used to increase









maintenance of treatment effects in environments that might have weakened contingencies. For

example, following acquisition of appropriate communication, an individual could be exposed to

a series of gradually weaker contingencies. Such exposure might increase the likelihood of

maintenance in situations when which reinforcers are less likely following appropriate behavior

and more likely after the nonoccurrence of appropriate behavior (e.g., after the occurrence of

problem behavior).

For subj ect 2004, contingency weakening using sequence 2 (that, in general, involved

smaller steps and changing only one parameter at a time) seemed more likely to result in

maintenance at weaker contingencies for subj ects that showed suppression during sequence 1

(1903, 2004, and 2005). Although both sequences were also evaluated for subjects 1903 and

2005, the data representing the last six sessions of Experiments 1 and 2 were obtained after much

longer experience in those conditions. A comparison of the effects of sequence 1 and 2 for

subjects 1903 and 2005 based on those data would seem inappropriate. However, subj ect 2004

experienced four additional exposures to sequence 1 following Experiment 1 followed by two

exposure to sequence 2. Therefore, it seems more reasonable to attribute differences in the

persistence of behavior during weakened contingencies to the particular sequence used (for

subj ect 2004).









CHAPTER 6
GENERAL DISCUSSION

Four experiments examined the effects of contingencies on lever pressing using rats as

subjects. In Experiment 1, subjects were exposed to four conditions: No Pellet, .10/.05, .10/.00,

and .10/.05. For all three subjects, responding remained low during the No Pellet and .10/.05

conditions, increased during .10/.00, and persisted during .10/.05. The results of Experiment 1

showed that a) acquisition was more likely under contingencies that were purely positive as

opposed to complex positive contingencies (where reinforcers were also sometimes presented

following the nonoccurrence of behavior) and b) responding persisted under complex positive

contingencies despite the fact that the same contingencies did not previously produce acquisition.

Subj ects in Experiment 2 were exposed to the same conditions as those in Experiment 1

except that following the No Pellet condition, subj ects were also exposed to two negative

contingencies: .00/. 10 and .05/. 10. The results of Experiment 2 were the same as those for

Experiment 1 except that, for two subj ects, responding was suppressed during the second

exposure to .10/.05 and, for one subject, acquisition did not occur in .10/.00 (although, a

subsequent exposure to 1.0/.00 produced acquisition). Differences in responding between

Experiments 1 and 2 were tentatively attributed to the early exposure to negative contingencies

provided in Experiment 2. Therefore, Experiment 3 was designed to examine effects of early

exposure to negative contingencies on maintenance during .10/.05 using a within-subject

preparation.

Experiment 3 arranged a two-component multiple schedule in which each component was

associated with different stimuli that signaled the presence of different contingencies. Early

exposure to negative contingencies was provided in component 2 while only positive

contingencies were implemented in component 1. Later conditions arranged positive









contingencies in both components to test for differences in acquisition during .10/.00 and

maintenance during .10/.05. For one subject, 2302, a clear but very temporary difference was

obtained in maintenance during .10/.05 but only minor differences were obtained during

acquisition under .10/.00 (and those minor differences were in the direction opposite from that

expected given the results of Experiments 1 and 2). For the other two subjects, 2406 and 2401,

no clear differences were obtained during either maintenance or acquisition. The temporary

suppression observed in component 2 of .10/.05 for subj ect 2302 was different compared to the

more persistent suppression observed for subjects 1901 and 1903 in Experiment 2. However,

temporary history effects are not entirely uncommon when evaluated within subj ect (e.g.,

Freeman & Lattal, 1992). One interpretation for the lack of a robust effect was that the first

condition in this experiment exposed the subj ects to a positive contingency. In addition,

alternation between the two components of the multiple schedule during the test condition (when

both components were set to .10/.05) may have produced carryover effects. Thus, rapid

alternation between the component previous associated with a positive contingency and the

component previously associated with the negative contingency may have made it difficult to

detect differences that would have otherwise been apparent had the components been presented

in isolation.

Experiment 4 showed how responding decreased as a result of gradually weakening

contingencies. For every subject, responding decreased gradually as contingencies were

weakened. For some of the subjects (1903, 2004, and 2005), responding seemed to persist

longer during weak contingencies when smaller changes in contingencies were used and

following repeated exposure to previously experienced contingency values.









Results of these experiments have potential implications for the acquisition and

maintenance of appropriate and problem behavior. The results of Experiment 1 suggested that

acquisition might be suppressed by reinforcers that are presented following the nonoccurrence of

behavior. Environments in which reinforcers are freely available or presented noncontingently

may be harmful in the sense that they might prevent the acquisition of appropriate behavior. The

notion that "free" reinforcers may be harmful is not new (c.f., Ayllon & Michael, 1959; Burgio et

al., 1986). However, the current conceptualization of contingencies refines that notion to the

degree that predicting an environment' s effects on acquisition requires knowledge of both a) the

probability of a reinforcer given the occurrence of a response and b) the probability of a

reinforcer given the nonoccurrence of a response. Likewise, a key to the success of early

childhood intervention programs may lie in the combination of few freely available reinforcers

and many response-produced reinforcers.

Inversely, "free" reinforcers might help prevent the acquisition of problem behavior. That

is, reinforcers delivered in the absence of problem behavior might suppression acquisition even

though reinforcers would be more likely following the occurrence of behavior. In the initial

complex positive contingency evaluated in Experiment 1, .10/.05, the delivery of a reinforcer

following behavior was twice as likely compared to the delivery of a reinforcer following the

absence of behavior. This may be important, especially considering that certain kinds of

problem behavior (e.g., aggression) have a high likelihood of producing potential reinforcement

(such as attention) from others (Thompson & Iwata, 2001). If caregivers or other members of an

individual's social environment are unable or unlikely to implement extinction (withholding the

reinforcer maintaining behavior following the occurrence of behavior), the current results










suggest that even a modest amount of reinforcement presented in the absence of behavior may

serve to "inoculate" against the development of some forms of severe problem behavior.

Results of Experiment 2 suggested that behavior previously exposed to a negative

contingency was less likely to be acquired under subsequent positive contingencies and less

likely to persist at high rates under subsequent complex positive contingencies. The purpose of

Experiment 3 was to evaluate these effects within subject although the results were not

compelling. Numerous possible interpretations of the results make applied implication tenuous.

However, results for subj ect 2302 provide some support for the notion that, perhaps for certain

individuals, extended early experience with negative contingencies may reduce the persistence of

behavior following acquisition. One implication with respect to the maintenance of appropriate

behavior is that the requirements for treatment integrity be that much more strenuous. For

example, appropriate responses should be selected on the basis of contacting a high likelihood of

reinforcement from members of the community. In addition, perhaps caregivers could be trained

not to provide reinforcers in the absence of appropriate behavior in the hopes of further

strengthening the reinforcement contingency when appropriate behavior does in fact occur.

Results of Experiment 4 suggest that procedures could be developed to improve the

likelihood that appropriate behavior will persist in environments unlikely to support such

behavior. Experiment 4 showed that as the probability of a reinforcer given a response decreased

and the probability of a reinforcer given no response increased (i.e., as the contingency became

more negative), behavior tended to decrease. In addition, there was some evidence to suggest

that smaller changes in the contingency values maintaining behavior were more likely to result in

persistence as compared to larger changes. Therefore, once individuals have acquired some

appropriate behavior, therapists might use a similar approach to a) identify the point where










behavior becomes suppressed, b) use such information as a baseline to compare the effects of

treatments designed to improve maintenance, and c) include making small changes in

contingency values as one component of a maintenance program.

Some limitations of the present experiments deserve comment. First, the use of 1-s cycle

durations was arbitrary but ultimately was derived as a compromise between two competing

problems. Shorter cycle duration might be ideal because they reduce the time between responses

and subsequent programmed reinforcer deliveries. Using a 1-s cycle duration meant that even if

the probability of a reinforcer given a response was 1.0, there may have been up to nearly a 1-s

delay between a response and a subsequent reinforcer delivery. The possibility of uncontrolled

delays between responses and programmed reinforcers introduces an additional source of

variation. However, previous researchers have shown that delays between 0.5 s and 1.0 s can

actually produce increases in responding relative to a no-delay baseline (Sizemore & Lattal,

1978; Lattal & Ziegler, 1982). On the other hand, shorter cycle durations suffer from additional

problems. One problem with short cycle durations is that behavior occurring in one cycle may

be more likely affected by reinforcers delivered in a subsequent cycle for the nonoccurrencee" of

behavior. Given that the 1-s cycle had been used by Hammond (1980) it was selected as a

reasonable beginning point and future contingency research could test larger and smaller

intervals.

A parallel problem lies in the arbitrary definition of the nonoccurrence of behavior as any

cycle in which behavior does not occur. Why should the nonoccurrence of behavior be defined

as a 1-s period in which behavior does not occur? If the nonoccurrence of behavior were defined

as some period of time greater than 1-s in which behavior did not occur then the interval unit is

considerably different than the amount of time it takes the organism to make a response.









Conversely, if the nonoccurrence of behavior were defined as some period of time less than 1-s

in which behavior did not occur, then the aforementioned problem of short delays to

reinforcement for previously emitted responses is pertinent. This discussion suggests that future

work should investigate the interaction between different interval sizes and different definitions

of the nonoccurrence of behavior on the effects of contingencies. A cycle duration of 1-s

produced relatively orderly data in the current experiment; however, larger cycle durations may

be more appropriate in cases where the duration of behavior exceeds 1 s. Alternatively, similar

procedures may be evaluated in a trial-based format with explicit stimuli that signal the

beginning and end of each trial and reinforcers that are presented immediately following the

occurrence of behavior.

In addition to the implication for the acquisition and maintenance of behavior, the current

conceptualization of reinforcement contingencies may also provide a means for applied behavior

analysts to increase the overall rate of reinforcement for a given procedure without necessarily

sacrificing effectiveness. Once appropriate behavior has been acquired, it might be possible to

increase the probability of a reinforcer given no response by a small amount while maintaining

similar rates of behavior. Increasing the rate of reinforcers provided following the

nonoccurrence of a response and thus, overall rate of reinforcement, might be beneficial for

reasons other than its effects on the target response. Environments with few response-

independent reinforcers may appear sparse and barren to untrained individuals. Alternatively,

procedures with a higher likelihood of reinforcers for the nonoccurrence of behavior (and overall

rates of reinforcement) might be judged as more acceptable by parents, caregivers, and other

members of the community. Also, procedures with higher rates of reinforcement might be










preferred by the individual receiving the treatment and thus, the individual might be less likely to

avoid the treatment and caregivers associated with its implementation.

Viewing contingencies as the relationship between the probability of a reinforcer given a

response and the probability of a reinforcer given the nonoccurrence of a response may provide a

bridge for understanding reinforcement in both experimental and non-experimental contexts.

Descriptive analyses are procedures that attempt to produce descriptions of the relationship

between behavior and environment in circumstances in which the environment is not under the

control of the experimenter. Most research and understanding of reinforcement is based on

reinforcement as implemented using traditional schedules (e.g., FR, VI). However, if reinforcers

are presented both following behavior and following the nonoccurrence of behavior in some

uncontrolled context (as has been shown by Samaha et al., in press) then, interpreting the overall

schedule of reinforcement as something like VR or VI can be complicated and problematic.

Reliance on traditional schedule nomenclature has therefore restricted the understanding of

reinforcement to contexts in which schedules can be controlled. A conceptualization of

reinforcement based on the probability of a reinforcer given a response and the probability of a

reinforcer given no response can therefore bridge the concept of reinforcement across both

experimental and non-experimental (descriptive) contexts.










LIST OF REFERENCES


Ayllon, T. & Michael, J. (1959). The psychiatric nurse as a behavioral engineer. Journal
of the Experimental Ana~lysis ofBehavior, 2, 323-334.

Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied
behavior analysis. Journal ofApplied Behavior Analysis, 1, 91-97.

Borrero, J. C. Vollmer, T. R., van Harren, F., Haworth, S. & Samaha, A. L. (in prep). A
multiple schedule evaluation of behavior on fixed-time and fixed-interval schedules.

Borrero, J. C., Vollmer, T. R., & Wright C. S. (2002). An evaluation of contingency
strength and response suppression. Journal ofAppliedBehavior Analysis, 35, 337-347.

Burgio, L. D., Burgio, K. L., Engel, B. T., & Tice, L. M. (1986). Increasing distance and
independence of ambulation in elderly nursing home residents. Journal ofAppliedBehavior
Analysis, 19, 357-366.

Campbell, D. T. & Stanley, J. C. Experimental and quasi-experimental designs for
research. In N. L. Gage (Ed.), Handbook ofresearch on teaching. Chicago: Rand McNally,
1963.

Catania, A. C. (1983). Lea-l~rning. SecondEdition. Englewood Cliffs, New Jersey:
Prentice-Hall, Inc.

Catania, A. C. (1988). Learning. Third Edition. Englewood Cliffs, New Jersey: Prentice-
Hall, Inc.

Edwards, D. D., Peek, V., & Wolf, F. (1970). Independently delivered food decelerates
fixed-ratio rates. Journal of the Experimental Analysis ofBehavior, 14, 301-307.

Ferster, C. B. & Skinner, B. F. (1957). Schedules ofReinforcentent. New York: Appleton-
Century-Crofts .

Freeman, T. J. & Lattal, K. A. (1992). Stimulus Control of Behavior History. Journal of
the Experimental Analysis of Behavior, 57, 5-15.

Goh, H. L., Iwata, B. A., & DeLeon, I. G. (2000). Competition between noncontingent
and contingent reinforcement schedules during response acquisition. Journal ofApplied
Behavior Analysis, 33, 195-205.

Hammond, L. J. (1980). The effect of contingency upon the appetitive conditioning of
free-operant behavior. Journal of the Experimental Analysis ofBehavior, 34, 297-304.

Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1982/1994).
Toward a functional analysis of self-injury. Journal ofAppliedBehavior Analysis, 27, 197-209.










Koegel, R. L. & Rincover, A. (1977). Research on the difference between generalization
and maintenance in extra-therapy responding. Journal ofAppliedBehavior Analysis, 10, 1-12.

Lattal, K. A. (1972). Response-reinforcer independence and conventional extinction after
fixed-interval and variable-interval schedules. Journal of the Experimental Analysis ofBehavior,
18, 133-140.

Lattal, K. A. (1974). Combinations of response-reinforcer dependence and independence.
Journal of the Experimental Analysis ofBehavior, 22, 357-362.

Lattal, K. A. & Bryan, A. J. (1977). Effects of concurrent response-independent
reinforcement on fixed-interval schedule performance. Journal of the Experintental Analysis of
Behavior, 26, 495-504.

Lattal, K. A. & Maxey, G. C. (1971). Some effects of response independent reinforcers in
multiple schedules. Journal of the Experimental Analysis ofBehavior, 16, 225-23 1.

Lattal, K. A. & Ziegler, D. R. (1982). Briefly delayed reinforcement: An interresponse
time analysis. Journal of the Experimental Analysis ofBehavior, 37, 407-416.

Marcus, B. A. & Vollmer, T. R. (1996). Combining noncontingent reinforcement and
differential reinforcement schedules as treatment for aberrant behavior. Journal ofApplied
Behavior Analysis, 29, 43-51.

McGonigle, J. J., Rojahn, J., Dixon, J., & Strain, P. S. (1987). Multiple treatment
interference in the alternating treatments design as a function of the intercomponent interval
length. Journal ofApplied Behavior Analysis, 20, 171-178.

Rescorla, R. A. (1967). Pavlovian conditioning and its proper control procedures.
Psychological Review, 74, 71-80.

Rescorla, R. A. & Skucy, J. C. (1969). Effects of response-independent reinforcers during
extinction. Journal of Comparative and Physiological Psychology, 67, 38 1-389.

Reynolds, G. S. (1968). A Primer of Operant Conditioning. Glenview, Illinois: Scott,
Foresman.

Samaha, A. L., Vollmer, T. R., Borrero, J. C., Sloman, K. N., St. Peter Pipken, C., &
Bourret, J. (in press). Journal ofApplied Behavior Analysis.

Sizemore, O. J. & Lattal, K. A. (1978). Unsignaled delay of reinforcement in variable-
interval schedules. Journal of the Experimental Analysis ofBehavior, 30, 169-175.

Thompson, R. H. & Iwata, B. A. (2001). A descriptive analysis of social consequences
following problem behavior. Journal ofApplied Behavior Analysis, 34, 169-178.











Thompson, R. H., Iwata, B. A., Hanley, G. P., Dozier, C. L., & Samaha, A. L. (2003). The
effects of extinction, noncontingent reinforcement, and differential reinforcement of other
behavior as control procedures. Journal ofApplied Behavior Analysis, 36, 221-238.

Vollmer, T. R., Borrero, J. C., Borrero, C. S., Van Camp, C., & Lalli, J. S. (2001).
Identifying possible contingencies during descriptive analyses of severe behavior disorders.
Journal ofApplied Behavior Analysis, 34, 269-287.

Zeiler, M. D. (1968). Fixed and variable schedules of response-independent
reinforcement. Journal of the Experimental Analysis ofBehavior, 11, 404-414.









BIOGRAPHICAL SKETCH

I completed my undergraduate degree in 2001 at the University of Florida where I maj ored

in psychology. I began the doctoral program in behavioral analysis in the same department that

summer. My master's thesis examined a method for describing interactions between children

and their caregivers. My primary interests are in the areas of assessment and treatment of severe

problem behavior and the use of animal and human operant preparations to address issues related

to the assessment and treatment of severe problem behavior. Also, I hope to also expand the

range of problems in which behavior analysis can suitably applied during my career.





PAGE 1

1 CONTINGENCY VALUES OF VARYI NG STRENGTH AND COMPLEXITY By ANDREW LAWRENCE SAMAHA A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2008

PAGE 2

2 2008 Andrew Lawrence Samaha

PAGE 3

3 To my dog, Enzo.

PAGE 4

4 ACKNOWLEDGMENTS I would like to thank the faculty, staff, and students of the University of Florida Psychology Departm ent for contributing toward my education in psychology and behavior analysis and challenging me to become a better st udent and scientist. Above all others, I would like to express my sincerest gratitude to Dr. Timothy Vollmer for his support, guidance, and patience as my faculty advisor. I would also like to thank the members of my dissertation committee for their attention and feedback for which I am extremely grateful and honored: Dr. Timothy Hackenberg, Dr. Brian Iwata, Dr David Smith, and Dr. Colette St. Mary. I would also like to acknowledge Stephen Haworth and Dr. Frans van Haaren for helping to establish the lab in which this research was conducted, Dr. Jonathan Pinkston and Dr. Jin Yoon for their innumerable contributions during my early development as a student, and Dr. Gregory Hanley and Dr. Rachael Thompson for en couraging me to pursue a career in Behavior Analysis.

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEDGMENTS...............................................................................................................4 LIST OF TABLES................................................................................................................. ..........7 LIST OF FIGURES.........................................................................................................................8 ABSTRACT...................................................................................................................................10 CHAP TER 1 INTRODUCTION..................................................................................................................12 Brief History of Reinforcement..............................................................................................12 Considering the Occurrence and Nonoccurrence of Behavior............................................... 18 An Analogy in Respondent Conditioning............................................................................... 20 Previous Research on Complex Continge ncies of (Operant) Reinforcement ........................ 23 Translational Research......................................................................................................... ...26 Goals of the Current Research................................................................................................ 27 2 EXPERIMENT 1....................................................................................................................29 Purpose...................................................................................................................................29 Method....................................................................................................................................29 Subjects............................................................................................................................29 Apparatus.........................................................................................................................30 Procedures..................................................................................................................... ..30 Conditions........................................................................................................................31 Results and Discussion......................................................................................................... ..32 3 EXPERIMENT 2....................................................................................................................37 Purpose...................................................................................................................................37 Method....................................................................................................................................37 Subjects and Apparatus................................................................................................... 37 Procedures..................................................................................................................... ..37 Conditions........................................................................................................................37 Results and Discussion......................................................................................................... ..38

PAGE 6

6 4 EXPERIMENT 3....................................................................................................................42 Purpose...................................................................................................................................42 Method....................................................................................................................................42 Subjects and Apparatus................................................................................................... 42 Procedures..................................................................................................................... ..42 Conditions........................................................................................................................43 Results and Discussion......................................................................................................... ..43 5 EXPERIMENT 4....................................................................................................................49 Purpose...................................................................................................................................49 Methods..................................................................................................................................49 Subjects and Apparatus................................................................................................... 49 Procedures..................................................................................................................... ..50 Conditions........................................................................................................................50 Results and Discussion......................................................................................................... ..51 Subject 2003....................................................................................................................51 Subject 2004....................................................................................................................54 Subject 1903....................................................................................................................57 Subject 2001....................................................................................................................59 Subject 2005....................................................................................................................61 6 GENERAL DISCUSSION..................................................................................................... 65 LIST OF REFERENCES...............................................................................................................72 BIOGRAPHICAL SKETCH.........................................................................................................75

PAGE 7

7 LIST OF TABLES Table page 1-1 Contingencies for each condition in Hammond (1980)..................................................... 24 4-1 Contingencies for each c ondition of Experim ent 3............................................................ 43

PAGE 8

8 LIST OF FIGURES Figure page 2-1 Experiment 1: All Sessions................................................................................................ 33 3-1 Experiment 2: All Sessions................................................................................................ 40 4-1 Experiment 3: All Sessions................................................................................................ 46 5-1 Experiment 4: Sequence of Conditions.............................................................................. 51 5-2 Experiment 4: Subject 2003............................................................................................... 53 5-3 Experiment 4: Subject 2004............................................................................................... 56 5-4 Experiment 4: Subject 1903............................................................................................... 58 5-5 Experiment 4: Subject 2001............................................................................................... 60 5-6 Experiment 4: Subject 2005............................................................................................... 62

PAGE 9

9 LIST OF ABBREVIATIONS DRO Differential reinforcement of other behavi or. This is a common treatment for problem behavior whereby reinforcer s are arranged to follow some period of time in which problem behavior does not occur. FI Fixed-interval schedule. This is a schedule of reinforcement whereby a reinforcer is delivered following the first instance of behavior after a fixed-amount of time has elapsed. For example, FI-30 would mean th at the first response after 30 s would be reinforced. FR Fixed-ratio schedule. This is a schedule of reinforcement whereby a reinforcer is delivered following the nth instance of behavi or. For example, FR-30 would mean that the 30th response would produce a reinforcer. NCR Noncontingent reinforcement. This is a common treatment for problem behavior whereby reinforcer are arranged independent of behavior, usually according to the passage of time (e.g., every 30 s). VI Variable-interval schedule. This is a schedule of reinfor cement whereby a reinforcer is delivered following the first response after so me variable interval of time has elapsed. That amount of time centers around an aver age determined by an experimenter-set distribution. VR Variable-ratio schedule. This is a schedule of reinforcement whereby a reinforcer is delivered following, on average, the nth instance of behavior. The exact response requirement changes from trial to trial accord ing to some experimenter-set distribution.

PAGE 10

10 Abstract of Dissertation Pres ented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy CONTINGENCY VALUES OF VARYI NG STRENGTH AND COMPLEXITY By Andrew Lawrence Samaha August 2008 Chair: Timothy R. Vollmer Major: Psychology Precise control over the reinforcers that fo llow behavior and the reinforcers that are presented in the absence of behavior may help to provide a clearer unders tanding of the role of response-dependent and response-in dependent reinforcers. Four experiments examined lever pressing in rats as a function of a contingency for the delivery of sucrose pellets. Contingencies were arranged by manipulating the probability of a reinforcer given a response and the probability of a reinforcer given no response. Experiment 1 examined acquisition and main tenance of lever pr essing during positive contingencies (where the probability of a rein forcer given a response was higher than the probability of a reinforcer given no response) and complex positive contingencies (a positive contingency where the probability of a reinforcer given no response is greater than zero). Results indicated lever pressing was not acqui red under the complex positive contingency, was acquired under the positive contingency, but persisted during a return to the complex positive contingency for all three subjects. In Experiment 2, subjects were exposed to the same sequence of conditions as subjects in Experiment 1 but after first experiencing nega tive (.00/.10) and complex negative contingencies (.05/.10). In general, results of Experiment 2 were similar to the results of Experiment 1 except

PAGE 11

11 that responding did not persist during the s econd exposure to .10/.05 fo r two subjects and, for one subject, acquisition during the positive continge ncy was more difficult to obtain than for any of the subjects in Experiment 1. In Experiment 3, a two-component multiple schedule was arranged where one component was associated with early exposure to a negati ve contingency while the other component was associated with only positive contingencies. Results indicated that, overall, the multiple schedule method did not detect diffe rences in subsequent responding. In Experiment 4, the effects of a gradual shift from a positive to a negative contingency were examined. Results indicated that lever pressing decreased accord ingly as contingencies became more negative. In addition, maintenance under negative contingencies was more likely when smaller contingency changes were made from one condition to another. All of the results are discussed in terms of unders tanding naturally occu rring schedules of reinforcement in the acquisition and maintenance of appropriate and problematic human behavior.

PAGE 12

12 CHAPTER 1 INTRODUCTION The f ollowing experiments examined acquisition and maintenance of lever pressing in rats. The purpose of the research was to investigat e contingencies of reinforcement in which reinforcers are presented both following behavior and following periods of time in which behavior did not occur. Although reinfor cement contingencies are commonly arranged experimentally such that a response must occur to produce a reinforcer, based on prior research in applied behavior analysis, it is likely that in nature a blend of events occur following both the occurrence and nonoccurrence of behavior occur. A better understanding of such contingencies has important implications for understanding acquisition of both problem and appropriate behavior in the development of human behavioral repertoires. Brief History of Reinforcement In Schedules of Reinforcement (1957), Ferster and Skinner categorized hundreds of variations on relations between behavior and environm ent, known as reinforcement schedules. Investigating such relations involved arranging contingencies betw een behavior and features of the apparatus that could exert c ontrol over behavior. The continge ncies took the form of if-then relations with some specification of behavior or time and behavior affecting some feature of the environment. For example, if a response key in a pigeon chamber was pecked 25 times, a solenoid was then activated to ra ise a hopper filled with grain. Or, the first response after 15 s resulted in hopper access. These procedures ha ve come to be known as a fixed-ratio (FR) and fixed-interval (FI) schedules of reinforcement, respectively. Fixed-ratio schedules specify that reinforcers are to be delivered following some fixed number of responses. Examples of FR schedules include piece-work reimbursement system s in which workers are paid for completing a set amount of work. Fixed-interval schedules specify that reinforcer s are to be delivered

PAGE 13

13 following the first response after some fixed peri od of time. For example, an FI-10 min schedule specifies that the first response after 10 min will produce a reinforcer. In addition to FR and FI schedules, Ferster and Skinner also examined th e effects of varying th e response requirement around some average following every reinforcer de livery. These were referred to as variableratio (VR) and variable-interval (V I) schedules. Variable-ratio sc hedules specify that reinforcers are delivered following, on average, the nth response but the exact number of responses necessary to produce each individual reinforcer is unpredictable. Th e exact distribution of response requirements is controlled by the expe rimenter. Variable-interval schedules specify that reinforcers are delivered following the firs t response after the pass age of some variable length of time. Similar to VR schedules, the length of time is centere d on some average value but varies unpredictably from reinforcer to reinforcer. Reynolds (1968) made an important distinct ion that further extends the notion of reinforcement schedules. In his text, A Primer of Operant Conditioning, Reynolds wrote about the difference between dependencies and continge ncies. According to Reynolds, dependencies describe relations in which some consequence occurs if and only if behavior occurs. All of the schedules described in Schedules of Reinforcement arranged dependencies. For example, the mechanical delivery of grain in an operant cham ber may be dependent on a key press. Turning on the light in ones office is dependent upon hitting the light-switch. And, according to Reynolds, contingencies describe the obtained relations found in the environment, including those that occur as a re sult of dependencies and those that oc cur for other reasons. For example, a reinforcer may be programmed to occur every 60 s whether or not be havior happens. Suppose that, by accident, a response occurs at sec ond 59. This accidental contingency may produce a

PAGE 14

14 reinforcement effect and the relation may be e xpressed as a reinforcement contingency despite the fact that there is no dependency between behavior and the delivery of reinforcers. In our day to day lives, behavior can enter into relations that likely consist of a blend of dependencies, accidental pairing, and events that follow periods with no behavior. In order to understand these kinds of relations, a method or framework must be esta blished to integrate them. Consider the behavior one person (Alber t) might engage in to get another persons (Janes) attention (for the purpose of the example, assume that attention is a reinforcer). For example, Albert might say Hello or attempt to make eye-contact with Jane. Through observation and experimentation, it might be possible to show that making eye contact is reinforced on about every other occasion. Th is approximates something like a random-ratio schedule where each response is associated with a .5 probability of being followed by a reinforcer. But, what if Jane initiates a convers ation with Albert before Albert had a chance to do anything? How should this extra attention be conceptualized? There are a few possibilities. One is that a reinforcement effect will occur mere ly as a result of the c ontiguity, or brief delay, between behavior and the subseque nt attention. The other is that a reinforcement effect for eyecontact would result if ey e-contact was correlated with an incr ease in the probabili ty of receiving attention over the background probability of attention. Skinner and others certainly r ecognized that there was value in examining the effects of reinforcers that were delivered for free or independent of behavior. For example, Zeiler (1968) examined the effects of what he termed response-independent schedules of reinforcement. Zeiler exposed pigeons to fixed-time (FT) and va riable-time (VT) schedules where reinforcers were delivered according to either a fixed duration of time that did not change from reinforcer to reinforcer or a quasi-random dur ation that changed from reinfo rcer to reinforcer but whose

PAGE 15

15 average remained constant across sessions. Re sponding in the context of FT and VT schedules was evaluated after pigeons first experienced FI and VI schedules. The effect of both schedules was to produce a decrease in the rate of responding howev er the FT schedule produced accelerated patterns of responding just prior to reinfo rcer delivery. This increase was attributed to adventitious reinforcement or, the strength ening of behavior because it happens to occur contiguously with or in close te mporal proximity to reinforcement. (p. 412) That is, the pattern of responding established during FI was maintained during the s ubsequent FT condition despite the lack of a dependency between responding and re inforcer delivery. The absence of systematic patterns observed during exposure to VT was interpreted to have been caused by the strengthening of behavior other than key-pecking as a result of unpredictable intervals between reinforcers. Additional experiments followed Zeilers (1968) examination including Lattal and Maxey (1971). Lattal and Maxey evaluated responding duri ng VT schedules using a multiple schedule. Multiple schedules involve the a lternation between two conditions (or, components) within the same session. Each component is associated with a unique stimulus or set of stimuli. In Lattal and Maxeys first experiment, both components were initially set to VI sche dules (Mult VI VI). In later conditions, both components changed to VT schedules (but, at different points in the experiment). Responding during the VT component persisted l onger when the other component was VI. In addition, responding was higher in th e component that was most recently a VI schedule, suggesting that responding during th e VT schedule was par tly a function of the response strength in the previous condition. In the second experiment, responding was examined following a transition from Mult VI VI to Mult VI Ext (extinction) and then Mult Ext Ext with occasional 1-session probe evaluations of Mult VT VT. Although extinction typically produces

PAGE 16

16 complete suppression of behavior, responses ma intained at approximately 10 responses per minute, indicating that responding during the VT condition would have likely produce responses contiguous with reinforcer presentation. Hence, at least some of the re sponse persistence during VT might be attributed to adventitious reinforcement. Other researchers noted that the pattern produced by the previous response-dependent schedule could influence the like lihood of adventitious reinfor cement in subsequent responseindependent conditions. For exam ple, Rescorla and Skucy (1969) suggested that relatively high rates could be obtained in FT following FI because exposure to FI sche dules typically produces rates of behavior that increase prior to reinforcer delivery. Therefore, response-independent reinforcers delivered at the same frequency w ould likely follow similar local increases in responding. Similarly, Lattal (1972) concluded that relative to FT VT does not produce rates of responding as high as its respons e-dependent counterpart (VI) be cause the VT presentation of reinforcers is more likely to occur during some behavior other than lever pressing. In an attempt to understand the relative cont ributions of dependency and contingency to responding, several investigators examined schedules that combined features of both. Edwards, Peek, and Wolfe (1970) compared rates of res ponding in FR, FT, conjoint FR FT (where reinforcers were delivered followi ng a fixed number of responses and fixed-periods of time), and extinction (where reinforcers we re not delivered during the sessi on). Edwards et al. found that the effects of adding response-independent sc hedules on top of existing response-dependent schedules produced relatively sma ll decreases in behavior compared to either extinction or FT. In addition, as the rate of res ponse-independent reinforcement was increased (or, as the intervals of the FT schedule were decreased) and th e response-requirement for response-dependent reinforcers remained fixed during the conjoi nt FR FT condition, response rate decreased.

PAGE 17

17 Lattal (1974) examined schedules in which the percentage of reinforcers delivered according to a variable schedule were responsedependent (while the remainder were responseindependent). This was accomplished by either making every 3rd, 10th, or all reinforcer deliveries dependent on a response. When res ponse-dependent reinforcers were available, response-independent deliveries were suspended until after the first response occurred. In addition, the proportions of res ponse-dependent reinforcers were examined in both ascending and descending series. Results suggested that response rates d ecreased as the percentage of response-dependent reinforcers decreased. Lattal and Bryan (1976, Experiment 1) ex amined effects of delivering responseindependent reinforcers according to a VT sche dule on top of existing FI performance using a conjoint FI VT schedule. The experimenters manipulated the rate of reinforcer presentation on the VT schedule while keeping the FI schedule constant. In genera l, the results suggested that VT reinforcer delivery disrupted both the patter n and rate of responding established by the FI schedule. That is, the positively accelerated rates observed prior to reinforcement on the FI schedule became more linear when VT reinforcer we re introduces. In addition, the overall rate of responding decreased during the session. However, the authors not ed that in some cases, the addition of response-independent re inforcement had either no clear effect or increased rates of responding. The authors suggested that the uncontrolled temporal contiguity of responses and reinforcers delivered according to the VT schedule may have contributed to the lack of consistent effects. Additionally, more recent applied studies ha ve shown that responding may persist when response-independent reinforcers are delivered on top of an existing response-dependent schedule. For example, Marcus and Vollm er (1996) evaluated whether appropriate

PAGE 18

18 communication behavior would persist following training if the reinforcers maintaining appropriate communication (and pr oblem behavior) were delivered according to a fixed-time schedule. Once appropriate behavior was establ ished and problem behavior remained low, the rate of fixed-time presentation was decreased across sessions. The results showed that appropriate communication pers isted despite the fixed-time delivery of reinforcers. Additionally, this effect was replicated by Goh, Iwata, and DeLeon (2000). Considering the Occurrence and Nonoccurrence of Behavior One feature common to schedules in which reinf orcers are delivered following either responses or following the passage of time is that reinforcers delivered ac cording to the latter might still follow responses closely in time. This becomes a problem because reinforcers can have different effects depending on whether or not they follow behavior. In addition, these different effects can occur independent of whet her or not the behavior actually triggered the delivery (i.e., there does not n eed to be a dependency between behavior and a subsequent event for the behavior to be affected by it). So, a conceptualization of rein forcement that includes those reinforcers that happen af ter behavior and those reinforcer s that happen after some period of time (regularly or irregular ly) is inadequate because some proportion of those latter reinforcers will inevitably follow behavior. Furthermore, that proportion (of reinforcers delivered according to a time-based schedule that accidentally follow behavior) is not controlled by the experimenter but instead, by the organisms behavior. Therefore, to study contingencies similar to those found in the natu ral environment, there must be control over the delivery of reinforcers following the occurrence of behavior and the delivery of reinforcers following the nonoccurrence of behavior. Fortunately, nomenclature and conceptualizations that supports such a framework already exist.

PAGE 19

19 Catania (1988) described in his text Learning the fundamental process and procedures known as reinforcement. He noted that a protot ypical study on reinforcement might compare the effects of exposing the animal to two conditions : a baseline, where the animal receives no food and a reinforcement condition, where the animal r eceives food after each instance of behavior. The conditions might alternate back and forth a few times so that the experimenter is convinced it is the reinforcement causing the increase in behavior and not some other, uncontrolled variable. Following such an experiment, the da ta might reveal that responding remained low during the initial baseline condition, increased during the reinforcement condition, then decreased back down to previous levels during the subsequent baseline condition, and so on. To some, it may seem like a clear demonstration that reinforcement was responsible for the increase in behavior, but Catania noted two changes occu rring during the transition back to baseline: 1) the relationship between behavior and food and 2) the mere presence of food in the session. In light of that limitation, an alternative explanation for the obtained increase in behavior might be that the food had a general tendency to increase the activity of the animal, which produced not only an increase in the measured be havior but in other, unmeasured behavior as well. To address this, Catania described an alternative contro l condition where, instead of not delivering reinforcers at all, food is to be delivered for both the occurrence and nonoccurrence of behavior. He expressed these terms probabilistically su ch that, for the reinforcement condition, the probability of a reinforcer give n a response was 1.0 and the probabi lity of a reinforcer given no response was 0 and in the extinction condition, both probability terms would be equal. In addition, Catanias (1988) c onceptualization provides a he uristic for anticipating the effects of complex contingencies (for l ack of a better term, complex is used here to describe contingencies where both the probability of a rein forcer given a response a nd the probability of a

PAGE 20

20 reinforcer given no response are greater than zer o). Referring back to the above example using eye-contact, the probability of receiving atten tion given eye-contact was .5 but sometimes attention was delivered in the absence of eye-cont act. Catanias conceptua lization allows us to evaluate the contingency if we al so express the attention that is delivered in the absence of eyecontact as a probability. If the probability of attention given ey e-contact is greater than the probability of attention given no eye-contact, Catanias framework would predict that eyecontact would be strengthened as a result of reinforcement. C onversely, if the probability of attention given eye-contact is less than or equa l to the probability of attention given no eyecontact, Catanias framework would predict that eye-contact would not be strengthened. The conceptualization might be helpful for improvi ng our understanding of contingencies similar to those found outside the laboratory. An Analogy in Respondent Conditioning Perhaps not coincid ently, a similar conceptualiz ation of contingencies has been useful for understanding respondent conditioning. Rescorla (1967) wrote about confounds present in common control conditions during tests of re spondent conditioning. Respondent conditioning (sometimes called Pavlovian conditioning) descri bes conditioning in which a neutral stimulus comes to produce effects similar to those of an unconditioned stimulus (US) as a result of operations often simply (and inadequately) de scribed as pairing. Effects of respondent conditioning are demonstrated by comparing a subjects responses to the CS (conditioned stimulus) following a test condition (in which the CS and US are paired) and a control condition. Popular control procedures pre-dating Rescor las publication involve d some variation of presenting both the CS and the US but, in a ma nner that was directly contrary to the test condition. That is, US were ofte n presented before CS such that presentation of the CS was never predictive of an upcoming presentation of the US. Rescorla made two arguments: 1) the

PAGE 21

21 only difference between the effect s of the test and c ontrol conditions should be the contingency necessary to produce conditioning and 2) many of the commonly used control conditions included two changes: the removal of one cont ingency and the addition of another. For Rescorla, the constraints placed on the relation be tween the CS and the US in typical control conditions constituted a procedural difference beyond the mere absence of the contingency responsible for conditioning. Therefore, th e ideal control condition was one in which presentation of the CS and the US was unconstrained. The test and three of the c ontrol conditions (exp licitly unpaired control, backward, conditioning, and discriminative conditioning) described by Rescorla can be expressed probabilistically (for the sake of completeness, the remain ing control conditions were presentation of the CS alone, presentation of a no vel CS, and presentation of the US alone). In the test condition, in which CS are always presen ted and removed prior to the US, the probability of a US given a CS is 1.0 and the probability of a CS given a US is 0. The explicitly unpaired, backward conditioning, and discriminative c onditioning effectively arranged the same contingency: US always proceed CS and CS never proceed US. Hence, in these control conditions, the probability of a US given a CS is 0 and the probability of a CS given a US is 1.0. And in the ideal control condition, in which pres entation of the CS and the US are unconstrained (random), the probability of a US given a CS woul d be equal to the probability of a CS given a US. Lane (1960) investigated the potential effectiveness of control conditions for operant control of vocalizations in chickens. The control conditi ons included no reinforcement (extinction), fixed-time reinforcer delivery, fixed-ratio food tray presentation (a stimulus that was correlated with reinforcer delivery) without accompanying reinforcers, and DRO (where

PAGE 22

22 reinforcers were delivered given the absence of food). Lane found decr eases in each of the control conditions relative to either fixed-ratio and fixed-interv al test conditions. Similar results were obtained by Thompson, Iwat a, Hanley, Dozier, & Samaha (2003) who examined fixedtime, extinction, and DRO. Both studies reported relatively higher rates of responding during the fixed-time condition which was attributed to a ccidental contiguity between responses and reinforcers. Thompson and Iwata (2005) noted the analogy between Rescorlas ( 1967) description of ideal control procedures for respondent conditioning and those used for operant conditioning. Their analysis led them to conclude that, al though imperfect for reasons described below, noncontingent reinforcement (NCR) met Rescorlas definition of a truly random control. (Thompson and Iwata, 2005, p. 261) However, the fixed-time delivery of reinforcers does not ensure that the obtained relations hip between behavior and reinfo rcers is random. Reinforcers, by definition, have the effect of strengthening whatever preceded them. The strengthening effect does not depend on the nature of the relationshi p between behavior and reinforcement (i.e., whether the behavior produced the reinforcer or if the reinforcer accidentally followed behavior). As a result of being strengthened, the rate and/or pattern of behavior may change such that the obtained contingenc y is no longer random. In the case of fixed-time delivery of reinforcers, responses that occur in the interval just before f ood delivery may be more likely to occur in the future. Such a case was reporte d by Vollmer, Ringdahl, Roane, and Marcus (1997) in which a childs aggression persisted during NCR. An examination of the within-session pattern of responding revealed that as the individu al gained more experien ce with the treatment, instances of aggression became more likely just prio r to reinforcer-delivery. In other words, the probability of a reinforcer given aggression was lik ely higher than the probability of a reinforcer

PAGE 23

23 given the nonoccurrence of aggre ssion. Such a condition is more descriptive of a fixedor variable-ratio schedule as opposed to a truly random control. It is possible that such a problem only occurs if one uses fixed-time schedules and that NCR implemented using variable-time schedules would retain the status as the truly random control. However, VT schedules also do not ensure that the obtained relation between behavior and reinforcers remains random. Reinforcers that are delivered closely followi ng responses may increase the overall rate of responding such that, compared to the initial rate of responding that produced a negative contingency, higher rates of responding may produ ce positive contingencies. In addition, many of the studies on reinforcement contingencies already discussed emphasize the role in which response-independent reinforcers exert their influence on responding in systematic (i.e., nonrandom) ways (c.f., Zeiler, 1968; Rescorla & Skucy, 1969; Edwards, Peek, & Wolfe, 1970, Lattal & Maxey, 1971; Lattal, 1972; Lattal, 1974; Lattal & Bryan, 1976). Previous Research on Complex Contin gencie s of (Operant) Reinforcement To date, two studies have experimentally manipulated contingencies of reinforcement viewed as the probability of a reinforcer given a response and the probability of a reinforcer given no response. In the first one, whic h was a two experiment study, Hammond (1980) investigated effects of positive and negative conti ngencies in rats using water as the reinforcer and lever pressing as the response. Contingenc ies were arranged by dividing the session into a series of unsignaled 1-s cycles. At the end of each cycle, .03 ml of water was delivered (or not) according to two experimenter-programmed probabil ities: the probability of a reinforcer given that at least one response occurred during the prev ious cycle and the probability of a reinforcer given that no responses occurred during the previous cycle. In the first experiment, rats were given a history of a positive contingency before they were exposed to a zero contingency. Hammond used the term positive contingency to refer to

PAGE 24

24 conditions where the probability of a reinforcer given a response was highe r than the probability of a reinforcer given no response. The term zero contingency was used to refer to conditions where the probabilities of a reinforcer given a response and given no response were equal. The specific sequence of conditions and the terms used to describe them are listed in Table 1-1. Responding decreased rapidly after the introduction of the zero contingency as compared to the moderately high positive contingency. Table 1-1. Contingencies for each condition in Hammond (1980). Condition P(Sr|R) P(Sr|~R) Term a 1.0 0 Very High Positive b .2 0 High Positive c .05 0 Moderately High Positive d .05 .05 Zero e .05 0 Moderately High Positive f .05 .05 Zero The conditions in Experiment 1 of Hammond (1980) and the terms used to describe them. The abbreviation P(Sr|R) stands for the probabil ity of a reinforcer given a response and the abbreviation P(Sr|~R) stands for the proba bility of a reinforcer given no response. In the second experiment, 47 rats were given a history of a positive contingency and then were exposed to either one of two positive c ontingencies (.12/.00 or .12/.08), one of two zero contingencies (.12/.12 or .05/.05), or a negative contingency (.00/.05) The results showed that responding decreased as the contin gencies were progressively weak ened. In the discussion, the correspondingly decreased response rates were interpreted to have implications against accounts of reinforcement that are based on contiguity. Contiguity, when used with respect to operant behavior, refers to the amount of time that elapses between respons es and reinforcers. According to the author, the contiguity was the same in a ll conditions of the experiment. Therefore, the relationship between the probability of a reinforcer given a res ponse and the probability of a reinforcer given no response must play an important role in determining reinforcement effects.

PAGE 25

25 Borrero, Vollmer, and Wright (2002) translated the findi ngs and procedures used by Hammond (1980) in the treatment of aggression. A functional analysis (Iwata et al., 1982/1994) was conducted in order to identify the reinforcer s maintaining aggression for two participants. For both participants, aggression was maintained by social reinforcement, which meant that it occurred because of the reactions of other individuals in the environment. Specifically, one participants aggression was maintained by escape from activities and the others was maintained by access to preferred food items. Following the functional analyses, the participants were exposed to positive and then neutral (zero) contingencies. Cycle durations were adjusted to be approximately equal to the average duration of th e responses made by the participants. For one participant, the cycle duration wa s 1 s and, for the other, the cycl e duration was 5 s. The effect of the contingencies was the same for both participants: positive contingencies produced maintenance and neutral contingencies produced decreases in aggression. One implication of Borrero, Vollmer, and Wright is that the proced ures used to arrange complex contingencies of reinforcement may represent a useful method fo r simulating reinforcement contingencies like those maintaining problem (or appropriate) behavi or in the natural enviro nment. Furthermore, the effects on socially-relevant behavior seem to be in the direction anticipated by Catania (1988). The neutral contingencies described by Hammond (1980) and Borrero, Vollmer, and Wright (2002) might better fit an operant analog of Rescorlas (1967) truly random control. Neutral contingencies specify that the probability of a re inforcer given a response is equal to the probability of a reinforcer given no response. If those probabilities are set to values greater than zero then, responding does not have the effect of increasing the proba bility of a reinforcer above that obtained if no response occurs. Therefore, the alternation between positive and neutral

PAGE 26

26 contingencies by Borrero, Vollmer, and Wright (2002) constitutes the demons tration of a control condition where the only change between baseline and reinforcement is the contingency for not responding. However, this kind of control condi tion has not been described or examined in relevant discussions of opera nt control procedures (Lane, 1960; Thompson et al., 2003, Thompson & Iwata, 2005). Translational Research Traditional views of science often p lace a divi sion between two groups of scientists: basic and applied. Basic scientists are those that do science for the sake of understanding and applied scientists are those that do it to meet some more immediate need of society (Baer, Wolf, & Risley, 1968). The extension of the findings of Hammond (1980) and the contingency concept of reinforcement to the treatment of problem beha vior represents an example of how research in basic science may be applied to address issues that are impor tant to society (i.e., reducing aggressive behavior displayed by children). This m odel of the relationship between basic and applied science is often unidirectional, where information flows from basic to applied. However, less obvious is the reciprocal role in which application can (or s hould) guide basic science. Positive reinforcement is a concept that is clearly basic and fundamental to behavior analysis. Basic research on positive reinforcement has focused largely on if-then responsereinforcer dependencies. However, applied rese arch has shown that events known to reinforce problem (and appropriate behavior) sometimes occur following behavior and sometimes occur when behavior has not occurred (e.g., Vollmer, Borrero, Borrero, Van Camp, & Lalli, 2001; Samaha et al., in press). Intuitively, such c ontingencies are frequent in human environments. Therefore, examining the necessary and sufficient conditions for reinforcement in the context of complex contingencies would seem important.

PAGE 27

27 In addition, previous translati onal research has shown that so me effects of reinforcement seem dependent on not just curren t contingencies, but also previ ous experience. For example, Borrero, Vollmer, Van Harren, Haworth, and Sama ha (in prep) used rats to examine lever pressing during fixed-time (FT) schedules where reinforcers are delivered according to a clock (independent of lever pressing). Fixed-time schedules might sometimes produce complex contingencies because, even though reinforcers ar e delivered according to a clock, they may accidentally occur just after a response or af ter a period of time without responding. Results indicated that maintenance during the FT condition was more likely when rats had a previous history of responding on an FI schedule with th e same interval value as that used in the subsequent FT condition. For example, rats with a previous history of FI 30 s (where the first response after 30 s produced a reinforcer) conti nued to respond at higher rates in a subsequent FT 30 s condition (where reinforcers were presented every 30 s independent of lever pressing) as compared to an FT 15 s condition. While the resu lts of this study do not lend themselves to an evaluation of the effects of complex continge ncies (because the relationship between responding and reinforcer delivery in the FT condition was not directly arranged by the experimenter), the results clearly suggested that reinforcement eff ects in complex contingencies may be influenced by previous experience. Therefore, a complete description of the necessary and sufficient conditions for reinforcement in complex contin gencies might need to include conditional statements based on an organisms previous experience. Goals of the Current Research The general aim of this dissert ation is to present a method to study complex contingencies of reinforcement. The series of studies seeks to investigate some conditions for observing acquisition and maintenance unde r complex schedules of reinforcement. An improved

PAGE 28

28 understanding of complex schedules of reinforcement has implications for how behavior might be reinforced and maintained in the natural environment. The following five experiments examined acqui sition and maintenance of lever pressing in rats. In the first experiment, acquisition wa s examined during two positive contingencies (.10/.05 and .10/.00) and effects of exposure to .10/.00 on responding in a subsequent .10/.05. In Experiment 2, a systematic replication of experiment one was conducted by providing experience with negative contingencies (.00 /.10 and .05/.10) prior to the evaluation of responding in positive contingencies. The resu lts of Experiment 1 and 2 were somewhat different, such that acquisition and maintenance may have been weakened by the early exposure to a negative contingency. So, Experiment 3 was designed to evaluate effects of the differences between Experiment 1 and 2 (the previous exposure to positive contingencies) within subjects. Finally, in Experiment 4, a method was used to syst ematically identify the contingency values at which responding would break down by gradually manipulating the contingency from positive to negative (.10/.00 to .00/.10).

PAGE 29

29 CHAPTER 2 EXPERIMENT 1 Purpose The purpose of this experim ent was to evalua te whether lever pre ssing could be acquired, maintained, or both under a complex positive contingency of reinforcement, in which there was some probability of a reinforcer given behavior (.10) and some probability of a reinforcer given no behavior (.05). Method Subjects Three experim entally nave male Wistar (albi no) rats purchased at 8 weeks of age were housed individually in home cages. Experimentally nave rats were selected as subjects in order to control for a history of behavior reinfor ced by access to food. Conclusions based on the acquisition of behavior by non-expe rimentally nave organisms woul d need to be tempered due to both known and unknown experiences prior to th e experiment. Likewise, the conditions under which food-reinforced behavior could be acqui red and maintained in experimentally nave organisms could be tested. Prior to the experime nt, rats were given adlib food and water for 7 consecutive days. After 7 days, access to food wa s restricted to 16 g per day. Food was made available in the home cages immediately followi ng sessions. Water was fr eely available in the home cages throughout the experime nt. Sessions began after the 7t h day of food restriction. All procedures were approved by the University of Florida Animal Care and Use Committee. The colony room was illuminated on a 12hour light-dark cycle w ith lights programmed to turn on at 8 am. Temperatur e and humidity were monitored and maintained at consistent levels.

PAGE 30

30 Apparatus Six Coulbourn-Instrum ents operant chambers were enclosed in sound-attenuated boxes with exhaust fans. An intelligence panel was mounted on one wall of the chamber measuring 29 cm long X 30 cm wide X 25 cm high. Mounted on the panel were two levers and a pellet hopper. The pellet hopper was mounted in the center of the intelligence panel (7.0 cm above the floor) and the levers were located on either side of the hopper (centered 7.0 cm above the floor and 5.5 cm from the center of the hopper). Also mounted on the intelligence panel were three color LEDs (light-emitting diodes) mounted horizontally 4 cm above each lever, an incandescent house-light (2.0 cm from the top-center of the panel), and an incandescent hopper-light. From left to right, the colors of the LEDs were red, green, and yellow. The side-panels of the chamber were made of clear acrylic plastic while the ceilin g, rear, and intelligence panel were constructed of aluminum. The bottom of the chamber consis ted of a shock floor (although no shock was ever delivered during the experiment ) raised above a white plasti c drop pan. A pellet feeder was attached to the back of the inte lligence panel and delivered pellets into the hopper. Lever presses were defined as any force on the lever sufficien t to produce a switch clos ure (about 0.20 nm). Responses to both levers were recorded but only responses on the left lever produced changes in the probability of reinforcer delivery. A PC computer used Coulbourn Instruments Graphic State Notation to record lever-presses and control the apparatuses. The computer also emitted white-noise through a pair of attached speakers at approximately 70 dbs (as measured from the center of the room). Procedures Three 10-m in sessions were conducted each day. Each session was proceeded by a 1-min blackout and the third session was followed by a 1min blackout before the animal was returned to its home cage. During sessions, the house light and the lever lights above both levers were

PAGE 31

31 illuminated. Throughout the experiment, the sess ion was divided into unsignaled 1-s cycles (similar to that described by Hammond, 1980). The computer was programmed to deliver a single 45-mg sucrose pellet (Formu la 5TUL, Research Diets Inc., New Brunswick, NJ) at the end of each cycle according to a pair of probabilities specific to each phase: the probability of a pellet delivery given at least one lever press in the current cycle ( P(Sr|R) ) and the probability of a pellet delivery given no lever pre sses in the current cycle ( P(Sr|~ R) ). During a pellet delivery, the house and lever lights were turned off for 1 s. At the same time, the hopper-light flashed briefly for 250 ms. The next cycle began when the house and lever-lights were re-illuminated. Lever presses that occurred duri ng the 1-s blackout did not ha ve any programmed effect and were not included in the ove rall rate of responding. Other than the contingencies implemented during each phase, no lever shaping or hoppertraining was performed prior to or during the experiment. Conti ngency values (the probabilities of pellet delivery) for each condition were in itially based on the va lues reported by Hammond (1980). Pilot work revealed that animals gained excessive weight when exposed to similar contingency values in combination with session durations of 50 min. Therefore, an attempt was made to reduce food intake by limiting the total ti me spent in session to 30 min per day. In addition, the session time was divide d into three 10-min blocks afte r an examination of withinsession patterns revealed reasonabl y consistent rates of responding. Conditions Condition changes were m ade following stabil ity as judged by visual inspection. From this point forward, each condition has been specif ied using two parameters: the probability of a pellet delivery given a response and the probabi lity of a pellet delivery given no response (P(Sr|R) / P(Sr|~R)). Conditions were conduc ted in the following order: .00/.00 (No Pellet), .10/.05, .10/.00, and .10/.05.

PAGE 32

32 Results and Discussion Figure 2-1 shows responses per m in of le ver pressing for each session, subject, and condition. The following pattern of responding was observed for all three subj ects. Little to no responding was obtained in the initial No Pellet condition (as expected). No subject showed acquisition during the subsequent .10/.05 condition. Responding incr eased for all three subjects following exposure to .10/.00. Responding then pe rsisted at somewhat reduced levels (as compared to the previous .10/.00 condition) following the reversal back to .10/.05.

PAGE 33

33 Figure 2-1. Experiment 1 All Sessions. This fi gure shows responses per min of lever pressing for each session. Each panel shows data from a different subject.

PAGE 34

34 Three conclusions can be drawn from the data. First, .10/.05 was not sufficient to produce acquisition in these subjects during the time period in which they were exposed to the condition. Second, the lack of acquisition in .10/.05 may be, in part, explained by the reinforcers that were delivered following cycles without responses gi ven that acquisition was obtained in .10/.00. Third, responding was maintained during th e second exposure to .10/.05, a condition which previously did not produce responding. It is this third finding that is perhaps most critical. If the .10/.00 condition is viewed as an independent variab le, then exposure to that variable produced a differential effect in a subsequent condition: .10/.05 in comparison to the .10/.05 condition that preceded .10/.00. Although, only a small range of parameter values was examined in this experiment, the results may have implications for the acquisiti on and maintenance of problem and appropriate behavior in humans. With resp ect to the first effect, it may be that occasional reinforcers presented in the absence of behavior are suffici ent to prevent the acqui sition of problem (or appropriate) behavior. Given the current data, this could be the case even if the probability of reinforcement given problem (or appropriate) behavi or was twice as likely as the probability of reinforcement given no behavior. Such reinforcer s could be arranged using fixed-time schedules (e.g., noncontingent reinforcement, NCR), differen tial reinforcement of other behavior (DRO), or following the occurrence of appropriate beha vior as a sort of i noculation against the emergence of problem behavior. On the other hand, too many free reinforcers may impede the development of important appropriate skills. Koegel and Rincover (1977) showed similar results when, following experience with intermittent reinforcement, students correct responses persisted (but eventually decreased) in another setting when reinforcers were presen ted following successive incorrect responses or

PAGE 35

35 independent of behavior. When reinforcers were presented fo llowing incorrect responses, an examination of the pattern of responses revealed that the reinforcer appeared to serve as a discriminative stimulus. That is, correct responses increased after the delivery of a reinforcer and then decreased across successive trials. Indeed, other authors have observed response persistence during DRO schedules and have positive a discriminativ e effect of the reinforcer (c.f., Thompson, Iwata, Hanley, Dozier, & Sama ha, 2003). When reinforcers were presented independent of correct responses, behavior persisted for much longe r. The authors attributed the enhanced persistence of the response-independent reinforcement to adventitious pairing of responses and reinforcers. In the current st udy, response rates persiste d (for several hundred sessions in two cases) under a complex positive con tingency. One interpretation of the results of this study was that the occasional response-dependent reinforcer may have enhanced the discriminative properties of the all the reinforcers such that re sponding persisted for much longer than that observed by Koegel and Rincover (1977). The acquisition versus maintenance effect with .10/.05 has implications for the maintenance of appropriate behavior and the treatment of problem behavior. Once acquired, both appropriate and problem behavior may be relatively robust despite intermittent reinforcement and occasional reinforcers deliver ed following the absence of behavior. For problem behavior, the result suggest s that those selecting treatmen ts for eventual implementation by caregivers should do so while considering the possible effects th at treatment integrity failures will have on the contingency. For example, DRO (a common treatment) specifies that reinforcers are to be delivered following periods of time in which behavior has not occurred. Despite even the most ideal training, it is very li kely that other factors may result in reinforcers occasionally following problem behavior (e.g., as a result of intermittent care by untrained or

PAGE 36

36 unmotivated individuals or if the problem behavior is extremely dangerous and necessitates immediate reactions from caregivers). Such mistakes may appear small but might serve to drive an initially strong negative contingency toward conditions that would produce maintenance. Conceptually, the results have implications for understanding th e basic principle of reinforcement. A given contingency may not pr oduce a reinforcement ef fect in the sense of strengthening behavior, but ma y produce a reinforcement effect in the sense of maintaining previously acquired behavior. Similar results were also obt ained by Marcus and Vollmer (1996) and Goh, Iwata, and DeLeon (2000) in which appropriate behavior persisted following exposure to an FT schedule of reinforcer delivery. However, one important di fference between those studies and the current procedures was that previous authors used sp ecific training procedures (FR-1 in Marcus & Vollmer and a prompt-delay procedure in Goh, Iw ata, & DeLeon) in orde r to teach the initial behavior. The current study did not involve explicit magazine training, shaping, or any other analogous training procedure ot her than the contingencies of reinforcer presentation. The next experiment was designed to systematic ally replicate the procedures of the first experiment but by providing an initial hi story with negative contingencies.

PAGE 37

37 CHAPTER 3 EXPERIMENT 2 Purpose One purpose of this experim ent was the same as Experiment 1: to evaluate whether lever pressing could be acquired and maintained under a complex positive contingency of reinforcement. However, rats were first expos ed to two negative contingencies (.00/.10 and .00/.05). Method Subjects and Apparatus Four experim entally nave male Wistar (albi no) rats were included in Experiment 2. The animals were acquired, housed, and fed in a manner identical to Experiment 1. In addition, the same chambers used in Experiment 1 were also used in Experiment 2. Procedures Procedures in Experim ent 2 were identical to those described in Experiment 1 for all but one animal. For rat 1852, one 50-min session was conducted per day. A 1-min blackout was presented prior to and following each session. Conditions After an initial No Pellet baseline, four ra ts were exposed to a sequence of conditions starting from strong negative to strong positive: .00/.10, .05/.10, .10/.05, and .10/.00. It was thought that this sequence was ordered from leas t to most likely to produce acquisition of lever pressing. When shifts in the contingency toward strong positive were associated with increases in lever pressing, maintenance was evaluated by re turning the contingency back the level in the previous condition. Therefore, the exact sequence of conditions was different for each animal because it depended, in part, on the animals performance.

PAGE 38

38 Results and Discussion Figure 3-1 shows responses per m in of le ver pressing for each session, subject, and condition. For subject 1852, low rates of respondi ng were obtained in the first four conditions: No Pellet, .00/.10, .05/.10, and .10/.05. Similar to the results of Experi ment 1, acquisition was obtained following the change from .10/.05 to .10/.00. Also similar to the results of experiment 1, responding persisted following the change from .10/.00 to .10/.05. For subject 1901, low rates of responding were obtained in the first four conditions: No Pellet, .00/.10, .05/.10, and .10/.05. Unlike the results of Experiment 1, little to no increase in responding was obtained following the change from .10/.05 to .10/.00 (up to this point, every subject had acquired lever pressing under .10/.00 ). Acquisition was obtained following the change to 1.00/.00 and responding persisted follow ing the subsequent return to .10/.00. Also unlike the results of Experiment 1, responding did not persist following the change from .10/.00 to .10/.05 (up to this point, every subject maintained lever pressing under .10/.05 following acquisition). The results for subject 1902 were similar to those obtained for subject 1852. Low rates of responding were obtained in the first four condi tions (No Pellet, .00/.10, .05/.10, and .10/.05). Acquisition was obtained following the change from .10/.05 to .10/.00 and responding persisted following the return to .10/.05. For subject 1903, low rates of responding were obtained in the first three conditions: No Pellet, .00/.10, and .05/.10. The ensuing exposur e to .10/.05 was carried out for an extended number of sessions because a modest increasing trend was obtained unti l session 260. A change to .05/.10 resulted in suppression of responding and a return to .10/.05 produced a modest increase in responding similar to that obtained in the first exposure to .10/.05. Acquisition was

PAGE 39

39 obtained after a change to .10/.00. However, si milar to the results ob tained for subject 1901, responding did not persist af ter the return to .10/.05.

PAGE 40

40 Figure 3-1. Experiment 2 All Sessions. This fi gure shows rate of leve r pressing for subjects 1852, 1901, 1902, and 1903 during Experiment 2.

PAGE 41

41 Some similarities and differences between the results of the Experiment 1 and Experiment 2 deserve comment. First, for subjects 1901 and 1903, responding did not persist during .10/.05 (following acquisition). In experiment 1, all three subjects maintained responding following the second exposure to .10/.05 (following acquisition). Second, for subject 1901, exposure to .10/.00 did not produce acquisition. In Experiment 1, all three subjects acqui red lever pressing under .10/.00. Third, overall rates of lever pressing were somewhat lower in Experiment 2. With the exception of subject 1852, rates rarely exceeded 40 responses per min during .10/00, which is only about 40-60% of the levels obtained in that same condition of Experiment 1. Although the methods used in Experiment 1 a nd 2 do not allow for a proper comparison of the effects of previous exposure to negative contingencies, the results of the two experiments suggest that exposure to nega tive contingencies might produ ce a tendency for suppressed acquisition and suppressed maintenance in subseque nt conditions. If true, the finding may have implications for treatments desi gned to inoculate individuals ag ainst the acquisition of problem behavior. For example, prolonged exposure to DRO-like contingencies might make severe problem behavior less sensitive to acquisition and maintenance contingencies. In addition, the finding would also support earl y intervention programs designe d to teach skills and other appropriate behavior at an early age using strict (strong positive) reinforcement contingencies. Conversely, too many reinforcers given for free in early development might hamper later acquisition and maintenance of appropriate behavior.

PAGE 42

42 CHAPTER 4 EXPERIMENT 3 Purpose The purpose of this experim ent was to exam ine effects of early exposure to negative contingencies within individual subjects using a multiple schedule. A multiple schedule is defined as the alternation between two (or mo re) components in a single session with each component associated with a unique stimulus or set of stimuli (Ferster & Skinner, 1957). Method Subjects and Apparatus Three experim entally-nave male Wistar (albi no) rat were included in Experiment 3. The animals were acquired, housed, and fed in a manner identical to that used in Experiments 1 and 2. In addition, the same experimental chambers were used. Procedures Procedures were sim ilar to those used in E xperiment 1 with the following exceptions. Instead of a conducting three sessions per day, two components were conducted each day in a multiple-schedule format. Components were pr esented in pseudo-random order determined by the running computer at the beginning or each session. Each component was 10 min in duration and was associated with the illumination of LED s in different colors and positions. Component 1 was associated with the illumi nation of an LED located left-of-center and above each lever in the chamber. Component 2 was associated with th e illumination of an LED located at the center line and above each lever. Previous work in ot her experiments showed that LED location could exert stimulus control over re sponding in the context of mul tiple schedules; hence, it was deemed adequate for this preparation. Like Experiments 1 and 2, each session was preceded and

PAGE 43

43 terminated by a 1-min blackout. In addition, a 1-min blackout delineated the presentation of each component. Conditions The conting ency values associated with each component and condition are listed in Table 4-1. The preparation was designed to provide the subject with a history of a positive contingency in component 1 and a history of a negative contingency in component 2. After providing those histories, a ba seline of responding in a pos itive contingency (.10/.00) was established in both components prio r to the test condition (.10/.05). Table 4-1. Contingencies for each condition of Experiment 3. Condition Component 1 Component 2 1 1.0/.00 (not used) 2 (not used) .00/.10 3 1.0/.00 .00/.10 4 .10/.00 .00/.10 5 .10/.00 .10/.00 6 .10/.05 .10/.05 Results and Discussion Session-by-session data are pres ented in Figure 4-1. Results for subject 2302 are show n in the top panel. In the first condition, the anim al was only exposed to component 1 (1.0/.00). Subject 2302 acquired leve r pressing during the 8th session. In the next condition, the animal was only exposed to component 2 (.00/.10). By th e end of the condition, rates of lever pressing had fallen to an average of 0.29 responses per min and ranged between 0 and 0.8 responses per min. During condition 3, the subject experience d both component 1 (1.0/.00) and component 2 (.00/.10) in random order, with the schedule-corr elated stimuli, once per day. By the end of condition 3, responding in component 1 (1.00/.00 ) was higher (averaging 11.9 responses per min) compared to component 2 (.00/.10) (ave raging 1.6 responses per min). In the next

PAGE 44

44 condition, component 1 was changed from 1.00/ .00 to .10/.00 while component 2 remained .00/.10. Responding during component 1 became so mewhat variable while responding in component 2 remained low. Next, component 2 was changed from .00/.10 to .10/.00 to match component 1. Responding increased during both components. In the next condition, both components were changed to .10/.05. During the fi rst four sessions, resp onding in component 2 (the component associated with previous exposur e to a negative contingency) was notably lower as compared to component 1. Results for subject 2406 are presented in the center panel of Figur e 4-1. In the first condition, the animal was only exposed to comp onent 1 (1.0/.00). Subject 2406 acquired lever pressing during the 10th session. In the next condition, the animal was only exposed to component 2 (.00/.10). By the end of the condi tion, rates of lever pressing had fallen to 1.2 responses per min. During condition 3, the subject experienced both component 1 (1.0/.00) and component 2 (.00/.10) in random order once per da y. On average, responding in component 1 (1.00/.00) was higher (averaging 17.4 responses pe r min) compared to component 2 (.00/.10) (averaging 3.2 responses per min). In the ne xt condition, component 1 changed from 1.00/.00 to .10/.00 while component 2 remained .00/.10. Responding during component 1 became somewhat variable while responding in component 2 remained low. Next, component 2 was changed from .00/.10 to .10/.00 to match com ponent 1. Responding increased during both components, averaging 45.41 responses per min in component 1 and 45.21 responses per min in component 2. In the next condition, both comp onents were changed to .10/.05. Following the transition to .10/.05, responding in both component s 1 and 2 decreased to an average of 28.15 and 31.96 responses per min, respectively. No noteworthy differences in responding were observed between components 1 and 2.

PAGE 45

45 Results for subject 2401 are presented in the bottom panel of Figur e 4-1. In the first condition, the animal was only exposed to comp onent 1 (1.0/.00). Subject 2401 acquired lever pressing during the 5th session. In the next condition, the an imal was only exposed to component 2 (.00/.10). By the end of the condition, rates of lever pressing had fallen to 0.09 responses per min. During condition 3, the subject experience d both component 1 (1.0/.00) and component 2 (.00/.10) in random order once per day. On aver age, responding in component 1 (1.00/.00) was higher (averaging 11.9 responses per min) compar ed to component 2 (.00/.10) (averaging 2.5 responses per min). In the next condition, co mponent 1 changed from 1.00/.00 to .10/.00 while component 2 remained .00/.10. Responding during component 1 increased while responding in component remained low. Next, component 2 was changed from .00/.10 to .10/.00 to match component 1. Responding increa sed during both components, av eraging 44.0 responses per min in component 1 and 42.1 responses per min in component 2. In the next condition, both components were changed to .10/.05. Responding decreased slightly in both components, averaging 40.8 responses per min in component 1 and 40.1 responses per min in component 2. No noteworthy differences in responding we re observed between components 1 and 2.

PAGE 46

46 Figure 4-1. Experiment 3 All Sessions. This figure shows responses per min of lever pressing for each subject and each condi tion of Experiment 3.

PAGE 47

47 In Experiment 3, in an effort to tease out differences between Experiment 1 and Experiment 2 results, one component of a multiple schedule was associated with early exposure to positive contingencies while the other com ponent was associated with early exposure to negative contingencies. Later, following acquisition of lever pressing, both components were changed to .10/.05. Data for one subject, 2302, s howed that responding in the component that was previously associated with the negative contingency was initially lower following the transition to .10/.05 compared to the component that was not pr eviously associated with a negative contingency. Data for subjects 2406 and 2401 revealed no noteworthy differences in responding between components 1 and 2 following the transition to .10/.05. Results for all three subjects failed to repl icate the differences in acquisition obtained during .10/.00 in Experiments 1 and 2. In Experiment 1, three out of three subjects acquired responding during .10/.00. In Experiment 2, subj ects were given an early history with two negative contingencies (. 00/.10 and .05/.10) and one out of f our subjects failed to acquire responding during .10/.00. In Experiment 3, one component of a multiple schedule was associated with early exposure to positive contingencies while the other component was associated with early exposure to negative c ontingencies. Followi ng the experience with negative contingencies, both components were changed to .10/.00. For all three subjects, responding during the component pr eviously associated with th e negative contingency followed the same pattern as responding during the compon ent that was not associated with a negative contingency. Introduction of the complex pos itive contingency (.10/ .05) in both components produced temporary reductions in the component previous associated with the negative contingency for only one out of three rats. Th erefore, these results only modestly support the notion that early exposure to negative contingencies decreases acquisition. It is possible that the

PAGE 48

48 early exposure to a positive contingency, independent of historie s associated with particular components, weakened any within-subject effect of early exposure to negative contingencies. A future experiment might begin withou t the positive contingency exposure.

PAGE 49

49 CHAPTER 5 EXPERIMENT 4 Purpose In the previous experim ents, a very small se t of contingency values were examined. In addition, the results of the previ ous experiments have been desc ribed in terms of one or two possible outcomes: persistence and suppression. It is not clear wh ether gradual reductions in the strength of reinforcement contingency woul d produce concomitant reductions in responding. Lattal (1974) investigated the de gree to which response rates w ould change in relation to different combinations of rein forcers delivered according to either VI or VT schedules. Specifically, the procedures arranged either 0% 10%, 33%, 66%, or 100% of the reinforcers according to the VI schedule and the remainder a ccording to VT. The results showed gradual decreases in responding as a function of the perc entage of reinforcers that were responsedependent decreased. However, the VT schedul e permitted contiguity between the responses and response-independent reinforcers. In fact, re sults showed that the rates of behavior obtained using combinations of VT and VI schedules wo uld have likely produced contiguity between responses and response-independent reinforcers. Therefore, it is not clear whether responding would be immediately suppressed after contacting weak contingenc ies in which the proportion of reinforcers that followed responses was more controlled or whether response rates would decrease gradually. The purpose of this final experiment was to evaluate a procedure to gradually weaken the contingency of reinforcement in terms of its effects on responding. Methods Subjects and Apparatus Two rats from Experim ent 1 (2003, and 2004) and three rats from Experiment 2 (1903, 2001, and 2005) also participated in Experiment 4. The animals were acquired, housed, and fed

PAGE 50

50 in a manner identical to that used in the pr evious experiments. In addition, the same experimental chambers were used. Procedures Procedures were iden tical to t hose described in Experiment 2. Conditions After the last condition from Experiments 1 and 2 (.10/.05), responding was reestablished in .10/.00 and from there, the contingency was gr adually weakened until the rate of responding was judged to have been suppressed. Suppressi on criterion was defined as at least one session in which the rate was at or below ten times the hi ghest rate obtained duri ng the last-six sessions of the No Pellet baseline from the previous experiment. Once suppression was obtained, an attempt was made to re-establish responding usin g the contingency from the previous condition. If responding was re-established, then the contingency was weakened again. If responding was not re-established, 0.10/0.00 was re-implemented un til rates similar to a prior 0.10/0.00 exposure were obtained and contingency weakening began again. Each contingency value was maintained for three consecutive days (for a total of 6 sessions). Contingency values were selected according to one of two possible sequences depicted on Figure 5-1. Each rat was exposed to the first sequence at least once. Rats 2004, 1903, and 2005 were exposed to the fi rst sequence and then the second.

PAGE 51

51 Figure 5-1. Experiment 4 Sequence of Conditi ons. Sequences 1 and 2 (depicted on the left and right panels, respective ly) in which the contingencies were weakened across conditions. The area above the diagonal line represents a positive contingency where the probability of a reinforcer given a res ponse is greater than the probability of a reinforcer given no response. The area be low the diagonal line represents a negative contingency where the probability of a reinforcer given a response is less than the probability of a reinforcer given no response. The diagonal line represents a neutral contingency where both para meters would be equal. Results and Discussion Data from the last six sessions of the last two conditions of Experiments 1 and 2 (.10/.00 and .10/.05) and each session for each condition (for a total of six sessions per condition) from Experiment 4 are presented below. Subject 2003 Data for subject 2003 are presented in Figure 5-2. For 2003, three contingency m anipulations were performed following seque nce 1. Those were 10/.00 to .09/.07, .10/.00 to .05/.10, and .10/.00 to .05/.10. The figure shows that responding decreased as the contingency was progressively weakened. During the firs t manipulation, the criteri a for suppression were met during .09/.07. A return to .10/.05 did not result in an increase in re sponding therefore, the contingency was returned to .10/.00. This produced an increase in responding. At that point, the

PAGE 52

52 second contingency manipulation began. During the second manipulation, responding remained above suppression criteria until the contingency was weakened to .05/.10. This sequence was replicated during the third manipulation. Follo wing the third manipulation, responding increased during a brief reversal to .10/.00.

PAGE 53

53 Figure 5-2. Experiment 4 Subj ect 2003. This figure shows responses per min of lever pressing for the last six sessions of t he last two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Three contingency manipulations are shown: from .10/.00 to .09/.07, .10/.00 to .05/.10, and .10/.00 to .05/.10. 2003 Experiment 4

PAGE 54

54 Subject 2004 Results for 2004 are presented on Figure 53. For 2004, seven conti ngency m anipulations were made consisting of five exposures to se quence 1 and two exposures to sequence 2. Those were from .10/.00 to .09/.06, .10/.00 to .10/ .05, .10/.00 to .10/.05, .10/.00 to .09/.06, .10/.00 to .09/.06, .10/.00 to .08/.03, and 10/.00 to .05/.09. The figure shows that responding decreased as the contingency was progressively weakened. During the first manipulation, the criterion for suppression was met during .09/.06. A return to .10/.00 resulted in an increase in responding. At that point, the second contingency manipulation began. During the second manipulation, responding reached suppression criteria during the next condition: .10/.05. This sequence and effects were replicated during the third c ontingency manipulation. Following the third manipulation, responding increased during a return to .10/.00. At that point, the fourth contingency manipulation began. Responding persisted during the subsequent condition: .10/.05. Responding decreased following a change to .09/. 06. At that point, th e fifth contingency manipulation began. Responding increased dur ing .10/.00 but reached suppression criteria during the subsequent condition: .10/.05. Three out of the five previous attempts to transition from .10/.00 to .10/.05 resulted in suppression of responding. In the remaining tw o attempts, suppression occurred in the next condition (.09/.06). The effects of weakening the contingency according to sequence 2 were evaluated during the sixth and seventh contingenc y manipulations. That is, instead of making a transition from .10/.00 to .10/.05, the seque nce was .10/.00, .10/.01, .09/.01, .09/.02, and so on. During the sixth contingency manipulat ion, responding increased during .10/.00 and persisted during the subseque nt four conditions: .10/.01, .09/.01, .09/.02, .08/.02, and .08/.03. During the seventh contingency manipulation, responding increased duri ng .10/.00 and persisted during the subsequent twelve conditions: .10/.01, .09/.01, .09/.02, .08/.02, .08/.03, .07/.03,

PAGE 55

55 .07/.04, .06/.04, .05/.05, .05/.06, .05/.07, and .05/.08. The suppression was obtained during the subsequent conditions (.05/.09 a nd .05/.10) and response rates in creased following a return to .10/.00.

PAGE 56

56 Figure 5-3. Experiment 4 Subj ect 2004. This figure shows responses per min of lever pressing for the last six sessions of t he last two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Seven contingency manipulations are shown. Suppression was obtained at the follo wing contingency values: .09/.06, .10/.05, .10/.05, .09/.06, .10/.05, .08/.03, and .05/.08. 2004 Experiment 4

PAGE 57

57 Subject 1903 Results for 1903 are presented on Figure 54. For 1903, four contingency m anipulations were performed: one from sequence 1 and tw o from sequence 2. The figure shows that responding decreased as the contingency was driven more in favor of not responding. During the first manipulation (following sequence 1), the criterion for suppression was met during .10/.05. Subsequence contingencies were examined using sequence 2. During the second contingency manipulation, a retu rn to .10/.00 resulted in an increase in responding. Responding persisted during the subse quent condition (.10/.01) and then decreased in the next condition (.09/.01). A return to .10/.01 produced increased responding and a subsequent return to .09/.01 fa iled to replicate the suppressi on effects obtained during the previous exposure to .09/.01 (i.e., responding pers isted). Responding sti ll persisted during the subsequent conditions: .09/ .02, .08/.02, .08/.03, and .07/.03. The suppression criterion was obtained in the following condition (.07/.04) a nd responding increased dur ing a return to the previous condition (.07/.03). A return to .07/.04 produced suppr ession during the 3rd session. At that point, the suppression criterion was change d such that an overall downward trend must be obtained in the condition (in addition to the previous rule that responding must be below 10 times the highest rates obtained du ring the last six sessions of the No Pellet baseline). Therefore, the contingency was weakened again to .06/.04, .05/.05, and .04/.06. The suppression criteria were obtained during .04/.06 and a return to .05/.05 did not produce increased responding. Therefore, the contingency was returned to .10/.00 and the third contingency manipulation began. Responding increased during the return to .10/.00 and persisted during the following four conditions: .10/.01, .09/.01, .09/.02, and .08/.02. Re sponding decreased during the subsequent condition (.08/.03) and then increas ed following a return to .10/.00.

PAGE 58

58 Figure 5-4. Experiment 4 Subject 1903. This figure shows responses per min of lever pressing for the last six sessions of th e last two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Four contingency manipulations are shown. Suppression was obtained at the follo wing contingency values: .10/.05, .09/.01, .05/.05, and .08/.03. 1903 Experiment 4

PAGE 59

59 Subject 2001 Results for 2001 are presented on Figure 55. For 2001, three contingency m anipulations were performed following sequence 1 from .10/ .00 to .09/.07, .10/.00 to .08/.07, and .10/.00 to .08/.07. The figure shows that responding decreas ed as the contingency was progressively weakened. During the first manipulation, respo nding decreased but rema ined above suppression criteria during .10/.05. Suppression criteria we re met during the next condition (.09/.06). A return to .10/.00 resulted in increased responding. At that point, the second contingency manipulation began. Responding decreased but remained above suppression criteria during the subsequent two conditions: .10/.05 and .09/.06. Suppr ession criteria were met during the next condition (.08/.07) and a return to .09/.06 failed to increase rates. At that point, the third contingency manipulation bega n. Responding increased during .10/.00 and persisted for the ensuing two conditions (.10/.05 and .09/.06) unt il suppression was obtained during .08/.07.

PAGE 60

60 Figure 5-5. Experiment 4 Subject 2001. This figure shows responses per min of lever pressing for the last six sessions of th e last two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Three contingency manipulations are shown. Suppression was obtained at the followi ng contingency values: .09/.06, .08/.07, and .08/.07. 2001 Experiment 4

PAGE 61

61 Subject 2005 Results for 2005 are presented on Figure 56. For 2005, three contingency m anipulations were made from: one following sequence 1 a nd two following sequence 2. The figure shows that responding decreased as the contingency was progressively weakened. During the first manipulation, responding reached suppression criteri a during .10/.05. Like the previous subjects where responding was suppressed during .10/.0 5 (1903 and 2004), the effect of weakening contingencies according to sequence 2 was evalua ted. Responding increased following a return to .10/.00 and persisted during the subseque nt eight conditions: .10/.01, .09/.01, .09/.02, .08/.02, .08/.03, .07/.03, .07/.04, and .06/.04. S uppression criteria were met during the subsequent condition (.05/.05) and increased following a re turn to the previous condition (.06/.04). Responding then decreased but remained above suppr ession criteria during th e subsequent return to .05/.05. Suppression criteria we re met during the subsequent condition (.05/.06) and a return to .05/.05 did not produce an increase in behavior. At that point the third contingency manipulation was performed. Responding increased during a return to .10/.00 and remained above suppression criteria for the following ten conditions: .10 /.01, .09/.01, .09/.02, .08/.02, .08/.03, .07/.03, .07/.04, .06/.04, .05/.04 and .05/.06. Suppression criteria were met during the subsequent condition (.05/.07) and increased follo wing a return to .10/.00.

PAGE 62

62 Figure 5-6. Experiment 4 Subj ect 2005. This figure shows responses per min of lever pressing for the last six sessions of t he last two conditions from Experiments 1 and 2 and every session from every condition of the current experiment. Three contingency manipulations are shown. Suppression was obtained at the followi ng contingency values: .10/.05, .05/.05, and .05/.07. 2005 Experiment 4

PAGE 63

63 In general, lever pressing decreased as th e contingency was weakened. These results replicate and extend the findings of Lattal (1974) that showed gr adual decreases in responding as the proportion of reinforcers produced by behavior decreased. In this case, procedures were used that allowed the probability of a reinforcer delivery given a response and given the nonoccurrence of a response to be tightly controlled. In some cases, it appeared as though return s to .10/.00 followed by additional contingency weakening produced greater persiste nce over successive attempts to meet suppression criteria. That is, suppression criteria were not met until a much weaker contingency occurred during later manipulations as compared to earlier manipulat ions. For example, subject 2003 initially showed suppression under .09/.07. During the next co ntingency manipulation, suppression was not obtained until the cont ingency was weakened to .07/.08. For subject 2001, suppression was obtained in the initial contingency manipul ation until .09/.06. But during subsequent manipulations, suppression was not obtained un til .08/.07. For subject 1903, suppression was obtained during the second contingency manipulation at .09/.01 but during subsequent manipulations, suppression was not obtained until weaker contingencie s were reached (.07/.04 and .08/.03, during the third and fourth manipul ations, respectively). For subject 2004, suppression was obtained during th e sixth contingency manipulat ion at .08/.03 but suppression was not obtained during the sevent h manipulation until the continge ncy was weakened to .05/.08. Similarly, for subject 2005, suppression was obtained during the second contingency manipulation at .05/.05 but wa s not obtained until .05/.07 du ring the third manipulation. In addition, there were some cases in wh ich contingencies that initially produced suppression resulted in maintenan ce after responding was increased during a return to the next stronger contingency. These results suggest that similar procedur es could be used to increase

PAGE 64

64 maintenance of treatment effects in environments that might have weakened contingencies. For example, following acquisition of appropriate co mmunication, an individual could be exposed to a series of gradually weaker contingencies. Such exposure might in crease the likelihood of maintenance in situations when which reinforcers are less likely following appropriate behavior and more likely after the nonoccu rrence of appropriate behavior (e.g., after the occurrence of problem behavior). For subject 2004, contingency w eakening using sequence 2 (that, in general, involved smaller steps and changing only one parameter at a time) seemed more likely to result in maintenance at weaker contingencies for subj ects that showed suppression during sequence 1 (1903, 2004, and 2005). Although both sequences were also evaluated for subjects 1903 and 2005, the data representing the last six sessions of Experiments 1 and 2 were obtained after much longer experience in those condi tions. A comparison of the effects of sequence 1 and 2 for subjects 1903 and 2005 based on those data would seem inappropriate. However, subject 2004 experienced four additional exposures to seque nce 1 following Experiment 1 followed by two exposure to sequence 2. Therefore, it seems more reasonable to attribute differences in the persistence of behavior during weakened contingencies to the particular sequence used (for subject 2004).

PAGE 65

65 CHAPTER 6 GENERAL DISCUSSION Four experim ents examined the effects of c ontingencies on lever pressing using rats as subjects. In Experiment 1, subjects were e xposed to four conditions: No Pellet, .10/.05, .10/.00, and .10/.05. For all three subjects, responding remained low during the No Pellet and .10/.05 conditions, increased during .10/.00, and persisted during .10/.05. The results of Experiment 1 showed that a) acquisition was more likely under contingencies that were purely positive as opposed to complex positive contingencies (where reinforcers were also sometimes presented following the nonoccurrence of behavior) and b) responding persisted under complex positive contingencies despite the fact that the same c ontingencies did not prev iously produce acquisition. Subjects in Experiment 2 were exposed to th e same conditions as those in Experiment 1 except that following the No Pellet condition, su bjects were also exposed to two negative contingencies: .00/.10 and .05/.10. The results of Experiment 2 were the same as those for Experiment 1 except that, for two subjects, responding was suppressed during the second exposure to .10/.05 and, for one subject, acqui sition did not occur in .10/.00 (although, a subsequent exposure to 1.0/.00 produced acqui sition). Differences in responding between Experiments 1 and 2 were tentatively attributed to the early exposure to negative contingencies provided in Experiment 2. Ther efore, Experiment 3 was designed to examine effects of early exposure to negative contingenc ies on maintenance during .1 0/.05 using a within-subject preparation. Experiment 3 arranged a two-component multiple schedule in which each component was associated with different stimuli that signaled th e presence of different contingencies. Early exposure to negative contingencies was provi ded in component 2 while only positive contingencies were implemented in component 1. Later conditions arranged positive

PAGE 66

66 contingences in both components to test for differences in acquisition during .10/.00 and maintenance during .10/.05. For one subject, 230 2, a clear but very temporary difference was obtained in maintenance during .10/.05 but onl y minor differences were obtained during acquisition under .10/.00 (and those minor differences were in th e direction opposite from that expected given the results of Experiments 1 and 2). For the other two subjects, 2406 and 2401, no clear differences were obtai ned during either maintenance or acquisition. The temporary suppression observed in component 2 of .10/.05 for subject 2302 wa s different compared to the more persistent suppression observed for subjects 1901 and 1903 in Experiment 2. However, temporary history effects are not entirely uncommon when ev aluated within subject (e.g., Freeman & Lattal, 1992). One interpretation for the lack of a robust effect was that the first condition in this experiment exposed the subj ects to a positive contingency. In addition, alternation between the two components of the multiple schedule during the test condition (when both components were set to .10/.05) may ha ve produced carryover effects. Thus, rapid alternation between the component previous a ssociated with a positive contingency and the component previously associated with the nega tive contingency may have made it difficult to detect differences that would have otherwise b een apparent had the components been presented in isolation. Experiment 4 showed how responding decrea sed as a result of gradually weakening contingencies. For every subject, responding decreased gradually as contingencies were weakened. For some of th e subjects (1903, 2004, and 2005), responding seemed to persist longer during weak contingencie s when smaller changes in co ntingencies were used and following repeated exposure to previ ously experienced contingency values.

PAGE 67

67 Results of these experiments have potent ial implications for the acquisition and maintenance of appropriate and problem behavior. The results of Experiment 1 suggested that acquisition might be suppressed by reinforcers that are presen ted following the nonoccurrence of behavior. Environments in which reinforcers are freely available or presented noncontingently may be harmful in the sense that they might pr event the acquisition of ap propriate behavior. The notion that free reinforcers may be harmful is not new (c.f., Ayllon & Michael, 1959; Burgio et al., 1986). However, the current conceptualizati on of contingencies refi nes that notion to the degree that predicting an environments effect s on acquisition requires know ledge of both a) the probability of a reinforcer gi ven the occurrence of a respons e and b) the probability of a reinforcer given the nonoccurrence of a respons e. Likewise, a key to the success of early childhood intervention programs may lie in the combination of fe w freely available reinforcers and many response-produced reinforcers. Inversely, free reinforcers might help prevent the acquisition of problem behavior. That is, reinforcers delivered in th e absence of problem behavior might suppression acquisition even though reinforcers would be more likely following the occurrence of behavior. In the initial complex positive contingency evaluated in Experiment 1, .10/.05, the delivery of a reinforcer following behavior was twice as likely compared to the delivery of a reinforcer following the absence of behavior. This may be important, especially considering that certain kinds of problem behavior (e.g., aggression) have a high likelihood of producing po tential reinforcement (such as attention) from others (Thompson & Iwat a, 2001). If caregivers or other members of an individuals social environmen t are unable or unlikely to implement extinction (withholding the reinforcer maintaining behavior following the occurrence of behavior), the current results

PAGE 68

68 suggest that even a modest amount of reinforcem ent presented in the ab sence of behavior may serve to inoculate against the development of some forms of severe problem behavior. Results of Experiment 2 suggested that be havior previously exposed to a negative contingency was less likely to be acquired under subsequent positive contingencies and less likely to persist at high rates under subsequent complex positive contingencies. The purpose of Experiment 3 was to evaluate these effects within subject although the results were not compelling. Numerous possible interpretations of the results make applied implication tenuous. However, results for subject 2302 provide some support for the notion that perhaps for certain individuals, extended earl y experience with negative continge ncies may reduce the persistence of behavior following acquisition. On e implication with respect to the maintenance of appropriate behavior is that the requirements for treatment integrity be that much more strenuous. For example, appropriate responses s hould be selected on the basis of contacting a high likelihood of reinforcement from members of the community. In addition, perhaps caregi vers could be trained not to provide reinforcers in the absence of appropriate behavior in the hopes of further strengthening the reinforcement contingency when appropriate behavior does in fact occur. Results of Experiment 4 suggest that pro cedures could be developed to improve the likelihood that appropriate beha vior will persist in environm ents unlikely to support such behavior. Experiment 4 showed that as the prob ability of a reinforcer given a response decreased and the probability of a reinforcer given no response increased (i.e., as the contingency became more negative), behavior tended to decrease. In addition, there was some evidence to suggest that smaller changes in the contingency values ma intaining behavior were more likely to result in persistence as compared to larger changes. Therefore, once individuals have acquired some appropriate behavior, therapists might use a similar approach to a) identify the point where

PAGE 69

69 behavior becomes suppressed, b) use such inform ation as a baseline to compare the effects of treatments designed to improve maintenance, and c) include making small changes in contingency values as one component of a maintenance program. Some limitations of the present experiments dese rve comment. First, the use of 1-s cycle durations was arbitrary but ultimately was de rived as a compromise between two competing problems. Shorter cycle durati on might be ideal because they reduce the time between responses and subsequent programmed reinforcer deliveries. Using a 1-s cycle duration meant that even if the probability of a reinforcer given a response was 1.0, there may have been up to nearly a 1-s delay between a response and a subsequent reinforcer delivery. The possibility of uncontrolled delays between responses and programmed reinfo rcers introduces an additional source of variation. However, previous researchers have shown that delays between 0.5 s and 1.0 s can actually produce increases in re sponding relative to a no-delay baseline (Sizemore & Lattal, 1978; Lattal & Ziegler, 1982). On the other hand shorter cycle durations suffer from additional problems. One problem with short cycle durations is that behavior occurring in one cycle may be more likely affected by reinforcers delivered in a subsequent cycle for the nonoccurrence of behavior. Given that the 1-s cycle had been used by Hammond (1980) it was selected as a reasonable beginning point and future contingenc y research could test larger and smaller intervals. A parallel problem lies in the arbitrary defini tion of the nonoccurrence of behavior as any cycle in which behavior does not occur. Why should the nonoccu rrence of behavior be defined as a 1-s period in which behavior does not occur? If the nonoccurrence of be havior were defined as some period of time greater than 1-s in which behavior did not occur th en the interval unit is considerably different than the amount of time it takes the organism to make a response.

PAGE 70

70 Conversely, if the nonoccurre nce of behavior were defined as some period of time less than 1-s in which behavior did not occur, then the aforementioned problem of short delays to reinforcement for previously emitted responses is pe rtinent. This discussion suggests that future work should investigate the interaction between di fferent interval sizes and different definitions of the nonoccurrence of behavior on the effects of contingencie s. A cycle duration of 1-s produced relatively orderly data in the current ex periment; however, larger cycle durations may be more appropriate in cases where the duration of behavior exceeds 1 s. Alternatively, similar procedures may be evaluated in a trial-based format with explicit stimuli that signal the beginning and end of each trial and reinforcers that are presented immediately following the occurrence of behavior. In addition to the implication for the acquis ition and maintenance of behavior, the current conceptualization of reinforcemen t contingencies may also provide a means for applied behavior analysts to increase the overall rate of reinfo rcement for a given proced ure without necessarily sacrificing effectiveness. Once appropriate behavior has been acquired, it might be possible to increase the probability of a reinforcer given no response by a small amount while maintaining similar rates of behavior. Increasing the rate of reinforcers provided following the nonoccurrence of a response and thus, overall rate of reinforcement, might be beneficial for reasons other than its effects on the target response. Environments with few responseindependent reinforcers may appear sparse and barren to untrained individuals. Alternatively, procedures with a higher likeli hood of reinforcers for the nonoccu rrence of behavior (and overall rates of reinforcement) might be judged as mo re acceptable by parents, caregivers, and other members of the community. Also, procedures with higher rates of reinforcement might be

PAGE 71

71 preferred by the individual recei ving the treatment and thus, the i ndividual might be less likely to avoid the treatment and caregivers as sociated with its implementation. Viewing contingencies as the re lationship between the probabili ty of a reinforcer given a response and the probability of a reinforcer given the nonoccurrence of a response may provide a bridge for understanding reinforcement in both experimental and non-expe rimental contexts. Descriptive analyses are procedures that atte mpt to produce descriptions of the relationship between behavior and environment in circumstan ces in which the environment is not under the control of the experimenter. Most research and understanding of re inforcement is based on reinforcement as implemented using traditional sche dules (e.g., FR, VI). However, if reinforcers are presented both following behavior and followi ng the nonoccurrence of behavior in some uncontrolled context (as has been shown by Samaha et al., in press) the n, interpreting the overall schedule of reinforcement as something like VR or VI can be complicated and problematic. Reliance on traditional schedule nomenclature ha s therefore restricted the understanding of reinforcement to contexts in which schedules can be controlled. A conceptualization of reinforcement based on the probability of a reinfo rcer given a response an d the probability of a reinforcer given no response can therefore br idge the concept of reinforcement across both experimental and non-experiment al (descriptive) contexts.

PAGE 72

72 LIST OF REFERENCES Ayllon, T. & Michael, J. (1959). T he psychiat ric nurse as a beha vioral engineer. Journal of the Experimental An alysis of Behavior, 2, 323-334. Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 1 91-97. Borrero, J. C. Vollmer, T. R., van Harren, F., Haworth, S. & Samaha, A. L. (in prep). A multiple schedule evaluation of behavior on fixed-time and fixed-interval schedules. Borrero, J. C., Vollmer, T. R., & Wright C. S. (2002). An evaluation of contingency strength and response suppression. Journal of Applied Behavior Analysis, 35 337-347. Burgio, L. D., Burgio, K. L., Engel, B. T., & Tice, L. M. (1986). Increasing distance and independence of ambulation in el derly nursing home residents. Journal of Applied Behavior Analysis, 19 357-366. Campbell, D. T. & Stanley, J. C. Experimental and quasi-experimental designs for research. In N. L. Gage (Ed.), Handbook of research on teaching. Chicago: Rand McNally, 1963. Catania, A. C. (1983). Learning, Second Edition. Englewood Cliffs, New Jersey: Prentice-Hall, Inc. Catania, A. C. (1988). Learning. Third Edition. Englewood Cliffs, New Jersey: PrenticeHall, Inc. Edwards, D. D., Peek, V., & Wolf, F. (1970) Independently delivered food decelerates fixed-ratio rates. Journal of the Experimental Analysis of Behavior, 14, 301-307. Ferster, C. B. & Skinner, B. F. (1957). Schedules of Reinforcement. New York: AppletonCentury-Crofts. Freeman, T. J. & Lattal, K. A. (1992). Stimulus Control of Behavior History. Journal of the Experimental Analysis of Behavior, 57, 5-15. Goh, H. L., Iwata, B. A., & DeLeon, I. G. (2000). Competition between noncontingent and contingent reinforcement schedules during response acquisition. Journal of Applied Behavior Analysis, 33, 195-205. Hammond, L. J. (1980). The effect of c ontingency upon the appetitive conditioning of free-operant behavior. Journal of the Experimental Analysis of Behavior, 34 297-304. Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1982/1994). Toward a functional analysis of self-injury. Journal of Applied Behavior Analysis, 27 197-209.

PAGE 73

73 Koegel, R. L. & Rincover, A. (1977). Res earch on the difference between generalization and maintenance in extra-therapy responding. Journal of Applied Behavior Analysis, 10, 1-12. Lattal, K. A. (1972). Response-reinforcer independence and conventional extinction after fixed-interval and variab le-interval schedules. Journal of the Experimental Analysis of Behavior, 18, 133-140. Lattal, K. A. (1974). Combinations of res ponse-reinforcer dependence and independence. Journal of the Experimental Analysis of Behavior, 22, 357-362. Lattal, K. A. & Bryan, A. J. (1977). E ffects of concurrent response-independent reinforcement on fixed-interval schedule performance. Journal of the Experi mental Analysis of Behavior, 26, 495-504. Lattal, K. A. & Maxey, G. C. (1971). Some ef fects of response independent reinforcers in multiple schedules. Journal of the Experimental Analysis of Behavior, 16, 225-231. Lattal, K. A. & Ziegler, D. R. (1982). Br iefly delayed reinforcement: An interresponse time analysis. Journal of the Experimental Analysis of Behavior, 37, 407-416. Marcus, B. A. & Vollmer, T. R. (1996). Combining noncontingent reinforcement and differential reinforcement schedules as treatment for aberrant behavior. Journal of Applied Behavior Analysis, 29, 43-51. McGonigle, J. J., Rojahn, J., Dixon, J., & Strain, P. S. (1987). Multiple treatment interference in the alte rnating treatments desi gn as a function of the intercomponent interval length. Journal of Applied Behavior Analysis, 20 171-178. Rescorla, R. A. (1967). Pavlovian conditioni ng and its proper control procedures. Psychological Review, 74, 71-80. Rescorla, R. A. & Skucy, J. C. (1969). Eff ects of response-independent reinforcers during extinction. Journal of Comparative and Physiological Psychology, 67, 381-389. Reynolds, G. S. (1968). A Primer of Operan t Conditioning. Glenview, Illinois: Scott, Foresman. Samaha, A. L., Vollmer, T. R., Borrero, J. C., Sloman, K. N., St. Peter Pipken, C., & Bourret, J. (in press). Journal of Applied Behavior Analysis. Sizemore, O. J. & Lattal, K. A. (1978). Un signaled delay of reinforcement in variableinterval schedules. Journal of the Experimental Analysis of Behavior, 30, 169-175. Thompson, R. H. & Iwata, B. A. (2001). A de scriptive analysis of social consequences following problem behavior. Journal of Applied Behavior Analysis, 34, 169-178.

PAGE 74

74 Thompson, R. H., Iwata, B. A., Hanley, G. P ., Dozier, C. L., & Samaha, A. L. (2003). The effects of extinction, noncontingent reinforcem ent, and differential reinforcement of other behavior as control procedures. Journal of Applied Behavior Analysis, 36, 221-238. Vollmer, T. R., Borrero, J. C., Borrero, C. S., Van Camp, C., & Lalli, J. S. (2001). Identifying possible contingencie s during descriptive analyses of severe behavior disorders. Journal of Applied Behavior Analysis, 34 269-287. Zeiler, M. D. (1968). Fixed and variable schedules of response-independent reinforcement. Journal of the Experimental Analysis of Behavior, 11 404-414.

PAGE 75

75 BIOGRAPHICAL SKETCH I com pleted my undergraduate degree in 2001 at the University of Florida where I majored in psychology. I began the doctora l program in behavioral analysis in the same department that summer. My masters thesis examined a met hod for describing interactions between children and their caregivers. My primary interests are in the areas of a ssessment and treatment of severe problem behavior and the use of animal and human operant prepara tions to address issues related to the assessment and treatment of severe prob lem behavior. Also, I hope to also expand the range of problems in which behavior anal ysis can suitably app lied during my career.