William S. Verplanck
This glossary was made possible by a grant from the American Philosophical Society to the author for foreign travel and by the extended hospitality of N. Tinbergen and of A. C. Hardy of the Department of Zoology, Oxford University. The author wishes to express his gratitude to the society for its support and to these men for their assistance. He also wishes to thank the many ethologists whom he had the opportunity of talking to, of hearing, and of watching at work. Thankful acknowledgment is also made for the critical reading of early drafts by Frank A. Beach, Robert A. Hinde, Martin Moynihan, and W. H. Thorpe.
In talking with students of behavior on both sides of the Atlantic, I often found myself faced with certain vocabulary problems, even when using English. For a common term, there was not always a common referent; for a common referent, there was not always a common term. Even when both term and referent seemed identical, there were often misleading shades of meaning. Much of the work published by ethologists, in fact, is not readily intelligible to “rat” behaviorists, and many publications by the latter that could be of interest (not to say importance) to ethologists might as well have been written in an obscure Sanskrit dialect. Perhaps it would be better if they had been, since then a need for translation would be apparent. Not only are some writings unreadable, but still others that can be “read” are read erroneously, and both psychological and ethological concepts are sometimes referred to inappropriate observational and theoretical contexts.
Even when the referents of terms agree, the words employed to label them sometimes are misinterpreted because of their connotations in the vernacular or in other scientific contexts. Some scientists react with negativism, if not hostility, to a concept because a particular term is used to designate it; but they may accept without misgivings concepts that are essentially repugnant to them, solely because someone made a happy choice of words to name them. By designating a member of one class of stimuli as a Gestalt, for example, one may alienate the very people who need and use such a concept (under other labels) and may achieve its acceptance by those to whom the objective approach to behavior is a source of moral indignation. Even observations may be questioned or rejected, not on empirical grounds, but simply because a particular word was chosen to label the observation. Terms such as conditioning are not used by some because it implies to them that the Pavlovian model of brain physiology is valid. Along with the term, they ignore or deprecate a whole area of research. Imprinting is rejected by others because “there is no perceptual learning” or because someone wonders what gets imprinted where; obtaining an unpalatable answer, they assume that there is no class of events to which the term can be applied. All this is perhaps unfortunate when one recalls that these problems arise in a scientific context. But scientists are human and behave according to laws that do not always work out toward most rapid scientific progress.
Glossaries that have been prepared are not always as useful as they might be. Most American behaviorists, perhaps oversensitive to the demands of the logical analysts, are not solely vocabulary-conscious-they choose words carefully-but they are also definition-conscious—they like to keep track of what they are talking about. Some of the terms they use are defined ostensively: they point. Others are defined by more elaborate operations. Still others are theoretically defined. Data-language terms (i.e., terms used in describing what happens) are distinguished from theoretical terms and from names for processes (i.e., behavior changes that are functions of time). For this reason, these men become unhappy when they encounter sets of terms including members of all these groups placed together and endowed with literary definitions.
By “literary” definitions, I mean definitions stated in words occurring in common speech for which no clear and unequivocal referent in behavior, for which no coordinating definition or operation that might provide such a referent, is or can be stated. What, for example, does fuller organization of sense data” mean, and what is “a complex of states”? When should one talk about “sensory integrationor the “apprehension of relations”? Presumably some-body “knows”-but I don’t.
Most sets of definitions, whether carefully assembled or implicitly available to anyone willing to search the indexes of systematic works, give such difficulties. There are “empirical” terms that have no referent or whose referents are so stated that one does not quite know whether, in any one case, to stick an electrode or a qualification into them (center). There are terms that represent laboratory slang apotheosized into scientific terminology (e.g., superstition) and terms that represent the reification of processes (e.g., mimesis). Some are defined adequately, even elegantly in the literary sense; but they turn out to be meaningless, self-contradictory, or contradictory to other definitions in the same system when the definitions of the words used in them are examined. Other terms have been borrowed from physiology to label a phenomenon solely by rough analogy. Still others must be classified as intuitive; if one does not know what the term “means” in the first place, he will never learn. These lack empirical objective referents no matter how careful the attempts to define, to redefine, to determine usage.1 What they mean depends on their connotations for each person who uses them. Yet others are theoretical and incorporate empirically defined terms. Some terms are used in two ways: empirically, as a label for a class of behavior, and theoretically, as the name of a concept relating that behavior to other classes of events. Displacement activity is a theoretical concept, for which Tinbergen (29) has given a definition that is one of the clearest and most precise theoretical definitions that can be found. But in use, it is also an empirical term, and its application as such is independent of Tinbergen’s definition. That one term has two such uses may be misleading, but not seriously so if the user recognizes the two. Otherwise, mere labels become explanatory concepts. It is no wonder that difficulties arise.
Yet words defined in so many and such arbitrary ways are often used consistently and intelligibly. This suggests that there is something wrong with the definitions that have appeared. Without specifying exactly what is wrong, beyond their sheer incapacity to communicate information about behavior or to tell one how and when to use a word, one can propose alternative definitions based on the fact that the terms are used with some success. If usage is watched, if one watches and listens to people using terms, if the events to which the terms are applied are observed, the fact becomes clear that many such terms have clear-cut referents and are hence capable of objective (i.e., empirical) definition-they are labels. This would be surprising if it were not true that the empirical observations with which the vocabulary has to do, and which it purports to describe and explain, are made on the same subject matter: the behavior of living animals, scientifically observed.
The philosophical analysts, or logical positivists, and other methodologists have proposed methods for the elaboration and clarification of vocabularies that have helped make communication more precise. Three classes of terms, for example, can be isolated. The first class falls into what we shall call the data-language, the terms and syntax used in describing observations. If words are to be used in the data-language, they must be defined so that anyone after a minimum of training can use them consistently. The terms must refer to objective events so that statements embodying them can be verified directly by other people. Terms that do not lead to such agreed upon statements fall outside of this class. Theoretical statements cannot appear in this language, but only statements such as: ”The rat turned left and ran into the goal-box.” ”The man in the brown coat is eating a hamburger.” or “Three chaffinches are mobbing an owl.” The words that appear in the data-language are typically defined by simple (and perhaps repeated) empirical designation, that is, by pointing.
A second class of terms is what I shall here call “empirically” or ”symptomatically” defined terms. They are labels applied to classes of statements in the data-language; their application depends upon the verification of one or more statements in the data-language. The term extinction, for example, can be applied if and only if certain observations are made. We must state the operations which lead to the observation, and we must state an outcome of the operations too. Agreement among observers that such and such an observation has been made is implicitly required before the word can be applied. Theoretical statements are not necessarily made when members of this class of terms are employed in a statement, although, as we shall see, theoretical statements necessarily employ them.
“Operational” definitions include all that fall into these first two classes. They demand agreement, and they make it possible for anyone who is able to read to reconstruct the observations to which the terms apply. It does not follow that all terms that are operationally defined will be equally useful to the behavior scientist; many such empirically defined concepts may prove to be of limited value and interest. The scientist can take them or leave them. The operational definition of a term may be modified at a later date. Two operationally defined concepts may prove to be equivalent, and one term will then cover both. Such definitions state the observational conditions under which an observer may apply a particular term. They are, as it were, a description of the discriminative stimuli, releasers, or sign stimuli for a piece of verbal behavior.
The third class of terms includes all those defined by the exclusive use of members of the first two. Such terms are theoretical in that they may propose relationships among members of the earlier classes and in that they may introduce new concepts that hypothesize classes of events, states, structures, or mechanisms that have not yet been directly observed in connection with the behavioral events to which they are theoretically related. Statements incorporating theoretical terms are subject to verification, either through experimental test of their con-sequences or by direct observation. Thus, theoretical terms may graduate, as it were, into the class of empirical terms.
Heuristically, a set of definitions stated with these principles in mind may serve some useful purposes. It should permit the behaviorist to learn what ethology is about in terms of behavior and at the same time make the behaviorist literature intelligible to ethologists. It should make it possible to translate statements using terms familiar to one group into statements using terms familiar to the other.
It may do more than provide for direct translation. When each concept is examined in its experimental and theoretical context, when the causal statements summarizing and defining it are analyzed following the criteria which we have stated, other possibilities arise. Analysis enables one to distinguish between observational and theoretical levels of discourse and so may indicate areas in which empirical data are not yet adequate to provide the basis for theoretical advance; hence, it may pinpoint areas where research is badly needed. Certain concepts, often treated as purely speculative, may prove to have sound empirical bases. Analysis may show that some “empirical” terms have no referents; the definition of a term in the data-language may include a statement about events that cannot be observed-there are no unicorns, and perhaps there is no sensory preconditioning either. Other terms, used as though they were empirical and frequently referred to as “things” that belong to the second set of terms, may turn out to refer, not to empirical goats, but to theoretical sheep. The status of various other theoretical concepts may also be clarified. Some may prove to be “intuitive,” that is to say, undefinable without the introduction of non-empirical, nonobjective sets of referents, and hence to have no place in scientific statement.2
Finally, two apparently different terms may prove to have the same, or almost the same, empirical content so that the number of theoretical entities (and hence theoretical problems) can be reduced, the body of experimental data related to the survivor can be amplified, and new experimental hypotheses can be generated. One can state with some precision the theoretical relations between Lorenz’ action-specific energy and Skinner’s reflex reserve, and between Tinbergen’s motivation and Hull’s sEr. What is more important, one can state the relationship of both to experimental observations of behavior.
Attempting to follow the principles and the objectives that have been stated, I have prepared this glossary for the use of those who work with behavior as an objective datum. In defining terms, I have tried to follow usage in stating definitions, in preference to following or restating literary definitions however well put. It follows that the definitions will not always correspond to those which have appeared in other publications and, hence, that not everyone will agree with them. I have tried to avoid arbitrary redefinition and to restate concepts, in so far as possible, on the basis of usage in the literature, in the laboratory, and in field observations. Occasionally, and especially with theoretical terms, a usage definition had to be arrived at inductively on the basis of numerous instances of the appearance of the word.
The major rules followed in preparing the glossary are given at the end of the Preface.
The glossary is not an exhaustive one— it cannot pretend to be. I have tried to include as many useful terms as possible, but some important ones have doubtless been omitted, just as some trivial ones have crept in. It arrives at a thoroughly Skinnerian and ethological point of view toward behavior. (The Hullian terms, fitted into place, will mislead no one; their usefulness rests upon the redefinition of the response concept.) But this, I submit, is where any such attempt at clarifying psychological terms will land us. We always have to go back to pieces of behavior that we can identify, and we have to build from them.
I am the first to recognize that writing this glossary is a rash enterprise and that I have been presumptuous indeed. Someone, I suspect, has to be. Like all glossaries (back to Dr. Johnson’s dictionary), it is a personal one and admittedly exhibits certain biases, not all of which I am probably aware of . Certainly, not everyone will be happy with many or all of the definitions given. This is so if only because not all behavior theorists and not all ethologists agree on the exact usage of several terms. I have tried to follow what seemed to be the most common usage, even when it did not conform with the usage intended by the men who first introduced the term into the vocabulary of behavior. No doubt, some of the definitions are based on misunderstandings; but, if this is so, others too will have misunderstood. I hope that those who read and use these definitions will offer emendations or alternative correct definitions so that my errors will not (or cannot) be repeated by me or by others who have been led into the same misunderstanding.
The glossary, then, is intended to provide an empirical vocabulary that can be used by anyone in the science of “human” or “animal” behavior. It may serve to familiarize its readers with some of the very great recent developments in the study of animal behavior and to suggest their relevance to human behavior. Perhaps it will clarify the status of some of the concepts that have been used in both areas in the past. To some it may suggest, as it has to me, new lines of research on humans. To others it may recommend itself as a supplementary text.
But if this glossary provokes thought, discussion, and argument, if it increases mutual intelligibility between ethologists and other behavior students even a little, it will have served its purpose.
Willam S. Verplanck
Major Rules Followed in Preparing the Glossary
1. Terms having empirical, symptomatic status are defined in the data-language or in other empirical terms and are designated emp. Such definitions are given to some terms that seem to warrant it but that have not previously been endowed with empirical status.
2. Many terms having sound theoretical status are defined in terms of the empirical concepts on which they are based. For others, such definitions are overlong and they are explicated elsewhere (e.g., in Hull’s Principles of Behavior),therefore their content is stated in general terms and their full definitions are omitted here. In both cases, their definitions are designated th. Terms that pretend to theoretical status, but that are not reducible to empirical terms, share with their intuitive, literary, and conversational fellows the label Con.
3. Where terms have several usages, each is defined separately.
4. This glossary has omitted all but a few data-language terms, including the names of specific responses that have been studied extensively. From the psychological vocabulary such self-explanatory terms are omitted as “turning right,” “pressing a bar,” “eating a pellet,” “jumping to the left,” and “pecking”; and from the ethological vocabulary, such terms as “mobbing,” “preening,” “egg rolling,” “brooding,” “forward threat posture,” “greeting ceremony,” and “long call,” to name but a few. These are all ostensively defined. The terms retained define apparatus that may be unfamiliar to ethologists (e.g., Lashley jumping-stand).
5. Laboratory slang is designated lab slang.
6. The meaning of some terms has evolved considerably; in some instances, words that have been widely used are dropping out now. I have tried to indicate such terms and to state the current usage. (See, for example, innate and its cross references.)
7. To save space, and to obviate the coining of new terms, the definitions are stated in the broader, more widely used vocabulary of behavior theory.
8. When the usages of ethologists and behavior theorists (see glossary) differ, each is labeled appropriately (eth and bt). Similarly, terms used by only one of the two groups are designated appropriately.
9. Although many of the words have been borrowed from physiology, only a few of their definitions in physiology are given.3Physiological definitions (designated physiol) appear only when it seemed interesting to compare the two usages and when the comparison might suggest the experimental and theoretical steps to be taken before the behavioral and physiological definitions of the same word can state the same concept.
10. Whenever it seemed desirable, brief discussion of the concept and its definition, and examples of usage, are appended.
11. Insofar as possible, all behavioral terms appearing in a definition are themselves defined in the glossary. The reader, I hope, will find occasion to check this statement.
12. No attempt has been made to incorporate the whole system of terms that various theorists have developed. For these, the reader is referred to the works themselves, which appear in the list of references.
A GLOSSARY OF SOME TERMS USED IN THE
OBJECTIVE SCIENCE OF BEHAVIOR
William S. Verplanck
acquisition = 1. (emp, bt)4 progressive increments in response-strength observed over a series of occasions on which the response is measured. = 2. (emp) any modification of behavior in which a response changes in strength or topography, or occurs in new environments.
action-specific energy, specific action energy = (th, eth) a hypothetical construct inferred from changes in stimulus threshold, intensity, and rate of occurrence of an unlearned response with time and with frequency of occurrence of the response. Its quantity becomes very low when the response is made; it then recovers with time in the absence of response (26). [This concept is almost identical with Skinner’s respondent reflex reserve (22). See reflex reserve and exhaustion.]
adaptation = (emp) habituation. [The use of this synonym for habituation should be avoided because of the inevitable confusion with sensory adaptation and with the many other usages it enjoys.]
adaptation/sensory = (emp) change, incremental or decremental, in response or response-strength that has been experimentally demonstrated to depend solely upon changes in the state of a receptor organ produced by protracted or repetitive stimulation of, or by recovery from, such stimulation of that organ.
after-discharge = 1. (emp, bt) that part of a response that occurs after the termination of the stimulus that elicited it or set the occasion for its occurrence. [Uncommon usage.] = 2. (emp, physiol) the term applied to discharges of nerve impulses in efferent neurons that persist in time after the stimulus that set up the discharge of which they are a part is terminated. [This observation forms part of the empirical basis of the concept of the synapse.]
aggressive, see behavior/aggressive.
agonistic, see behavior/agonistic.
allochthonous, see behavior/allochthonous.
anxiety = (th, bt) a secondary drive. Its establishing operation is the development of a discriminated avoidance conditioned response (See conditioning/avoidance and discriminated.) The dependent symptom may be either that the stimulus of this CR now as a negative reinforcing stimulus for other responses or that the presence of this stimulus in the environment of an animal depresses the rate of occurrence of behavior usually shown in that environment, thus producing behavior unusual in that environment (such as, for the rat, defecation, urination, huddling, vocalizing, flight, excessive general activity, etc., or some combination of these). [This concept has been the subject of a very great amount of experimental work in experimental psychology in late years, and it is playing an increasingly large role in the field of motivation. Some theorists now tend to consider all drives as instances of anxiety. Its relationship to the concept of anxiety as it appears in clinical circles is the subject of no little theoretical interest-and controversy. For example, when the onset of a buzzer is paired with electrical shock a number of times, a rat will learn very readily to press a bar if the bar-press shuts off the buzzer. This learning is said by drive-reduction theorists to depend upon reduction of the anxiety drive.]
apparatus, see Lashley jumping-stand, runway, Skinner-box, T-maze, and T-maze/multiple.
appeasement, see behavior/appeasement.
appetitive, see behavior/appetitive.
approach-approach, see conflict/approach-approach.
approach-avoidance, see conflict/approach-avoidance.
approach/gradient of = (emp, bt) the goal-gradient (1.), usually as measured in the runway.
attack, see behavior/attack.
attend = (emp, bt) to give any response whatsoever to a stimulus. [This term defines what is probably the broadest possible class of behavior.]
attention = (con) a reification, as faculty or process, of attending.
autochthonous, see behavior/autochthonous.
aversive, see stimulus/aversive.
avoidance-avoidance, see conflict/avoidance-avoidance.
avoidance conditioning, see conditioning/avoidance.
avoidance/gradient of = (emp, bt) when an animal has been repeatedly presented with a strongly aversive stimulus (as an intense electric shock) in the goal box of a runway, and measures of the strength of running from or pulling away from the goal box are made by introducing the animal into a series of positions along the runway, it is found that the strength of the behavior is inversely proportional to the distance from the goal box. This function is called the gradient of avoidance. [The runway is often used in the experimental analysis of behavior in approach-avoidance conflicts. The animal, after learning to run to a goal box for food, is shocked there. His behavior on subsequent occasions can then be demonstrated as a function of the sum of a gradient of approach and a gradient of avoidance.]
behavior, behaviour = 1. (emp) the whole complex of observable, recordable, or measurable activities of a living animal, such as movements of the skeletal, cardiac, or smooth muscles, production of sound, discharge of electric organs, movements of cilia, contraction or expansion of chromatophores, nettling, glandular secretion, and changes in body chemistry (including those producing luminescence) as they are concerned in the animal’s commerce with its environment. [Loosely, behavior is anything an animal does. The intuitive concepts of mental activity and consciousness are not behavior, although they may be defined in terms of it. Behavior is analyzed into responses.] = 2. (emp) any parts of behavior (1.) that are recurrently identifiable and classifiable by the observer. [Cf. response. A class of behavior need not show the types of orderly quantitative variations that enable us to analyze it still further into responses.]
behavior/acquired = (emp, bt) behavior that has been experimentally demonstrated, in either its topography or stimulus control or both, to be dependent in part upon the operation of variables encountered in conditioning and learning, such as the occurrence of reinforcing stimuli. [If species-specific behavior, as a result of conditioning procedures, occurs systematically as a function of the presence of previously ineffective stimuli, it must then be considered acquired with respect to those stimuli. The terms, then, are not mutually exclusive in application. Ant. unlearned or inborn behavior.]
behavior/aggressive = (emp) a broad class of behavior that includes both threat and attack behavior. [Once identified as such, responses occurring in aggressive behavior can be identified in other contexts, so that it can be stated that aggressive behavior may occur toward inanimate objects. Aggressive behavior is often exhibited during the extinction of conditioned responses: rats bite bars during extinction; pigeons threaten keys.]
behavior/agonistic = (emp, eth) a broad class of behaviors that includes all attack, threat, appeasement, and flight behavior.
behavior/allochthonous = 1. (th, eth) behavior that is not activated by its own drive. = 2. (th, eth) behavior that is activated as a consequence of the frustration of behavior activated by some drive other than that which most often controls it. [Syn. displacement activity; Ant. autochthonous behavior.]
behavior/appeasement = (emp, eth) those behaviors of an animal (exclusive of flight) that, when they occur, terminate attack on the animal by another animal of the same species. [E.g., when wolves fight and one gains, as it were, tile upper fang, the losing animal will throw up its head and turn the ventral surface of its neck toward the jaws of the winner. The winner will then cease the attack, and the loser will retreat from the field of battle.]
behavior/appetitive = (th, eth) a term applied to characterize, in terms of an inferred or anticipated (by the ethologist) consummatory act, the behavior of an animal that is not at rest or “doing nothing.” [It is described as plastic, etc. Appetitive behavior, by definition, cannot be identified as such until observations not involving it have been made. It is further presumed to be characterized by a motor pattern, an orientation (taxis) component, and sets of stimuli to which that animal is particularly responsive. Many behaviorists would identify almost all the behavior that is classified as appetitive by ethologists as “operant behavior occurring in an environment devoid of experimental control.”Further, they would lean on theory to note that, after given drive operations had occurred experimentally or naturally, all the operants conditioned under that drive should increase greatly in strength. This of stimuli to which that animal is particularly responsive. Many behaviorists would identify almost all the interpretation would not make the activities any the less interesting to them. In fact, they spend much of their time studying them.]
behavior/attack = (emp, eth) a broad response class including those behaviors of an animal that, when earned to completion, bring to bear one or more of the animal’s effectors on the body surface of a second animal in such a way that injury and possibly death of the second animal will occur if the behavior continues. [In a given species, more precise specification is possible in terms of biting, clawing, hitting with the wings, and the like.]
behavior/autochthonous (th, eth) behavior that is activated by its own drive. [See behavior/allochthonous, drive (3.), and instinct (2.).]
behavior chain = (emp) a sequence of stimuli and responses that can be observed repeatedly in an animal, with only minor variations in the ordinal position of each stimulus and response. [In many cases of chaining, it is not possible to identify all the stimuli of the chain, but only the responses as they occur in order. In others, a response may move the animal in such a way that he is confronted with a new set of stimuli that release or elicit the next response, and so on. A theoretical distinction can be drawn between “heterogenous” chains, some of the stimuli of which are environmental and others movement-produced (see stimuli/ movement-produced), and ”homogenous” chains all stimulus members of which are either environmental or movement-produced-e.g., (classical) the hunting and egglaying behavior in the female solitary wasp, the running of a rat through a complex maze which he has run through many times before.]
behavior/emotional = (emp, bt) an arbitrary class of responses that is defined differently for different species and that is based on the covariation in response-strength of the behaviors as a function of certain often ill-defined independent variables. [Despite the rather fuzzy origins of this class, once a particular response has been placed in the class, the term emotional can be consistently applied to it; and a concept that is initially theoretical seems to acquire some empirical status. Emotional behaviors in the rat include urination, defecation, freezing, vocalizing, and trembling, when two or more of these occur together. In the pigeon they include cooing and wing-beating.]
behavior/escape = 1. (emp, eth, bt) rapid locomotion of an animal from any given location that occurs after specific stimuli are presented to it in that place. [The stimuli effective in producing escape behavior are almost identical with those identified as negative reinforcing stimuli or aversive stimuli in conditioning situations.] = 2. (emp, eth) behavior/flight (2.).
behavior/flight = 1. (emp, eth) flying of birds and bats. =2. (emp, eth, bt) rapid locomotion of an animal from proximity to another animal that is exhibiting threat or attack behavior.
behavior/imitative = (emp) an animal is placed in the environment of another animal. If the first animal then makes some response that is not species-specific, and thereafter the second animal makes the same response under conditions when it can be demonstrated that the response of the first animal is the stimulus for the response of the second, then the second animal is said to imitate the first and the behavior is called imitative. [Imitative behavior is readily producible in the primate after discrimination and differentiation training has been given. Cf. behavior/mimetic.]
behavior/inborn, see behavior/unlearned and behavior/species-specific.
behavior/innate, see innate, behavior, unlearned, and behavior/species-specific.
behavior/instinctive, see instinct (1., 2.), behavior/unlearned, and behavior/species-specific.
behavior/mimetic = (emp, eth) an animal is placed in the environment of another animal of the same species: if the second animal then makes some species-specific response, and thereafter the first animal makes the same response under conditions where it can be demonstrated that the response of the second animal is the stimulus for that of the first and that no opportunity for discrimination and differentiation training has been given to either animal, then the behavior is termed mimetic. [This term is to be distinguished from imitative behavior, which does not refer to species-specific behavior. (Cf. behavior/imitative.) Reified, we have “mimesis.” This is a good example of a not very useful empirical concept. Hinde (11) has shown that, although a few examples of mimetic behavior can be found, little is gained by classifying them together.]
behavior/operant = (emp, bt) the totality of operant responses in the behavior repertoire of the animal.
behavior pattern = (emp) a set of responses statistically organized in time, that is, associated together and manifesting some degree of stereotypy in the temporal sequence in which they occur. [The term behavior pattern is applied whenever one has analyzed behavior into relatively large units, that is, units composed of a number of responses. Behavior chains, instincts (1., 2.) and the activities that are the empirical bases of particular drives are all labeled behavior patterns, which renders the term of very limited usefulness for theory.] behavior/purposive = (con) behavior when considered with respect to its goals. [The usage of goal varies sufficiently that this term finds very limited use except in conversation or in literary statement. To those who wish to investigate variations in the stimulus control of verbal behavior, one thing is clear: few people respond to the same behavioral events consistently with one another by calling them purposive behavior. For the behavior scientist, purposive behavior is a null class or, alternatively, a class that includes all behavior. In either case, purposive behavior is not a very useful concept, for it is neither empirical fish nor theoretical fowl.]
behavior repertoire = (emp, bt) the set of behaviors characteristic either of an entire species or of a single member of a species.
behavior/respondent = (emp, bt) the totality of respondents in the behavior repertoire of an animal.
behavior/ritualized, see response/ritualized.
behavior/species-specific = 1. (Emp) those behaviors shown by a great majority of members of a species in the same or highly similar environments and under the same or highly similar conditions. [Such species-characteristic behaviors can sometimes be employed in taxonomy to assist in the classification of animals.] =2. (emp, eth) those behaviors, complex and relatively stereotyped, that appear in most members of a species under set and statable conditions without evident prior opportunity to learn. [These behaviors are often presumed to be unmodifiable by training.]
behavior/spontaneous = (emp) behavior that occurs in the ostensible absence of any stimuli that can be shown to elicit or release, or to set the occasion for its occurrence. [To call behavior spontaneous is to indicate current ignorance of the events controlling it. It does not imply capriciousness. This term would be synonymous with operant behavior, except that operants that have come under the control of discriminative stimuli are no longer spontaneous.]
behavior/symbolic = 1. (emp, bt) verbal behavior. =2. (th, bt) a hypothesized class of behavior that cannot necessarily be directly observed. =3. (th, bt) the observed behavior from which symbolic behavior (2) is inferred. [Loosely, all behavior that “must” be accounted for on some such grounds is also termed symbolic. If some behavior is observed for which the controlling stimuli are not present in the environment at the time of the response (although they may have been present on previous occasions on which the behavior occurred), then it is presumed to be under the control of symbolic behavior’ that acts as surrogate for the absent stimuli. Stimuli present in the environment are presumed to elicit some mediating behavioral (e.g., postural) or sensory event-pure stimulus act (15) or fractional anticipatory goal response—which in turn serves as the stimulus for the behavior that is actually observed.] =4. (con) loosely and colloquially, thinking.
behavior/threat = (emp, eth) those sets of behaviors of an animal that have been shown to produce flight behavior at some strength in another animal (usually of the same species) when they occur in its presence. [Threat behavior most often elicits other agonistic behavior, e.g., threat, attack, flight, or appeasement, in the other animal.]
behavior/unlearned = 1. (emp) species-specific behavior, the necessary and sufficient antecedents of which are unknown to, and often of little interest to, the investigator. = 2. (emp, eth) behavior that has been experimentally demonstrated, in both its stimulus control and topography, to be independent of and unmodified by the operation of variables encountered in conditioning and learning, such as the occurrence of reinforcing stimuli. [Syn. behavior/instinctive, inborn, innate (obsolete); Ant. behavior/acquired. The empirical referents of this definition are in dispute.]
behavior/verbal = 1. (emp, bt) behavior involving the vocalization or writing of words, or response to spoken or written words. =2. (emp, bt) behavior whose reinforcement is contingent upon stimulation of and response by another individual. [By this definition, verbal behavior is not limited to behavior involving words: gesture and other forms of communication are included.]
behaviorist = 1. (emp) a scientist who investigates die behavior of animals objectively and who attempts to relate his observations together in a theoretical system that does not include concepts borrowed from introspection and mental philosophy. =2. (emp) more specifically, a psychologist who studies learning and related phenomena as a behaviorist (1.) [His theories are often built on the behavior of R. r. norvegicus albinus, the domesticated rat. See ethologist.]
block = (th, eth) a hypothetical state of the pathways between two centers of an instinct [This state may be terminated or reduced by the action of an innate releasing mechanism that has been activated by a sign stimulus. After such nullification of the state, “motivational impulses” can flow from the higher center to the lower, activating the latter and hence yielding a response. There is no direct physiological evidence for such a state.]
center = 1. (emp, physiol) a locus in the nervous system characterized by the presence of a number of cell bodies and synapses, the excitation of which by appropriately specified electrical stimulation may yield discrete motor or autonomic behavior patterns and experimental destruction of which is followed by the disappearance or gross modification of the same, or similar, discrete motor or autonomic behavior patterns. = 2. (th, physiol) a functionally coordinated, but not necessarily localized, group of neural structures having the properties of center (1.). =3. (th, eth) a hypothetical neural structure, or place or set of places in the central nervous system, of unspecified anatomical properties, presumed to act as a unit upon excitation by another such place or other such places by sending nerve impulses that govern the occurrence of some innate response. [A center is the theoretical neural correlate of a species-specific response. Theory endows this with all the properties of the physiologist’s centers. It is not impossible that these centers may become, eventually, empirical concepts, with specified anatomical loci and properties.]-Ger.Erbkoördinationen.
chaining, see behavior chain.
choice point = (emp, bt) that position in a T-maze, or other maze, or on a discrimination apparatus, from which it is possible for, the animal to give. only one of two or more alternative responses. [E.g., to move down only one of two or more runways, or jump to one of two or more doors. In the T-maze, this is of course the point at which the base of the T touches the cross-arm.]
cognition=(con, bt) a hypothetical stimulus-stimulus association or perceptual organization postulated to account for expectancies. [See expect. It is not possible as yet to define in other than intuitive terminology, except for trivial cases. Cognitive maps are elaborations of such cognitions.]
concept = (emp, bt) any response, verbal or motor, that is under the discriminative control of a broad class of environmental objects or events; the members of the class may differ from one another in all respects other than a single quantifiable property. [Most concepts are statements that refer to the common property: “blue,” “square,” “velocity,” ‘beauty,” “length.” Pseudo-concepts may depend on a number of partially overlapping classes of events that do not share an objective common property: “honesty,” “virtue,” “rigidity.”]
conditioned, see response/conditioned, stimulus/conditioned, stimulus/reinforcing, stimulus/unconditioned, and suppression/conditioned.
conditioning = 1. (emp, bt) the generic name for the empirical concepts defined procedurally below. [In laboratory usage, the response brought under experimental control through die operations of conditioning is termed a conditioned response. Lab slang, unfortunately, extends the usage, so that an animal is called conditioned when one of its responses has been conditioned, and a stimulus may be spoken of as “becoming conditioned” when a response is conditioned to it. These usages are, strictly speaking, incorrect and may lead to misunderstandings on the part of the student. How many kinds of conditioning? In the strictly empirical, operational sense, there are as many kinds of conditioning as there are sets of conditioning operations. The present glossary lists the more important. In terms of the equivalences that can be empirically and quantitatively established, the list becomes shortened, perhaps to two, which seem to exhibit persistent differences: “classical” and “operant” Drive-reduction theorists argue that both kinds of conditioning inevitably occur whichever procedure one follows and that the only distinction that can be made is in terms of which one of several responses is observed. Their statement rests on theoretical grounds. Others prefer to admit that conditioned responses of both types can be observed in the same animal at the same time as a function of the same reinforcing stimulus, and let it go at that] = 2. (th, eth) the hypothetical production of a change in the nervous system presumed to occur with and to underlie conditioning (1.).
conditioning/approximation = (emp, bt) a special case of operant conditioning. If it is desired to condition a highly improbable (infrequent) operant response (one that will not occur and, hence, cannot be reinforced, except. rarely), the experimenter may shorten the time before the subject emits the response by reinforcing other responses that are successively more like the response to be conditioned. [The timing of these reinforcements is critical: they must follow the response reinforced within a second’s time, or the procedure will be ineffective. Thus, if it is desired to train a dog to jump up against a wall, the experimenter may first reinforce a head turn toward the wall, then a body turn, then one or more steps toward the wall, and so on- This technique makes it possible to train animals (and people) to give unusual performances in short order. Many parlor games (e.g., Twenty Questions) are based on it. Syn. shaping.]
conditioning/avoidance = (emp, bt) the experimental procedure in which the occurrence of a response prevents the administration of a negatively reinforcing stimulus which would otherwise occur shortly after the onset of the conditioned stimulus. Conditioning is said to occur if, and only if, the response then increases in rate of occurrence, in magnitude, or in relative frequency, or decreases in latency with successive presentations of the CS as a function of this operation. [On theoretical grounds, it is often argued that avoidance conditioning is a special case of operant conditioning under intermittent reinforcement.]
conditioning/classical = (emp, bt) the experimental procedure of repeatedly presenting the animal with a stimulus (unconditioned stimulus or US) of some reflex contiguously or almost contiguously in time with a neutral stimulus (conditioned stimulus or CS). Conditioning is said to have occurred if, and only if, a response (conditioned response or CR) similar to, but not necessarily identical with, that of the reflex can later be elicited by the initially indifferent stimulus (CS). [The differences are primarily quantitative: the CR has a longer latency, is usually of greater duration, and, in some cases, of much smaller amplitude. In the case 6f certain responses, some components of a topographically complex response disappear completely (assume a value of zero). The CR produced by conditioning a given R must therefore always be determined empirically. E.g., the Pavlovian CR, conditioned salivation, is only one part of the dog’s initial response (UR) to food. Many of the remaining parts of the UR do not occur after the response is conditioned. Syn., or approximate Syn., Pavlovian conditioning, respondent conditioning, type-S conditioning, and type-1 conditioning.]
conditioning/counter- = (emp, bt) the experimental procedure of conditioning a second and conflicting response to the conditioned or discriminative stimulus of a response that is not being reinforced. [This procedure is employed theoretically by contiguity theory to account for extinction. Cf. interference.]
conditioning/escape = (emp, bt) a special case of operant conditioning. The conditioning procedure in which successive occurrences of a response repeatedly terminate a negative reinforcing stimulus. Conditioning is said to occur if, and only if, the response then increases in rate of occurrence, in magnitude, or in relative frequency, or decreases in latency as a function of this procedure. [Cf. escape behavior.]
conditioning/higher-order (second-order, third-order, etc.) = (emp, bt) if a classical conditioned response is established, the stimulus of this S-R correlation may serve as the unconditioned stimulus for a new second-order conditioned response, and so on.
conditioning/instrumental = 1. (emp, bt) operant conditioning. =2. (con, bt) a term applied to operant conditioning [based on the view that the conditioned response is instrumental in producing the reinforcing stimulus for the animal] when the animal is not free to give the response except when the experimenter chooses [as by opening a door so that the animal can run out].
conditioning/operant = (emp, bt) the experimental procedure of presenting the animal with a reinforcing stimulus immediately following the occurrence of a given response. Conditioning is said to occur if, and only if, the response then increases in rate of occurrence, magnitude, or relative frequency, or decreases in latency, as a consequence of this operation. (The measure chosen depends on the response with which one is dealing.) [Sensustricto, only operants may be operantly conditioned. This distinction accounts for the apparent failure of members of the list of synonyms to be strictly synonymous. The name seems to have stuck, however, to all cases where reinforcement is contingent on response, when the response occurs freely (the operant case), or when the response can occur only as the experimenter chooses, as m a trial on the runway (the instrumental case). This is an example of different operations yielding equivalent results, with the result that one term is used. For clarity, “free” is often used with the term operant conditioning to describe the first case. Syn., or approximate Syn., Skinnerian conditioning, Thorndikean conditioning, type-R conditioning, trial-and-error conditioning, instrumental conditioning, and type-2 conditioning.]
conditioning/pre-, see conditioning/ sensory pre-.
conditioning/pseudo- (emp, bt) “The US [unconditioned stimulus] is presented alone in a series of massed trials, and then, after a short interval of time, the CS [conditioned stimulus] is presented in a series of massed trials” (25). If the response to the US is then presented to the CS, pseudoconditioning is said to have occurred. [Possibly related to reflex sensitization and to stimulus generalization.]
conditioning/sensory pre- = (emp, bt) the experimental procedure of repeatedly and consecutively presenting the animal with two stimuli to both of which it is indifferent and then conditioning the animal to respond to the second of the two. The two stimuli must be chosen so that stimulus generalization between them cannot be demonstrated. Sensory preconditioning is said to occur if the animal is then observed to respond to the first, when the first is presented without the second, with the response conditioned to the second.
conditioning/strength of =1. (emp, bt) when it is desired to make statements about the magnitude of a response with reference to the operations of reinforcement, but with drive-inducing operations and stimulus properties taken into account, the term strength of conditioning is used. [The statement, “If an animal is not hungry, he will fail to give a conditioned response, but the strength of conditioning is not altered,” implies that, if this same animal now is 22-hours food deprived and is placed in the same situation, he will then give conditioned responses of large magnitude. Only the operations of reinforcement and extinction alter the strength of conditioning; changes in deprivation leave it unchanged.] =2. (emp, bt) resistance to extinction.
conflict =1. (emp, bt) a term applied when the stimuli for two incompatible responses are presented simultaneously under conditions (e.g., following drive-inducing operations) in which either, presented alone, would yield a response. [The most striking effects of conflict appear under conditions of such nature that both responses would be of large magnitude.] = 2. (th) the state of the animal when two drives associated with incompatible behaviors are equally or nearly equally strong.
conflict/approach-approach = (emp, bt) when two stimuli, towards either one of which the animal moves when it is presented alone, are presented simultaneously but in different locations so that approach to one takes it away from the other, the conflict is termed approach-approach conflict.
conflict/approach-avoidance = (emp, bt) when two stimuli, towards one of which the animal moves when it is presented alone and from the other of which it actively runs, are presented together at the same or approximately the same location, the conflict is termed approach-avoidance. [Also used if one stimulus has come to control, through separate training procedures, both approach and avoidance. The animal may have been repeatedly shocked in the same goal box to which it has run with reinforcement by food.]
conflict/avoidance-avoidance = (emp, bt) if two stimuli, from either of which the animal moves if it is presented alone, are presented simultaneously but in different locations so that escape from one places the animal in the presence of the other, the conflict is classified as avoidance-avoidance conflict.
consummatory act = (emp) a response that most often terminates a given frequently occurring sequence of behaviors. [E.g., eating, copulation. It is not always possible to draw a rigid distinction between a consummatory act and the responses that usually precede it. Many consummatory acts have been empirically identified with responses to reinforcing stimuli; no clear-cut exception to this is known. The converse often, but not invariably, holds. In Hullian theory, this effect is associated with the concept of drive-reduction; whereas in early ethological theory, consummatory acts are associated with the consumption of action-specific energy. These seem to be analogous notions. Many consummatory acts are highly stereotyped.]
consummatory stimulus, see stimulus/ consummatory.
contiguity = (emp, bt) the occurrence together in time, or within; no more than two seconds of one another, of two stimuli, two responses, or a stimulus and a response. [One of the conditions necessary for learning or conditioning to occur? See theory/continguity.]
differentiation = (emp, bt) if a reinforcing stimulus is withheld except when a restrictively specified response is given, the frequency of occurrence of the response will increase, and that of alternative responses, even those that differ only slightly, will decrease. This procedure is called response-differentiation, and its outcome a differentiated response. [Differentiated responses may become, of course, highly stereotyped and “mechanical.” Response differentiation is almost synonymous with approximation conditioning.]
discharge = 1. (th, eth) with reference to a drive, the utilization; consumption, annulment, or destruction of a drive (or of an accumulation of action-specific energy or of “impulses”) that occurs when its consummatory act is given. [The responses that have been activated by the discharged drive will not then recur until the drive is built up in magnitude again by the summated action of internal (e.g., hormonal) changes and environmental stimulation.] =2. (emp, eth) with reference to a behavior pattern, emission.
discriminated = (emp, bt) when applied to a response, it refers to one brought under the control of a stimulus by differential reinforcement. When applied to a stimulus, it refers to one differentially responded to by an animal.
discrimination = 1. (emp, bt) differential response to two or more stimuli [When two stimuli are simultaneously or successively presented to an organism, and the quantitative topographic properties of a specified response to the two differ, a discrimination has been demonstrated. By this definition, discrimination is shown in any two S-R correlations.] =2. (th, bt) differential response-strength to two or more stimuli. [This is the more common, but more specialized, definition. By this usage, one cannot speak of discrimination without being able to relate it to response-strength, so those cases of differential response on the first occasion a second stimulus is introduced cannot be referred to as discrimination. Typically, discriminations are the outcome of discrimination training, although some can be exhibited without such training. Discriminations between similar stimuli are used to explore the sensory and perceptual characteristics of an organism. N.B.: by this definition an animal that jumps 100% to a cross and never to a triangle is discriminating between them. The same animal, presented with the same two stimulus objects, may walk to them equally often and is not discriminating between them. Thus, an animal can discriminate between a pair of stimuli when one response is under investigation and cannot discriminate between the same pair with another response.] = 3. (th, eth) a hypothetical sensory or neurophysiological event which makes it possible for an animal to form discriminations as defined above. A capacity. [The discriminations of behavior theory arc formed, those of ethology are revealed, by discrimination training.]
discrimination training = (emp, bt) the experimental procedure of reinforcing a response in the presence of a discriminative stimulus and not reinforcing the response in the presence of other stimuli. When quantitative measures of the response taken in the presence of the stimulus differ from those in its absence, a discrimination has been formed, and the animal is said to discriminate.
discriminative, see stimulus/discriminative.
disinhibition = (emp, bt) the term applied to the observation, in the course of extinction of a classical conditioned response, that, when an extraneous stimulus (one not previously present in the situation and which does not evoke the same response) is presented together with the conditioned stimulus, the magnitude of CR may be larger than predicted from its magnitude on previous trials. [A like observation can be made, although less predictably, in the extinction of operant CRs.]
displacement activity = 1. (emp, bt) a response that has been identified as a member of one instinct (1.) often occurs in others. On these occasions it is termed a displacement activity. [A very common and incorrect usage applies the term to any behavior unexpected by the observer.] = 2. (th, eth) “A displacement activity is an activity belonging to the executive motor pattern of an instinct other than the instinct(s) activated” (29). [It ”…seems to appear when an activated drive is denied discharge through its own consummatory activity” (29)] =3. (emp) a term applied to a response that has been conditioned under one set of deprivations and reinforcement conditions and that then appears under others. [Some observations are needed here. See also appetitive behavior. Psychologists are, by large, not familiar with the empirical content of this term and do not seem employ an analogous concept. In of their hours of watching rats, they should be interested to learn that m ethologists would term the face-washings and bar-bitings that appear during extinction, displacement activities. See also redirection activity.]
drive = 1. (th) a hypothetical state the animal which is identified by: (a) gross changes in the relative frequency of broad classes of behavior that not attributable to disease, learning, growth; (b) changes in running-wheel activity; or (c)changes in cage-motility. Drives may be manipulated by: operations of deprivation (as of food and water), alterations of the hormone or other biochemical balance of the blood (e.g., of ACTH, testosterone, sodium chloride), by temperature changes, intense stimulation. Some can be specified only by stating the time of the year. Particular states so defined sensitize particular S-R correlations: those acquired after the particular drive operations have been performed and those species-specific responses that have been empirically established as covarying. [Thus, food elicits eating behavior in hungry animals only, and lordosis can be elicited in the guinea pig only when particular combinations of hormones are present in specific concentrations.] 2. (th, eth) motivation. =3. (th, eth) a concept equivalent to Hull’s sEr (16). [That is to say, a construct expressing the combined action of all causal variables controlling a piece of behavior. Ordinarily, a particular drive is associated with the activation of a corresponding instinct (2.). The third usage is not universal among ethologists but seems embodied in Thorpe’s definition (27, 28). It is very difficult to determine just how the word “drive” is being used by ethologists. Sometimes, it seems to be defined as in drive (2.). Sometimes, it seems to refer to a hypothetical construct embodying the combined effect of all the excitational causal variables controlling a response. Sometimes, it seems to include the concept of a hypothetical magnitude of “neural excitation” or “nerve impulses.” I hope that someone will straighten out this concept. Nor is this the only definition of which one can complain. The theoretical definition of drive (1.) also leaves something to be demanded. One suspects that the term drive may be a bit too broad and that we should deal rather with specific complexes of operations and behavior, such as those defining “hunger,” “maternal behavior” etc., and not try at all to class them together prior to rigorous experimental analysis.] =4. (th, bt) a state of the animal established by deprivation of food or water, or by the presentation of electrical shock, characterized by a change in the relative rates of occurrence of a specified set of operant responses (22). [Further operations for producing such states may be experimentally determined] =5. (th, bt)Hull’s D (16). [A construct having the properties of a general state but associated with, or producing, specific stimuli. Its controlling variables are only suggested.] =6. (con) one of many hypothesized state variables derived from the effectiveness of particular environmental events as reinforcing stimuli. [E.g., the manipulatory drive (9). This is invoked as a consequence of observations that monkeys and apes solve mechanical puzzles without reinforcement administered by the experimenter and that they acquire conditioned responses when the rein-forcing stimulus is the opening of a window through which the animal can look into a space not available to it just before. The absence of any defining operations or manipulations other than the identification of specific events as reinforcers renders this usage unfortunate and misleading. If such behaviors as exploration and manipulation occur in animals whose behavior can also be conditioned with presentations of novel events as reinforcing stimuli, then the term instinct (2.), which lacks any implication with respect to the classes of variables controlling the behavior and which suggests no particular underlying theory, would seem to be the preferable. term to attach to the behavior.]
drive/alien, see drive/irrelevant
drive-inducing operation = (emp, bt) a generic term for one class of procedures followed before beginning experimental investigations. These procedures are described in sufficient detail in publications of research results so that others can duplicate them. (They are the procedures that set up “drives.” They include: for hunger and thirst—deprivation according to specified rules of food and water, respectively; for sex-injection of hormones, observation during the reproductive season, manipulation of the light-dark cycle, presentation of an oestrous female rat to a male; for escape or fear-intense electrical stimulation, and so on.]
drive/irrelevant (th, bt) when the inducing operations of two drives (1.) are carried out, and reinforcement of some response is effected by reducing one of them, then the other drive is termed irrelevant. [E.g., if an animal is deprived of food and is then shocked electrically until it jumps over a small barrier, shock-avoidance is the drive, and hunger is the irrelevant drive. Syn. alien drive.]
drive/primary = (th, bt) any of the drives enumerated under drive (1.). [When a drive is termed primary, stress is being laid on the postulate that (a) drive-inducing operations produce states of physiological disequilibrium (needs) and(b) both drive reduction (the opposite operation) and the behavioral changes that occur when a drive is established tend to restore the animal to a state of homeostasis.]
drive/secondary (or acquired) = (th, bt) a drive (1.) hypothesized so that the drive variable may be introduced theoretically in order to predict the occurrence of learned behavior in the absence of one or another of the drive manipulations listed under drive (1.). [Defining operations of one such drive are given in terms of the establishment of a discriminated avoidance conditioned response. See anxiety.]
drive stimulus, see stimulus/drive.
emit = (emp, bt) to give an operant response [A vacuum activity may be spoken of as emitted. Many psychologists are most unhappy with this word, but still accept the distinction between a response that is elicited by a stimulus presented by the experimenter and one that “just occurs” as a useful one for communication. See also spontaneous behavior.]
emotion = 1. (th, bt) a generic name for states of the animal in which a wide variety of responses exhibit lowered response-strengths not attributable to satiation or extinction. [The depressed rate of response observed during the course of extinction following regular reinforcement and during conditioned suppression, for example, are classified as defining emotion.] =2. (th, bt) an intervening variable having some of the properties of D (drive) in some versions of Hullian theory (2). = 3. (con) a broad, ill-defined or indefinable class of behavior based upon the colloquial use of the word, emotion. [The class seems as meaningless for human beings as for subhuman animals. A pretheoretical notion, it has played some role in the identification of emotional behavior. The concepts of emotion bear disappointingly little reference to the empirical concept of emotional behavior. Many who use the latter dispense with the former. Cf. emotional behavior.]
engram = (th) a hypothetical neural locus, structure, or persistent activity, presumably anatomically or physiologically identifiable, that plays the same role for the explanation of learned behavior that center does for species-specific behavior. [What Lashley couldn’t find (26), and perhaps Penfield did (21).]
Erbkoördination, see fixed action pattern.
error = (emp, bt) any response, or set of responses collectively taken, whose occurrence delays the appearance of the response chosen to be reinforced [Errors may not be practicably subject to measurement, as in the case of trial-and-error learning; or they may, in some situations, be more fully specified and hence become enumerable, as in the case of some superstitions and of entry into blind alleys in maze learning.]
error/anticipatory = (emp, bt) a class of errors in the acquisition of behavior chains, in which the error is the occurrcnce of a response earlier in the chain than the position in which it would lead to reinforcement.
error/perseverative = (emp, bt) a class of errors in the acquisition of behavior chains, in which the error is the occurrence of a response in a serial position following that in which it would lead to reinforcement.
escape, see behavior/escape and conditioning/escape.
ethologist = 1. (emp) a behaviorist (1.) who, typically, has been trained in zoology, usually studies the behavior of insects, fishes, and birds more often than that of mammals and other groups, and makes use of much of the system of terms labeled eth in this glossary. =2. (emp) a student of comparative behavior. =3. (emp, eth) a behaviorist (1.) who likes his animals.
excitation = (th) a hypothetical state of the animal used in various ways to account for the occurrence of a response and presumed to be great when response magnitude is large. [Sometimes dressed up by the addition of suggestions about nerve impulses, etc.]
excitatory potential = (th, bt) Hull’s sEr (16). A hypothetical state variable, defined mathematically, that incorporates the combined effects on response magnitude of such excitatory variables as stimulation and drive.
excitatory potential/effective = (th, bt) Hull’s sEr (16). A hypothetical state variable quantitatively embodying the joint effects of excitatory potential and the reactive inhibitions on response magnitude. [Drive (3.) is almost synonymous. Momentary effective excitatory potential (sEr) incorporates a variability concept.]
exhaustion (of action-specific energy, or of specific action potential) = (th, eth) a hypothetical process postulated to account for the data of adaptation or habituation of a species-specific response. [The occurrence of a response is presumed to drain off or consume a certain quantity of this “energy,” thus raising the threshold and hence reducing the frequency and magnitude of the response thereafter.]
expect = (?, bt) [If one does not “intuitively know”‘ what expect means, one is lost.] The writer has been unable to find a definition of any type that has enabled him to use the term in a systematic fashion, except the following: “If x is deprived of food and z has been trained on path P and path P is now blocked and there are other paths which lead away from path P, one of which points directly to location L, then x runs down the path that points directly to location L =x expects food at location L” (33). [Expectancy is the noun; presumably it is what one has when one expects.] = 2. (con) to behave as if making a probability judgment [The difficulty of arriving at an empirical or theoretical definition of “expect” and of “expectancy” that is other than trivial may explain why many experimental behavior theorists find the concept of limited value, even though sonic consider it equivalent in explanatory power” to habit-strength sHr.]
extinction/experimental = (emp, bt) the progressive decrements in the magnitude or relative frequency of a previously conditioned, response resulting from the procedure of omitting reinforcement following or accompanying the occurrence of the response, when other variables are held constant.
extinction/resistance to = (emp, bt) the number of instances of a conditioned response that occurs during experimental extinction before the conditioned response reaches sonic predetermined criterion of low response-strength. [Such criteria are usually chosen on the basis of previous experimental results so that little further extinction would be expected if the procedure were continued, i.e., so that extinction has approached an asymptotic value. Criteria that have been employed in free operant conditioning include: return of the rate to the operant level, failure to respond within five minutes of the preceding response, and failure to respond within five minutes of onset of the discriminative stimulus. In classical conditioning, failure to respond to the conditional stimulus in a specified time is a typical criterion. Resistance to extinction is an important measure of strength of conditioning and forms the basis of Skinner’s concept of the reflex reserve.]
extinguish = 1. (emp, bt) to omit reinforcement of a response sufficiently often that a decrement in response-strength is observed. [Usage often restricts this word to cases where the response returns to the magnitude or relative frequency observed before conditioning began.] = 2. (emp, bt) with reference to a response, to decrease in magnitude as a function of the omission of reinforcement. [Extinguish is both a transitive and an intransitive verb. By definition 1., the experimenter extinguishes a response; by definition 2., the response extinguishes.]
fatigue = (lab slang) a pejorative synonym for habituation, used preferentially when a lot of work (physical) is involved in the response considered. [Fatigue is a homely word, full of connotations and implications from ordinary conversation as well as from physiological referents. It would be very nice if it turned out that fatigue reduced to tile fatigue observable in a muscle or nerve-muscle preparation. If the evidence points to anything, it points to the desirability of limiting fatigue to the effect observed in muscle and nerve-muscle preparations and of doing some experimental investigations of work-decrement. Hull’s concept of reactive inhibition (16), incidentally, is developed so that it covers most of the experimental observations that tempt one to talk about fatigue.]
fear = 1. (emp, bt) the behavior produced either by sudden and intense stimulation or by specific classes of stimuli that must be identified empirically for each species studied. Responses include alterations of sphincter control, flight behavior, respiratory changes, and the suppression of behavior occurring at the onset of stimulation. [These responses may be readily conditioned by the Pavlovian procedure, and they are then the symptoms of anxiety.] =2. (th, bt) a drive or emotion postulated as underlying fear (1.). = 3. (th, bt) anxiety. [This usage is incorrect.]
fixed action pattern = (emp, eth) a highly stereotyped and precise response observable in most members of a species when there has been no experimental manipulation calculated to produce learning (especially response-differentiation). [See stereotyping and rcsponse/species-specific.]— Ger. Erbkoördination.
flight, see behavior/flight.
frustration = (emp) the operation of preventing an animal from making some response. This may be done in any of three ways: (a) by withholding the stimulus for the response, when the stimulus ordinarily appears as the consequence of a previous response [as when one fails to reinforce a response already conditioned, or when a restraining test tube prevents a female from following a male stickleback that is courting her, hence frustrating the male]; or (b) bymechanically preventing the response from occurring [as when one places a glass barrier between an animal and the object toward which it is moving, or when one rigidly fixes the bar of a Skinner-box, or when one ties down the wings of a bird; in the example of (a), it is the female stickleback which is frustrated ja this sense]; or (c) by placing the animal in a conflict situation. [These three operations do not always have the same consequences, and a new term seems to be urgently needed for at least two of them.]
generalization, see stimulus generalization and response equivalence.
Gestalt = (emp, eth) when it is shown that a particular response or set of responses is under the control of a complex set of stimuli that has not yet been experimentally defined, then the complex set of stimuli is often termed a Gestalt or configuration (26). [The term is contrasted with sign stimulus where it is relatively easy to specify a narrow stimulus class controlling the behavior. Unfortunately, others use the term in a precisely opposite sense and refer to a sign stimulus as a Gestalt (29). Both usages are misleading. Both might equally properly-and improperly-be termed Gestalten, for such is the vagueness of referent of this word. Neither usage corresponds to that followed by the Gestalt psychologists, which is a theoretical concept that, as it has been defined and used, cannot be introduced into the language of objective studies of behavior.]
goal = (con) a lay term often applied to reinforcing stimuli, to the responses given to reinforcing stimuli, to consummatory acts, or to the stimuli for or releasers of consummatory acts. [When one uses the word goal, one is speaking loosely, as the fact that it has at least four referents suggests.]
goal-gradient = 1. (emp, bt) the functional relationship between the strength of a response and its distance in space or time from the reinforcing stimulus which has followed instances of it in the past. [In most cases, the response is stronger closer to the reinforcing stimulus.] = 2. (th, bt) a theoretically postulated relationship between the reinforcing effect on a particular response of a particular reinforcing stimulus, and the distance in time between the occurrence of the response and the occurrence of the stimulus.
goal response, see response/goal
gradient of approach, see approach/gradient of.
gradient of avoidance, see avoidance/gradient of.
habit-strength = (th, bt) Hull’s sHr. A hypothetical or inferred ”historical” variable considered to determine the extent to which a conditioned stimulus or set of stimuli tends to evoke a given response. [A function of the number of reinforcements, of the quantity of the reinforcing stimulus, and of time variables. For a full development, see Hull (16).]
habituation = 1. (emp) the decrement in response-strength which occurs with the repeated elicitation of that response in massed practice. [In reflex habituation, recovery after a time interval may be complete, although on subsequent occasions habituation may occur more rapidly. Operationally, there is no distinction between habituation and extinction, except in terms of the history of the response prior to the procedure of repeated elicitation. Responses that are extinguished have previously undergone a systematic experimental procedure of reinforcement before repeated elicitation or emission without reinforcement begins. The antecedent history of responses that are habituated out shows no such experimental reinforcement. As data accumulate, it may become advisable to distinguish between several kinds of response decrements that are all functions of repeated elicitation, but differ with respect to stimulus-control and rate of recovery. See, for example, definition 2.] =2. (emp, eth) response decrement that is a function of the number of elicitations, that is specific to the response (and independent of the stimuli eliciting it), and from which recovery is slow or absent. [Hinde (12, 13, 14).]
hierarchy = (th, eth, bt) a series divided or classified in ranks or orders. [This term is used by psychologists and ethologists to describe certain conspicuous properties of behavior. “Habit-family hierarchy” and “hierarchical organization of centers” both refer to such organizations of hypothetical entities or constructs postulated to account for certain temporal sequences and dependent probabilities evident in behavior. The term is also applied empirically to behavior, where it has reference to certain statistical sequential dependencies that can be found in the order in which a number of responses occur, and theoretically to sets of causal variables inferred from the statistical organization of behavior.]
hypothesis = 1. (emp, bt) a term applied to repeated occurrences of the same response or response-pattern during the acquisition of a discrimination in the two-choice discrimination experiment (see Lashley jumping-stand) when a large proportion of the responses are errors. [Alternation and position habits are hypotheses.] = 2. (?, bt) a class of expectancies postulated by cognitive theories to account for hypothesis (1.) [According to continuity theory; hypotheses are the effect of the summed differential reinforcements of response to various aspects of the discriminative stimuli.] = 3. (emp, bt) one class of statements emitted by humans when they are undergoing discrimination or concept formation training. An hypothesis may be distinguished from other such statements in that it purports to describe the rules or principles followed by the experimenter in reinforcing the subject. [If it indeed does describe these rules, it will be regularly reinforced on each occasion that the behavior it controls occurs, without regard to the particular discriminative stimuli present on that occasion. If, however, the experimenter is interested in human hypothesizing, he may independently reinforce the hypotheses and the behaviors they direct. A discriminated hypothesis is often termed “conditional.”] =4. (th, bt) an hypothesis (3.) that is inferred to occur covertly, when subjects are be-having in discrimination situations as they do when such hypotheses do in fact occur. [A very dangerous inference; cf. hypothesis (1.).]
imitation, see behavior/imitative and behavior/mimetic.
imprinting =1. (emp, eth) the operation of visually presenting to an individual large (and usually noisy) moving objects (exclusive of members of its own species) during the first hours of its life. Imprinting is said to have occurred if, and only if, the individual subsequently exhibits toward the large moving object (and objects like it) the behavior ordinarily exhibited only to-ward members of its own species. [By extension, theoretically, the term is then applied to such behavior shown with respect to its own species. The experimental data at hand do not yield a more adequate empirical definition.] =2 (th, eth) a process hypothesized to account for imprinting (1.) in terms of perceptual theory. = 3.(con) any learning that strikes someone as ‘like’ imprinting (1.).
inborn, see behavior/innate.
inertia = (emp, eth) continuation of a response after the stimuli for it are withdrawn. [Cf. After-discharge.]
inhibition = 1. (th, bt) a hypothetical state of the animal sometimes used to account for decrements in response magnitude, or habit-strength, or both. [It is a negative, variable presumed to operate by canceling out an excitation of some sort.] =2. (th, physiol) a cortical process opposite to excitation and postulated as having the property of suppressing excitation and, hence, of suppressing the behavior dependent on that excitation (20). [Its behavioral correlate in the extreme case is inactivity and eventually sleep.] =3. (th, eth) a block.
inhibition/external = (emp, eth) the term applied to the fact that, in the course of the acquisition of a conditioned response, if an extraneous, stimulus is presented with the conditioned stimulus, a CR of smaller magnitude than that predicted from previous trials may be produced.
inhibition/reactive = (th, bt) Hull’s Ir (16). A hypothetical state variable which, together with conditioned reactive inhibition, accounts for response decrement in both learned and unlearned behavior (i.e., for both extinction and habituation). [It is a function of the work involved in a response, of the number of times the response is elicited, and of time since the occurrence of the last response. It applies particularly to the ease of short-term reversible decrements.] inhibition/conditioned reactive = (th, bt) Hull’s sIr (16). A hypothetical state variable which, together with reactive inhibition, accounts for response decrement. [It is a function of Ir and of some of the variables that control habit-strength, and is used to account for long-term, relatively irreversible decrements. When a response is extinguished, 1r increases and becomes conditioned to the stimuli present. On subsequent occasions, when the stimuli are again presented to the animal, the stimuli elicit sIr at the same time that further Ir produced by the repeated elicitation of the response. Ir and sIrsummate to render ineffective sHr or sUr for the elicitation of the response.]
innate = (emp, genetics) a term applied to differences in genetic character between two members of the same species that have been raised in the same environment. [It is now generally acknowledged that the term innate, a technical one in genetics, cannot properly be applied to behavior as synonymous with unlearned or inborn, and its use in this sense may be expected to become less frequent. Where it has appeared in the past, innate behavior should now be read as unlearned behavior or as species-specific behavior. The restricted use of the term innate has recently been adopted as necessary by prominent ethologists. Its new place in the technical vocabulary of ethology and of behavior theory, and the use of species-specific and unlearned instead, will probably have fruitful consequences, since the connotations of innate for ethologists and psychologists have been different indeed and have led to many confusions that may now be avoided. (E.g., one of two human beings has blue eyes, the other has brown eyes. Blue eyes are not innate. Brown eyes are not innate. But the difference in eye color between the two persons is innate. Clearly, by this definition, which is that of the geneticists, it is not permissible to speak of a response, or set of responses, as innate.)]
innate releasing mechanism (IRM), see releasing mechanism/innate.
insight = 1. (emp, lab slang) a gross difference in behavior between two successive occasions on which behavior can occur, when the behavior shown on the second occasion is close to what the experimenter has previously decided upon as a “good” or “efficient,” solution. =2. (con) an event hypothesized to account for insightful learning. [It may be further defined as “reorganization of the perceptual field” or the “apprehension of relations,” for which the writer cannot find any elucidating elucidations.]
insightful learning, see learning/insightful.
instinct = 1. (emp) a class of sets of responses shown by most members of a species. Many of the responses can be demonstrated as dependent on highly specific stimuli in the environment. Such a class is empirically demonstrated by showing that certain responses are statistically organized (i.e., associated together in time) under specified environmental conditions and following a single set of drive operations, when there have been experimental manipulations calculated to prevent learning. =2. (emp, eth) same as instinct (1.) but without any qualification with respect to experimental manipulations calculated to prevent learning. [Such a definition has recently been proposed by Tinbergen (30). It is based on the increasing emphasis on the logical (and experimental) impossibility of distinguishing in a meaningful way between “1earned” and “unlearned” behavior, and recognizes that this distinction, even if it could be made easily, would not necessarily be a useful one. Theoretically, this concept of instinct may readily be related to the biological concepts of function and adaptation as they appear in general evolutionary theory.] = 3. (th, eth [obsolete]) a hypothetical system of hierarchically organized centers postulated to account for observable instances of instinct (1.). “A hierarchically organized nervous mechanism which is susceptible to certain priming, releasing and directing impulses of internal as well as of external origin, and which responds to these impulses by coordinated movements that contribute to the maintenance of the individual and the species” (29).
instinctive movement, see fixed action pattern.
intention movement = (emp, eth) if an instance of a response that is a member of a behavior chain occurs usually, but not necessarily, in association with instances of other members of the same chain, it is termed an intention movement with respect to later members of the chain. [Used most often when the chain is not necessarily carried through to completion immediately. Intention movements are usually of low intensity. See response/intensity of. The experienced observer can predict accurately from the occurrence of intention movements that the chain will be carried to completion after a delay or that it will begin anew. For example, members of a species of bird will give an ordered series of responses before becoming airborne. From these responses, termed intention movements of flight, the observer can predict, other things being equal, that the bird will take off in the very near future. The responses can be identified on occasions when they are not followed by flight in young birds that have never flown or in birds that are presented with weakly aversive stimulation while they are feeding. Cf. vicarious trial and error and conflict.]
interference = (th, bt) a process often assumed by contiguity theorists (see theory/contiguity) to underlie experimental extinction, involving a decrement in response-strength as a result of the occurrence and conditioning of competing responses. [This view implies that habituation and extinction are produced by the same variables that produced acquisition of the response that is undergoing decrement. This process is a basic postulate of contiguity theory. See also conditioning/counter-.]
Lashley jumping-stand = a small raised platform from which the animal can remove itself only by falling to the floor, or into a safety net, or by jumping to an easily movable door a few inches away. One, two, or more such doors may be available. The door or doors may have affixed to them discriminative stimuli. Jumping at the “correct” door is reinforced by food or water, to which the animal gains access only by jumping at and through the door.
learn = (emp, bt) to exhibit a change in behavior between two successive exposures to the same environment that cannot be attributed to manipulation of drive operations, alterations in the environment, sensory adaptation, disease, surgical interference, physical trauma, or growth-although the propriety of these exclusions may be questioned. [When we say that an animal learns, we are stating at the least that, other things being equal, some behavior now occurs in a situation in which it had not occurred previously, or that the behavior now occurring in a given situation is different from the behavior that occurred on the last occasion the animal was in that situation. The behavior need not change, nor the situation, but the relation between them has changed. For an extremely stimulating, logical treatment of the possibilities, see Haldane (8). See also learning.]
learning = 1. (con, bt) a process (2.) or family of processes inferred from the observation that animals learn. [The term is a very broad one indeed, so broad that many ethologists as well as behavior theorists question its usefulness, except as a label for a broad class of problems. Learning is not and cannot be an explanatory concept. Learning never explains anything, except in that, by definition, it suggests that certain variables are not operative in a given situation in which behavior is observed to change.] =2. (con, bt) a generic term for conditioning. =3. (th) those behavioral processes determining how the genotype is expressed in the phenotype. [“The characters of the phenotype are in part determined by the environmental conditions it meets during its life, not only by accidents that happen to it but also in many animals by the nature of the environment in which it is living” (3).]
learning/insightful = (th, bt) a process hypothesized specifically to account for the results of experiments in which relatively great and not easily reversible changes in behavior appear between two successive occasions on which the behavior can occur. [See insight
learning/latent = 1. (emp, bt) under some conditions, without either ostensible presentation of a reinforcing stimulus or the occurrence of the response whose strength is altered, changes in the magnitude or relative frequency of a response may be observed. If such changes can be observed, they are termed latent learning. =2. (th, bt) acquisition in the absence of ostensible reinforcement = 3. (th) a process hypothesized to account for the empirical observation above.
learning/perceptual = (con) those cases of acquisition that are interpreted in terms of changes in the perception (2.) or discriminations of the subject.
learning/place = (emp, bt) an animal is repeatedly introduced in the same manner and at the same place into an environment in which it can move towards, and so present to itself, a reinforcing stimulus that is in a fixed geographical position with respect to the whole environment If on a critical trial the animal is introduced into the environment at a different place, and it then moves towards the reinforcing stimulus directly (and hence shows a series of responses topographically different from those that have been rein-forced on the preceding trials), place learning is said to have occurred. [Note that the fixed geographical position of the reinforcing stimulus determines a set of discriminative stimuli, movement towards which (however done) is reinforced. Hence, place learning seems to be a special case of response equivalence. According to cognitive theory, the animal has learned “where.”]
learning/response = 1. (emp, bt) if, in the situation described under place learning, an animal is shown on the critical trial to make responses of the same topography (i.e., the same movements) that he had made during training, irrespective of his location with reference to that of the reinforcing stimulus, it is said that response learning has occurred. [Ant. place learning.] =2. (th, bt) the doctrine that an animal acquires in learning, not dispositions to move to or towards particular stimulus complexes, but dispositions to make particular movements in the presence of particular stimuli.
learning/serial = (emp, bt) when reinforcement is made contingent upon the occurrence in ordered series of a number of different responses, then, as the number of reinforcements increases, a progressively stereotyped behavior chain may be observed. The acquisition of such behavior chains, measured with reference to the number of errors in the time required for the chain to run off, is termed serial learning. [E.g., maze-running, much nonsense-syllable learning, and such demonstration performances as a rat pulling a string that delivers a marble which the rat then picks up and carries to a hole into which it drops it so that water is automatically delivered.]
learning/trial-and-error = (emp, bt) operant conditioning as it is observed to occur under conditions where relatively precise experimental controls have not been established. [Hence when much irrelevant behavior (“errors”) can and does occur and recur. In conditioning by approximation, where the delivery of reinforcing stimuli is under precise experimental control, very few errors have an opportunity to occur, and the “selection out” of the correct response is achieved by the experimenter. It is interesting to contrast trial-and-error learning of bar-pressing by the rat in the Skinner-box with approximation conditioning. In both cases, food is the reinforcing stimulus. In the former, there are long and variable delays between the first occurrence of bar-pressing and the eating of the food. During these delays, much other behavior occurs, a great deal of which will be reinforced and therefore will become conditioned. These responses have been termed superstitions. They must extinguish before the probability of bar-pressing becomes very high since they compete with it. Hence, the course of learning is slow and characterized by the appearance of many irrelevant responses (errors). In approximation conditioning, the animal is first trained to respond to the sound of the food magazine by diving towards it and eating, irrespective of his position at the time the sound occurs. He will then repeat almost immediately whatever response is followed by the sound. In the first case, the process may require several hours in the experimental situation. In the second, it takes only a few (5-10) minutes. In trial-and-error learning, much appetitive behavior is observed, and operants occur freely. The correct response—that is, the response upon which reinforcement (reward) is contingent—decreases in latency and increases in relative frequency only slowly; if it increased rapidly, the older terminology would apply the word insight.]
Leerlaufreaktion, see vacuum activity.
mimesis, see behavior/mimetic.
mood = (th, eth) a specific internal state of readiness to discharge a certain complex of behavior patterns. [Cf. the old psychological concept of set. This concept is related to Tinbergen’s drive or motivation (29). It seems to be dropping out of use.]
motivation =1. (con, bt) a rough generic term for the broadest possible class of nonstimulus variables controlling behavior. [The most important of these relate to drive (1.), but the term motivation also refers to preferences, values, appetites, set, Aufgabe, and so on and on. The term usually indicates that the controlling variables of a set of behaviors are unknown. Its wide use as an explanatory concept suggests that, for some, ignorance is a virtue—it is admission of ignorance that is the virtue.] =2. (th, eth) a concept very similar to Hull’s momentary effective excitatory potential, stripped of its precise quantitative statement. [That is, motivation is a hypothetical conceptualization of the joint action of all the determiners of behavior, including external and hypothetical internal stimuli, as they converge to determine the magnitude or intensity of a response. Sometimes this concept incorporates statements about nerve impulses.]
movement/instinctive, see fixed action pattern.
multiple schedule, see reinforcement/schedules of.
multiple T-maze, see T-maze/multiple.
need = (con) a state of affairs of the animal, considered as an individual or as one member of its species, such that its continuance in time will lead to the animal’s death or to the disappearance of the species (as by failure to reproduce). A state of physiological disequilibrium and of departure from homeostatic balance.
nervous system/conceptual = (con, bt) a hypothetical set of logical (?) structures that is often confused with the nervous system. [The “anatomy” and “physiology” of the conceptual nervous system are whatever theorists may choose to postulate to account for observed behavior. To be sharply distinguished from the nervous system, whose anatomy and physiology are what neuroanatomists and neurophysiologists observe when they study it directly.]
operant = 1. (emp, bt) an adjective specifying a response or behavior which is identified by its consequences (as, for example, by its producing a specific reinforcing stimulus under a given set of conditions) and for which eliciting stimuli have not necessarily been determined (and which is therefore unpredictable with respect to its appearance in the presence of a set of stimuli until it has been brought under the control of discriminative stimuli by reinforcement in their presence). [Cf. vacuum activity. Some operant responses are defined with respect not only to their effect on the environment but also to their topography. See operant conditioning. The “spontaneity” implicit in this definition has led many psychologists to reject the term. Others, also uneasy, satisfy themselves by saying something like this: “Well, there are stimuli, but we just don’t know what they are.” Most appetitive behavior is operant behavior. Vacuum activities provide another clear family of probable examples of operant behavior, although these responses also occur upon elicitation. Whether necessary for sound theory or not, the distinction between operant and respondent behavior is operationally sound and convenient for referring to behavior observed when there usually is no explicit operation of presenting a stimulus, as in the Skinner-box.] =2. (emp, bt) an operant response.
operant behavior, see behavior/operant.
operant conditioning, see conditioning/ operant.
operant level = 1. (emp, bt) the rate of occurrence of an operant response before the response has been experimentally reinforced. [If a rat is placed in a Skinner-box, it will press the bar some 10 to 12 times in an hour, although it has not been, and is not being, reinforced for this behavior. This rate may remain stable hour after hour. After conditioning, the rate may reach 1,000 responses an hour. One interesting quantitative property of a response is its operant level. For the rat, that of bar-pressing is quite high; that, say, of pulling a string is quite low. But both are equally easy to condition.] = 2. (emp, bt) the terminal rate of occurrence of an operant response that has first been conditioned and then been extinguished. [Some behaviorists identify these two concepts as equivalent, that is, they accept the view that operant level (1.) will be quantitatively equal to operant level (2.). The evidence is not, however, complete.]
operant response, see operant.
operationism = the general point of view toward the data and concepts of natural science which holds that the concepts of a science are defined by the experimental operations involved in investigation and measurement. [Analysis of the experimental operations involved in the elucidation of one or more concepts (e.g., space, time, response, conditioning) may reveal either that the concept first held must be rejected since it involves several operations which do not yield equivalent results, or that of two or more concepts all but one may be unnecessary since the same set of operations defines them all.]
perception =1. (emp, bt) a generic term for the complex sensory control of behavior as it is inferred from that behavior. =2. (con) a hypothetical internal event of unspecified nature controlled largely by external stimulation (but sometimes also by state variables such as habit and drive). [Such events are often treated as though they were the true controllers of behavior.]
perceptual learning, see learning/perceptual.
performance = (emp, bt) measures of observed behavior.
practice = (emp, bt) the observed occurrence of a specified part of behavior one or more times. The greater the number of occurrences, the greater the amount of practice. [Use of the term practice often implies that response differentiation is taking place with successive occurrences of the response. The only responses that can be called unpracticed are those of which it can be asserted that they have not previously occurred.]
practice/massed = (emp, bt) a term applied to the experimental procedures followed when the time interval between successive trials is small with respect to the duration of the trial.
practice/spaced = (emp, bt) a term applied to the experimental procedures followed when the time interval between successive trials is large with respect to the duration of the trial.
potential/specific-action =1. (th, eth) a synonym for action-specific energy apparendy chosen to eliminate the con-notations of “energy.” =2. (th, eth) potentiality or readiness of the whole animal for response to a given stimulus (12, 27). =3. (th, eth) sEr, not quantitatively defined.
preconditioning, see conditioning/sensory pre-.
process = 1. (emp) changes in response-strength that are a function of experimental manipulation. [Usually qualified by the name of the operation, e.g., extinction process.] =2. (th) a term applied to hypothetical and unspecified sets of events occurring with the passage of time that are inferred from data showing differences in the magnitude of one or more dependent variables that are functions of time or functions of some variable that is in turn a function of time (e.g., the number of trials). 3. a progressive series of changes.
pseudoconditioning,. see conditioning/pseudo-.
punishment = (emp, bt) the procedure of following the occurrence of a conditioned response with an aversive stimulus. [Punishment is not necessarily effective, but when it is, measures of response-magnitude decrease, as do rates of response.]
purposive, see behavior/purposive.
reaction, Syn. response.
reaction/delayed = (emp, bt) one of several experiments whose results have led to the concept of symbolic behavior (2.).
recovery/spontaneous, see spontaneous recovery.
redirection activity = (th, eth) any response elicited by a stimulus and, ordinarily, topographically directed toward that stimulus, which, in the presence of a drive-conflict, is topographically directed at a part of the environment that otherwise would show no control over it. [Cf. “displacement” in Freudian theory. E.g., under the combined action of attack, escape, and sex drives, the male black-headed gull will threaten his mate when she arrives in the territory and then attack other (previously ignored) birds in the vicinity. See Bastock et al. (1); also Miller (17).]
reflex = 1. (emp, bt [obsolete]) any S-R correlation, whatever its history or probability of occurrence. [By no means limited to the “reflexes” of physiology, although it includes them as it does acquired correlations. Syn. S-R correlation.] =2. (emp, bt) an S-R correlation which is observable in all members of a species under a given set of conditions in which there have been experimental manipulations calculated to prevent learning. [The term is not limited to the reflexes of physiology, although, again, it includes them. This is often confused with the reflex-arc, which is a physiological theory accounting very elegantly for certain simple reflexes.] =3. (th, eth) same as reflex (2.), with the further qualifications that (a) changes in strength are negligible with respect to drive-inducing operations and (b) a specific physiological theory of their occurrence is well-established. [It is this kind of reflex that ethologists insist is not important in unlearned behavior.]
reflex reserve = (th, bt) an obsolete construct, found inadequate experimentally in the early 40’s by Skinner, who introduced it (22). [A reservoir notion designed to handle the relationship between input of reinforced responses and output of unreinforced responses. This relationship was later found to be highly dependent upon a number of experimental variables and therefore not suitable for a relationship to a postulated reserve. This concept is remarkably equivalent to Lorenz’ hydraulic model for innate behavior (26), although it was designed primarily to account for acquired behavior as well as unlearned behavior.]
reflex response, see response/reflex.
reflex sensitization = (emp, bt) if, after repeated elicitation by its stimulus, a reflex response is then given in response to a previously neutral or much less effective stimulus with which the reflex stimulus has not been paired, reflex sensitization is said to have occurred.
regression = 1. (emp, bt) a term applied to the observation that, during experimental extinction, the subject frequently makes responses which had been conditioned and extinguished prior to the conditioning of the response being extinguished. =2. (emp, bt) a term applied to the observation that, after punishment, the subject frequently makes responses which had been conditioned prior to the conditioning of the response that was punished.
reinforcement = 1. (emp, bt) the operation of presenting to the animal in operant conditioning, after it has made a response (and therefore contingent on its occurrence), a reinforcing stimulus or of withdrawing a negative reinforcing stimulus. =2. (emp, bt) in classical conditioning, the operation of presenting, contiguously in time, a conditioned stimulus and an unconditioned stimulus. [Reinforcement (1.) and (2.) are parallel and might be considered the specification of equivalent operations. They are not parallel to (3.) and (4.). Reinforcement (1) can only be applied to occasions when reinforcing stimuli are presented to the animal, and stimuli can only be classified as reinforcing after it has been demonstrated that their use produces modifications in behavior under the stated conditions. Food, for example, is not a reinforcing stimulus for a satiated animal. When a stimulus has been found to be reinforcing in one situation to several individuals, it is usually assumed that it will be reinforcing in other situations and to other animals. As a result, the term is used without rigorous experimental justification. In the same way, the use of the term reinforcement is commonly extended to cases when food is given but a change in behavior has not been observed. “After 30 reinforcements, the relative frequency of turning to the right had not changed, nor was any other change in behavior apparent,” may be a careless (but common) way of saying: “Although through 30 trials I gave food to the animal each time it turned to the right (and food is usually a reinforcing stimulus), no change in behavior appeared. Hence, food is not a reinforcing stimulus under these conditions to this animal.”] =3. (th, bt) the reduction of a specific drive which, in Hull‘s theory, is a condition necessary for learning. [Thus, if learning occurs, reinforcement has occurred and a drive has been reduced. In much drive-reduction theory, this holds even in the absence of drive-defining operations or other symptomatic behavior changes.] =4. (th. bt) a term applied to any unspecified hypothetical change in the state of an animal inferred from change in the animal’s behavior. [“The animal started nodding his head when the click came. Therefore, reinforcement must have occurred.” Reinforcement should not be used in this sense, although, unfortunately, it often is. By this usage, any condition that is known to increase response-strength is a reinforcing stimulus.]
reinforcement/aperiodic, see reinforcement/schedules of.
reinforcement/continuous, see reinforcement/schedules of.
reinforcement/differential = 1. (emp, bt) with reference to a stimulus, the operation of reinforcing a response when it has occurred in the presence of one stimulus or set of stimuli, and of withholding reinforcement when it has occurred in the presence of another. The procedure followed in discrimination training. =2. (emp, bt) of a response, the operation of making reinforcement contingent on the occurrence of a response of predetermined topography. [The experimental procedure that produces highly stereotyped responses.] =3. (emp, bt) of a rate of response, the operation of making reinforcement contingent on the occurrence of a response at a predetermined rate.
reinforcement/fixed-interval, see reinforcement/schedules of.
reinforcement/fixed-ratio, see reinforcement/schedules of.
reinforcement/heterogenous = (emp, bt) if the reinforcing stimulus evokes a response different from the response that is being reinforced, the operation of presenting it is termed heterogenous reinforcement. [This is the case in operant conditioning.]
reinforcement/homogenous = (emp, bt) if the reinforcing stimulus evokes a response identical with or similar to the response that is reinforced, the operation of presenting it is termed homogenous reinforcement. [This is the case in classical conditioning.]
reinforcement/irrelevant = (emp, bt) if a reinforcing stimulus is not directly related to or involved in the drive-reducing operations antecedent to reinforcement, the operation of presenting it is termed irrelevant reinforcement. [E.g., if food is taken away from the animal and the response being conditioned is followed by the presentation of water, irrelevant reinforcement is being employed. Irrelevant reinforcement is generally not so effective as relevant reinforcement.]
reinforcement/intermittent, see reinforcement/schedules of.
reinforcement/partial, see reinforcement/schedules of.
reinforcement/periodic, see reinforcement/schedules of.
reinforcement/primary = 1. (emp, bt) the use in reinforcement of a stimulus that is a reinforcing stimulus to any animal of a given species without special training. [The complement of the empirically defined secondary reinforcement. If the history of the animal with respect to a reinforcing stimulus is unknown, this irrelevant and theoretically “1oaded” term is applied both to the stimulus and to the operation.] =2. (th, bt) reinforcement by reduction of a primary drive.
reinforcement/random, see reinforcement/schedules of.
reinforcement/regular, see reinforcement/schedules of.
reinforcement/relevant = (emp, bt) if a reinforcing stimulus is directly related to or involved in the drive-reducing operations antecedent to conditioning, the operation of presenting it is termed relevant reinforcement. [If food is presented to a hungry animal after the occurrence of the response being conditioned, the procedure is termed relevant reinforcement.]
reinforcement/schedules of = (emp, bt) plans or procedures whereby the experimenter determines which ones of a series of responses will be reinforced. There are many possible schedules, which may yield quite different results (24). The most common ones are: continuous (or regular) reinforcement = the experimental procedure of reinforcing a response each time it occurs. [Regular reinforcement should have almost no analogue in the behavior of animals in a free environment.] intermittent (or partial) reinforcement = any one of several procedures in accordance with which a response is reinforced only on some of its occurrences. fixed-interval (periodic) reinforcement (Fl) = a schedule of intermittent reinforcement in which reinforcements are delivered according to a fixed and regular time schedule. [E.g., the first response occurring in each 30-sec. interval may be reinforced. In some forms of fixed-interval reinforcement, the first response occurring at the end of a predetermined interval of invariant size after the last reinforced response may be reinforced. When this kind of schedule is maintained over a long period, animals of many species form temporal discriminations: they will emit groups of responses at the appropriate intervals and fail to respond at other times.] variable-interval (aperiodic or random) reinforcement (VI) = a schedule of intermittent reinforcement in which reinforcements are delivered according to a predetermined but irregular time schedule. [Under this schedule, on the average, reinforcement occurs at predetermined temporal intervals: the interval between each successive pair of reinforcements is randomly selected from a population of intervals of varying sizes whose mean value is predetermined. This schedule produces a low, but very stable, rate of response and very slow extinction.] fixed-ratio reinforcement (FR) = a schedule of intermittent reinforcement in which every nth instance of the operant response is reinforced. [Produces very high rates of response provided the ratio is not too large. In the case of respondents, and of operants whose discriminative stimuli are experimentally controlled (i.e., instrumental -conditioned responses), these schedules are equivalent to fixed-interval schedules.) variable-ratio reinforcement (VR) = a schedule of intermittent reinforcement in which, on the average, every nth instance of an operant response is reinforced. [This schedule yields a very high and stable rate of response if the ratio is not too large. For respondents and discriminated operants, it is equivalent to the variable-interval schedule.] multiple schedule = a schedule of intermittent reinforcement in which reinforcements by the same reinforcing stimulus are programmed according to two or more schedules in alternation. In multiple schedules, one of the component schedules is followed through an interval during which a discriminative stimulus is presented to the animal. At the end of this interval, the first schedule is followed directly by a different schedule, presented in association with a different discriminative stimulus; this second stimulus is then used for a period of time; and so on. [In experiments where discriminative stimuli are omitted, this schedule is termed mixed.] tandem schedule = a schedule of intermittent reinforcement in which single reinforcements are delivered according to two schedules acting in succession. [E.g., a single reinforcement may be delivered when five responses have been given (FR-5:l) beginning at the end of a l0-min. interval after the last reinforced response (FI-10 min.).]
reinforcement/secondary (or conditioned) =1. (emp, bt) when a stimulus, initially ineffective, becomes a reinforcing stimulus after being presented to an animal contiguously in time with a reinforcing stimulus, it is termed a secondary or conditioned reinforcing stimulus, and the operation of presenting it is termed secondary reinforcement. =2. (th, bt) according to some versions of Hullian behavior theory, the reduction of anxiety that reinforces behavior.
reinforcement/variable-interval, see reinforcement/schedules of.
reinforcement/variable-ratio, see reinforcement/schedules of.
reinforcer = (lab slang, bt) a reinforcing stimulus.
reinforcing stimulus, see stimulus/reinforcing.
releaser = 1. (emp, eth) a synonym for sign stimulus. [According to many, this usage is incorrect, and only (2.) should be used.] =2. (emp, eth) a sign stimulus produced by the physical structure or behavior of an animal that releases some particular species-specific response or responses of another animal. [Releaser might be used by a behavior theorist for a stimulus presented by one animal that controls a response or responses of another. Often used with the qualifying adjective “social.” This concept is important for evolutionary theories of behavior.]
releasing mechanism/innate (IRM) = (th, eth) a hypothetical physiological “neurosensory” mechanism invoked to explain the action of sign stimuli. [“The strict dependence of an innate reaction on a certain set of sign stimuli leads to the conclusion that there must (italics added) be a special neurosensory mechanism that releases the reaction and is responsible for its selective susceptibility” (29). Activation of an IRM removes a block preventing passage of impulses from a center at one level of an instinctive hierarchy to a center at the next lower level.]
respondent = 1. (emp, bt) an adjective specifying parts of behavior for which eliciting stimuli are identified. [A respondent response may be elicited readily by presenting the appropriate stimuli. The responses of the physiologist’s “reflexes” are all respondents, as are the responses often classified as “autonomic” and the responses occurring in consummatory acts.] =2. (emp, bt) a respondent response.
respondent behavior, see behavior/respondent.
respondent response, see response/respondent.
response = 1. (emp) a class of parts or changes in parts of behavior such that its members can be observed to vary together systematically as a function of time or of other environmental variables. [Before a particular movement or other part of behavior can be placed in a particular class (that is, identified as an instance of a given response), procedures for the objective identification of instances of the response must be established. In some cases, objective identification is ensured by recording devices (bt), and in others by demonstrating that appropriately trained observers can recognize them without disagreement when they occur (eth). Similarly, some responses are specifiable in terms of specific movements (often termed respondent), and others in terms of their effect on the environment (operant). There is no logical restriction on the duration or complexity of the parts of behavior termed responses; this is determined by the animal’s behavior. Similarly, alternative analyses may be made, and one response may be part of another, or two response-instances may be members of the same response-class in some contexts but not in others. E.g., writing a particular word may be equivalent to speaking it in some experimental procedures. It is the animal’s whole behavior that enables the experimenter to identify responses. It is this concept of response, shared by ethologists and behavior theorists (but seldom stated), that makes this glossary possible. Basic to both, the concept establishes the subject matters of the two fields of investigation as one and sharply distinguishes both from other approaches to the investigation of behavior at both the observational and theoretical levels.] =2. (emp) loosely, but very commonly, a response-instance. =3. (emp, th) the muscular contraction, glandular secretion, or any other activity of an organism that results from stimulation (34). [“Any other activity” makes it difficult to determine how to use the word and whether it is an empirical or theoretical term. Some such definition as this leads to the difficulties that some theories of learning encounter in using the concept (5).]
response/amplitude of = 1. (emp, bt) a quantitative measure of one dimension of a response. [This measure may be made of a single response-instance or averaged over a number of them. The dimension selected is, of course, determined by the topography of the response and is usually the dimension most closely related to the name of the response. For salivary response, the number of drops of saliva defines the amplitude; and, for the eye-blink, it is millimeters excursion of the eyelid. Several measures of amplitude are possible for any response. Which is employed will depend upon experimental convenience. Some tentatively employed measures prove to be less sensitive to experimental variables than others.] = 2. (emp, bt) response magnitude. [A confusing usage.]
response/backward conditioned, see response/conditioned.
response/conditioned (CR) = (emp, bt) a response which appears or is modified as a consequence of conditioning. Subclasses of the Pavlovian CR include: backward conditioned response = (emp, bt) the conditioned response that is set up by classical conditioning when the unconditioned stimulus (US) precedes the conditioned stimulus (CS). [There is some controversy over the experimental referent for this term.] delayed conditioned response = (emp, bt) the conditioned response that is set up by classical conditioning when the interval between the onset of a continuing CS and the onset of the US is greater than several seconds. simultaneous conditioned response = (emp, bt) the conditioned response that is set up by classical conditioning when the interval between the onset (or termination) of the CS and the onset of the US ranges from zero to several seconds, and when CS and US overlap in time. trace conditioned response = (emp, bt) the conditioned response that is set up by classical conditioning when an interval of several seconds intervenes between the termination of CS and the onset of US.
response/consummatory, see consummatory act
response/delayed conditioned, see response/conditioned.
response equivalence, response generalization =1. (emp, bt) a term applied to the observation that two or more instances of the same response may not be alike topographically. [Following the reinforcement of a response-instance of one topography, other response-instances of different topography, often with equivalent effects on the environment, may be given. By the definition of response, it follows that these parts of behavior are instances of the same response and also that a response cannot be identified on a purely a priori basis. It is the kind of data one collects that determines whether pressing the bar with the nose and pressing the bar with the feet are both subclasses of the response bar-pressing. Where two such parts of behavior do not prove to be equivalent [i.e., subclasses of the same response), they may often be made equivalent by the experimenter. See definition (2.).] =2. (emp,bt) a term applied to the observation that when instances of two or more responses have been reinforced in the same situation, and after the same drive operations, they will then appear interchangeably. [This phenomenon suggests that instances of either one may be considered as instances of a broader class of responses whose members are not easily identifiable a priori or that is formed by the operation of reinforcement. Thus, in a problem box, a cat may unlatch the door by pushing with his rump or with his front paw. So far as the quantitative properties of door unlatching are concerned, it does not matter much which he does; both will be reinforced, and both will occur. Similarly, a rat may run or swim through a maze. After extensive reinforcement, response equivalence tends to disappear, and the response becomes more stereotyped. It should be emphasized that the experimenter will find that he is not able to make any response equivalent to any other. He is limited not only by physical law but also by the animal’s behavior capacity. Response equivalence offers a variety of challenges both to experimenters and theorists.]
response/fractional anticipatory (antedating) goal (rg) = (th, bt) a hypothetical response, inferred from the laws of classical conditioning, that carries the burden of explaining the findings of experiments such as those on latent learning. [The rg occurs progressively earlier in a response chain during acquisition and provides hypothetical stimuli that become conditioned to ensuing responses in the chain. Their definition is such that they might be directly observed See Spence (25). F.a.g.r. is another abbreviation that also appears; in speech, one hears “little rg” as well.]
response generalization, Syn. response equivalence.
response/goal (RG) = (emp, bt) the response given to a reinforcing stimulus [in Pavlovian as well as in operant or instrumental conditioning.]
response incompatibility = (emp, bt) when the occurrence of one response makes impossible or highly improbable the simultaneous or nearly simultaneous occurrence of a second response, then the two responses are termed mutually incompatible. [Responses may be topographically incompatible (an animal cannot turn its head to the right and to the left at the same time) or they may simply never occur together; thus a rat is not observed to exhibit emotional behavior and to eat at the same time. In general, the responses to aversive and negative reinforcing stimuli are incompatible with the responses to such reinforcing stimuli as food and water. The phenomenon of response incompatibility plays an important role in some contiguity theories of reinforcement.]
response-instance = (emp, bt) a single occurrence of a part of behavior that is a member of a response. [Thus, if a hammer strikes the patellar tendon five times, and each time the lower leg kicks forward, then each kick is an instance of the response “knee-jerk”; or, if a relay operated by the downward movement of a bar operates five times when a rat is in the vicinity of the bar, each one of the five operations is an instance of the response bar-press—even though on one occasion the animal pushed the bar with the right paw, on another with the left, and on another with his nose. Any part of behavior that is observed once and that ‘can be specified may prove to be a response-instance, but it does not follow that the observer can state what response or responses it is a member of. In fact, he cannot do so by the definition of either response or response-instance until many observations are made.]
response/intensity of = (emp, eth) with a constant stimulus but under increasing values of variables involved in drive-establishing operations, or vice versa, a response may not only increase in magnitude but may alter in topography as well. Positions along such a graded continuum are referred to as different intensities of response. [For example, the individual may blink for a short time when a moderately intense light is flashed into his eyes; but when a very intense light is flashed, he will give a more intense response: he will blink, squint, secrete tears, show accommodatory spasm, and avert his head. With intermediate stimuli, intermediately intense responses will appear. Other analyses of such graded behavior (in terms of the successive passing of the thresholds of several responses, and of the magnitude of these responses) may be made in cases where the term intensity of response has been applied, although the latter win prove more convenient. Thus, “low intensity response” may be used one might also speak of “incomplete” response or of a response that is a subclass of a more broadly specified response.]
response/latency of = (emp, bt) the measure of time elapsing between the onset of a stimulus and the beginning of the response to it.
response learning, see learning/response.
response/magnitude of = (emp, bt) a term applied to any one of several descriptive measures of a response-instance (amplitude or duration, reciprocal latency or velocity), or of a temporally restricted set of response-instances (such as frequency, relative frequency, percentage of occurrence), or rate of response, which state quantitatively the likelihood that a response-instance will occur under stated conditions during that brief span of time. [This term may be compared with response-strength. The latter, a synonym, is used preferentially when one is referring to response magnitude as a function of repeated elicitation or occurrence, with or without reinforcement.]
response/measurement of = (emp, bt) the assignment of numbers to responses or to response-instances according to the rules of measurement. Measures employed include: extinction/resistance to, response/amplitude of, response/intensity of, response/latency of, response/magnitude of, response/probability of, response/rate of. [See also response-strength and conditioning/ strength of.]
response/operant, see operant
response/probability of = 1. (emp, bt) relative frequency of response, determined over a number of trials, in situations where any one of several alternative responses may be given. [The alternative response may be R, i.e., simply not giving the response.] =2. (th, bt) a synonym for strength of conditioning, preferentially employed by those who use rates of response as an experimental dependent variable (24).= 3. (th, bt) the dependent variable in modern probability theories of behavior. [It is rigorously related via mathematical theory and coordinating definitions to such experimental response measures as latency, rate, and probability of response (2.) (4).]
response/rate of = (emp, bt) the number of response-instances of a response occurring in unit time.
response/reflex = (emp, bt) the response of a reflex (3.). A respondent. [Often termed an unconditioned response when referred to in the context of conditioning.]
response/respondent, see respondent
response/ritualized = 1. (emp, eth) a response that appears in most members of a species, that is relatively invariant in topography,and that is typically a social releaser. = 2. (th; eth) a response that appears in the repertoire of the members of several different but related species and that usually varies in stereotypy from species to species. [Ritualized responses occur in sharply restricted situations. Most ritualized responses are topographically very like some less stereotyped response that occurs in rather different situations and following other drive operations. The herring gull pulls grass with its bill in the course of nest building. Ritualized grass pulling occurs as a component of threat behavior. Theoretically, a ritualized response is one that has become specialized in the course of the evolution of the species as, most often, a social releaser. These responses are often associated with a special marking of the fur or feathers. E.g., ritualized preening occurs in the courtship of some species of the Anatidae; when it takes place, it reveals a light-colored spot not visible otherwise.]
response/simultaneous conditioned, see response/conditioned.
response/species-specific = (emp) a response that is a part of species-specific behavior.
response-strength = (emp, bt) measures of response when taken with reference to the measure of response at other times or under other conditions.
response/superstitious, see superstition.
response (or reaction) threshold = (th, bt) in theories that conceptualize as a state variable all the independent variables governing response (as Hull’s conceptualization sEr), the minimal value of the state variable that will evoke a response.
response/topography of, see topography.
response/trace conditioned, see response/conditioned.
response/unconditioned (UR) = (emp, bt) a regular and measurable response elicited by the unconditioned stimulus in the classical conditioning experiment [Usually a reflex response, except in higher order conditioning.]
reward = (Con) a colloquial term for a reinforcing stimulus.
reward/token = (emp, bt) experimentally, it is possible to reinforce the behavior of manipulating some object (as, for example, dropping a poker chip or coin in a slot, or rolling a marble into a hole). If it can then be demonstrated that conditioning occurs when this object is used as a reinforcing stimulus, the object is termed a token reward.
ritualization = (th, eth) a concept appearing in evolutionary theories of ritualized responses.
ritualized response, see response/ritualized.
runway = a straight pathway, usually without interruption, either along an elevated rail (Graham-Gagne’ apparatus) or through an enclosure, leading from a starting box to a goal box in which food or water is placed. The animal is restrained in the starting box. At the beginning of a trial, the door of the starting box is opened, and the animal is free to move down the pathway into the goal box.
schedule/multiple, see reinforcement/schedules of.
schedule/tandem, see reinforcement/schedules of.
schedules of reinforcement, see reinforcement/schedules of.
sensory, see adaptation/sensory and conditioning/sensory pre-.
shaping (of behavior) = (lab slang, bt) approximation conditioning.
Skinner-box = a space enclosed by a floor and walls of one or another material through which the animal studied cannot escape. It is provided with one or more objects (manipulanda), the movement of which will automatically deliver a reinforcing stimulus. The manipulandum may be a key (to be pecked by a pigeon), a bar (to be pressed by a rat), or a string (to be pulled by a rat), or a panel (to be pushed by a dog or monkey). Used in studies of operant conditioning.
species-specific, see behavior /species-specific and response/species-specific.
spontaneous behavior, see behavior/ spontaneous.
spontaneous recovery = (emp, bt) the term applied to the observation that a response which has been extinguished, and then remains neither elicited nor emitted over a period of absence from the experimental situation, will show when next it appears a strength greater than that observed at the termination of the extinction procedure. [A concept often used theoretically to account for regression. A similar phenomenon appears after habituation.]
statement = (emp, bt) a stretch of verbal behavior bounded by a change of speaker; hence one that is the discriminative stimulus for some behavior of the second speaker. [Statements are “sentences that arc regularly directed to eliciting attention to continuous discourse” (6). Verbal behavior has the property of presenting stimuli both to the behaver himself and to those with respect to whom he is acting.]
stereotyping = (emp, bt) a term applied when members of a set of successive instances of a response do not vary in their quantitative topographic characteristics. [A response that has been performed with reinforcement many times will be stereotyped. A response that has been carefully differentially reinforced will also show stereotyping, but more quickly. Cf. ritualization.]
stimulus = [Five usages must be distinguished among the writings of various students of behavior. Fortunately for the intellectual comfort of the reader (but for nothing else), in most cases the ambiguity of this term does not reveal itself, since most students of behavior have not shown any great interest in treating the problem of stimulation in great experimental detail.] =1. (emp) a physical event impinging on the receptors of an animal. =2. (emp) a physical event impinging on the receptors of an animal and capable of exciting those receptors. 3. (emp) a specified part, or change in a part, of the environment correlated in an orderly manner with the occurrence of a specified response. [See sign stimulus.] =4. (th) an event within the animal hypothesized to account for certain complex behavior. [See movement-produced stimulus, private stimulus, and drive stimulus.] 5. (lab slang) loosely used as synonymous with stimulus object (an object which produces stimuli) and with stimulus event (an event which produces stimuli). [Stimulus is a difficult term indeed. In many psychophysical and psychophysiological laboratories, the first usage is often heard (although it is perhaps always incorrect). The second appears as an empirical term in physiological studies of sensory discrimination and as a theoretical term, as in Hullian theory. The third is that explicitly stated by Skinner (22) and corresponds closely to the ethologist’s releaser or sign stimulus. The distinction between this usage and the preceding one is related to the distinction that Skinner draws between stimuli that “elicit” and stimuli that “set the occasion for” response, and that ethologists draw between reflex stimulation and the “releasing” action of a sign stimulus or releaser. It is implicitly employed empirically by other American behaviorists. The stimulus for a response, by this usage, is not necessarily descriptively simple, or easily quantifiable, and can only be determined by experimental manipulation of the environment designed to isolate those parts of it on which a particular response is contingent. A response may or may not vary in magnitude as a function of the magnitude of the stimulus (where it can be measured or controlled). Sometimes, the stimulus proves to be complex but invariant, as “a (preferably red) patch at the tip of the lower mandible” for the food begging response of the gull chick (31), or “a (preferably black) angle” for certain cases of discriminated pecking in the pigeon. At other times, the stimulus turns out to be both complex and variable, as those stimuli controlling maze-running in the rat. For an interesting discussion of the concept of stimulus, see Skinner (22). See also stimulus/discriminative and stimulus/sign. In the following series of terms, definition (3.) applies to all the empirical ones, and definition (4.) to the theoretical ones. It should also be noted that the terms are not mutually exclusive in their application to parts of the environment; thus, a reinforcing stimulus may also be a discriminative stimulus. E.g., the click of the food magazine is a reinforcing stimulus for bar-pressing, and a discriminative stimulus for diving to the food-tray, for eating, and, indeed, for the next bar-press in the series. Note, too, that since a given environmental event is a stimulus of a given class at one time for an animal, it is not necessarily a stimulus of any class whatever on other occasions.]
stimulus/aversive = (emp, bt) a stimulus which, if it is applied following the occurrence of a response, decreases the strength of that response on later occurrences. [Most aversive stimuli are also negatively reinforcing stimuli. It is a live experimental problem to determine whether these are identical classes and hence whether only one term need be employed conceptually. Incidentally, both classes of stimuli also usually elicit the behavioral “symptoms” of fear as well as of avoidance. The decrease in response-strength that is produced by administering aversive stimuli has been experimentally demonstrated to be transitory; the strength of conditioning seems not to be affected.]
stimulus/conditioned (CS) = (emp, bt) in classical conditioning, a stimulus which originally does not evoke any response similar to the unconditioned response, but which during conditioning acquires the property of eliciting this response or a similar one. The originally neutral stimulus. [Properly, this should perhaps be “conditional” stimulus, but usage dictates this form.]
stimulus/conditioned reinforcing, see stimulus/reinforcing.
stimulus/consummatory =1. (emp) the stimulus for a consummatory act. =2. (emp, eth) a member of a set of stimuli the occurrence of which most often terminates a given sequence of behaviors, but which does not elicit an observable consummatory act (18).
stimulus/discriminative (SD) = (emp; bt) used with reference to operant behavior. A stimulus which sets the occasion on which a response will be reinforced. If a response is reinforced only when a discriminative stimulus is present, the animal will eventually make the response at a higher rate or in greater magnitude in the presence of that stimulus than in its absence. [The usage “to set the occasion for” parallels the ethologist’s “to release” and is basal on the same empirical differences from the “elicitation” by a stimulus of the reflex response of a physiologist’s reflex. Discriminative stimuli have most of the properties of sign stimuli. In neither of these cases is the stimulus physically quantifiable in any simple manner. It is, of course, possible for an experimenter to produce an easily quantifiable discriminative stimulus by differential reinforcement, but this is rarely done outside of experiments on sensory mechanisms. Since quantification is usually not readily effected, simple R = f(S) laws are often not statable, and consequently nonstimulus variables (e.g., deprivations or other drive operations) tend to be emphasized as controllers of behavior. This should not be taken to mean that discriminative stimuli or sign stimuli are quite unmanipulatable or that quantitative dimensions cannot be defined at all. ”A (preferably red) patch at the tip of the lower mandible” defines the “normal” stimulus for food begging in the gull chick. Black or gray patches at slightly different locations also control the response, its strength being dependent on the degree of similarity to the specification of the normal sign stimulus. Here lies a problem for psychological scaling. (Negative discriminative stimuli are customarily termed SA.)]
stimulus/drive (S4) = (th, bt) a stimulus, usually internal, which is hypothesized to occur in and to be uniquely determined by a given drive state. [This concept allows drives to have the empirical properties of stimuli. Theories often identify them with specific events within the organism, and so it is possible that they may in the future be empirically defined.]
stimulus/eliciting = (emp, bt) the stimulus of a reflex, or the conditioned stimulus of a classical conditioned response.
stimulus generalization = (emp, bt) the behavioral fact that a response conditioned to one stimulus (or set of stimuli) will be elicited by or will occur in the presence of another stimulus (or set of stimuli) which is similar to the conditioned stimulus or discriminative stimulus although there has been no specific training to it. Changes in strength of response to one will covary with changes in strength of response to the other. [Observed both in conditioning and in extinction.]
stimulus/movement-produced (mps) = (th, bt) hypothetical proprioceptive stimulus set up by a particular response of the animal, postulated in association theory. [Most responses may be postulated to produce such stimuli on the basis of sensory physiology so that statements about mps’s often reduce to statements about responses serving as stimuli for further behavior. Cf. stimulus/drive. Probable physiological correlates of these include nerve impulses in the proprioceptive fibers of the nervous system, so that some might think this an empirical concept. However, the operations used in experimentally manipulating them are restricted to increasing the physical work involved in a response and introducing responses of varying topography. As with drive stimuli, empirical definition may eventually become possible.]
stimulus/negative reinforcing, see stimulus/reinforcing.
stimulus/positive reinforcing, see stimulus/reinforcing.
stimulus/primary reinforcing, see stimulus/reinforcing.
stimulus/private = (emp, th, bt) a part, or a change in a part, of the animal occurring within the animal’s body surface, and hence one that is not observable (cannot be responded to) by others except through special instrumentation. [E.g., increased pressure within an infected appendix or around an abscessed tooth root. This concept is employed in the theoretical derivation of the verbal behavior termed “introspective.” It is the lack of control over the conditions of social reinforcement that renders the observer’s responses to private stimuli notoriously unreliable and, hence, that limits the usefulness of the introspective method to special cases.]
stimulus/proximal = (emp) a stimulus (2.) more completely specified in terms of physical events occurring at the receptor organ. [One may specify a stimulus (2.) as a circular patch of light, so many millimeters in diameter, of such and such a wave length, eight inches in front of the organism’s nose. The proximal stimulus here is the retinal image (physically specified) produced in the subject’s eye by this stimulus.]
stimulus/reinforcing = 1. (emp, bt) in operant conditioning, if it can be shown that the occurrence or termination under specified conditions of an environmental event that is contingent upon some response of the animal alters some measure of that response as determined on later occasions when instances of it appear, and if these changes conform with those defining the empirical concepts of conditioning, then that event is a reinforcing stimulus. [Reinforcing stimuli can all be shown to elicit stable specific responses in those situations in which they reinforce a conditioned response. Thus, those stimuli for consummatory responses that have been used in conditioning have proven almost without exception to be positive reinforcing stimuli, and those that elicit escape behavior and fear are usually negative reinforcing stimuli. Some stimuli are reinforcing only when the animal to which they are presented has been previously treated according to certain drive operations: thus, food is not a reinforcing stimulus for the behavior of a satiated animal. Such qualification of the status of a particular part of the environment as a reinforcing stimulus should always be inferred when the term is used. Reinforcing stimuli are identified empirically. General theories of reinforcement attempt to account for their action in terms of properties common to them all. Thus, Hull’s drive-reduction theory stresses the view that such reinforcement stimuli act to reduce drives (either primary or secondary), and Guthrian contiguity theory asserts that they act by altering considerably the stimulation impinging on the animal. Some take the view that it is the response to reinforcing stimuli that is effective. It should be noted that many reinforcing stimuli can be observed to be positive reinforcers for some behaviors and negative reinforcers for others.] =2. (emp, bt) in classical conditioning, the unconditioned stimulus.= 3. (emp, bt) loosely, but almost universally, a positive reinforcing stimulus. stimulus/negative reinforcing = (emp, bt) a reinforcing stimulus that, if it is terminated following aresponse, increases the strength of instances of response occurring later in time. If such stimuli are administered following the occurrence of a response, its rate or magnitude usually diminishes. [See punishment and stimulus/aversive. Negative reinforcing stimuli usually control escape behavior. Here are a few stimuli that have been found to be negative reinforcing stimuli. The list is short, suggesting the relatively small number of experiments that have been done on this topic: for rats–electrical shocks to the paws, bright lights; for dogs–electrical shocks to the paws, sudden loud noises, hammer blows to the body; and for human adults–electrical shocks, “that’s wrong.”] stimulus/positive reinforcing = (emp, bt) a reinforcing stimulus that, if applied following a response, increases the strength of instances of that response occurring later in time. [Here are a few stimuli that have been found experimentally to be positive reinforcing stimuli for most responses under appropriate conditions: for rats—laboratory chow, water, bread and milk, removal from the experimental situation, saccharin solutions, warmth, dark spaces, and (to the male) females in heat; for fish-meal worms, daphnia, roe; for dogs—meat, meat powder, water, head-pattings by humans; for human infants—lights, gongs; for chimpanzees—peanuts, bananas, and (after appropriate training) poker chips; for human adults—lights, “points,” “knowledge of results,” smiles, agreement, peanuts, silver, and gold.] stimulus/primary reinforcing =1. (emp, bt) any stimulus that is effective as a reinforcing stimulus for all the known members of a strain or of a species at the beginning of an experiment. [Hence, any reinforcing stimulus which has not been shown experimentally to be a secondary reinforcing stimulus.] =2. (th, bt) any reinforcing stimulus that reduces a primary drive. [If all rats of a colony were raised on laboratory chow and had had nothing else to eat, an experimenter would find that laboratory chow is a primary reinforcer when they are hungry; their behavior could not be reinforced by cheese, nor would they eat it until they had been trained to eat it. If all rats were raised on laboratory chow and cheese, then both would prove to be primary reinforcers when they were used in an experiment.] stimulus/secondary (or conditioned) reinforcing = (emp, bt) after a stimulus has been presented to an animal in spatial and temporal contiguity with a reinforcing stimulus one or more times, if, and only if, it then acts as a reinforcing stimulus itself, it is termed a secondary reinforcing stimulus. [Presentation of a secondary reinforcing stimulus is termed secondary reinforcement. The distinction between primary and secondary reinforcing stimuli is based upon the experimental history of the animal and nothing else. The connotations of the modifiers primary and secondary are unfortunate since they imply for many a distinction based on one theory of reinforcement, the drive-reduction theory. For this reason, the term conditioned reinforcing stimulus is to be preferred to the more commonly (and misleadingly) used secondary reinforcing stimulus, since the experimental operations that render a previously neutral stimulus a reinforcing stimulus are the same as those which produce classical conditioned responses.]
stimulus-response (S-R or SR) correlation = (emp, bt) an observed relationship between a stimulus and a response, such that a particular response can be shown to be dependent for its occurrence upon the just previous or concomitant occurrence of a specific stimulus or class of stimuli and to vary with variations in the conditions of presentation. [Cf. reflex.]
stimulus/secondary reinforcing, see stimulus/reinforcing.
stimulus/sign = (emp, eth) a specified part, or change in a part, of the environment correlated in an orderly manner with the occurrence of a species-specific response that is not a reflex response. [This term corresponds closely with stimulus (3.) and almost exactly with the term discriminative stimulus. Stimulus (3.) is the definition used empirically by behaviorists. The difference lies in the class of response controlled. A sign stimulus can be identified only on the basis of experimental work. It usually turns out to be specifiable in rather complex but sometimes exact terms, and it is often not conveniently describable in the language of physics and physiology. This leads to the use of literary terms such as “configuration” and then on to the use of Gestalt. Sign stimuli “release” behavior, just as discriminative stimuli “set the occasion for” responses. See also stimulus/super-normal sign.]
stimulus/super-normal sign = (emp, eth) a term applied to certain sign stimuli that have proven amenable to quantification along some scale. The sign stimulus, as it occurs in the field, falls at some point on this scale. To stimui below this value, strength or intensity of response is less. If responses are given at greater strength or intensity to stimuli above this value, the stimuli of these magnitudes are referred to as “super-normal stimuli” (that is, they are more effective than “normal” stimuli). [A good example is the oversized dummy egg to which the oyster-catcher responds with more vigorous brooding activity than it does to its own egg. The egg is too large for sitting, but the oyster-catcher climbs upon it nonetheless, topples off it, climbs on again, and so on, all the while ignoring its own much smaller egg that lies nearby (29).]
stimulus threshold = (emp) the class of those values of quantified stimuli that will elicit some defined constant response at a fixed strength of less than maximal value. [E.g., the absolute terminal threshold of vision is defined in terms of the photometric brightness of a stimulus patch of specified characteristics that will elicit “Yes, I see it” from a subject on 50% of all the occasions on which it is presented. Note that not only the stimulus characteristics, but the response characteristics and the magnitude as well, must be specified in defining a threshold.]
stimulus-trace = (th, bt) a hypothetical after-effect in the conceptual nervous system that persists for a short time after the termination of a stimulus and that has the properties of a stimulus in controlling response. [Not to be confounded with the physiologically observable nerve impulses that may be recorded from afferent fibers after the withdrawal of a stimulus from a receptor. Stimulus-traces are theoretical, and their properties are what they must be to satisfy the needs of theory and not what the physiologist observes. Theorists tend to overlook discrepancies and sanguinely look to the day when the discrepancies will disappear so that stimulus-trace conceptions can take on an empirical status. See Hull (16).]
stimulus/unconditioned (US) = (emp, bt) in classical conditioning, a stimulus which evokes or elicits a regular and measurable response (the unconditioned response). [Usually the stimulus of a reflex.]
superstition =1. (emp, bt, lab slang) unless an experimenter is very careful, during approximation conditioning of a rat, or a pigeon, or a human subject, he may reinforce a response in which he is not interested or reinforce too often one of the responses that is in the approximation sequence of responses. These responses, occurring henceforth at a relatively high rate or in great strength, are referred to as superstitions or as superstitious responses. [They tend to recur through the animal‘s experimental history and, hence, render data on the response in which the experimenter is interested relatively disorderly.] =2. (emp, bt, lab slang) if stimuli that are usually reinforcing (e.g., food) are randomly delivered to a pigeon over a long period of time irrespective of his behavior, at the end of the period the subject can be observed to repeat over and over some response. Such a response is termed a superstition (23). [It is probably one that occurred just before food was presented, that then increased in rate, was reinforced again, and so on. It has been conditioned despite the fact that reinforcement was not experimentally contingent on it. This procedure bas not been tried on other species.]
superstitious, see superstition.
suppression/conditioned= (emp, bt) the experimental procedure of presenting, on a number of occasions for a short period of time (e.g., 1-5 min.) during the performance of a given pattern of behavior, a neutral stimulus and of presenting, at its termination, a strongly aversive stimulus, neither being contingent on the animal’s behavior. Conditioned suppression is said to occur if, and only if, the response-strength is observed to decrease during the presentation of the initially neutral stimulus.
tandem schedule, see reinforcement/ schedules of.
taxis= (emp, eth) a term applied to a broad class of behaviors specifiable in terms of the responses (locomotor and orientative) and of the stimuli (most often visual) controlling them. [The broad class of taxes is divided into subclasses (e.g., klinotaxis, menotaxis) on the basis of (a) the physiological mechanisms that have been experimentally demonstrated or theoretically inferred, as involved in the behavior, (b) the classes of stimuli controlling them, or (c) the ontogenetic basis of the correlation. This heterogenous system of classification is of limited descriptive usefulness; it includes both empirical and theoretical concepts. Perhaps its greatest utility is to emphasize certain theoretical problems that, when approached experimentally, may lead to the elimination of the whole set.]
theory = within the usage of this glossary, a set of statements about empirical concepts, relationships among them, and hypotheses postulating further relationships among them. Theories often include statements involving the empirical concepts of universes of discourse other than that in which the theory is derived. [For an extensive discussion of some theories of learning, see Spence (25).]
theory/behavior = a generic term for the empirical and theoretical study of behavior in experimental psychology. [The label, despite its wide use, is doubly misleading. Most behavior theorists study only behavior that is dearly learned, and the major part of their effort is experimental not theoretical. A pejorative synonym is “rat” psychology, although the rat is by no means the only animal studied. Two general classes of theory are encountered. According to one class (cognitive theory), animals learn that stimulus A follows stimulus B or learn where stimulus A is in an environment. That is to say, the animal’s changed behavior depends upon its acquiring something that corresponds in some (inexplicit) way to the physical situation in which it is behaving. According to the other class (S-R theory), the animal learns to give response A in the presence of stimulus A. That is to say, the animal’s changed behavior depends upon its acquiring a new S-R correlation. Syn. learning theory.]
theory/cognitive = a generic term for theories of learning that stress latent learning and place learning experimentally, and perception and cognition theoretically. In terms of the conditions asserted as necessary and sufficient for learning to occur, it may be characterized briefly as sensory-sensory contiguity, nonreinforcement theory. [Tolmanian theory, S-S theory, arid expectancy theory are almost synonymous.]
theory/contiguity = a generic term for theories of learning that consider the occurrence of a response in the presence of a stimulus as the necessary and sufficient condition for learning. One-trial conditioning, stimulus populations, and extinction via interference and habituation are key theoretical tools. Reinforcement serves only to protect a response from “unlearning” and is derived from more elementary principles. This theory has recently been stated rigorously in terms of mathematical probability theory with good results (4). In terms of the conditions asserted as necessary and sufficient for learning to occur, it may be characterized briefly as stimulus-response contiguity, nonreinforcement theory. [Association theory and Guthrian theory are almost synonymous.]
theory/continuity = the theory of discrimination learning which states that the animal’s response is determined by the total number of responses reinforced in the presence of, or shortly after the presence of, the positive discriminative stimulus. Implicitly a part of most stimulus-response theories.
theory/drive-reduction = a generic term for those theories of learning which assert that both the occurrence of a reinforcing stimulus and contiguity in time of response with stimulus, taken together, are necessary and sufficient conditions for learning. The reinforcing stimulus must further reduce some drive or need if learning is to occur. In terms of the conditions asserted as necessary and sufficient for learning to occur, it may be characterized briefly as stimulus-response contiguity, drive-reduction theory. [It is interesting to compare drive-reduction with Lorenz’ “consumption of specific-action energy.” Hullian theory is often used synonymously.]
theory/Hullian = a generic term for either (a) drive-reduction theories of learning or (b) the whole complex of empirical and theoretical statements put forward by Hull and his school, which incorporate many concepts other than drive-reduction.
theory/learning, see theory/behavior.
theory/noncontinuity = a theory of discrimination learning according to which the animal’s behavior is not dependent upon the total number of responses reinforced in the presence of, or shortly after the presence of, the positive discriminative stimulus, but upon successively adopted “expectancies.” Occurs in cognitive theory.
theory/probability = a generic name for theories of learning that use as their model for behavior the mathematics of probability. They are typically S-R theories.
theory/reinforcement = a generic name for the work of (largely) S-R theorists who stress, either experimentally or theoretically, the operation of reinforcement. Besides the Hullians, the Guthrians are also included in this group, as well as the anti-theoretical followers of Skinner. All stress reinforcement as an experimental tool but make widely differing theoretical use of it.
theory/stimulus-response (S-R) = a generic name for theories of behavior that phrase all descriptions of behavior in terms of stimulus and response, that assume the necessity for a response to occur if learning occurs, and that attempt to predict specific behavior. [Almost synonymous with reinforcement theory.]
threat, see behavior/threat.
threshold reaction, see response threshold.
threshold/response, see response threshold.
threshold/stimulus, see stimulus threshold.
thwarting, see frustration.
T-maze = a T-shaped pathway basically similar to a runway. The starting box is at the base of the T, and goal boxes are at each end of the cross-piece. On a trial, food or water may be placed in either or both goal boxes. Discriminative stimuli are sometimes placed in the arms of the T.
T-maze/multiple = a series of T-shaped runways, with a starting box at the base of the first T, and a goal box at the end of one of the arms of the last of the series. The base of each successive T opens into the top side of one of the arms of the preceding T.
topography (of a response) = (emp, bt) the full quantitative specification of all the relevant, physically measurable dimensions of a response. [If a response is stereotyped, the topographies of instances of it are very similar quantitatively to one another. This word, borrowed it seems from geography, is preferred by many to “pattern of response,” which is sometimes used synonymously.]
train = (emp) to subject an animal to experimental procedures such that one or more of its responses become conditioned.I
training/discrimination, see discrimination training.
transfer = (emp, bt) a name for observations (over one or more trials) made on the strength of a set of responses initially given to one set of stimuli, when similar measurements are made of the same set of responses in the presence of other sets of stimuli. [Transfer may be positive or negative; the term insight is often applied to instances of positive transfer. The term transfer overlaps in its referents with stimulus generalizations; all cases of stimulus generalization are cases of transfer.]
trial = (emp, bt) a single, experimentally manipulated occasion on which an instance of a specified response is elicited or may occur. [This general definition is subject to qualification in particular cases. Thus, each experimenter typically reports his precise usage in applying the term. Some experimenters define a trial as the occasion when a response could occur, without respect to whether it in fact did; others use the term trial only if a response occurs when the occasion is set; and so on. The former usage is preferred.]
trial-and-error learning, see learning/ trial-and-error.
trial and error/vicarious= (emp, bt) movement of the head of an animal back and forth at the choice point between one and another of the alternative pathways of a maze, or between me or the other door of a Lashley jumping-stand. [This is one of those empirical terms that is “1oaded”; that is to say, the words used in the term have rather direct theoretical implications. See intention movement.]
unlearned, see behavior/unlearned.
vacuum activity = (emp, eth) the occurrence of a fixed action pattern in the apparent absence of its usual releaser or sign stimulus. [Restated, this is the appearance of a respondent as an operant. According to theory these events occur when an animal is in a state of high drive.]-Ger. Leerlaufreaktion.
verbal, see behavior/verbal.
warm-up = (emp) over the first few occasions within a short period of time that a specified response occurs, response-strength may increase relatively rapidly, independently of reinforcement contingencies. Such increments in response-strength are termed warm-up. [Occurs in both unlearned and learned behavior.]
work-decrement = (emp, bt) decrement in response magnitude that is an increasing function of the number of occurrences of the response and of the parameters of physical work involved in it. [Work decrement is the empirical basis of such concepts as Hull’s reactive inhibition and is similarly associated with such intuitive notions as fatigue.]
1To risk a pun, the writer remains ignorant of what a cognition is. So far as he knows, he has never had one, and no one has ever been able to correct him on this, or tell him how to have one, or how to recognize it if he did.
2Some will question this cavalier treatment of “intuitive” concepts; some may even suggest that these are more unimportant than others. There is certainly a place for intuitive concepts, which are often a major tool in an experimenter’s pre-experimental and pre-theoretical speculations, as are his casual and quite uncontrolled observations. Indeed, I suspect that a major part of the behaviorist’s problem is to provide a scientific account of how and why intuitive concepts are developed and used, that is, to “explain” them in terms of human verbal behavior and its discriminative control. But intuitive concepts do not belong in statements of fact or in statements of theory set up to deal with fact, just as casual observations have no scientific status as fact. In passing, ethologists may be interested in the treatment accorded to the (often intuitive) concept “emotion” in a recent 800-page advanced text of experimental psychology (19). The word simply does not appear in the book; two derivatives appear once only.
3 This absence of physiological definitions should be interpreted as showing, not an “anti-physiological” bias, but rather a distaste for “physiologizing,” i.e., for explaining behavior in terms of the properties of a mythical nervous system with handy (but unverifiable) characteristics. A greater volume of experimental work on the physiology of the behavior that behaviorists deal with will presumably make the inclusion of physiological definitions necessary in future glossaries such as the present one.
4 For explanation of abbreviations and procedures used, see discussion in the Preface of the rules followed in preparing the glossary.
References not listed on webpage.