Behaviorism
Behaviorism (also spelled behaviourism)[1] is a systematic approach to understand the behavior of humans and other animals.[2] It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events. The cognitive revolution of the late 20th century largely replaced behaviorism as an explanatory theory with cognitive psychology, which unlike behaviorism examines internal mental states.
Not to be confused with Behavioralism.
Behaviorism emerged in the early 1900s as a reaction to depth psychology and other traditional forms of psychology, which often had difficulty making predictions that could be tested experimentally, but derived from earlier research in the late nineteenth century, such as when Edward Thorndike pioneered the law of effect, a procedure that involved the use of consequences to strengthen or weaken behavior.
With a 1924 publication, John B. Watson devised methodological behaviorism, which rejected introspective methods and sought to understand behavior by only measuring observable behaviors and events. It was not until the 1930s that B. F. Skinner suggested that covert behavior—including cognition and emotions—is subject to the same controlling variables as observable behavior, which became the basis for his philosophy called radical behaviorism.[3][4] While Watson and Ivan Pavlov investigated how (conditioned) neutral stimuli elicit reflexes in respondent conditioning, Skinner assessed the reinforcement histories of the discriminative (antecedent) stimuli that emits behavior; the technique became known as operant conditioning.
The application of radical behaviorism—known as applied behavior analysis—is used in a variety of contexts, including, for example, applied animal behavior and organizational behavior management to treatment of mental disorders, such as autism and substance abuse.[5][6] In addition, while behaviorism and cognitive schools of psychological thought do not agree theoretically, they have complemented each other in the cognitive-behavior therapies, which have demonstrated utility in treating certain pathologies, including simple phobias, PTSD, and mood disorders.
Experimental and conceptual innovations[edit]
As experimental behavioural psychology is related to behavioral neuroscience, we can date the first researches in the area were done in the beginning of 19th century.[15]
Later, this essentially philosophical position gained strength from the success of Skinner's early experimental work with rats and pigeons, summarized in his books The Behavior of Organisms[16] and Schedules of Reinforcement.[17] Of particular importance was his concept of the operant response, of which the canonical example was the rat's lever-press. In contrast with the idea of a physiological or reflex response, an operant is a class of structurally distinct but functionally equivalent responses. For example, while a rat might press a lever with its left paw or its right paw or its tail, all of these responses operate on the world in the same way and have a common consequence. Operants are often thought of as species of responses, where the individuals differ but the class coheres in its function-shared consequences with operants and reproductive success with species. This is a clear distinction between Skinner's theory and S–R theory.
Skinner's empirical work expanded on earlier research on trial-and-error learning by researchers such as Thorndike and Guthrie with both conceptual reformulations—Thorndike's notion of a stimulus-response "association" or "connection" was abandoned; and methodological ones—the use of the "free operant", so-called because the animal was now permitted to respond at its own rate rather than in a series of trials determined by the experimenter procedures. With this method, Skinner carried out substantial experimental work on the effects of different schedules and rates of reinforcement on the rates of operant responses made by rats and pigeons. He achieved remarkable success in training animals to perform unexpected responses, to emit large numbers of responses, and to demonstrate many empirical regularities at the purely behavioral level. This lent some credibility to his conceptual analysis. It is largely his conceptual analysis that made his work much more rigorous than his peers, a point which can be seen clearly in his seminal work Are Theories of Learning Necessary? in which he criticizes what he viewed to be theoretical weaknesses then common in the study of psychology. An important descendant of the experimental analysis of behavior is the Society for Quantitative Analysis of Behavior.[18][19]
Relation to language[edit]
As Skinner turned from experimental work to concentrate on the philosophical underpinnings of a science of behavior, his attention turned to human language with his 1957 book Verbal Behavior[20] and other language-related publications;[21] Verbal Behavior laid out a vocabulary and theory for functional analysis of verbal behavior, and was strongly criticized in a review by Noam Chomsky.[22][23]
Skinner did not respond in detail but claimed that Chomsky failed to understand his ideas,[24] and the disagreements between the two and the theories involved have been further discussed.[25][26][27][28][29][30] Innateness theory, which has been heavily critiqued,[31][32] is opposed to behaviorist theory which claims that language is a set of habits that can be acquired by means of conditioning.[33][34][35] According to some, the behaviorist account is a process which would be too slow to explain a phenomenon as complicated as language learning. What was important for a behaviorist's analysis of human behavior was not language acquisition so much as the interaction between language and overt behavior. In an essay republished in his 1969 book Contingencies of Reinforcement,[21] Skinner took the view that humans could construct linguistic stimuli that would then acquire control over their behavior in the same way that external stimuli could. The possibility of such "instructional control" over behavior meant that contingencies of reinforcement would not always produce the same effects on human behavior as they reliably do in other animals. The focus of a radical behaviorist analysis of human behavior therefore shifted to an attempt to understand the interaction between instructional control and contingency control, and also to understand the behavioral processes that determine what instructions are constructed and what control they acquire over behavior. Recently, a new line of behavioral research on language was started under the name of relational frame theory.[36][37][38][39]
Operant conditioning was developed by B.F. Skinner in 1938 and is form of learning in which the frequency of a behavior is controlled by consequences to change behavior.[42][16][43][44] In other words, behavior is controlled by historical consequential contingencies, particularly reinforcement—a stimulus that increases the probability of performing behaviors, and punishment—a stimulus that decreases such probability.[42] The core tools of consequences are either positive (presenting stimuli following a response), or negative (withdrawn stimuli following a response).[45]
The following descriptions explains the concepts of four common types of consequences in operant conditioning:[46]
A classical experiment in operant conditioning, for example, is the Skinner Box, "puzzle box" or operant conditioning chamber to test the effects of operant conditioning principles on rats, cats and other species. From this experiment, he discovered that the rats learned very effectively if they were rewarded frequently with food. Skinner also found that he could shape (create new behavior) the rats' behavior through the use of rewards, which could, in turn, be applied to human learning as well.
Skinner's model was based on the premise that reinforcement is used for the desired actions or responses while punishment was used to stop the responses of the undesired actions that are not. This theory proved that humans or animals will repeat any action that leads to a positive outcome, and avoid any action that leads to a negative outcome. The experiment with the pigeons showed that a positive outcome leads to learned behavior since the pigeon learned to peck the disc in return for the reward of food.
These historical consequential contingencies subsequently lead to (antecedent) stimulus control, but in contrast to respondent conditioning where antecedent stimuli elicit reflexive behavior, operant behavior is only emitted and therefore does not force its occurrence. It includes the following controlling stimuli:[46]
Behavior analysis and culture[edit]
From its inception, behavior analysis has centered its examination on cultural occurrences (Skinner, 1953,[70] 1961,[71] 1971,[72] 1974 [73]). Nevertheless, the methods used to tackle these occurrences have evolved. Initially, culture was perceived as a factor influencing behavior, later becoming a subject of study in itself.[74] This shift prompted research into group practices and the potential for significant behavioral transformations on a larger scale. Following Glenn's (1986) influential work, "Metacontingencies in Walden Two," [75] numerous research endeavors exploring behavior analysis in cultural contexts have centered around the concept of the metacontingency. Glenn (2003) posited that understanding the origins and development of cultures necessitates delving beyond evolutionary and behavioral principles governing species characteristics and individual learned behaviors requires analysis at a major level.[76]
Behavior informatics and behavior computing[edit]
With the fast growth of big behavioral data and applications, behavior analysis is ubiquitous. Understanding behavior from the informatics and computing perspective becomes increasingly critical for in-depth understanding of what, why and how behaviors are formed, interact, evolve, change and affect business and decision. Behavior informatics and behavior computing deeply explore behavior intelligence and behavior insights from the informatics and computing perspectives.
Pavel et al. (2015) found that in the realm of healthcare and health psychology, substantial evidence supports the notion that personalized health interventions yield greater effectiveness compared to standardized approaches. Additionally, researchers found that recent progress in sensor and communication technology, coupled with data analysis and computational modeling, holds significant potential in revolutionizing interventions aimed at changing health behavior. Simultaneous advancements in sensor and communication technology, alongside the field of data science, have now made it possible to comprehensively measure behaviors occurring in real-life settings. These two elements, when combined with advancements in computational modeling, have laid the groundwork for the emerging discipline known as behavioral informatics. Behavioral informatics represents a scientific and engineering domain encompassing behavior tracking, evaluation, computational modeling, deduction, and intervention.[77]