Skinner identified three types of responses, or operant, that can follow behavior. An animal can be rewarded or punished for engaging in certain behaviors, such as lever pressing (for rats) or key pecking (for pigeons). Skinner (1948) studied operant conditioning by conducting experiments using animals which he placed in a “ Skinner Box ” which was similar to Thorndike’s puzzle box.Ī Skinner box, also known as an operant conditioning chamber, is a device used to objectively record an animal’s behavior in a compressed time frame. behavior which is reinforced tends to be repeated (i.e., strengthened) behavior which is not reinforced tends to die out-or be extinguished (i.e., weakened). Skinner introduced a new term into the Law of Effect – Reinforcement. According to this principle, behavior that is followed by pleasant consequences is likely to be repeated, and behavior followed by unpleasant consequences is less likely to be repeated. Skinner is regarded as the father of Operant Conditioning, but his work was based on Thorndike’s (1898) law of effect. He called this approach operant conditioning. He believed that the best way to understand behavior is to look at the causes of an action and its consequences. The work of Skinner was rooted in a view that classical conditioning was far too simplistic to be a complete explanation of complex human behavior. Skinner believed that we do have such a thing as a mind, but that it is simply more productive to study observable behavior rather than internal mental events. Skinner’s views were slightly less extreme than those of Watson (1913). Although, for obvious reasons, he is more commonly known as B.F. Perhaps the most important of these was Burrhus Frederic Skinner. Watson had left academic psychology, and other behaviorists were becoming influential, proposing new forms of learning other than classical conditioning. Through operant conditioning behavior which is reinforced (rewarded) will likely be repeated, and behavior which is punished will occur less frequently.īy the 1920s, John B. Skinner, where the consequences of a response determine the probability of it being repeated. Operant conditioning, also known as instrumental conditioning, is a method of learning normally attributed to B.F.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |