What is the theory that is based upon the notion that learning is a result of change in overt behavior?

Skinner is based upon the idea that learning is a function of change in overt behavior. Changes in behavior are the result of an individual’s response to events [stimuli] that occur in the environment. When a particular Stimulus-Response [S-R] pattern is reinforced [rewarded], the individual is conditioned to respond.

What are the key concepts of behavioral theory?

Key concepts of behaviorism comprise the stimulus – response [S-R] equation, the classical and operant conditioning, and the reinforcement and punishment notions.

What is the difference between skills and Behaviours?

To recap, behaviours are the way you act. Skills are the abilities you have learned. Many of the behaviours or skills that you learn in life can help you at work, and many that you learn at work will help you in life.

How does imitation help learning?

Imitation is a crucial aspect of skill development, because it allows us to learn new things quickly and efficiently by watching those around us. Most children learn everything from gross motor movements, to speech, to interactive play skills by watching parents, caregivers, siblings, and peers perform these behaviors.

What is meant by imitation in social learning theory?

Imitation is a term used by social learning theorists to describe the way in which an individual copies the behaviour of a role model.

What is Behaviourism learning theory?

Behaviorism or the behavioral learning theory is a popular concept that focuses on how students learn. This learning theory states that behaviors are learned from the environment, and says that innate or inherited factors have very little influence on behavior. A common example of behaviorism is positive reinforcement.

What is an example of imitative learning?

For example, humans are able to imitate a sequence of responses [e.g., how to change batteries in a flashlight]. Can animals show such an advanced form of imitation [for suggestive evidence obtained from pigeons, see Nguyen et al.

[717]737-9068 Free Consultation * 24 Hour Services Available 

"For appointments in Pennsylvania please click here"
Home
 Table of Contents
 Emotional Problems
  Anger
  Anxiety
  Depression
  Frustration
  Grief
  Guilt
  Lack of Confidence
  Self-Esteem
  Stress
 Eating Disorders
  Anorexia
  Bulimia
  Binge Eating
  Eating and Weight
  Emotional Eating

  Excess Weight

  Weight Control

 Relationships
  Co-dependency
  Loneliness
  Loved Ones
  Rejection
  Separation / Divorce
 Addictions
  Drug and Alcohol
  Food
  Gambling
  Internet
  Sex / Pornography
  Spending / Shopping
  Work
Behavioral Problems
  ADD
  ADHD
  Adjustment Disorder
  Bipolar
  Borderline
  Conduct Disorders
  Explosive Disorder
  Hypochondria
  Kleptomania
  Mania
  Multiple Personality
  Obsessive/Compulsive
  PTSD
  Schizophrenia
  Sleep Disorders
 Phobias and Fears
  Fears and Phobias
  Acrophobia
  Agoraphobia
  Claustrophobia
  Monophobia
  Panic Attacks
  Phobias
  Social Phobia
  Performance Anxiety
  List Of Phobias
 Sexual Concerns
  Sexual Concerns [M]
  Sexual Concerns [F]
  Bisexuality
  Exhibitionism
  Fetishism
  Frotteurism
  Gay and Lesbian
  Gender Identity Issues
  Sadomasochism
  Sexual Orientation
  Voyeurism
  List of Paraphilias
Helpful Information
  Aging
  Communication Skills
  Non-Verbal Comm...
  Personal Growth
  Skill Enhancement
Adoption / Infertility
  Adoption
  For Adoptees
  For Adopting Persons
  For Birth Parents
  Infertility
Privacy

B. F. Skinner was one of the most influential of American psychologists. A behaviorist, he developed the theory of operant conditioning -- the idea that behavior is determined by its consequences, be they reinforcements or punishments, which make it more or less likely that the behavior will occur again. Skinner believed that the only scientific approach to psychology was one that studied behaviors, not internal [subjective] mental processes.

B.F. Skinner

The following has been adapted from the Webspace website.

Skinner was heavily influenced by the work of John B. Watson as well as early behaviorist pioneers Ivan Pavlov and Edward Thorndike. He spent most of his professional life teaching at Harvard University [after 9 years in the psychology department at Indiana University]. He died in 1990 of leukemia, leaving behind his wife, Yvonne Blue and two daughters.

Burrhus Frederic Skinner was born March 20, 1904 [he died in 1990 of leukemia], in the small Pennsylvania town of Susquehanna. His father was a lawyer, and his mother a strong and intelligent housewife. His upbringing was old-fashioned and hard-working.

Skinner was an active, out-going boy who loved the outdoors and building things, and actually enjoyed school. His life was not without its tragedies, however. In particular, his brother died at the age of 16 of a cerebral aneurysm.

B. F. Skinner received his BA in English from Hamilton College in upstate New York. Ultimately, he resigned himself to writing newspaper articles on labor problems, and lived for a while in Greenwich Village in New York City as a “bohemian.” After some traveling, he decided to go back to school, this time at Harvard. He got his masters in psychology in 1930 and his doctorate in 1931, and stayed there to do research until 1936.

Also in that year, he moved to Minneapolis to teach at the University of Minnesota. There he met and soon married Yvonne Blue. They had two daughters, the second of which became famous as the first infant to be raised in one of Skinner’s inventions, the air crib which was nothing more than a combination crib and playpen with glass sides and air conditioning.

In 1945, he became the chairman of the psychology department at Indiana University. In 1948, he was invited to come to Harvard, where he remained for the rest of his life. He was a very active man, doing research and guiding hundreds of doctoral candidates as well as writing many books. While not successful as a writer of fiction and poetry, he became one of our best psychology writers, including the book Walden II, which is a fictional account of a community run by his behaviorist principles.

B. F. Skinner’s theory is based on operant conditioning. The organism is in the process of “operating” on the environment, which in ordinary terms means it is bouncing around its world, doing what it does. During this “operating,” the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant -- that is, the behavior occurring just before the reinforcer. This is operant conditioning: “the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future.”

The following has been adapted from the Wikipedia and Webspace websites.

Skinner conducted research on shaping behavior through positive and negative reinforcement and demonstrated operant conditioning, a behavior modification technique which he developed in contrast with classical conditioning. His idea of the behavior modification technique was to put the subject on a program with steps. The steps would be setting goals which would help you determine how the subject would be changed by following the steps. The program design is designing a program that will help the subject to reach the desired state. Then implementation and evaluation which is putting the program to use and then evaluating the effectiveness of it.

B.F. Skinner's Theory of Operant Conditioning

Place a rat in a special cage [called a “Skinner box”] that has a bar or pedal on one wall that, when pressed, causes a little mechanism to release a food pellet into the cage. The rat is moving around the cage when it accidentally presses the bar and, as a result of pressing the bar, a food pellet falls into the cage.  The operant is the behavior just prior to the reinforcer, which is the food pellet.  In a relatively short period of time the rat "learns" to press the bar whenever it wants food. This leads to one of the principles of operant conditioning--A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.

If the rat presses the bar and continually does not get food, the behavior becomes extinguished.  This leads to another of the principles of operant conditioning--A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.

Now, if you were to turn the pellet machine back on, so that pressing the bar again provides the rat with pellets, the behavior of bar-pushing will  come right back into existence, much more quickly than it took for the rat to learn the behavior the first time. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the rat was reinforced for pushing on the bar.  This leads to what are called the Schedules of Reinforcement.

Schedules of Reinforcement

Continuous reinforcement is the original scenario: Every time that the rat does the behavior [such as pedal-pushing], he gets a food pellet.

The fixed ratio schedule was the first one Skinner discovered: If the rat presses the pedal three times, say, he gets a goodie. Or five times. Or twenty times. Or “x” times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc.

The fixed interval schedule uses a timing device of some sort. If the rat presses the bar at least once during a particular stretch of time [say 20 seconds], then he gets a goodie. If he fails to do so, he doesn’t get a goodie. But even if he hits that bar a hundred times during that 20 seconds, he still only gets one goodie! One strange thing that happens is that the rats tend to “pace” themselves: They slow down the rate of their behavior right after the reinforcer, and speed up when the time for it gets close.

Skinner also looked at variable schedules. Variable ratio means you change the “x” each time -- first it takes 3 presses to get a goodie, then 10, then 1, then 7 and so on. Variable interval means you keep changing the time period -- first 20 seconds, then 5, then 35, then 10 and so on.  With the variable interval schedule, the rats no longer “pace” themselves, because they can no longer establish a “rhythm” between behavior and reward. Most importantly, these schedules are very resistant to extinction. It makes sense, if you think about it. If you haven’t gotten a reinforcer for a while, well, it could just be that you are at a particularly “bad” ratio or interval, just one more bar press, maybe this’ll be the one time you get reinforced.

Shaping

A question Skinner had to deal with was how we get to more complex sorts of behaviors. He responded with the idea of shaping, or “the method of successive approximations.” Basically, it involves first reinforcing a behavior only vaguely similar to the one desired. Once that is established, you look out for variations that come a little closer to what you want, and so on, until you have the animal performing a behavior that would never show up in ordinary life. Skinner and his students have been quite successful in teaching simple animals to do some quite extraordinary things.

Beyond fairly simple examples, shaping also accounts for the most complex of behaviors. We are gently shaped by our environment to enjoy certain things.

Aversive stimuli

An aversive stimulus is the opposite of a reinforcing stimulus, something we might find unpleasant or painful.  This leads to another principle of operant conditioning--A behavior followed by an aversive stimulus results in a decreased probability of the behavior occurring in the future.

This both defines an aversive stimulus and describes the form of conditioning known as punishment. If you shock a rat for doing x, it’ll do a lot less of x. If you spank Johnny for throwing his toys he will probably throw his toys less and less.

On the other hand, if you remove an already active aversive stimulus after a rat or Johnny performs a certain behavior, you are doing negative reinforcement. If you turn off the electricity when the rat stands on his hind legs, he’ll do a lot more standing. If you stop your perpetually nagging when I finally take out the garbage, I’ll be more likely to take out the garbage.  You could say it “feels so good” when the aversive stimulus stops, that this serves as a reinforcer.  Another operant conditioning principle--Behavior followed by the removal of an aversive stimulus results in an increased probability of that behavior occurring in the future.

Skinner did not advocate the use of punishment. His main focus was to target behavior and see that consequences deliver responses. From his research came "shaping" [described above] which is described as creating behaviors through reinforcing. He also came up with the example of a child's refusal to go to school and that the focus should be on what is causing the child's refusal not necessarily the refusal itself. His research suggested that punishment was an ineffective way of controlling behavior, leading generally to short-term behavior change, but resulting mostly in the subject attempting to avoid the punishing stimulus instead of avoiding the behavior that was causing punishment. A simple example of this, he believed, was the failure of prison to eliminate criminal behavior. If prison [as a punishing stimulus] was effective at altering behavior, there would be no criminality, since the risk of imprisonment for criminal conduct is well established, Skinner deduced. However, he noted that individuals still commit offences, but attempt to avoid discovery and therefore punishment. He noted that the punishing stimulus does not stop criminal behavior; the criminal simply becomes more sophisticated at avoiding the punishment. Reinforcement, both positive and negative [the latter of which is often confused with punishment], he believed, proved to be more effective in bringing about lasting changes in behavior.

Behavior modification

Behavior modification -- often referred to as b-mod -- is the therapy technique based on Skinner’s work. It is very straight-forward: Extinguish an undesirable behavior [by removing the reinforcer] and replace it with a desirable behavior by reinforcement. It has been used on all sorts of psychological problems -- addictions, neuroses, shyness, autism, even schizophrenia -- and works particularly well with children. There are examples of back-ward psychotics who haven’t communicated with others for years who have been conditioned to behave themselves in fairly normal ways, such as eating with a knife and fork, taking care of their own hygiene needs, dressing themselves, and so on.

There is an offshoot of b-mod called the token economy. This is used primarily in institutions such as psychiatric hospitals, juvenile halls, and prisons. Certain rules are made explicit in the institution, and behaving yourself appropriately is rewarded with tokens -- poker chips, tickets, funny money, recorded notes, etc. Certain poor behavior is also often followed by a withdrawal of these tokens. The tokens can be traded in for desirable things such as candy, cigarettes, games, movies, time out of the institution, and so on. This has been found to be very effective in maintaining order in these often difficult institutions.

Walden II

In 1948, Skinner published his actual ideas on child-rearing in Walden Two, a fictional account of a behaviorist-created utopia in which carefree young parents stroll off to work or school while their little ones enjoy all the comforts of community-run, behaviorist-approved daycare.  Skinner's book Walden Two presents a vision of a decentralized, localized society which applies a practical, scientific approach and futuristically advanced behavioral expertise to peacefully deal with social problems. Skinner's utopia, like every other utopia or dystopia, is both a thought experiment and a rhetorical work.

In 1971 he wrote Beyond Freedom and Dignity, which suggests that the concept of individual freedom is an illusion. Skinner later sought to unite the reinforcement of individual behaviors, the natural selection of species, and the development of cultures under the heading of The Selection by Consequences [1981], the first of a series of articles in the journal Science.

Additional Information

For more information about B.F.Skinner and mental health treatment, please click on the websites listed below.

Would You Like Personal Assistance?

If you really want help dealing with your feelings and emotions, changing your behavior, and improving your life and the approach and office hours of typical therapists and counselors do not fit your life style or personal needs, I may have a solution.

By using very flexible office appointments, telephone consultations, email, teleconferences, and the willingness to travel and meet with you personally in your home, office, or other location,  I can be available to help you anytime and anywhere.

Feel free to contact me now for your free initial consultation. Once you become an existing client, you will be given a  pager  number where you can reach me whenever you need.

  Contact Dr. Berger
F.A.Q.
Help is Available
  Who I Can Help
  How I Can Help
  What You Can Do
  Fees
  About Dr Berger
What Is a
  Psychologist
  Psychiatrist
  Clinical Psychologist
  Educational Psych...
  Forensic Psychologist
  School Psychologist
  Social Worker
  Life Coach
  Personal Coach
  Executive Coach
  Therapist
  Mental Health Prof...
  Pastoral Counselor
  DSM-IV
Types of Treatment
  Behavioral Therapy
  Biofeedback
  Cognitive Behavioral
  Desensitization
  Electroconvulsive
  Gestalt Therapy
  Hypnotherapy
  Neurolinguistic
  Psychoanalysis
  Psychotherapy
  Rational Emotive
  Reality Therapy
  Family Therapy
  Group Therapy
 Tests
  Intelligence [IQ]
  Myers-Briggs
  MMPI
  Neuropsych
  Rorschach [inkblot]
 Famous Psychologists
  Allport, Gordon
  Beck, Aaron
  Binet, Alfred
  Chomsky, Noam
  Ellis, Albert
  Erikson, Erik
  Erickson, Milton
  Freud, Sigmund
  Fromm, Erich
  Glasser, William
  Harlow, Harry
  Jung, Carl
  Kinsey, Alfred
  Laing, R.D.
  Leary, Timothy
  Lewin, Kurt
  Perls, Fritz
  Maslow, Abraham
  May, Rollo
  Piaget, Jean
  Pavlov, Ivan
  Rogers, Carl
  Satir, Virginia
  Skinner, B. F.
  Wolpe, Joseph
Contact
  Psych Associations
  Disclaimer
  Privacy
 
Psychologist
Anywhere Anytime
                                    Copyright 2005 Dr Vincent Berger                                     

What is the Skinner theory?

B. F. Skinner's theory of learning says that a person is first exposed to a stimulus, which elicits a response, and the response is then reinforced [stimulus, response, reinforcement]. This, ultimately, is what conditions our behaviors.

What are the 3 main theories of learning?

Although there are many different approaches to learning, there are three basic types of learning theory: behaviorist, cognitive constructivist, and social constructivist.

What is operant conditioning theory of learning?

Operant conditioning, sometimes referred to as instrumental conditioning, is a method of learning that uses rewards and punishment to modify behavior. Through operant conditioning, behavior that is rewarded is likely to be repeated, and behavior that is punished will rarely occur.

What is the behavioral theory?

Behaviorism is a theory of learning based on the idea that all behaviors are acquired through conditioning, and conditioning occurs through interaction with the environment. Behaviorists believe that our actions are shaped by environmental stimuli.

Chủ Đề