Our Beta trial is ending soon and we will be moving to a paid subscription service. As a special thanks for your feedback and support, we will be offering extended free access and special pricing to our registered users. If you are not registered yet, register now to qualify. For questions, contact us and we will be happy to help you!

Variable Ratio Schedules: Examples, Definition & Quiz

  • Lesson
  • Quiz
  • Like?
Taught by

Chris Clause

Learn the definition of variable ratio schedules of reinforcement and see everyday examples in order to increase your understanding of how they work. You will have the opportunity to test your knowledge with a short quiz following completion of the lesson.

What are variable ratios schedules of reinforcement?

Like all schedules of reinforcement, variable ratio schedules of reinforcement are an important aspect of operant conditioning. A variety of schedules of reinforcement can be used to reinforce behavior, all possessing their own unique properties. In addition to variable ratio schedules, variable interval, fixed ratio, and fixed interval schedules of reinforcement exist and are used to reinforce behavior.

The easiest way to understand the meaning of variable ratio schedules of reinforcement is to understand the individual words that comprise the concept, variable, ratio, schedule, and reinforcement . These four words individually can have many meanings, but when used together they have a very specific meaning within the context of operant conditioning.

Variable refers to reinforcement being delivered following an average number of responses. In the world of operant conditioning, schedule refers to how often the reinforcement is provided. Reinforcement refers to a reward for engaging in some specific behavior. So, a certain behavior is exhibited when a reinforcer is presented. The concept of reinforcement says that the reinforcer should provide motivation for the behavior to be repeated. A pretty basic example of a schedule of reinforcement would be giving a child a prize of candy every time he cleans his room. This is referred to as a continuous schedule of reinforcement because the reward is provided every time the behavior occurs.

A variable ratio of schedule of reinforcement is a little bit more complex. In a variable ratio schedule of reinforcement the response is delivered on a schedule based on an average number of responses. For example, a variable ratio schedule that is set up to deliver a reinforcer based on an average of 5 responses might deliver reinforcement after the 2nd, 3rd, and 10th responses (5 is the average of 2, 3, and 10). Don't worry, after a couple of examples this will make sense.

Ratio refers to the fact that the reinforcement is delivered following a specific set of behavioral responses. It does not matter how much time has passed (that is what interval schedules of reinforcement are for). When it comes to ratio schedules of reinforcement the only thing that matters is that the behavior occurs a specific number of times.

So, a variable ratio schedule of reinforcement is a schedule of reinforcement wherein a reinforcer is provided following a pre-determined average number of responses. To the responder it can be mysterious as to what the reinforcement schedule actually is, but as we will see from two examples, not knowing when the reinforcer is coming can still be quite reinforcing.

Everyday Examples

Let's look at a couple of examples of variable ratio schedules of reinforcement in everyday life.

Slot Machines

It's pretty safe to say that slot machines can be used to successfully alter human behavior. Go into any casino across the US and you will see people repeatedly pulling the handle or pushing the button over and over again believing that the next pull or button push could result in a big payout. Thanks to variable ratio schedules of reinforcement people will continue to put money in the machine even if they don't initially get rewarded.

Slot machines are pre-set to payout after an average number of responses (handle pulls or button pushes) have been delivered. For instance, if it was set up to payout after an average of 10 responses, you might win some money on the 5th pull, the 12th pull, and the 13th pull (the average of 5, 12, and 13 is 10). This variable ratio schedule of reinforcement results in an exciting experience, since you never really know when the reinforcer is coming. For many people not knowing is what keeps them playing.

Animal Training

Variable ratio schedules of reinforcement are not just for humans. Animals learn to associate behavioral responses and rewards too. Variable ratio schedules of reinforcement can be used to train animals to perform desired tasks. For example, a horse trainer might give his horse a peppermint as a reward for a successful jump. If the horse trainer chose to employ a variable ratio schedule of reinforcement then, like the slot machine, the reward would come based on an average number of responses. So a schedule based on an average reward every 5 jumps might yield a peppermint after jumps number 1, 4, and 10 (the average of 1, 4, and 10 is 5). In this example it would not take long for the horse to figure out that the jumping behavior is tied to the yummy peppermints in some way. Like with the slot machine, the horse doesn't need to know to what the schedule is to continue to jump, he just knows that the more he jumps the more likely he is to get a peppermint.

Summary

Variable schedules of reinforcement can result in rapid behavior change. Not knowing exactly when the reinforcer is coming does not matter as much knowing that it is coming, and knowing that the next response could be rewarded. Whether a human or a horse, the brain quickly learns that the more I pull the lever on the slot or the more times I make a successful jump, the greater my odds of receiving reinforcement.

Variable ratio schedules of reinforcement are one of four classic schedules of reinforcement employed in operant conditioning. Variable schedules of reinforcement rely on providing reinforcement following a pre-determined average number of responses. This pairing of reward and reinforcement results in a relatively quick association being established between the two. Having an understanding of variable ratio schedules of reinforcement enables us to better understand the world in which we live by becoming more informed about our own behavior and the behavior of those around us.

Ask Our Experts
Thanks! Your question has been submitted to our experts and will be answered via email. You can check the status of your question on your dashboard.
Response times may vary by topic.

Our experts can answer your question related to:

  • Requirements for Different Careers
  • Enrolling in College
  • Transferring Credit
  • And More…
Did you know …

This lesson is part of a course that helps students earn real college credit accepted by 2,900 colleges.

Learn how simple it is.

Did you like this?
Yes No

Thanks for your feedback!

What didn't you like?

What didn't you like?

Share
Copyright