Research Review: More TV = more munching
Research shows that people eat more while watching TV. But does this “TV munchie effect” persist even hours afterwards? The results may surprise you…
Ever work in an office where food — usually junk food — was nearly always around? Maybe you work there now. Do you beat yourself up because you end up succumbing to the temptation and having a cookie or twelve?
Well you’re not alone. Researchers have found that a lot of environmental factors have a huge influence on how much and what you eat.
For example: how close are the chocolate candies to your desk? Can you see them? If the chocolates are 2.5 metres away or on your desk but in an opaque container, then you will eat less than if they are closer to you and you can see them. (1)
Hmm, so all it takes to eat less is to move the chocolates or cover them? Maybe.
How environment shapes our food choices
We might assume that our food choices are under our conscious control. But many studies like the “office candy jar” study have found that several environmental factors also change how much people eat.
For instance, people eat more when eating with others than alone.(2) People eat more with friends and relatives than with strangers.(3) But people ate less when dining with someone of the opposite sex they thought was attractive.(4) (I have an idea for a business: Dater Dinner Dieting! Hook up, stay lean — kill two birds with one stone!)
And if TV wasn’t bad enough for you, it also has been found to increase how much people eat. If you watch TV while you eat, you will probably eat more than if you had the TV off.(5)
So what happens if you watch TV and then have a snack over 2 hours later? Does it matter? How could it matter?
Those are the questions this week’s research article tries to answer. We can now all pretend that the article title doesn’t give the whole thing away.
Higgs S, Woodward M. Television watching during lunch increases afternoon snack intake of young women. Appetite. 2009 Feb;52(1):39-43. Epub 2008 Jul 23.
16 young women (19 years old on average) were participants for this study. I’d say 16 is pretty low for this type of study, though it was a “within-subject design”.
What the heck is a within-subject design?
Each participant (aka subject) partakes in the experiment twice – once with one condition and once with the other condition. Then researchers can compare each participant’s responses in the two situations.
The advantage of this type of experiment design is that you avoid results that occur because people are different from one another. You can compare person A to person A, not person A to person B.
For example, if you had 32 people with 16 people in each experimental group, then any differences between the groups could be because of:
- Experimental differences: Say one group got water and the other group got alcohol and lo and behold the group that got alcohol had slower reaction times and lower motor control scores. You would agree that these differences are because of the experimental differences – water versus alcohol.
- Differences in participant makeup: This is usually accidental and cannot be completely avoided if you are comparing two different groups of people. It could be, for instance, that one group has a few people that have a genetic variation that changes the way they respond. Or there may be differences between, for example, older and younger people that the researchers don’t realize are relevant — in this case, if groups are not age-balanced, it might throw things off.
- Group formation differences: With some studies, such as market research, researchers examine how groups act as a whole. But sometimes, when people get together, the group itself becomes an entity with its own character. How you act or respond to questions might depend on who else is in the room with you.
By having an experiment with two different groups and comparing them, you can never be sure that the differences are because of the experiment and not the people.
What to do? Use the same people for both experimental conditions! Just as they did in this experiment.
Each participant ate lunch twice, on different days, with 2 days between study lunches. On one day they watched a 10 minute video clip, and on the other there was no TV.
Half of the participants had the TV lunch first and the other half had the TV-free lunch first to minimize any order effects (in other words, the potential effect of whether TV came first or second).
The video clip was from the DVD Live at the Top of the Tower, featuring the comedian Peter Kay. The article described the clip as “a light-hearted and popular comedy clip” that would sufficiently distract the participants from their lunch.
After lunch, the participants were asked a few questions about how they felt. Researchers asked participants to rate their mood and appetite between 1 (not at all) and 100 (extremely). This included how:
the participants felt at that moment.
Once the questions were done, researchers asked the participants to return in 2½ hours. Participants were also told not to eat or drink anything during the 2½ hours.
After the break, researchers asked participants to taste a variety of cookies and rate each cookie. They were also asked to describe how vividly they remembered the lunch they ate 2½ hours before.
Lunch wasn’t particularly PN compliant (tsk), nor what I would call healthy: ham sandwiches made with 2½ slices of granary bread (not sure how you make a sandwich with 2½ slices of bread) with margarine and 1¼ slices of honey baked ham. Also, maybe because this study was done in England or the experimenters are just quirky people, the crusts were removed and were cut into 10 portions. Lunch also included 15 grams of salted crisps (aka potato chips for the non-UK folk).
Total calories were 400 cal and unless you consider chips a vegetable – no fruits or vegetables. (Tsk tsk.)
The cookie tasting offered 40 gram cookies: Cadbury’s milk chocolate fingers (520 calories/100g), Maryland chocolate chip cookies (511 calories/100g) and for the health conscious (just kidding) McVities digestives (495 cal/100g).
What were the scientists looking for?
There was a lot of eating, questioning, and tasting, probably to distract the participants from the real purpose of the experiment. What the scientists were really interested in was how many cookies participants ate after watching TV (or not) during lunch.
Remember, we know that watching TV while eating makes you eat more. But the researchers wanted to know whether this “TV munchie effect” would persist even hours after watching TV.
Would watching TV while eating lunch 2.5 hours before tasting cookies change how many cookies the participants would eat? Could the “TV munchie effect” be even stronger than we initially imagined?
The answer is a bit of a surprise and the reason is even more interesting.
When the participants watched TV during their lunch they ate more cookies – about 15 grams more on average (or about 55 calories more). In a year of daily cookie munching, that would be over 20,000 calories! Or just under 6 pounds of fat (just under 3 kg).
This is quite a bit considering the only difference between experiences involved watching TV (for only 10 minutes!) while they ate the exact same lunch.
Why did watching TV matter?
Watching TV didn’t affect how people perceived their moods. Participants said they didn’t feel any more hungry, full or bloated after the TV episode, which you’d think would matter most. There was also no difference in happiness, alertness, irritability, relaxation or sadness.
What did matter was that when the participants watched TV they couldn’t recall the lunch as vividly as when they didn’t watch TV.
When they were asked how vividly they remembered lunch, using a 100 point scale (1 being “not at all” and 100 being “extremely”), their scores were lower when they watched TV during lunch. Their attention was elsewhere, so they weren’t focused on lunch.
In other words, it seems that consciously remembering what they ate beforehand was the important factor. TV made people forget what they were eating. If they didn’t remember what they ate, they were inclined to eat more later!
So not only does watching TV while eating increase how much you eat it looks like watching TV increases how much you eat later on. Whoa!
The study suggests that the reason you eat more later (the snack in this case) is because you forget eating earlier (lunch). Well, you don’t forget the lunch completely, but you don’t recall any details. So the idea of savouring your meals to lose weight, while it may seem contradictory on one hand, seems to make sense if memory plays a role in future eating.
It does make you wonder how fast food and eating on the run impacts how much you eat later on in the day.
Don’t eat while watching TV! Minimize distractions when you eat and savour your meals.
That’s not too bad, probably the easiest and most pleasurable weight loss advice you could ever get – take your time and enjoy what you’re eating.
To learn more about making important improvements to your nutrition and exercise program, check out the following 5-day video courses.
They’re probably better than 90% of the seminars we’ve ever attended on the subjects of exercise and nutrition (and probably better than a few we’ve given ourselves, too).
The best part? They’re totally free.
To check out the free courses, just click one of the links below.