Study: Can We Change People’s Behavior with “Nudging?”

This post contains a video, which you can also view here. To support more videos like this, head to!


Let’s talk about nudges. Not, like, “wink wink nudge nudge” I just made a double entendre and would like to make it painfully obvious to you, but “nudges” as they are defined by behavioral psychologists.

Nudging is an old sort of psychological idea but it gained prominence and a catchy name in 2008 with the publishing of the book “Nudge: Improving Decisions About Health, Wealth, and Happiness,” by economist Richard Thaler and legal expert Cass Sunstein. They defined a nudge as “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not.”

And those are real examples: in 2011 researchers at UPenn found that at a self-serve salad bar, “Making a food slightly more difficult to reach (by varying its proximity by about 10 inches) or changing the serving utensil (spoon or tongs) modestly but reliably reduces intake, in the range of 8-16%.”

Another commonly cited example of nudging is when bars and restaurants put a realistic image of a fly near the drain on a urinal, which subtly encourages men to aim their pee at it, reducing splashing and lowering cleaning costs.

So far so good, right? Seems believable: people’s brains are constantly using shortcuts and offloading thinking to some automatic processes that make it relatively easy to subtly influence their actions in some scenarios. I mean, I have a degree in advertising so trust me when I say this isn’t terribly controversial: marketing goblins have known for years thanks to A/B testing how small changes can impact behavior. In A/B testing, let’s say you work on a political campaign and you have two different subject lines to send out asking people to donate. To figure out which one works better, you first send both of them out to separate groups. Whichever one gets more engagement gets sent out to the majority. Here’s a fun fact: a friend of mine worked for a political campaign and through careful testing found that people donated more money when the email contained hideously unprofessional graphic design. Seriously, like 10 different fonts, garish colors, low res images. They were unable to find an upper limit for that, because the graphic designers at some point simply refused to make things any uglier. Fascinating!

So as you might imagine, there have been a lot of studies over the years that look at “nudging” to see how effective it is in various settings, and a lot of that research has been positive. Governments added behavioral psychologists to their teams to come up with clever ways to improve society with minimal cost. And then COVID-19 happened, and everything exploded: hey, experts, how do we get people to wash their hands? How do we get them to stay 6 feet away from each other? How do we get them to wear masks? How do we get them to stay home? How do we get them to get vaccinated? Sure, you can institute mandates, but we don’t want people to be angry with us!

That’s how we ended up with things like vaccine lotteries, which I talked about a few times back in 2021, with early reports suggesting lotteries encouraged vaccination and later data suggesting they didn’t really do much after all. And that was a problem throughout the past few years: by the time the data is in, the situation has already changed and we’re on to some other problem to solve. Nudges are, by definition, small efforts that yield positive results, and maybe they’re just not what we need for a giant problem like a rapidly changing worldwide pandemic. In a recent article on this topic in Undark, Bryony Lau cites a psychologist who points out that “Nudging works if people are already inclined to do the thing they are being reminded to do,” according to a study that determined people’s relative “nudgability,” which is an adorable term. But that, maybe, is why “tactics that worked earlier in the vaccination campaign no longer did. Governments and businesses were increasingly dealing with vaccine holdouts who couldn’t be nudged or offered incentives. Instead, mandates caught on, with major companies like United Airlines requiring employees to get vaccinated to come to work.”

What happened over the past decade, in essence, is that researchers found a delightful psychological tool that could be used to subtly create positive change, but then the world fell so in love with this idea that we decided we could apply that tool to every problem we have. Yes, a hammer is a very helpful tool to have, but sometimes you actually need a wrecking ball.

In December of 2021, Swiss researchers published a meta analysis in PNAS that evaluated 200 different studies on nudging, finding that nudging promotes “behavior change with a small to medium effect size,” with some types of nudging, like influencing food choices, having a greater effect than others.

So it seems that despite the lack of success nudging has shown during the pandemic, it DOES work, right? WELL. You know what I say about every meta-analysis and systematic review I talk about on this channel: they can provide excellent overviews of one particular topic or field, but they can also cherry pick the studies that only show what they want to show.

And so, waves were made this month when ANOTHER group of psychologists published a response, also in PNAS, that called into question the statistical analysis performed by the authors of the meta-analysis. In the original paper, the authors did admit “Our analysis further reveals a moderate publication bias toward positive results in the literature,” which is a thing known as the file drawer effect: scientists are more likely to publish research that shows a positive result than one that shows no result at all. It’s more exciting and more interesting and more likely to get you tenure, so we have to be careful when we see a bunch of papers that say “X had an effect” and very few papers that say “X had no effect.” It doesn’t necessarily mean X has an effect – to determine that, you have to use the proper statistical analyses to control for the negative results you DON’T see published.

The original paper said they did find a “moderate” publication bias but the new paper asserts that those researchers failed to actually take that bias into account when determining the effect size of the nudging. The new paper asserts that when you DO take the bias into account, the effect size disappears for pretty much every category except “food,” which didn’t seem to have a very strong publication bias.

I have to admit that this was a giant nerdy statistical mic drop of a paper. It’s literally two pages long. And it does have a lot of experts asking things like “The death knell for nudging?”

But I’ve seen a number of other experts taking a more moderate standpoint, which I hope to summarize fairly as such: “nudging” is a catch-all term for a wide swath of actions with an equally wide variety of hoped for results. Can you really do a meta-analysis that compares, say, “getting a handful of kids to pick a fruit cup instead of a cookie for dessert at an after school program” with “convince 100 million Americans to get vaccinated?” And those are purposely ridiculous things to compare but what about similar nudging experiments, like “getting kids to pick a fruit cup by telling them they also get a prize if they take it over a cookie” and “getting people to pick healthier options at a salad bar by making the unhealthy things harder to reach”? Personally, in my unprofessional opinion, you still can’t compare them. Not every idea you have for encouraging healthy eating is going to work for every population. Maybe the kids don’t pick the fruit cup because they don’t like your prize…that wouldn’t tell us anything about whether or not adults at a salad bar will make healthier choices if you put the salad dressing out of reach. They’re both “nudging” about healthy food choices but they’re very different things.

So no, I don’t think this is the death knell of nudging but maybe we can ditch the overly dreamy idea that nudging is the world’s cure-all, which to be fair happens to every psychological concept that ends up the star of a popular science book that Oprah probably likes. Microexpressions, the wisdom of crowds, anything Malcolm Gladwell writes about: it’s okay for a small thing to have a small effect in specific circumstances. It just doesn’t necessarily sell books, or get researchers tenure.

Rebecca Watson

Rebecca is a writer, speaker, YouTube personality, and unrepentant science nerd. In addition to founding and continuing to run Skepchick, she hosts Quiz-o-Tron, a monthly science-themed quiz show and podcast that pits comedians against nerds. There is an asteroid named in her honor. Twitter @rebeccawatson Mastodon Instagram @actuallyrebeccawatson TikTok @actuallyrebeccawatson YouTube @rebeccawatson BlueSky

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button