You’ve probably run into it dozens of times: you click on an interesting article headline only to be taken to content that doesn’t exactly fulfill the headline’s promise. Sites that have been labeled as click-bait farms, such as BuzzFeed and Upworthy, are often accused of this. However, even long-standing and reputable publishers have started using misleading headlines to get traffic.
Now, a new study shows that these headlines can have more of an effect on a reader’s interpretation of an article than the text in the article itself – even if the whole article is read.
Examples of Misleading Headlines
Everybody’s familiar with tabloid-style headlines that are clearly exaggerations or fabrications. Misleading headlines in supermarket tabloids and gossip magazines are to be expected. But what happens when the line between tabloid and hard news starts to blur? For example, anybody who isn’t a news junkie would likely consider a source called the Washington Times to be a reputable publication. However, take a look at this article from March of 2013 titled “Take it to the bank: Sen. Elizabeth Warren wants to raise minimum wage to $22 per hour.” That headline might grab you, but a quick read of the article will reveal that Warren never truly posited the minimum wage should be raised to $22.
A little research reveals that the Washington Times has long had a conservative bias. Stuff like this can be found everywhere – there’s no shortage of biased news sources for both liberals and conservatives. But what happens when one of the most well-known, supposedly unbiased news outlets is just as misleading? Check out this CNN article from earlier this year titled “Ebola in the air? A nightmare that could happen.” Again, this headline is definitely going to get some clicks (“ebola” was the top search term this year), but the experts interviewed for the story claimed that the chances of ebola mutating to spread through the air are actually very small. The headline could just as easily have been “Ebola in the air? Experts say it’s unlikely.”
Study Shows Headlines Skew Readers’ Thoughts About Content
The Australian study, published in the Journal of Experimental Psychology: Applied, gave participants four articles to read – two factual pieces and two opinion pieces, all of which were 400 words or less. The articles also presented different slants in their headlines. For example, one of the factual pieces concerned burglary rates, which had decreased by 10 percent in the past decade but showed a 0.2 percent rise in the last year. Readers read two articles on this topic, one titled “Number of Burglaries Going Up” and one called “Downward Trend in Burglary Rate.” When the study participants faced a surprise quiz after reading the articles, they were better at recalling information that was congruous with the headline. In other words, readers could remember more details about the declining trend in the article titled “Downward Trend…” while also having better retention of the 0.2 percent increase in the article titled “Number of Burglaries Going Up.” The headlines told readers what to focus on, and those are exactly the details they retained. On the other hand, most readers were able to infer that the burglary rate would decrease next year regardless of article headlines.
In the opinion pieces, however, both inference and retention were skewed due to misleading headlines. Readers were presented with a piece about genetically modified food. The article contained contrasting information from an expert and a layperson, the food expert stating that GM foods are safe and the layperson expressing concerns. Some participants read the article under the title “GM foods are safe,” while others saw the headline “GM foods may pose long-term health risks.” Despite reading the same exact article, participants were found to side with whatever slant the title took. Readers of “GM foods may pose…” were also more willing to pay extra money for organic food in the future.
What Does This Mean For Writers?
The main problem here is that publishers are posting articles with lofty headlines that generate clicks but end up actually leaving readers with skewed versions of the truth. This happens even if the whole article is read. Thus, the study suggests that content creators are doing a serious disservice to the their readers by using headlines such as these. The question is this: if publishers and article writers know that readers retain information from the headline more than anything else in the article, don’t they have a responsibility to avoid headlines that bend the truth? Can readers be blamed for not examining content more closely and getting to the true crux of a story?