News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

My Biz

Submit content

My Account

Advertise with us

How to make health news interesting - without overselling the claims

Health stories are prolific in the news. Each year, thousands of articles are published claiming to have the latest compelling evidence on how we should eat, drink, exercise, sleep, and which medications we should or shouldn't be taking - among a host of other things.
Roman Samborskyi/Shutterstock
Roman Samborskyi/Shutterstock

Not only is there a deluge of information, it is also often conflicting. Reports on statin use, for example, have stated there are associations between taking them and living longer, ageing faster, reduced stroke risk and increased diabetes risk.

Every day, these reports are read and shared by millions, potentially influencing our decisions and behaviour – but how do we know that the evidence we’re relying on is strong enough? Writers need easy ways to communicate the strength of evidence without reducing interest or readability. But that can sometimes mean the public is over or undersold its relevance.

For our latest research, we wanted to find a way to help writers accurately communicate research evidence, without diminishing reader interest in the claims. To do this, we teamed up with nine UK press offices, from journals, universities and funders, to run a randomised trial with health-related press releases.


Read more:
Don't believe the Daily Express, it takes a lot more than carrots to beat cancer

We focused on press releases because they play a crucial role in science news. When the latest research is published, a press release is used to summarise the study’s most “newsworthy” results. The press release is then sent to journalists who use the material to write the news.

Previous research has shown that there is a close correspondence between the content of the press release and the news articles that follow – journalists have little time and tight word limits, so aren’t always able to build a more in-depth piece.

Mays and mights

The first aim of our work was to improve the alignment of news claims with the underlying evidence by focusing on the wording of press releases. In the intervention arm of the study, we reserved strong language, such as “causes”, “affects” and “boosts”, for strong causal evidence from trials and experiments.

In observational research, cause and effect is difficult to determine due to uncontrolled variables. For example, an association may exist between ice cream sales and water consumption – not because one causes the other, but because they both increase in sunny weather. So for this type of research we opted for weaker language, such as “may cause”, and “could affect”, in the releases. This distinction is not only easily understood by those who know the convention, but crucially also meaningful to all readers whether or not they have heard of correlations or clinical trials.

We found that the strength with which claims are made in press releases generally carries through into the news. Importantly, there was no detrimental effect on the likelihood of a story making it into the news if the language was softer. Whether or not a press release was picked up did not depend on the strength of the causal claims. When headlines and claims were softer in press releases, they were generally softer in news – despite the received wisdom that news is not interested in “mays” and “mights”.


Read more: From medical treatment to diet and lifestyle choice: how to spot unreliable health research

The second aim of our research was to make sure that stories included explicit caveats – such as “this research was observational and cannot show cause and effect” – when needed. Our results showed that these caveats were more likely to appear in the news when they were present in the press release.

A story on liver health and smoking published on MailOnline, for example, used a quote from the press release to state, “Dr Brown stressed this was an observational study and cannot say whether giving up smoking led to a reduction in drinking or vice versa”. The caveats did not appear to reduce news uptake, and were even associated with more news coverage – a result that matches parallel research showing that caveats do not reduce reader interest.

Most of these findings are based on observational analyses, and although we cannot show the direct effect of press release content on the news, we do know that journalists read press releases before writing the news. We also cannot show how such news content would affect public health. But our findings suggest that there could be a simple way to communicate the strength of evidence to the public without affecting uptake.

Causal inference is just one element of evidence strength, but there are many others that could, and should, be communicated to readers. For example, findings from larger studies repeated over a long period of time are more robust than those from small, single studies. Although the reporting of evidence strength in the media is only one factor in how people make health-related decisions, we believe that providing more easily decoded news is a step in the right direction.

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Source: The Conversation Africa

The Conversation Africa is an independent source of news and views from the academic and research community. Its aim is to promote better understanding of current affairs and complex issues, and allow for a better quality of public discourse and conversation.

Go to: https://theconversation.com/africa

About Petroc Sumner & Rachel Adams

Petroc Sumner, professor and head of school, School of Psychology, Cardiff University and Rachel Adams, research associate in cognition and neurostimulation, Cardiff University
Let's do Biz