Tuesday, 7 March 2023

The Transport for London food advertising con

 

There's a nice article by Duane Mellor and Dan Green in the Journal of Human Nutrition and Dietetics (no paywall) looking at how weak scientific findings are exaggerated in press releases and misrepresented by the media.

The age old question is who is to blame when the media reports false claims - journalists, academics or the press office? Increasingly, it seems to me that it is the academics themselves. When press releases are misleading it's usually because the politically motivated academics have provided quotes that go beyond the findings of their research, and the research is often worthless anyway.

Mellor and Green look at a number of case studies, one of which is the execrable study that used an insane model to claim that the Transport for London had led to a patently implausible reduction in calorie consumption. I wrote about it at the time. Mellor and Green pull their punches more than I would, but they get their point across.
 

The first example of this type of report and accompanying news story relates to the advertising ban on foods high in fat, salt and sugar across the Transport for London estate, which came into effect in 2019. The latest evaluation exploring the effects of this work was a piece of modelling from researchers at the University of Sheffield and London School of Tropical Medicine and Hygiene.62 This work was based heavily on data which explored household food consumption in London (intervention area) against households in urban areas in the north of England.63 

As with other examples discussed in this review, the premise of the work is not being questioned, as when designing public health intervention, evaluation is complex and modelling potential effects is important when making policy decisions. However, when reporting this type of research, inherent differences in food intake and choices between the north of England and London need to be considered as these residual confounders may explain the difference in energy (calorie) intake beyond any influence of advertising restrictions on public transport. 

The assumptions and the low quality of evidence supporting the development of the model used to predict changing prevalence of higher weight are not well explained, especially when translated into press releases.64-66 

What is perhaps of greater concern is that this was then used to predict the number of people living with obesity.62 Then through the institutional press releases,46, 48 these have been translated to actual fewer numbers of cases of people living with obesity and having developed type 2 diabetes and cardiovascular disease. This is concerning, as presenting modelled data as actual cases is not only overstating the value of the work, but when explained to the public,67 it can undermine public confidence in health messages. It is therefore recommended that when modelled data are presented, it needs to be clear that the data are modelled and not actually measured cases, therefore not ‘shown’ as one of the institutions involved in this work claimed.68

 
Since that study was published we've had 'public health' academics trying to pretend the sugar tax worked, despite child obesity rising for three years in a row after it came into effect. Having failed to find any impact among anyone except Year 6 girls, they ignored everything else and generated headlines which any normie would assume meant that child obesity had fallen (and note the use of the word 'know' in the tweet at the top of this post).
 
  
 
Mellor and Green made a reasonable recommendation, but it will fall on deaf ears. Junk modelling will continue to be employed - because it is the only way to pretend that a failed policy has worked - and numbers on a spreadsheet will continue to be portrayed as if they were real people.


No comments: