Wednesday 2 November 2011

A good news story

This piece of junk research came out a couple of months ago...

Meat eaters are selfish and less social

"Meat brings out the worst in people. This is what psychologists of the Radboud University Nijmegen and Tilburg University concluded from various studies on the psychological significance of meat.

Thinking of meat makes people less socially [sic] and in many respects more "loutish". It also appears that people are more likely to choose meat when they feel insecure, perhaps because it is a feeling of superiority or status displays, the researchers suggest.

Marcel Zeelenberg Tilburg professors (Economic psychology) and Diederik Stapel (consumer sciences and dean of Tilburg School of Social and Behavioral Sciences) and the Nijmegen Professor Roos Vonk (social psychology) examined the psychological significance of meat.

The conclusion was that eating meat is symptomatic of some sort of psychological disorder. This, of course, was just what militant vegetarians wanted to hear and it was eye-catching enough to make it into the newspapers.

Roos Vonk, known for her columns and books about how our ego gets in our way, doesn’t feel shocked. "Previous research had already shown that meat eaters think more in terms of dominance and hierarchy (who is the boss?) than vegetarians. Eating meat is also traditionally associated with status, meat used to be much more expensive and scarcer than now. Eating meat is a way to elevate yourself above others. But by uplifting yourself, you lose connection with others. That explains why there are more insecure people in need. It also makes people loutish when they think about meat and also feel lonely. "

Diederik Stapel adds to it: "It seems that vegetarians and flexitarians are happier and feel better, and they are also more sociable and less lonely."

Diederik Stapel is a social psychologist with a string of peer-reviewed studies to his name, including this one - just another junk scientist forcing his beliefs onto others with the veneer of social science. Nothing special about that, except that this story has a happy ending.

Dutch 'Lord of the Data' Forged Dozens of Studies

One of the Netherlands' leading social psychologists made up or manipulated data in dozens of papers over nearly a decade, an investigating committee has concluded.

Diederik Stapel was suspended from his position at Tilburg University in the Netherlands in September after three junior researchers reported that they suspected scientific misconduct in a study that claimed eating meat made people more aggressive.

Stapel's work encompassed a broad range of attention-catching topics, including the influence of power on moral thinking and the reaction of psychologists to a plagiarism scandal. The committee, which interviewed dozens of Stapel's former students, postdoctoral researchers, co-authors, and colleagues, found that Stapel alone was responsible for the fraud. The panel reported that he would discuss in detail experimental designs, including drafting questionnaires, and would then claim to conduct the experiments at high schools and universities with which he had special arrangements.

The experiments, however, never took place, the universities concluded. Stapel made up the data sets, which he then gave the student or collaborator for analysis, investigators allege. In other instances, the report says, he told colleagues that he had an old data set lying around that he hadn't yet had a chance to analyze. When Stapel did conduct actual experiments, the committee found evidence that he manipulated the results.

This is the kind of thing that the public expects peer-review to be able to weed out. In practice, alas, peer-reviewers do not verify raw data nor do they obtain proof that experiments have been carried out. Most of the time, they wouldn't be able to perform these checks even if they wanted to.

Not that I'm suggesting that peer-review is massively over-rated - sometimes reviewers will correct spelling mistakes.

The data were also suspicious, the report says: effects were large; missing data and outliers were rare; and hypotheses were rarely refuted. Journals publishing Stapel's papers did not question the omission of details about where the data came from. "We see that the scientific checks and balances process has failed at several levels," Levelt says.

The case of Mr Stapel is highly unusual. He got caught.

One down, hundreds to go.


Wiel said...

So you say we're just at only 1% in our fight to proof that junk science is all over the place, Chris?!

Well, it's at least one step forward. Although, maybe it doubles every next time as the balls start to roll: 1,2,4,8,16,21,64.......128!

Greetings from the Netherlands (where obviously a lot can be achieved that doesn't happen in the rest of the world ;) )

Wiel said...

Oops, 32....

Ben said...

How did he (they) become professors in the first place?
And why would a Journal be interested in publishing such an outlandish claim and such a useless study, even if the data was somehow real?

Anonymous said...

"Ben said...
How did he (they) become professors in the first place?"

Simply because Academia has assumed a position of superiority providing paper qualifications 'superior' to real life achievement. The best route to the top is then to dispose of all integrity and simply suck up to those higher up the academic food chain.
This will continue for as long as we have a university sourced political elite with no connection with real life and a willingness to fund those who 'educated' them

Michael J. McFadden said...

Chris, you noted that, "Not that I'm suggesting that peer-review is massively over-rated - sometimes reviewers will correct spelling mistakes."

You can't always count on them for that either though. Case in point: the massive and highly publicized Glantz/Lightwood study,"Declines in Acute Myocardial Infarction After Smoke-Free Laws and Individual Risk Attributable to Secondhand Smoke" that began its cogent Circulation published analysis with this first sentence:

The estimated effects of recent pubic and workplace smoking restriction laws suggest that they produce significant declines in community rates of heart attack. {Emphasis mine}

Of course, as I believe you noted in one of your blog entries here, the study just went downhill from there.


Simon Cooke said...

You ask why social science peer review - especially in sociology and social psychology - fails to spot fundamental data errors up to and including making stuff up.

This little tale might help:

While studying for an MSc in Urban Regeneration, we were required to undertake a module on research methods. No problems there - after all we were undertaking research.

At no point in this module - conducted by a senior academic - did we touch on quantitative data gathers, statistical analysis or data interpretation. Not a hint of significance tests, chi squared or standard deviation.

I am pretty sure that, of those taking the course with me, I was the only student who gathered and analysed new statistical data (a task that involved spending Summer in street markets) and applying it to a model.

There's your problem - loads of academics in social sciences simply don't have a good enough grasp of statistics to know when the wool is being pulled over their eyes.

Angry Exile said...

Roos Vonk's remarks are hilarious. Yeah, sure, meat eaters think in terms of dominance, right. That'll be why they're constantly nagging tolerant, live-and-let-live vegetarians to switch to an all meat diet and rocking up to dinner at a veggie's house making remarks such as "Vegetables are what food eats" and demanding a plateful of something bloody is made especially for them. Except that in most people's experience it's the other way round, isn't it? For a group supposedly less concerned with dominance there are many veggies who seem awfully determined to dominate other people's lifestyle choices.

Angry Exile said...

PS And what the hell's a flexitarian? You eat power cables?

Jean said...

Roos Vonk, known for her columns and books about how our ego gets in our way, doesn’t feel shocked. "Previous research had already shown that meat eaters think more in terms of dominance and hierarchy (who is the boss?) than vegetarians. (..)"

I don't feel shocked either. Since vegetarian are a small minority, and since they decline to eat meat often for intellectual reasons, it's absolutely unshocking to find out that some of them don't behave or think exactly the same way as non-vegetarians.
Said differently, the study's "finding" is that vegetarians are more or less hippies.
If that's Mrs Vonk's story, it's a short one.

Michael J. McFadden said...

Simon, you wrote, "There's your problem - loads of academics in social sciences simply don't have a good enough grasp of statistics to know when the wool is being pulled over their eyes."

There's an excellent example of this sort of thing going on right here in the states over the last month or so. A "Big New Report!" came out about how wonderful a state smoking ban was. They reported that tax revenue from bars had gone UP after their ban.

There was just one little problem with that. In another segment of the report they'd polled Ohioans about their bar-going habits and found that 40% of smokers were going to bars less, AND 20% of NONsmokers were going to bars less.

So with 100% of the population going to bars roughly 25% LESS after a ban... how can they claim the total bar income went up?

Steve Mace was planning to ask the lead researcher about that. I believe he's still waiting for an answer.


Anonymous said...

Speaking about the flaws of peer review,what about meta-analysis?

Especially meta-analysis that come from world organisations...who gets to check on them?

Epidemiology seems to be a junk science...

Michael J. McFadden said...

Karragianas, meta-analysis has a lot of weaknesses, but if you think about it, it's one of the FEW areas of antismoking research where the researchers can NOT simply "make up the data." The studies they use are, for the most part published. Whatever weighting and selection procedures they use should also be clear. With that information in front of them researcher should be able to determine if the study was at least done with honest numbers and methodology.

The BASE studies that meta-analyses are based on are an entirely different kettle of fish. The Helena "Heart Miracle" study would have been no miracle at all if just two or three heart attack occurrences were treated differently in the analysis. Were the results of that study rigged? There's no way of knowing unless you got clearance to go into the same patient records at the hospital that Glantz et al used and were able to determine as well the exact criteria they used to include and exclude patients.


nisakiman said...

It would seem that I'm a lonely, insecure lout, according to Ms Roos Vonk's theories.

"Roos Vonk, known for her columns and books about how our ego gets in our way, doesn’t feel shocked. "Previous research had already shown that meat eaters think more in terms of dominance and hierarchy (who is the boss?) than vegetarians."

Or to put it another way, vegetarians think like sheep...

kwik said...

Evolusion also shows that meat eaters are smarter than grass-eaters. Easy to understand. It doesnt take much IQ to sneak up on a grass-straw....

Michael J. McFadden said...

I dunno Kwik. I saw it take a hippie almost three hours to stalk down a wily pot plant in his garden one time years ago.

Of course he WAS kinda stoned at the time....

P.S. And no, it wasn't ME! Nyahh!