The Millennium Village PR Department Guardian newspaper reports "Child mortality down by a third in Jeffrey Sachs's Millennium Villages." Which is possibly true (I'm not going to even go into the validity of the non-random controls). But if you take a casual glance at the paper's results table, you'll also find no statistically significant impact of the project on poverty, nutrition, education, or child health.
Now THAT, folks, is science. (Here's the Lancet link, HT: Maham).
Of all 18 indicators, 10 are totally statistically insignificant (no difference between intervention and comparison) and only 1 of the 18 indicators is significant at the 1% level.
The text of the Lancet paper mentions 3 times that poverty has fallen in the village sites. And just once that this reduction is actually no different to that in comparison villages.
And check out this sentence;
For 14 of 18 outcomes, changes occurred in the predicted direction. No significant differences were recorded when comparing poverty ...So, mention the direction of the effect when it is the direction you want (but statistically insignificant from zero), and neglect to mention the direction of the effect when it is the direct opposite of what you want (but also insignificant).
Now THAT, folks, is science. (Here's the Lancet link, HT: Maham).
6 comments:
If you think the project had "
no statistically significant impact of the project on poverty, nutrition, education, or child health" why would it get published in the best Journal in International Public Health - the one that started the now seminal pieces in global malnutrition?
That is a very good question, but it's not "what I think", it's what it says if you actually look at the data in the Lancet paper.
Right, and the title of the paper refers to child mortality, not the other things, so it's not the fact that it got published that doesn't make sense. Lee's just highlighting that it doesn't make sense to emphasize one nonsignificant 'effect' while not putting equal emphasis on other 'effects' that were also not significant but that happen to go in the opposite direction as your narrative.
Yes, you're right rovingbandit. A project that was designed as a proof-of-concept study, where they are doing whatever statistical analysis they can in the meantime while ensuring the majority of the funds go where they are needed, does not prove to the 95% confidence interval that effects that it was not designed to measure were affected.
This means that you're smarter than everyone else and, also, since obviously all attempts to cure poverty are not proven by arbitrary standards to work, you can keep spending your money on useless fluff and feel good about yourself because there's nothing we can do to make this world a better place. Except we can. And you could be helping. But, instead, you're (not impressively) attempting to poke fun at those that are dedicating their lives to creating the sort of stable, equitable world where not only do we not have the moral hazard of hundreds of millions of starving children while some amongst us live large but we also have the sort of stable economic and growth profiles that lead to a world largely devoid of security threats to the more prosperous nations such as ours.
But, yes, you're right, that was number does look funny!! Hahahahaha!!
Karl, I'm upset about the deliberate misrepresentation of statistical results by Sachs in the world's leading medical journal, which I think is a legitimate concern. I agree with Sachs about a lot, but I think he would be a better advocate if he didn't keep trying to do this.
Not that it matters at all to my point, but for the record I bought 100 bednets for Malawians this year and sent cash to poor Kenyans (though I would like to give more), I work full time on development, and I spend much of my free time advocating for better aid and development policies on this website.
I do think that there is plenty that we can all do to make the world a place, but I also think it is important that our decisions are driven by evidence so that we don't waste money that could be saving lives. Which is why the deliberate misrepresentation of evidence upsets me.
Cheers,
Lee
Karlito
Obviously, it is frustraiting when things don't work and we fail to find methods to improve the lives of people who need some sort of help. But that doesn't mean that there is anything wrong with pointing out (or working out, or better, measuring) that something doesn't work before rolling it out to other unsuspeting populations. European and a Americans have a long history of implementing out well meant ideas without really kowing before the fact whether or not the implementation will do any good. We owe it to the people that we are hoping to help to put out zeal and idiology to one side for a moment and hold our efforts abroad to the same standards that we expect for our own families.
Most of the people that we are trying to help do not have access to means of influence and they cannot make decisions or undergo studies to determine the course of the development progrmas that are designed to help them. It is not a trivial matter that someone needs to step in (preferably someone more impartial than the authors of this paper) and ask, does it work? Because if it doesn't then these people have every right to say, 'no more of that please, send us something that does work' rather than simply accept our western offerings with grateful eyes.
Post a Comment