jump to navigation

Medialens’s embarrassing archive (part 3) October 29, 2008

Posted by dissident93 in Iraq mortality, Medialens.
comments closed

Presumably anxious to present their campaign (against IBC) in a positive light, the editors of Medialens (in late March 2006) took credit for a claimed amendment to a BBC web page. And they generously acknowledged the help of their supporters in this supposed achievement by announcing:

“Well done everyone” (Medialens editors, Medialens message board, March 2006)

But the amendment that Medialens took credit for (a note about limitations of IBC’s approach) was entirely imaginary. As was pointed out to me at the time, the BBC page hadn’t changed since it was first published on 14 December 2005, more than a month before Medialens began their campaign. (The only changes were the updated numbers and a note about morgue figures which were recently added). Oops.

Medialens were rather less vocal over a clear example of the failure of their campaign. They’d hoped to inform the media and others that IBC’s figure was an incomplete count and not a “total” estimate (as IBC themselves have always made clear). Disappointingly, not even Les Roberts or Gilbert Burnham (co-authors of the Lancet 2006 study) got this right. In a prominent article for Slate magazine, Roberts and Burnham made the “shameful” error (Medialens had been condemning journalists in strong terms for making precisely this error):

“President Bush stated that 30,000 ‘more or less’ had died. The president’s estimate roughly matched the estimates of Iraq Body Count, which derives its total by monitoring newspaper reports of violent deaths. Today, IBC estimates there have been 45,000 to 50,000 violent deaths.” (Slate, 20 November 2006)

Medialens have always been quiet about this.

Misrepresenting Iraq Body Count October 22, 2008

Posted by dissident93 in Iraq, Iraq mortality, Media watchdogs.
comments closed

Note: An extended, updated version of this post
was published as a
featured article by ZNet.

David Edwards and David Cromwell (editors of Media Lens) have published several articles attacking Iraq Body Count (IBC). Their claims have been widely circulated as part of a sustained, vigorous (and at times aggressive) campaign against IBC. But Media Lens’s case against IBC is riddled with errors, and is discredited by recent research (WHO/IFHS, CRED, etc).

Basic errors by Media Lens

• One of the main premises of Media Lens’s campaign against IBC is that “IBC is not primarily an Iraq Body Count, it is not even an Iraq Media Body count, it is an Iraq Western Media Body Count” (Media Lens 14/3/06, my emphasis).

This is entirely false. IBC use non-Western media sources and non-media sources (eg hospital, morgue and NGO data). They monitor 72 major “non-Western” media on a daily basis, along with 120 “Western” sources. (IBC)

Many incidents/deaths in IBC’s database are from the major wire agencies. This merely reflects the fact that, for example, Reuters covers by far the highest percentage – approximately 50% of documented incidents, compared to 35% from Al Sharqiyah TV (another IBC source), and much lower coverage by other media sources, “Western” or “non-Western” (IBC). Note also that at the level of reporting utilized by IBC, the dichotomy of “Western” vs “non-Western” is false, as agencies such as Reuters employ (for example) Iraqi journalists in covering Iraqi incidents (“We mainly use local reporters, Arab reporters can go out and talk to people” – Reuters’ Baghdad bureau chief).

• In their first article targeting IBC, Media Lens wrote:

“Whereas the Lancet report estimated around 100,000 civilian deaths in October 2004, IBC reported 17,000 at that time.”

This is incorrect in two ways, and is typical of the lack of thoroughness in Media Lens’s research. First, the Lancet study didn’t estimate “civilian” deaths as Media Lens claim (its estimate includes “combatants” as well as civilians). Second, IBC record only violent deaths, so the comparison should be between 57,600 and 17,687 (57,600 being the Lancet study’s estimate of violent deaths, according to Lancet co-author Richard Garfield). But even that isn’t comparing like with like, since IBC do not include combatant deaths, whereas the Lancet study does.

• One of Media Lens’s main claims is that IBC captures only “5-10% of the true death toll”. One can see immediately that this isn’t supported by their comparison of Lancet 2004 and IBC. (IBC’s count of violent civilian deaths is 30% of Lancet 2004′s estimate of total violent deaths. In other words, IBC is capturing much more than 30% of the “true death toll” of violent civilian deaths, given Lancet 2004 as a measure).

(It’s interesting to note that later estimates, eg from IFHS and CRED, show that IBC is capturing at least around a third of violent civilian deaths – contrary to the claims of Media Lens. An earlier estimate, from ILCS, shows IBC capturing well over a half of violent civilian deaths.)

Latest errors by Media Lens

Media Lens continue their attack on IBC in a more recent article, Iraq Body Count: “A Very Misleading Exercise”. This contains several misrepresentations and errors, which I list below:

1. Media Lens write: “IBC’s response to the suggestion that violence prevents journalists from capturing many deaths has been, in effect, ‘Prove it!’”

This is plainly false. IBC have always stated that “many if not most civilian casualties will go unreported by the media. That is the sad nature of war.” Media Lens are aware of this (they’ve quoted IBC’s statement) and cannot claim ignorance.

2. Media Lens: “It is striking that IBC link to a high-profile media report that so badly misrepresents its figures”.

This is misleading. The purpose of IBC’s link (titled “Lists of victims or victim categories to signal the pervasive impact on every sector of Iraqi society”) is to provide an example of how media have used IBC’s data on individual victims (see lower section of the cited article, which is clearly titled “Victims’ Stories“).
Whether Media Lens’s assertion that the article “misrepresents” IBC figures has any merit or not is irrelevant to the point of the link. IBC doesn’t endorse misrepresentations of its figures.

(Given Media Lens’s advocacy for the Lancet studies on Iraq mortality, it’s “striking” that they fail to mention a similar misrepresentation of IBC’s figures by the Lancet study’s authors, in an article for Slate magazine: “Today, IBC estimates there have been 45,000 to 50,000 violent deaths”).

3. Media Lens: “Whereas IBC have responded vigorously, indeed tirelessly, in responding [sic] to the 2004 and 2006 Lancet studies…”

In fact IBC released only two documents commenting on Lancet 2006 (both mildly critical) and one on Lancet 2004 (uncritical):

http://www.iraqbodycount.org/analysis/beyond/lancet100000/
http://www.iraqbodycount.org/analysis/beyond/reality-checks/
http://www.iraqbodycount.org/analysis/beyond/state-of-knowledge/ (only part of this document deals with Lancet 2006).

That Media Lens is now condemning IBC for “responding” to the Lancet studies is itself an ironic turn of events. One of the main complaints of Media Lens’s earlier articles targeting IBC was that IBC were “Refusing to Respond”.

medialens_ibc_01

— Medialens alert, 14 March 2006—

4. Media Lens: “It was [Marc] Herold’s Afghan Victim Memorial Project that inspired John Sloboda to set up IBC. Herold’s ‘most conservative estimate’ of Afghan civilian deaths resulting from American/NATO operations is between 5,700 and 6,500. But, he cautions, this is ‘probably a vast underestimate’
[...] There is no reason to believe that the application of the same methodology in Iraq is generating very different results.”

Again this is mistaken and misleading. IBC use the same general approach as Marc Herold has used for Afghanistan, but they don’t use the same methodology. One of three reasons listed by Herold in support of his comment in the same article is that his count includes civilian victims directly killed by US/NATO bombings and military action, while excluding victims of the Taliban or other perpetrators. IBC of course includes killings by any perpetrators in Iraq. There are several other differences in the methodologies, and there are also reasons to believe the approach in Iraq is generating somewhat different results than in Afghanistan. But it is unlikely that Media Lens have looked into the matter in enough depth to know the reasons. They have not looked into the matter closely enough even to know that there are differences in the methodologies, or even to know that it is not Herold’s “Afghan Victim Memorial Project” (begun in 2004) that inspired IBC, but rather his “Daily Casualty Count of Afghan Civilians Killed by U.S. Bombing” – begun in 2001), two wholly different projects.

In any case, Herold has now written to ZNet stating that the paragraph written by Edwards and Cromwell has inaccuracies which need to be corrected, and that the inference drawn from it regarding IBC is unwarranted.

5. Media Lens: “…what IBC is doing to promote or reduce the confusion”.

This is an unworthy insinuation, suggesting IBC are “promoting” confusion, but providing no examples of this.

6. Media Lens: “Well, the bureau chief of one of three Western media agencies providing a third of IBC’s data from Iraq sent this email to a colleague last year (the latter asked us to preserve the sender’s anonymity)”.

Media Lens also cited an “anonymous epidemiologist” in their earlier pieces targeting IBC. It was noteworthy then, as it is now with this anonymous “bureau chief” and “colleague”, that these unnamed sources weren’t able to send their comments directly to IBC (who would, of course, have treated them in confidence), or stand behind them publicly. In effect it amounts to 3rd-hand rumour-mongering.

7. Media Lens: “…a new ORB poll revealing that 1.2 million Iraqis had been murdered since the 2003 invasion”.

This is inaccurate. ORB estimated 1.2 million murders. They did not “reveal” any number of actual murders. Note also that the ORB poll wasn’t peer-reviewed science. According to ORB’s publicity literature, the person conducting ORB’s poll, Munqith Daghir, began his polling career in 2003, with little in the way of formal training or field experience. The ORB poll doesn’t have the scientific standing of major studies such as ILCS, which Media Lens failed to mention.

8. Media Lens: “Why is it important for IBC [...] to challenge the methodology and conclusions of epidemiological studies published in the Lancet…”.

IBC didn’t “challenge” Lancet 2004 (see IBC’s uncritical press release on Lancet 2004), so Media Lens are incorrect to write “studies” (plural). And other leading researchers besides IBC have expressed scepticism over the Lancet 2006 estimates: Jon Pedersen of the UNDP Iraq study, demographer Beth Osborne Daponte, Fritz Scheuren, a past president of the American Statistical Association, Professor Hans Rosling and Dr Johan Von Schreeb at the Karolinska Institute in Stockholm, Oxford physicists Neil Johnson and Sean Gourley, Debarati Guha-Sapir, Director of the WHO Collaborating Centre for Research on the Epidemiology of Disasters (CRED), among many others.

9. Media Lens: “Secondly, while IBC’s self-described task does indeed require only “care and literacy”, does not the task of challenging peer-reviewed science published by some of the world’s leading epidemiologists require very much more? Does it not, in fact ‘require statistical analysis or extrapolations,’…”.

In fact, it does not require “statistical analysis” to observe that the Lancet 2006 figure implies that half a million death certificates are missing. It does not require “extrapolations” to observe contradictions in the accounts of the Lancet 2006 team’s description of sampling, or to note that the sampling methodology as published wouldn’t give you “random” street selection. You don’t need “world’s leading epidemiologists” to appreciate how important random sampling is.

Conclusion

The rhetorical basis of Media Lens’s campaign against IBC is: “how dare these data collectors tirelessly and vigorously criticise an epidemiological study”. It’s a weak and misleading argument, and an appeal to crass credentialism. It’s noteworthy that Media Lens don’t apply the same credentialist standards to the ORB poll which they endorse (and which, as noted above, is not peer-reviewed science). It’s noteworthy also that (to date) Media Lens have ignored a large body of science (from leaders in the fields of demography and epidemiology) which tends to support and confirm the data collected by IBC.

Bear in mind that Media Lens went as far as writing (in a letter to New Statesman magazine, 16/10/06) that, “to our knowledge, IBC has not been able to demonstrate support for its methods from a single professional epidemiologist”. Presumably they weren’t paying attention to the views of leading epidemiologists such as Debarati Guha-Sapir, Olivier Degomme, Mohamed M. Ali, Colin Mathers, J. Ties Boerma, etc.

Project Censored as censors? October 20, 2008

Posted by dissident93 in Iraq mortality, Project Censored.
comments closed

How many Iraqi deaths due to the US occupation? Project Censored (Top censored story for 2009) focuses on the calamity in Iraq, but excludes several crucial scientific studies from its account. As a result it presents a headline deaths figure which isn’t supported by scientific consensus.

The misleading headline

Iraq has become a bloodbath, but Project Censored’s headline claim, “Over One Million Iraqi Deaths”, isn’t endorsed by the majority of experts in the field – many leading authorities dispute this level of deaths: Jon Pedersen, Beth Duponte Osborne, Debarati Guha-Sapir, Mark van der Laan, etc.

There are more peer-reviewed scientific studies casting doubt on Project Censored’s headline figure than there are corroborating it.* But Project Censored doesn’t tell readers about this body of scientific research. Why?

“Good” censorship?

Two studies (ORB and Lancet 2006) are cited by Project Censored, but other, larger, scientific surveys (eg WHO/IFHS*) have been excluded, as have important critical studies and overviews of existing research (eg from the Centre for Research on the Epidemiology of Disasters, which estimated 125,000 deaths over the same period as Lancet 2006*).

Questionable scientific standing

ORB (Opinion Research Business) is the market research company which publicised the “over a million” estimate cited by Project Censored. ORB’s Iraq poll wasn’t peer-reviewed science. The person conducting ORB’s poll, Munqith Daghir, began his polling career in 2003, with little in the way of formal training or field experience (according to ORB’s publicity literature).

The ORB poll doesn’t have the scientific standing of the major studies (eg IFHS, ILCS*) which Project Censored excludes.

Leading researchers disagree with Project Censored

Many researchers either flatly reject the mortality level claimed by Project Censored, or are highly critical of the Lancet 2006 estimates which it cites. On the other hand, studies which Project Censored overlooks (or intentionally excludes) have been well-received by professional epidemiologists and demographers as important contributions to the field…

• Jon Pedersen (Fafo) is one of the leading experts on Middle East demography. He conducted the Iraq Living Conditions Survey (ILCS, a cluster-sample survey of over 21,000 Iraqi households – much larger than Lancet 2006, which surveyed approximately 1,800). Pedersen has commented that the Lancet 2006 mortality estimates were “high, and probably way too high. I would accept something in the vicinity of 100,000 but 600,000 is too much.” (Source: Washington Post, 19 Oct 2006)

• Research by Debarati Guha-Sapir and Olivier Degomme, from the Centre for Research on the Epidemiology of Disasters (CRED) estimates the total war-related death toll (for the period covered by Lancet 2006) at around 125,000. They reach this figure by correcting errors in the Lancet 2006 survey, and triangulating with IBC and ILCS data. Source: CRED paper.

• Beth Osborne Daponte (the renowned demographer who produced authoritative death figures for the first Gulf War) argues in a recent paper that the most reliable information available (to date) is provided by a combination of IFHS, ILCS and Iraq Body Count. This puts a working estimate well below the “million” figure claimed by Project Censored. Daponte is critical of the Lancet 2006 study – like several other researchers, she finds its pre-war crude death rate too low (which would inflate the excess deaths estimate). She writes that the Lancet authors “have not adequately addressed these issues”. http://tinyurl.com/48mq63

• Paul Spiegel, an epidemiologist at the UN, commented on IFHS (which estimated 151,000 violent deaths over the same period as Lancet 2006): “Overall, this [IFHS] is a very good study [...] What they have done that other studies have not is try to compensate for the inaccuracies and difficulties of these surveys.” He adds that “this does seem more believable to me [than Lancet 2006]“. http://tinyurl.com/53s82b

• Mark van der Laan, an authority in the field of biostatistics (and recipient of the Presidential Award of the Committee of Presidents of Statistical Societies) has written, with Leon de Winter, on the Lancet 2006 study:

“We conclude that it is virtually impossible to judge the value of the original data collected in the 47 clusters [of the Lancet study]. We also conclude that the estimates based upon these data are extremely unreliable and cannot stand a decent scientific evaluation.” http://tinyurl.com/4txbpw

• Mohamed M. Ali, Colin Mathers and J. Ties Boerma, from the World Health Organization at Geneva (authors of IFHS), write that it “is unlikely that a small survey with only 47 clusters [Lancet 2006] has provided a more accurate estimate of violence-related mortality than a much larger survey sampling of 971 clusters [IFHS].” http://content.nejm.org/cgi/content/full/359/4/431

• Survey methodologist Seppo Laaksonen has expressed many doubts over the Lancet 2006 estimates due to problems with the data (an attempt was made by Laaksonen to reconstruct country-level estimates using data received from the Lancet team). See Retrospective two-stage cluster sampling for mortality in Iraq by Seppo Laaksonen, International Journal of Market Research, Vol. 50, No. 3, 2008.

• Neil F. Johnson, et al, in Bias in Epidemiological Studies of Conflict Mortality, argue that there may be a “substantial overestimate of mortality” in the Lancet 2006 study due to a bias introduced in the street sampling procedure. The Lancet authors responded by asserting that such a “main street bias” was intentionally avoided, but (to date) have not been able to explain how this was achieved (without fundamentally changing the published sampling scheme). It remains a serious, unresolved issue.

• Many other researchers have criticised the estimates produced by Lancet 2006 and ORB. These criticisms include a comprehensive paper by Professor Michael Spagat on “ethical and data-integrity problems” in the Lancet study. Fritz Scheuren, a past president of the American Statistical Association, has said the response rate in the Lancet 2006 study was “not credible”. Professor Stephen Fienberg, the well-known statistician, is on record as stating that he doesn’t believe the Lancet 2006 estimate. Two of the world’s prestigious scientific journals, Nature and Science, ran articles critical of the Lancet 2006 study.

Project Censored doesn’t mention this substantial body of opinion among leading researchers which, inconveniently, contradicts its message.

*References & further reading

Research disputing or indirectly contradicting the mortality estimates cited by Project Censored:

1. Estimating mortality in civil conflicts: lessons from Iraq, by Debarati Guha-Sapir, Olivier Degomme. Centre for Research on the Epidemiology of Disasters, University of Louvain, School of Public Health, Brussels. Paper (PDF format)

2. Wartime estimates of Iraqi civilian casualties, by Beth Osborne Daponte. International Review of the Red Cross, No. 868. http://tinyurl.com/48mq63

3. Violence-Related Mortality in Iraq from 2002 to 2006, Iraq Family Health Survey (IFHS) Study Group. The New England Journal of Medicine, Volume 358:484-493. http://tinyurl.com/yoysuf

4. Bias in Epidemiological Studies of Conflict Mortality, by Neil F. Johnson, Michael Spagat, Sean Gourley, Jukka-Pekka Onnela, Gesine Reinert. Journal of Peace Research, Vol. 45, No. 5. http://jpr.sagepub.com/cgi/content/abstract/45/5/653

5. Sampling bias due to structural heterogeneity and limited internal diffusion, by Jukka-Pekka Onnela, Neil F. Johnson, Sean Gourley, Gesine Reinert, Michael Spagat. http://arxiv.org/abs/0807.4420

6. Ethical and Data-Integrity Problems in the Second Lancet Survey of Mortality in Iraq, by Michael Spagat, Department of Economics, Royal Holloway College. http://tinyurl.com/4xsjtl

7. Confidence Intervals for the Population Mean Tailored to Small Sample Sizes, with Applications to Survey Sampling, by Michael Rosenblum, Mark J. van der Laan. University of California, Berkeley Division of Biostatistics Working Paper Series. http://www.bepress.com/ucbbiostat/paper237/

8. Reality checks: some responses to the latest Lancet estimates, by Hamit Dardagan, John Sloboda, and Josh Dougherty. Iraq Body Count Press Release 14, Oct 2006. http://tinyurl.com/ysfpbj

9. Retrospective two-stage cluster sampling for mortality in Iraq, by Seppo Laaksonen, International Journal of Market Research, Vol. 50, No. 3, 2008. http://tinyurl.com/4yawmx

10. Mainstreaming an Outlier: The Quest to Corroborate the Second Lancet Survey of Mortality in Iraq, by Michael Spagat, Department of Economics, Royal Holloway College. http://tinyurl.com/46v8jy

11. Iraq Living Conditions Survey (ILCS) 2004, United Nations Development Programme. http://tinyurl.com/5yfyye

12. Mortality after the 2003 invasion of Iraq: Were valid and ethical field methods used in this survey?, by Madelyn Hsiao-Rei Hicks. Households in Conflict Network, The Institute of Development Studies, University of Sussex. 1 December 2006. http://www.hicn.org/research_design/rdn3.pdf

13. “Mortality after the 2003 invasion of Iraq: A cross-sectional cluster sample survey”, by Burnham et al: An Approximate Confidence Interval for Total Number of Violent Deaths in the Post Invasion Period, by Mark J. van der Laan, Division of Biostatistics, University of California, Berkeley, October 26, 2006. http://socrates.berkeley.edu/~jewell/lancet061.pdf

14. Lancet 2006 study criticised – letters from researchers published in the Lancet journal, 2007; 369.

Leading researchers disagree with Project Censored October 20, 2008

Posted by dissident93 in Demography, Iraq mortality, Project Censored.
comments closed

Project Censored’s Top 2009 censored story” headline states: “Over One Million Iraqi Deaths Caused by US Occupation”. Many researchers either flatly reject this level of deaths, or are critical of the two studies that Project Censored cites (ORB and Lancet 2006). On the other hand, studies which Project Censored overlooks (or intentionally excludes) – eg IFHS, ILCS, CRED, etc – have been well-received by professional epidemiologists and demographers as important contributions to the field…

Jon Pedersen is one of the leading experts on Middle East demography. He conducted the Iraq Living Conditions Survey (ILCS, a cluster-sample survey of over 21,000 Iraqi households – much larger than Lancet 2006, which surveyed approximately 1,800). Pedersen has commented that the Lancet 2006 mortality estimates were “high, and probably way too high. I would accept something in the vicinity of 100,000 but 600,000 is too much.” (Source: Washington Post, 19 Oct 2006)

• Research by Debarati Guha-Sapir and Olivier Degomme, from the Centre for Research on the Epidemiology of Disasters (CRED) estimates the total war-related death toll (for the period covered by Lancet 2006) at around 125,000. They reach this figure by correcting errors in the Lancet 2006 survey, and triangulating with IBC and ILCS data. Source: CRED paper (PDF).

Beth Osborne Daponte (the renowned demographer who produced authoritative death figures for the first Gulf War) argues in a recent paper that the most reliable information available (to date) is provided by a combination of IFHS, ILCS and Iraq Body Count. This puts a working estimate well below the “million” figure claimed by Project Censored. Daponte is critical of the Lancet 2006 study – like several other researchers, she finds its pre-war crude death rate too low (which would inflate the excess deaths estimate). She writes that the Lancet authors “have not adequately addressed these issues”. http://tinyurl.com/48mq63

Paul Spiegel, an epidemiologist at the UN, commented on IFHS (which estimated 151,000 violent deaths over the same period as Lancet 2006): “Overall, this [IFHS] is a very good study [...] What they have done that other studies have not is try to compensate for the inaccuracies and difficulties of these surveys.” He adds that “this does seem more believable to me [than Lancet 2006]“. http://tinyurl.com/53s82b

Mark van der Laan, an authority in the field of biostatistics (and recipient of the Presidential Award of the Committee of Presidents of Statistical Societies) has written, with Leon de Winter, on the Lancet 2006 study:

“We conclude that it is virtually impossible to judge the value of the original data collected in the 47 clusters [of the Lancet study]. We also conclude that the estimates based upon these data are extremely unreliable and cannot stand a decent scientific evaluation.” http://tinyurl.com/4txbpw

Mohamed M. Ali, Colin Mathers and J. Ties Boerma, from the World Health Organization at Geneva (authors of IFHS), write that it “is unlikely that a small survey with only 47 clusters [Lancet 2006] has provided a more accurate estimate of violence-related mortality than a much larger survey sampling of 971 clusters [IFHS].” http://content.nejm.org/cgi/content/full/359/4/431

• Survey methodologist Seppo Laaksonen has expressed many doubts over the Lancet 2006 estimates due to problems with the data (an attempt was made by Laaksonen to reconstruct country-level estimates using data received from the Lancet team). See Retrospective two-stage cluster sampling for mortality in Iraq by Seppo Laaksonen, International Journal of Market Research, Vol. 50, No. 3, 2008.

Neil F. Johnson, et al, in Bias in Epidemiological Studies of Conflict Mortality, argue that there may be a “substantial overestimate of mortality” in the Lancet 2006 study due to a bias introduced in the street sampling procedure. The Lancet authors responded by asserting that such a “main street bias” was intentionally avoided, but (to date) have not been able to explain how this was achieved (without fundamentally changing the published sampling scheme). It remains a serious, unresolved issue.

• Many other researchers have criticised the estimates produced by Lancet 2006 and ORB. These criticisms include a comprehensive paper by Professor Michael Spagat on “ethical and data-integrity problems” in the Lancet study. Fritz Scheuren, a past president of the American Statistical Association, has said the response rate in the Lancet 2006 study was “not credible”. Professor Stephen Fienberg, the well-known statistician, is on record as stating that he doesn’t believe the Lancet 2006 estimate. Two of the world’s prestigious scientific journals, Nature and Science, ran articles critical of the Lancet 2006 study.

Donald Berry is Chairman of the Department of Biostatistics and Applied Mathematics at the University of Texas MD Anderson Cancer Center. Berry is reported as writing that the Lancet 2006 estimates are “unreliable”:

“…The last thing I want to do is agree with Bush, especially on something dealing with Iraq. But I think ‘unreliable’ is apt. (I just heard Bush say ‘not credible.’ ‘Unreliable’ is better. There is a certain amount of credibility in the study, but they exaggerate the reliability of their estimate.)

“Selecting clusters and households that are representative and random is enormously difficult. Moreover, any bias on the part of the interviewers in the selection process would occur in every cluster and would therefore be magnified. The authors point out the possibility of bias, but they do not account for it in their report.

“It is true that the range reported (392,979–942,636) is huge. Its width represents only one source of variability, the statistical error present under the assumption that their sample is representative and random. I believe their analysis to be correct under these assumptions. However, it does not incorporate the possibility of biases such as the one I mentioned above. Incorporating the possibility of such biases would lead to a substantially wider range, the potential for bias being huge. Although there is no formal way to address bias short of having an ‘independent body assess the excess mortality,’ which the authors recommend, the lower end of this range could easily drop to the 100,000 level.”

References & further reading

Research disputing or indirectly contradicting the mortality estimates cited by Project Censored:

1. Estimating mortality in civil conflicts: lessons from Iraq, by Debarati Guha-Sapir, Olivier Degomme. Centre for Research on the Epidemiology of Disasters, University of Louvain, School of Public Health, Brussels. http://www.cedat.be/sites/default/files/WP%20Iraq_0.pdf

2. Wartime estimates of Iraqi civilian casualties, by Beth Osborne Daponte. International Review of the Red Cross, No. 868. http://tinyurl.com/48mq63

3. Violence-Related Mortality in Iraq from 2002 to 2006, Iraq Family Health Survey (IFHS) Study Group. The New England Journal of Medicine, Volume 358:484-493. http://tinyurl.com/yoysuf

4. Bias in Epidemiological Studies of Conflict Mortality, by Neil F. Johnson, Michael Spagat, Sean Gourley, Jukka-Pekka Onnela, Gesine Reinert. Journal of Peace Research, Vol. 45, No. 5. http://jpr.sagepub.com/cgi/content/abstract/45/5/653

5. Sampling bias due to structural heterogeneity and limited internal diffusion, by Jukka-Pekka Onnela, Neil F. Johnson, Sean Gourley, Gesine Reinert, Michael Spagat. http://arxiv.org/abs/0807.4420

6. Ethical and Data-Integrity Problems in the Second Lancet Survey of Mortality in Iraq, by Michael Spagat, Department of Economics, Royal Holloway College. http://tinyurl.com/4xsjtl

7. Confidence Intervals for the Population Mean Tailored to Small Sample Sizes, with Applications to Survey Sampling, by Michael Rosenblum, Mark J. van der Laan. University of California, Berkeley Division of Biostatistics Working Paper Series. http://www.bepress.com/ucbbiostat/paper237/

8. Reality checks: some responses to the latest Lancet estimates, by Hamit Dardagan, John Sloboda, and Josh Dougherty. Iraq Body Count Press Release 14, Oct 2006. http://tinyurl.com/ysfpbj

9. Retrospective two-stage cluster sampling for mortality in Iraq, by Seppo Laaksonen, International Journal of Market Research, Vol. 50, No. 3, 2008. http://tinyurl.com/4yawmx

10. Mainstreaming an Outlier: The Quest to Corroborate the Second Lancet Survey of Mortality in Iraq, by Michael Spagat, Department of Economics, Royal Holloway College. http://tinyurl.com/46v8jy

11. Iraq Living Conditions Survey (ILCS) 2004, United Nations Development Programme. http://tinyurl.com/5yfyye

12. Mortality after the 2003 invasion of Iraq: Were valid and ethical field methods used in this survey?, by Madelyn Hsiao-Rei Hicks. Households in Conflict Network, The Institute of Development Studies, University of Sussex. 1 December 2006. http://www.hicn.org/research_design/rdn3.pdf

13. “Mortality after the 2003 invasion of Iraq: A cross-sectional cluster sample survey”, by Burnham et al: An Approximate Confidence Interval for Total Number of Violent Deaths in the Post Invasion Period, by Mark J. van der Laan, Division of Biostatistics, University of California, Berkeley, October 26, 2006. http://socrates.berkeley.edu/~jewell/lancet061.pdf

14. Lancet 2006 study criticised – letters from researchers published in the Lancet journal, 2007; 369.

Scientists ignored over Iraq October 19, 2008

Posted by dissident93 in Iraq, Iraq mortality, Media Criticism.
comments closed

Two competing mythologies on Iraqi deaths:

  1. Mass media (particularly in US): “A few thousand American soldiers died, plus several thousand Iraqis, but we don’t talk about them”.
  2. Alternative media: “Over a million Iraqi deaths, and anyone who questions this figure is probably a supporter of the war”.

The problem with “mainstream” coverage is obvious; the problem with “alternative” media only becomes apparent when you research the science. And by research, I don’t mean read the alternative media – I mean check out the source material, the scientific studies, the views expressed by the leading researchers themselves (as opposed to “alternative” journalists working for Alternet or whatever).

Here’s a start. You’ve probably read the views of the Lancet study authors (since they are often quoted on alternative media sites). Here’s some material by other leading epidemiologists/demographers, which you probably haven’t read, and which presents a different view from the “alternative media” consensus:

The ignored perspectives

Debarati Guha-Sapir and Olivier Degomme, from the Centre for Research on the Epidemiology of Disasters, Brussels, have written:

The Burnham [Lancet 2006] estimates of deaths in the post invasion period are much higher than any other estimate. Even the lower limit of its 95% CI is higher than the highest estimate from any other source (Table 1). Further, weaknesses cited earlier as well as several inconsistencies in their published work undermine the reliability of their estimates. [...]

While IBC is undoubtedly missing some deaths in Baghdad, it is unlikely that they would miss an average of over 100 violent deaths a day, given the level of media coverage in the city. We therefore conclude that their Baghdad mortality estimate is close to complete, further corroborated by the ILCS estimates [...]

Our re-estimation of total war-related death toll for Iraq from the invasion until June 2006 is therefore around 125,000.

Leading epidemiologists Mohamed M. Ali, Colin Mathers and J. Ties Boerma from the World Health Organisation / Iraq Family Health Survey have written:

Both sources [IFHS & IBC] indicate that the 2006 study by Burnham et al [Lancet] considerably overestimated the number of violent deaths. To reach the 925 violent deaths per day reported by Burnham et al [Lancet] for June 2005 through June 2006, as many as 87% of violent deaths would have been missed in the IFHS and more than 90% in the Iraq Body Count. This level of underreporting is highly improbable, given the internal and external consistency of the data and the much larger sample size and quality-control measures taken in the implementation of the IFHS.

Beth Osborne Daponte (the renowned demographer who produced authoritative death figures for the first Gulf War) has recently written:

Perhaps the best that the public can be given is exactly what IBC provides – a running tally of deaths derived from knowledge about incidents. While imperfect, that knowledge, supplemented by the wealth of data of the Iraq Living Conditions Survey and Iraq Family Health Survey (which have their own limitations), provides enough information in the light of the circumstances. At a later date, additional surveys can be conducted to determine the impact and/or do demographic analysis. But for now, the Iraq Body Count’s imperfect figures combined with the date of the ILCS and IFHS may suffice. [...]

The estimates from [the Lancet studies] students have been lauded but also questioned, partially because the researchers have misinterpreted their own figures but also because of fundamental questions about the representativeness of the achieved survey sample.

Mark van der Laan, an authority in the field of biostatistics (and recipient of the Presidential Award of the Committee of Presidents of Statistical Societies) has written, with Leon de Winter, on the Lancet 2006 study:

“We conclude that it is virtually impossible to judge the value of the original data collected in the 47 clusters [of the Lancet study]. We also conclude that the estimates based upon these data are extremely unreliable and cannot stand a decent scientific evaluation.” http://tinyurl.com/4txbpw

Medialens’s embarrassing archive (part 2) October 19, 2008

Posted by dissident93 in Iraq mortality, Medialens.
comments closed

Medialens’s campaign against Iraq Body Count (IBC) started in January 2006. It continued into March 2006 with their “Iraq Body Count refuses to respond” article. IBC published a substantial response in April 2006.

Then Medialens rewrote history. Conscious of the fact that they were accused of waging a smear campaign against IBC (Peter Beaumont of the Observer later described it as “deeply vicious”), Medialens attempted to (falsely) portray IBC as the aggressors. Less than five months after starting the anti-IBC campaign, the Medialens editors wrote, incorrectly, that IBC had been campaigning against them “over the last five months” (Medialens message board, June 2, 2006).

When questioned about this apparently dishonest remark, the Medialens editors replied:

Quibbling about whether it’s exactly four, five or six months is neither here nor there. (Medialens editors, Medialens message board, June 2, 2006).

But it was no mere quibble. Medialens apparently hoped their own readers had poor memories. Only two-and-a-half months earlier (14 March 2006) they’d complained of IBC’s “unwillingness to respond”.

Here’s a brief history of Medialens’s campaign – posted for those with poor memories (originally posted by me to the PoV website, November 1 2006):

• January 2006. MediaLens [ML] release their first anti-IBC alert, politely worded, but with a stench of insinuation from the start (that perhaps IBC aren’t as anti-war as they seem, etc). And full of distortions, errors and unsupported inferences. IBC don’t respond, beyond a few brief, polite emails, basically hinting that they don’t buy the premises of ML.

• MediaLens and their supporters get royally pissed at being snubbed by “silent”, “non-responsive” IBC. Edwards & Cromwell [ML's editors] issue more anti-IBC alerts, increasingly authoritarian in tone (basically dictating to IBC what the “honourable” thing to do is). Still, IBC sensibly don’t take the bait.

• Three months go by, in which we witness increasing fury on the ML board at IBC’s “failure to respond”. During this time the smears and attacks appear daily on the ML board. Edwards and Cromwell post the “aiding and abetting in war crimes” smear, and others. IBC are publicly accused of being “complicit in mass slaughter”, “assisting the US government”, being “cosy” with military and intelligence, “not caring” about the suffering of Iraqis, being “apologists” and “propagandists” for war criminals, etc, etc. John Sloboda is subjected to frequent personal attacks and character assassinations, both by email and via the ML board. On the ML board, these often involve people digging up old pieces by Sloboda, which are then quoted out of context.

• During all this time, IBC hasn’t criticised or rubbished the Lancet study 2004. [In fact IBC had previously issued a positive press release commenting on it - which describes the difference between IBC's and Lancet 2004's methodologies/figures].

• A few supporters and one member of IBC start posting to the ML board, mainly to deal with the factual distortions appearing there. Some of the anti-IBC crowd learn, to their surprise, that IBC have always stated that many or most deaths will go unreported. And that IBC use a long list of non-western media, etc.

• The anti-IBC campaign moves to a new phase. Les Roberts supplies the anonymous “IBC amateurs” quote, which Edwards and Cromwell milk, on the ML board, in emails to BBC journalists, in their next alert, etc. Roberts meanwhile (we assume) fails to point out to Edwards and Cromwell that their alerts have been based on claims from Roberts which are known [and admitted], by Roberts, to be erroneous. Odd one, that.

• After the smears have gone on for months, IBC finally publishes a defence, Speculation is No Substitute, which deals with, among other things, Roberts’s error – but which isn’t a criticism of the Lancet study (Roberts’s error was made outside of that study).

• After having been criticised, for months, for “failing to respond”, IBC are now attacked for responding (when they should, of course, be using their time to email Jon Snow, as demanded by the ML editors). Meanwhile, Les Roberts is running for Congress [or was, in 2006], and clearly doesn’t spend much time writing to media outlets to complain about their misrepresentations/omissions of the Lancet study. But this doesn’t seem to bother the ML crowd. After all, Roberts is an expert and a saint. IBC supporters are banned from the debate at ML. After all, they are supporters of apologists for war criminals.

• It all gradually dies down a bit (thank God), after Edwards and Cromwell are reduced to comparing a member of IBC to one of their ex-girlfriends. Typical of the quality of comment from Edwards and Cromwell at this time: “Where’s your message board, Josh?”

Medialens later resurrected their campaign, and continue to attack IBC.

Medialens’s embarrassing archive (part 1) October 19, 2008

Posted by dissident93 in Iraq mortality, Medialens.
comments closed

I’ve listed (most recently in a ZNet article) errors in Medialens’s articles (from their campaign against Iraq Body Count). But the most embarrassing Medialens material was posted to their message board, and has long since disappeared (no doubt to their relief).

I saved some of it to disk, however. It’s worth presenting (in installments – there’s a lot of it) as it shows how poorly-researched – and often hypocritical – Medialens’s campaign was.

On 24 March 2006, David Edwards (Medialens co-editor) signalled the latest “phase” of the campaign by posting a message titled: ‘The IBC “amateurs”‘. It claimed that a “world’s leading epidemiologist” had told them that IBC “is run by amateurs”. When asked for the identity of this epidemiologist, the Medialens editors replied:

“Our source has chosen to remain anonymous – that’s his decision. Our concern is that this important information be made public.” (Medialens editors, 24/3/06)

Well, they certainly milked this “important information”. Their “amateurs” theme ran and ran, and the Medialens editors seemed anxious to tell journalists about it:

Dear Steve

It baffles me that you would take IBC more seriously than you would the peer-reviewed Lancet report, which after all appeared in a major science journal. One of the world’s leading epidemiologists told me the IBC is run by “amateurs”… (email from David Edwards to BBC’s Steve Herrmann, 24/3/06)

Fast forward to the present to get a sense of Medialens’s hypocrisy. When it suits their position, it seems that Medialens favour “amateur” studies over peer-reviewed science. For example, in their recent article, Propping up Propaganda, they write that “the probable death toll exceeds one million”. Their sources for this? Just Foreign Policy and Opinion Research Business – both “amateur” studies (they’re not recognised in the scientific literature – see below for more details on the lack of scientific credentials for these studies*). And the sources that Medialens ignores (and which contradict their “one million” figure)? At least five peer-reviewed studies conducted by “world’s leading” experts such as Beth Duponte Osborne, Mohamed M. Ali, Mark van der Laan, etc. (More detail on these studies here).

Anonymous epidemiologists

Medialens didn’t rely on just one anonymous expert. They claimed that several “leading epidemiologists” supported their criticisms of IBC. But when asked for a list of names, they went silent. When asked a further time (on 30/4/06), they replied: “The experts have made an appearance – we have a list of names. We hope to have more on this before too long.” It’s now over two years later, and they still haven’t produced the list (they’ve been asked often enough that it’s turned into a running joke among Medialens’s critics).

(Iraq Body Count published a thorough response to Medialens’s campaign, which comments on the “amateurs” slur, among other things – eg see p47).

* “Amateurs”

The “amateur” operations that Medialens apparently now favours over peer-reviewed science:

*Just Foreign Policy (JFP), ironically, uses IBC figures to “extrapolate” from the Lancet 2006 estimate to the present day. JFP personnel appear to have no qualifications in the relevant fields (eg biostats, epidemiology, demography). JFP’s Iraqi Death Estimate is very much an “amateur” operation in the sense that Medialens have used the term in this context.

Opinion Research Business (ORB) is a market research company which publicised the “over a million” estimate cited by Medialens. ORB’s Iraq poll wasn’t peer-reviewed science. It doesn’t appear to be recognised in the scientific literature. The person conducting ORB’s poll, Munqith Daghir, began his polling career in 2003, with little in the way of formal training or field experience (according to ORB’s publicity literature).

Follow

Get every new post delivered to your Inbox.