Lancet publishes IBC study September 4, 2011Posted by dissident93 in Iraq mortality.
The Lancet journal has published an IBC-based paper co-authored by IBC’s founders: Casualties in civilians and coalition soldiers from suicide bombings in Iraq, 2003—10…
(Reading the full text requires free registration at the Lancet website. If you don’t want the hassle, but would like more info on the study, see Iraq Body Count’s summary).
Sloppy work from Les Roberts? March 9, 2011Posted by dissident93 in Iraq mortality.
In a new article, Les Roberts, the epidemiologist (and runner for Congress), claims that “Only 19% of the WikiLeaks reports of civilian deaths had been previously recorded by IBC [Iraq Body Count].” Roberts’s conclusion is summed up by his article’s title: ‘WikiLeaks Analysis Suggests Hundreds of Thousands of Unrecorded Iraqi Deaths’.
Two things immediately stand out:
1) The “Columbia University” study which Les Roberts cites, and on which he bases his conclusions, appears to have been conducted by Roberts himself (with the help of students), although the paper (misleadingly titled: ‘Do WikiLeaks and Iraq Body Count tell the same story? No!’) is unclear about its own authors. Roberts (in the above article) isn’t clear that it’s his own little study.
2) Roberts fails to mention that IBC worked with Wikileaks on an analysis which shows a much greater overlap between the Wikileaks and IBC data (an estimated 81% of the deaths in the Wikileaks logs matching those in IBC’s database).
What explains the difference (19% overlap claimed by Roberts*, 81% deaths overlap claimed by IBC/Wikileaks)? There are a few clues here and here which point to serious problems (if not incompetence, or worse) in Les Roberts’s study.** This episode puts me in mind of an earlier study from Roberts and his students (again designed to discredit IBC), about which Roberts made some misleading pre-publication statements (see second section at link).
* The wording of Les Roberts’s article will likely make people think he’s claiming that IBC recorded only 19% of the deaths in the Wikileaks logs. In fact his study claims a 31.6% overlap of deaths (“If this simple analysis of estimating the civilian death toll based on our estimate that the Iraq War Logs and IBC have a 31.6% overlap, the death estimate would be 324,000” – page 7). This 31.6% isn’t mentioned in Roberts’s article. The 19% figure mentioned in his article refers to the percentage of “Wikileaks reports of civilian deaths” (ie logs) “previously recorded” by IBC. The deaths overlap comparison should be: 31.6% (Roberts’s claim) vs 81.2% (IBC/Wikileaks claim).
** Since I wrote the blog entry, more has come to light regarding problems with Roberts’s study. These problems include systematic failure to match Wikileaks records to IBC data, where such matches do in fact exist, and failure to account properly for morgue and other aggregate entries in IBC’s data. Please see the discussion here for details.
BBC News publishes my response to Les Roberts October 28, 2010Posted by dissident93 in Iraq mortality.
After a prompt from an advocacy group, BBC’s Paul Reynolds added comments from epidemiologist Les Roberts to a BBC News piece on the WikiLeaks Iraq war logs. I complained to Paul that Roberts was using the space to reheat his old attacks on Iraq Body Count (IBC), and that his comments (with one exception) had little relevance to the subject matter of the article.
I also pointed out (with examples) that most of Roberts’s claims were either misinformed, unsubstantiated or simply false. Paul, to his credit, immediately published my comments on the BBC News page. But what he published was an edited version of comments which I’d already whittled down (to get the basic points across to someone who wasn’t already familiar with the studies/background). Here is the original version of my comments:
[Les Roberts] “A) It is likely that the IBC and Wikileaks reports tend to have the same lens (many reports coming from the Government of Iraq, oversampling of Baghdad, oversampling of the largest events and missing single killings).”
Robert Shone: On the previously unknown 15,000 deaths, IBC point out that: “The majority of these new deaths come from small incidents of one to three deaths”. If the WikiLeaks report had the same “lens” as IBC on size of incidents, as claimed by Roberts, this wouldn’t be the case.
[Les Roberts] “B) We have shown that most violent deaths in the press over the first 4 years of the war are not in IBC because they (cautiously) required multiple press reports and (unavoidably) used a few search terms that did not capture all events. (see attached)”
RS: Roberts hasn’t “shown” this anywhere (unless the “attached” which Roberts mentions contains something new and previously unpublished – what was this “attached”?). I suspect he’s making similar unfounded claims to the ones he makes in C), which I’ve already shown is false.
[Les Roberts] “C) In Baghdad where we believe the press coverage was by far the best, we showed that most violent deaths reported in phone and Skype interviews were not in IBC.”
RS: That’s false. Roberts’s study found that “38%” of deaths “were absolutely not in the [IBC] database”. The majority (62%) were included by IBC or can’t be ruled out as not included by Roberts’s Mickey Mouse study (which used a truly massive, comprehensive sample of only “18 primary interviewees”! – see my blog post for more details). The only way Roberts can make the above (false) claim is by ignoring the records in IBC’s database which came from morgues, etc (and which therefore Roberts couldn’t match, due to lack of detail). It’s very dishonest. Les Roberts’s study: http://pdm.medicine.wisc.edu/Volume_23/issue_4/siegler.pdf (page 3)
[Les Roberts] “D) There are just so many things that are not consistent with 120,000 deaths! The ORB 11/07 and BBC (see: http://news.bbc.co.uk/1/shared/bsp/hi/pdfs/19_03_07_iraqpollnew.pdf ) polls that are completely at odds with the IBC implication that 1 in 20 or 1 in 25 Iraqi households have lost someone to violence. The ORB implication that 1 in 4 households have lost someone matches all the ground reports I hear. You cannot have the Iraqi Ambassador reporting half a million new war widows or UNICEF speculating that there are a million orphans if there are 120,000 war deaths.”
RS: The comment about widows is completely bogus, as around 490,000 women would be widowed over a seven year period regardless of war (making a pro-rata population comparison to rate of widowhood in the US, for example – an over-simplistic comparison, to be sure, but it does underline the basic point that other things than war create widows). This issue has been debated a lot, and the conclusion among the informed seems to be that there’s currently no way of knowing how many of the widows are due to the recent conflict, to previous conflicts (80s & 90s), other factors, etc. As for the rest of Roberts’s comment here, this is where some real “balance” would come in useful. For example, the ORB study has recently been convincingly demolished in a peer-reviewed study published in Survey Research Methods  Vol.4, No.1. Even Les Roberts’s colleague, Francesco Checchi, thinks there are serious problems with it (he was quoted as saying so in a recent BBC report*). So it might be better to use a more credible study, such as IFHS, as a comparison. So much for “balance”.
*Checchi said the ORB figure was “implausible”, that it had a “major weakness” (echoing the Survey Research Methods study findings). He added that the Iraq death count was “likely to be between 200,000 and 500,000”. (BBC World Service, 27 Aug 2010)
Published 8.43am October 28 2010
Media relies on WikiLeaks October 24, 2010Posted by dissident93 in Iraq mortality.
“…the deaths of some of 109,000 people are documented
including 66,000 civilians… Working with Iraq Body Count,
we have seen there are approximately 15,000 never previously
documented cases of civilians who have been killed…”
– Julian Assange (WikiLeaks)
It’s been widely reported (particularly by The Guardian), so it would be redundant to repeat the details here. Apparently the media (in the UK at least) is willing to report this kind of thing, even if it doesn’t do the investigative or analytical work (in this case it relied on WikiLeaks and Iraq Body Count).
Here’s some BBC coverage of the WikiLeaks Iraq War Log press conference, with Julian Assange and IBC’s John Sloboda:
“We have summarised and released over 35,000 pages of records. We currently have one more set of documents we are working to summarise.” — Nasrina Bargzie (attorney, for ACLU)
Perhaps this is how Full Employment can be achieved: declare everything of international importance secret. The amount of work required to request its release (via Freedom of Information Act), to legally demand it (when denied), and then to unravel, interpret and publish (in a form that’s understandable), etc, would surely keep the world’s “unemployed” in work for decades.
So, over 35,000 pages of internal US government documents on civilian casualties in Iraq and Afghanistan – released following ACLU-initiated Freedom of Information Act requests, lawsuit and lengthy negotiations. Records which should always have been in the public domain. The latest batch of records was released by ACLU earlier this year. IBC has been working to integrate the records on Iraqi civilian deaths into its database since the first release in 2007.
A separate FOIA request by Professor Michael Spagat (of Royal Holloway University) led to the release of another set of data on civilian casualties (Basra police records held by the UK Ministry of Defence). This has also been integrated into the IBC database. (There’s a misconception in some circles that IBC excludes casualties which aren’t reported in “Western” media. In fact, IBC has used data from NGOs, Iraqi hospitals and morgues, records obtained from UK and US governments using FOIA requests – and non-“Western” media).
Unrelated: I’ve written a new piece for The Comment Factory: Counterproductive antiwar arguments.
Patrick Ball rubbishes Lancet study September 19, 2010Posted by dissident93 in Iraq mortality.
Patrick Ball is one of the “experts” whose work (on Guatemala) has often been cited in support of the Lancet 2006 study on Iraqi deaths (even though it has little relevance to the situation in Iraq). He was even quoted in some PR for the Lancet study.
First, I want to be clear that I have no interest in defending the Burnham et al. [Lancet 2006] estimates. The flaws in that study are now well known. (Patrick Ball, 28/4/10)
(Incidentally, I recommend the comments thread from which this is taken. Patrick Ball is critical of Iraq Body Count as well as the Lancet 2006 study, and his research on Guatemala is often cited by critics of IBC. But it’s very limited, based on small samples of press reporting specific to Guatemala – a handful of newspapers only; no news wires or anything comparable to IBC’s coverage, or to the situation in Iraq. I think the comment from Josh Dougherty completely demolishes Patrick Ball’s over-generalisations with regard to media coverage in Iraq, and media bias generally).
Note: the title of this blog entry uses the term “rubbishes” in a special sense. I’ve adopted this from my friends at Medialens who use it to characterise any criticism, substantial or trivial, real, implied or imagined, of the Lancet/ORB Iraq studies. (Recent example: a Medialens follower wrote that a piece I’d written “rubbished” the Lancet/ORB studies, when I’d merely listed peer-reviewed studies critical of those studies.
IBC vs Chilcot August 28, 2010Posted by dissident93 in Iraq mortality.
Note: a longer and slightly differently worded version of this piece has been published by The Comment Factory.
Iraq Body Count (IBC) has successfully drawn media attention to the failure of the Chilcot Iraq Inquiry to take account of Iraqi casualties: “most of the attention has remained firmly fixed, fixated even, upon the interplay between political and military actors here and in the USA…”. I suspect that the point made strongly by IBC will lead to discomfort in certain media and political circles, where it applies just as much (as anyone who has witnessed the typical level of discussion on this topic among politicians in the respectable UK “news” will appreciate).
Putting on my Chomsky hat, I find it remarkable that so much media coverage has been given to this particular criticism of Chilcot. It creates a new story in its own right (whereas one expects such criticism to be added perhaps only as a sidebar to an existing story, if it’s mentioned at all). A major Inquiry into the war fails to address the central issue of the war (the vast amount of bloodshed and suffering it led to) – that’s surely not a topic fit for the “news” organs of Establishment Power. And yet there was a lot of coverage (due, I think, to the praiseworthy efforts of IBC in presenting its case so effectively). Here’s a partial list of media coverage:
So, for once, we have a prominent news story about “official” failure to take account of the blood spilt in Iraq. Is it important that there’s media coverage focusing on this very point? As a relativist in such matters, I can only compare its importance to other things. For example, my favorite amateur-Chomskyite website once declared it “very important” for George Monbiot to namecheck their site in his Guardian column – as if the mere mention of it would automatically have beneficial effects on humanity. If namechecking a website in the Guardian counts as “very important” on some media-activism scale, then, by comparison, I would classify IBC’s accomplishment in creating the above news story as of almost cosmic importance. (IBC will get the joke – they have a sense of humour and perspective. The chaps who run the Chomskyite website probably won’t get it – they have very little humour/perspective).
The “Passive Surveillance” myth May 18, 2010Posted by dissident93 in Iraq mortality.
Note: an extended version of this post has been published by the Comment Factory
Les Roberts, the epidemiologist (and runner for Congress), uses the term “passive surveillance” to describe media-based counts of war dead. The term has entered the Iraq war lexicon – commentators often compare survey estimates (eg Lancet 2006) to figures from so-called “passive surveillance” (eg Iraq Body Count).
But Roberts misuses the term, and the lexicon is poorer for it. Consider the Iraqi journalists hired by Reuters to get the facts by going out and talking to people. Are they more “passive” than survey teams or pollsters? (Also, as a reader of this blog points out, Iraq was the “world’s deadliest nation for journalists” for six years, 2003-2008. That’s a measure of actual fatalities and abductions, etc, not of “passive” sitting in an office – RS, 1/6/10).
What about the processing stage – is it more “passive” to process media-based data than it is to process survey results? Does it help if you type faster or do press-ups at regular intervals? In fact, the active/passive metaphor has little relevance here. A media-based count of war dead may be incomplete, an “undercount”; a survey estimate may be way off due to a bias in sampling, etc – such things have nothing to do with relative passivity/activity.
So what are the origins of the phrase “passive surveillance”, and why is it used in this context?
The term ‘passive surveillance’ seems to have originated in the medical literature to refer to data on medical ailments compiled by recording the number of people who present themselves to medical facilities for treatment. This is contrasted to ‘active surveillance’ methods by which data collectors proactively search the community and find ailing people. Applying the ‘passive surveillance’ term to conflict journalism is misleading since journalists actively seek out violent events, witnesses and informed sources in the field. (Note 44, Ethical and Data Integrity Problems in the second Lancet survey.., Defence and Peace Economics, Volume 21, Issue 1)
So, the term is misleading. Why use it? Well, if you’re trying to discredit media-based counts, it helps to use a word with derogatory connotations. Labelling something as “passive” is like saying “not good enough”, “should try harder”, etc. As an epidemiologist, Les Roberts can get away with using a phrase from the medical domain. The key to establishing a term is repetition, and that’s what Roberts has done.
Les Roberts’s latest attack on so-called “passive surveillance” makes some sweeping (and misinformed) statements. As a correspondent of mine points out:
1. Roberts writes: “Aside from the Human Security Report, whose conclusions are largely based on news media reports, a variety of other publications have been produced based on press reports, or worse, passive surveillance by governments involved in a war [5,6]”
This is an odd statement, as the Human Security Report’s main conclusions are not “based” on media reports. Roberts doesn’t even specify which of its conclusions he thinks are “based” on news reports. He’s equally vague about the “variety of other publications”, from which he mentions just two (in his footnotes) without specifying which of their findings, if any, he has a problem with, or why.
2. Roberts then writes: “This Journal has shown that news reports are in part a cultural construct. For example, the ratio of civilian to Coalition military deaths in Iraq reversed when comparing 11 US newspapers with three from the middle east.”
His wording here is somewhat misleading. It’s not the “the ratio of civilian to Coalition military deaths” which “reversed”. If a newspaper reports an Iraqi death once, and a US death 10 times, the ratio of deaths reported is still 1-1, although the ratio of reports is 1-10. The latter is reversed – but that’s not a great illustration of the type of “cultural construct” that Roberts apparently has in mind – ie one which would justify his next statement: “The dangers of drawing conclusions from passive surveillance processes are profound: they allow one to conclude mortality goes down in times of war making war more acceptable and they allow armies, like those invading Iraq, to manipulate the press to portray resistance fighters as the primary killers when population-wide data conclude the opposite [8,9]”
His latter claim is not only unsupported by anything in his article – it’s clearly refuted by, for example, a comparison between IBC and Roberts’s own 2004 Lancet Iraq survey. The Lancet 2004 estimate shows that 43% of violent deaths (for the whole country outside Falluja) were directly caused by US-led forces, compared to IBC’s 47% over the same period. (IBC analysis, p23-26)
3. Roberts makes the absurd, sweeping statement that “We should not tolerate publications of surveillance data where the sensitivity of that data cannot be shown.” As my correspondent points out, this is like saying that we shouldn’t “tolerate” police-recorded crime figures, or any kind of simple count, without some statistical interpretation (a “sensitivity analysis”). This is complete nonsense, of course. Roberts should be asking himself whether we should “tolerate” statistical estimates from surveys which don’t provide the nitty-gritty facts on how the claimed random sampling was achieved.
Surely Action Heroes would not tolerate statistical constructs insufficiently supported by hard facts?
ORB Iraq poll criticised May 3, 2010Posted by dissident93 in Iraq mortality.
Opinion Research Business (ORB) is a “corporate and issues-led market research” firm which received much publicity in 2007 when it estimated that over a million Iraqis had been murdered as a result of the Iraq war.
A poll on the complex issue of war-related deaths was untypical of ORB’s work (they’re opinion-pollsters not, for example, epidemiologists). More typical for them is the following type of “finding”:
Latest poll by ORB on behalf of BBC Newsnight reveals more people believe David Cameron and the Conservative Party will “make the right cuts in public spending” than Gordon Brown and Labour. (ORB press release)
ORB’s Iraq poll wasn’t peer-reviewed science (it was published nowhere but on the ORB website), and the person conducting it, Munqith Daghir, had little formal training or field experience (according to ORB’s publicity literature). Nevertheless, their figure of over 1,000,000 deaths was widely quoted as a serious estimate.
A new peer-reviewed paper (published by Survey Research Methods  Vol.4, No.1) details systematic errors (eg non-coverage and measurement errors) in the ORB Iraq poll. It shows that in four governorates in central Iraq (which account for more than 80% of ORB’s estimated one million deaths) a higher percentage of respondents report deaths of household members than, in an earlier ORB poll, reported deaths of extended family members. This cannot be seen as credible, since extended families are much larger than households.
There’s a response to this paper from ORB’s Johnny Heald (followed by a response [p2] to Heald by the paper’s authors). Heald doesn’t address the paper’s substantial criticisms. Instead, he adopts a rather defensive position with regard to ORB’s aims (Heald’s own emphasis):
The survey was only an estimate and the fundamental point of this and every other investigation into this subject remains the same i.e. there has been a very significant human cost associated with the conflict. […] Our findings were an estimate based on a survey – or opinion poll if it makes it clearer. We have repeatedly stressed that our work does only offer an estimate; again the key point is not whether we (or others) are 100% accurate…
Of course, nobody claims that it was anything but an “estimate” (or that “100% accuracy” was required). And few people disputed the “very significant human cost associated with the conflict” (in fact most of my friends and colleagues were calling it a “bloodbath” way back in 2003, based on Iraq Body Count’s figures). The point – which Heald doesn’t address – was whether ORB’s specific estimate of 1,033,000 dead can be considered remotely reliable, given the credibility problems with their data.
Heald then attacks the paper’s authors, asserting that they “had little intention of taking an objective view but are merely pursuing an agenda”. He then claims a “conflict of interest”, because one of the paper’s authors (Professor Michael Spagat) is, he says, “very closely linked to Iraq Body Count (IBC)”. Heald apparently overlooks the fact that the paper’s other author, Josh Dougherty, is a member of IBC, and that this is clearly indicated at the top of the paper – presumably the paper’s referees didn’t regard this as a “conflict of interest”!*
Perhaps if Heald spent more time studying how good scientific method works in practice, rather than reading conspiriological smear campaigns issued by “media criticism” websites (I speculate here, of course), he might have a better understanding of what “conflict of interest” means in the context of published research. I direct him to a paper criticising IBC, co-authored by Les Roberts (who was, you know, “closely linked” to the Lancet Iraq studies). How’s that for a “conflict of interest”?
It’s a pity that Heald resorts to ad hominem rather than addressing the points of substance raised by the paper. Quantifying war dead is a serious business. If you publish an estimate of 1,000,000 war-related deaths, based on a claim of a “nationally representative sample” (a difficult challenge in Iraq) then you should provide sufficient detail about the sampling methodology for your claim to be assessed. As the above paper states, “ORB’s parsimony with information about its methodology is an indicator of low survey quality and weakens confidence in its estimate.”
Low survey quality might not matter so much if you’re making trivial claims (for frivolous news reports) – eg “more people believe David Cameron than believe Gordon Brown”. But it surely matters if you’re publishing a figure of a million deaths resulting from a war, and this figure contradicts, by a large amount, the estimates of several other studies.
* Heald was the Conservative Party’s private pollster in the 2005 General Election (according to his ORB profile). If I had a “link” like that, I wouldn’t go around making vague, silly accusations of “conflict of interest”.
Related news: Prof Spagat’s long paper on ethical and data-integrity problems in the Lancet Iraq study (2006) has been published in Defence and Peace Economics. (Another of his papers, with Neil Johnson et al, made the cover of Nature journal recently. Perhaps Nature is part of the conspiracy? After all, it too was critical of the Lancet study). The editor of this journal notes that “The authors of the Lancet II Study were given the opportunity to reply to this article. No reply has been forthcoming”. That’s a shame, given the seriousness of Spagat’s criticisms.
• The 2006 Lancet Iraq study has been awarded the “STONEWALLING & COVERUP” Award in the 2010 Top Ten “Dubious Polling” Awards from StinkyJournalism.org (a Media Ethics Project of the Art Science Research Laboratory). I checked the background of the two people who decided the awards (George F. Bishop and David W. Moore), to make sure they’re not “closely linked” to anything suspiciously like a “conflict of interest”. They passed the test – they’re not rightwing warmongers or anything.
Nature cover story December 23, 2009Posted by dissident93 in Iraq mortality.
A research paper by Neil Johnson, Michael Spagat, et al, provides the cover story for the latest issue of Nature. The blurb from Nature which accompanys the cover says: “Many seemingly random or chaotic human activities have been found to exhibit universal statistical patterns. Neil Johnson and colleagues use detailed data sets from conflicts, including those in Afghanistan, Iraq and Colombia, to show that insurgent wars fit into this category, sharing common patterns with each other and also with global terrorism.”
The paper’s authors note that their model’s similarity to financial market models “provides a surprising link between violent and non-violent forms of human behaviour”.
The datasets they used for the Iraq conflict were from Iraq Body Count, icasualities.org and ITERATE. As readers of this blog will be aware, Johnson and Spagat (with colleagues Gourley, Onnela and Reinert) produced the “main street bias” research, which was critical of the 2006 Lancet survey of Iraq deaths (and which won the Journal of Peace Research Article of the Year Award. See my earlier blog entry on main street bias).