Sick Capitalism May 25, 2010Posted by dissident93 in economics.
I watched Michael Moore’s brilliant film, Capitalism: A Love Story, last night. My impression: rich in information, with some startling facts. For example, a segment on US airline pilots informed me that:
- Many pilots are poorly paid (eg $17,600 gross).
- Some have second jobs to make ends meet (as well as being overworked as pilots).
- Some are on food stamps, or donate blood to earn extra cash.
The next segment showed US companies (Walmart, AT&T, Citibank, etc) benefiting from deaths of their employees by secretly taking out life insurance on them. This is called “Dead Peasants” insurance by the brokers. An attorney (Michael D. Myers) said the companies “want the employees to die in accordance with the policy projections”. Moore showed a document (from a broker) about a shortfall in deaths (“78% of expected mortality”) which stated that their clients “are acutely aware of this problem”. Who needs satire?
Missing link to long working hours
Moore didn’t link worker mortality with factors such as long working hours, work-related stress, perceived low status, etc. I guess some things have to be left to the viewer to piece together. There’s a useful, sourced list of studies linking worker mortality to long hours and stress at Media Hell.
In a sub-theme, Moore presented (some) Christians as opposing capitalism: a priest says it’s “radically evil”, a bishop blesses protesting workers, etc. Footage of hypnotism was shown to imply that “good Americans” have been hypnotised to believe that capitalism is compatible with Christian beliefs.
But Moore didn’t mention the historic links between Protestantism and capitalism, as documented by Max Weber and histories of the industrial revolution. He also didn’t specifically mention the masses of conservative Christians – eg Republican-voters, supporters of the “free-market” – not all well-off or supportive of Major Financial Criminals.
So, on one level, the Us-vs-Them theme didn’t work for me, although Moore was probably right to go with it (dry intellectual reasoning makes tedious films). Also, for me, it’s enough to witness the effects of the Financial Criminal System (and Moore shows these effects close-up – people losing their homes, etc) without having a Christian Authority assert that it’s evil. I mean, you can probably find just as many American Christians who will assert that communism, blow-jobs, abortions and welfare are evil (or at least immoral).
Bank Bailouts, USA & UK
The excellent section of the film on the $700bn bailout of the US Financial Criminal System implied a conspiracy – Congresswoman Marcy Kaptur described the US Congress’s about-face over the bailout as being arranged “almost like an intelligence operation”; Moore used the term “coup d’état”, etc.
We could do with someone like Michael Moore to give the UK bank-bailout the satirical treatment. There are some dubious and confusing messages (or propaganda) in the UK media. Some media reports estimate that the bank bailout adds around a trillion pounds to the national debt, whereas others (noticeably from the BBC) claim that the bailout had “very little” effect on the debt.
An example of the latter is from BBC’s resident “number cruncher” (also FT’s “Undercover Economist”), Tim Harford. When asked by BBC PM presenter Eddie Mair, “how much of this deficit is due to the bailout of the banks”, he replied:
Well, the answer’s very little, really. The reason we’re in trouble is not because of the bank bailout being expensive but because the recession’s made us poorer. […] The cost of the bank bailouts is a sort of shadow debt, and the Office for National Statistics is still trying to do all the sums. We as taxpayers have bought lots of shares in banks, and we’ll make or lose money depending on what happens to their share price. In its recent budget, the government reckons that all the interventions in the financial system would cost just £6bn, based on share prices at the time. That’s a bit more than two hundred pounds per household. (PM, BBC Radio 4, 29/4/10 – my emphasis)
If the estimates of around a trillion pounds are correct, that’s equivalent to approximately £40,000 per household (around 25.5 million households). Presumably this is what Harford “sort of” refers to when he says “We as taxpayers have bought lots of shares in banks”. What I’d like to know is when can I cash-in these shares I’ve apparently bought – £40K’s worth – preferably with interest.
Harford repeated his claims in a later BBC Radio 4 programme (More or Less, 21/5/10). He puts the national debt at well over £2 trillion (“around £90,000″* per household) by the “end of this parliament”. He adds:
The bank bailouts are mostly not included and they also have an uncertain cost depending on future share prices. It might be just a couple of hundred pounds per household.
What’s a few hundred quid compared to £90,000. Harford’s message seems clear: the bank bailouts amount to virtually nothing; nothing to worry about.
* National debt figures (per household) from BBC’s Tim Harford:
£30,000 – Official National Debt (not including bank bailout)
£5,000 – PFIs (off-balance-sheet)
£30,000 – Public Sector Pensions (off-balance-sheet)
£25,000 – Further borrowing until end of parliament
£90,000 – Total
£200 approx – bank bailout
The “Passive Surveillance” myth May 18, 2010Posted by dissident93 in Iraq mortality.
Note: an extended version of this post has been published by the Comment Factory
Les Roberts, the epidemiologist (and runner for Congress), uses the term “passive surveillance” to describe media-based counts of war dead. The term has entered the Iraq war lexicon – commentators often compare survey estimates (eg Lancet 2006) to figures from so-called “passive surveillance” (eg Iraq Body Count).
But Roberts misuses the term, and the lexicon is poorer for it. Consider the Iraqi journalists hired by Reuters to get the facts by going out and talking to people. Are they more “passive” than survey teams or pollsters? (Also, as a reader of this blog points out, Iraq was the “world’s deadliest nation for journalists” for six years, 2003-2008. That’s a measure of actual fatalities and abductions, etc, not of “passive” sitting in an office – RS, 1/6/10).
What about the processing stage – is it more “passive” to process media-based data than it is to process survey results? Does it help if you type faster or do press-ups at regular intervals? In fact, the active/passive metaphor has little relevance here. A media-based count of war dead may be incomplete, an “undercount”; a survey estimate may be way off due to a bias in sampling, etc – such things have nothing to do with relative passivity/activity.
So what are the origins of the phrase “passive surveillance”, and why is it used in this context?
The term ‘passive surveillance’ seems to have originated in the medical literature to refer to data on medical ailments compiled by recording the number of people who present themselves to medical facilities for treatment. This is contrasted to ‘active surveillance’ methods by which data collectors proactively search the community and find ailing people. Applying the ‘passive surveillance’ term to conflict journalism is misleading since journalists actively seek out violent events, witnesses and informed sources in the field. (Note 44, Ethical and Data Integrity Problems in the second Lancet survey.., Defence and Peace Economics, Volume 21, Issue 1)
So, the term is misleading. Why use it? Well, if you’re trying to discredit media-based counts, it helps to use a word with derogatory connotations. Labelling something as “passive” is like saying “not good enough”, “should try harder”, etc. As an epidemiologist, Les Roberts can get away with using a phrase from the medical domain. The key to establishing a term is repetition, and that’s what Roberts has done.
Les Roberts’s latest attack on so-called “passive surveillance” makes some sweeping (and misinformed) statements. As a correspondent of mine points out:
1. Roberts writes: “Aside from the Human Security Report, whose conclusions are largely based on news media reports, a variety of other publications have been produced based on press reports, or worse, passive surveillance by governments involved in a war [5,6]”
This is an odd statement, as the Human Security Report’s main conclusions are not “based” on media reports. Roberts doesn’t even specify which of its conclusions he thinks are “based” on news reports. He’s equally vague about the “variety of other publications”, from which he mentions just two (in his footnotes) without specifying which of their findings, if any, he has a problem with, or why.
2. Roberts then writes: “This Journal has shown that news reports are in part a cultural construct. For example, the ratio of civilian to Coalition military deaths in Iraq reversed when comparing 11 US newspapers with three from the middle east.”
His wording here is somewhat misleading. It’s not the “the ratio of civilian to Coalition military deaths” which “reversed”. If a newspaper reports an Iraqi death once, and a US death 10 times, the ratio of deaths reported is still 1-1, although the ratio of reports is 1-10. The latter is reversed – but that’s not a great illustration of the type of “cultural construct” that Roberts apparently has in mind – ie one which would justify his next statement: “The dangers of drawing conclusions from passive surveillance processes are profound: they allow one to conclude mortality goes down in times of war making war more acceptable and they allow armies, like those invading Iraq, to manipulate the press to portray resistance fighters as the primary killers when population-wide data conclude the opposite [8,9]”
His latter claim is not only unsupported by anything in his article – it’s clearly refuted by, for example, a comparison between IBC and Roberts’s own 2004 Lancet Iraq survey. The Lancet 2004 estimate shows that 43% of violent deaths (for the whole country outside Falluja) were directly caused by US-led forces, compared to IBC’s 47% over the same period. (IBC analysis, p23-26)
3. Roberts makes the absurd, sweeping statement that “We should not tolerate publications of surveillance data where the sensitivity of that data cannot be shown.” As my correspondent points out, this is like saying that we shouldn’t “tolerate” police-recorded crime figures, or any kind of simple count, without some statistical interpretation (a “sensitivity analysis”). This is complete nonsense, of course. Roberts should be asking himself whether we should “tolerate” statistical estimates from surveys which don’t provide the nitty-gritty facts on how the claimed random sampling was achieved.
Surely Action Heroes would not tolerate statistical constructs insufficiently supported by hard facts?
ORB Iraq poll criticised May 3, 2010Posted by dissident93 in Iraq mortality.
Opinion Research Business (ORB) is a “corporate and issues-led market research” firm which received much publicity in 2007 when it estimated that over a million Iraqis had been murdered as a result of the Iraq war.
A poll on the complex issue of war-related deaths was untypical of ORB’s work (they’re opinion-pollsters not, for example, epidemiologists). More typical for them is the following type of “finding”:
Latest poll by ORB on behalf of BBC Newsnight reveals more people believe David Cameron and the Conservative Party will “make the right cuts in public spending” than Gordon Brown and Labour. (ORB press release)
ORB’s Iraq poll wasn’t peer-reviewed science (it was published nowhere but on the ORB website), and the person conducting it, Munqith Daghir, had little formal training or field experience (according to ORB’s publicity literature). Nevertheless, their figure of over 1,000,000 deaths was widely quoted as a serious estimate.
A new peer-reviewed paper (published by Survey Research Methods  Vol.4, No.1) details systematic errors (eg non-coverage and measurement errors) in the ORB Iraq poll. It shows that in four governorates in central Iraq (which account for more than 80% of ORB’s estimated one million deaths) a higher percentage of respondents report deaths of household members than, in an earlier ORB poll, reported deaths of extended family members. This cannot be seen as credible, since extended families are much larger than households.
There’s a response to this paper from ORB’s Johnny Heald (followed by a response [p2] to Heald by the paper’s authors). Heald doesn’t address the paper’s substantial criticisms. Instead, he adopts a rather defensive position with regard to ORB’s aims (Heald’s own emphasis):
The survey was only an estimate and the fundamental point of this and every other investigation into this subject remains the same i.e. there has been a very significant human cost associated with the conflict. […] Our findings were an estimate based on a survey – or opinion poll if it makes it clearer. We have repeatedly stressed that our work does only offer an estimate; again the key point is not whether we (or others) are 100% accurate…
Of course, nobody claims that it was anything but an “estimate” (or that “100% accuracy” was required). And few people disputed the “very significant human cost associated with the conflict” (in fact most of my friends and colleagues were calling it a “bloodbath” way back in 2003, based on Iraq Body Count’s figures). The point – which Heald doesn’t address – was whether ORB’s specific estimate of 1,033,000 dead can be considered remotely reliable, given the credibility problems with their data.
Heald then attacks the paper’s authors, asserting that they “had little intention of taking an objective view but are merely pursuing an agenda”. He then claims a “conflict of interest”, because one of the paper’s authors (Professor Michael Spagat) is, he says, “very closely linked to Iraq Body Count (IBC)”. Heald apparently overlooks the fact that the paper’s other author, Josh Dougherty, is a member of IBC, and that this is clearly indicated at the top of the paper – presumably the paper’s referees didn’t regard this as a “conflict of interest”!*
Perhaps if Heald spent more time studying how good scientific method works in practice, rather than reading conspiriological smear campaigns issued by “media criticism” websites (I speculate here, of course), he might have a better understanding of what “conflict of interest” means in the context of published research. I direct him to a paper criticising IBC, co-authored by Les Roberts (who was, you know, “closely linked” to the Lancet Iraq studies). How’s that for a “conflict of interest”?
It’s a pity that Heald resorts to ad hominem rather than addressing the points of substance raised by the paper. Quantifying war dead is a serious business. If you publish an estimate of 1,000,000 war-related deaths, based on a claim of a “nationally representative sample” (a difficult challenge in Iraq) then you should provide sufficient detail about the sampling methodology for your claim to be assessed. As the above paper states, “ORB’s parsimony with information about its methodology is an indicator of low survey quality and weakens confidence in its estimate.”
Low survey quality might not matter so much if you’re making trivial claims (for frivolous news reports) – eg “more people believe David Cameron than believe Gordon Brown”. But it surely matters if you’re publishing a figure of a million deaths resulting from a war, and this figure contradicts, by a large amount, the estimates of several other studies.
* Heald was the Conservative Party’s private pollster in the 2005 General Election (according to his ORB profile). If I had a “link” like that, I wouldn’t go around making vague, silly accusations of “conflict of interest”.
Related news: Prof Spagat’s long paper on ethical and data-integrity problems in the Lancet Iraq study (2006) has been published in Defence and Peace Economics. (Another of his papers, with Neil Johnson et al, made the cover of Nature journal recently. Perhaps Nature is part of the conspiracy? After all, it too was critical of the Lancet study). The editor of this journal notes that “The authors of the Lancet II Study were given the opportunity to reply to this article. No reply has been forthcoming”. That’s a shame, given the seriousness of Spagat’s criticisms.
• The 2006 Lancet Iraq study has been awarded the “STONEWALLING & COVERUP” Award in the 2010 Top Ten “Dubious Polling” Awards from StinkyJournalism.org (a Media Ethics Project of the Art Science Research Laboratory). I checked the background of the two people who decided the awards (George F. Bishop and David W. Moore), to make sure they’re not “closely linked” to anything suspiciously like a “conflict of interest”. They passed the test – they’re not rightwing warmongers or anything.