Peter Ould writes: The recent BBC commissioned poll on belief in the Resurrection of Jesus has attracted a lot of media attention. Gavin Ashenden, the former Queen’s Chaplain, pointed out in a letter to the Times (and subsequent news piece) that belief in the resurrection is a core Christian doctrine. If you don’t believe in the resurrection, how can you call yourself a Christian?
On the back of this I thought it would be interesting to look at the ComRes poll and dissect what the “Christians” in the survey actually look like. The poll includes, very helpfully, a measure of how often someone attends church (once a week, up to 3 times a month, etc) and this allows them to categorise some of the Christians as “active” on the basis of their attendance. This is in sharp contrast to YouGov who do not use this stratification and consequentially report as “Christian” those people who never actually engage in any public forms of Christian worship.
I took the ComRes poll and produced a new column for “inactive Christians” by subtracting the active Christians from the total Christian sub-sample. I then put the non-Christians and no religion group together. This gives us three groups to look at: active Christians; non-active Christians; and non Christians. I then calculated what the margin of error was on each of these sub-samples and looked at the confidence intervals for my three groups on each of the key questions.
We look at confidence intervals because when we use samples in opinion polls, the percentage figures we report are only estimates of the true population position. Let me use a political example to explain. Say ComRes did an election opinion poll of 1,000 people, and asked them who they would vote for, and 40% said “the Peter Party”. Because there are just over 64 million people in the UK, even if we make sure that our 1,000 people questioned are a fair representation of the population, we can’t guarantee getting the true result for all 64 million people. We use margins of error to create a confidence interval which shows us the most likely range of the true figure of support in the total population. So although our sample of 1,000 people say that 40% would vote for yours truly, the margin of error on a sample of 1,000 from a population of 64 million is around 3%, so our 95% confidence interval for the true answer is 37% to 43%. To put it another way, based on asking 1,000 people how they would vote, we think the level of support for the Peter Party is 95% likely to be between 37% and 43%.
Now all we have to do is add one more piece of statistical insight and we can look at the ComRes/BBC poll. The final piece of the puzzle to understand how to interpret polls is to recognise that when two different observations have confidence intervals that fall across each other, we say that there is no statistical significant difference between them. Let’s use an example to explain. In the same pretend poll that gave a 40% figure for the Peter Party, we found a 35% level of support for the Ian Party. Now that result comes from the same 1000 people, so the 35% result also has a margin of error of around 3%, leaving the 95% confidence interval for the Ian Party as 32% to 38%. Even though the Ian Party polled 5% points lower than the Peter Party, because the Ian Party’s top confidence interval level (38%) is higher than the Peter Party lowest confidence interval level (37%), we cannot actually say that the Peter Party has a statistically significant lead. If, on the other hand, the Ian Party had polled 33%, then its confidence interval (30% to 36%) would be outside the bounds of the confidence interval of the Peter Party (37% to 43%) and so we would be able to say that the Peter Party has a statistically significant lead.
Make sense? In reality this stuff is actually a little bit more complicated as we have to take into account things like covariances and some statistical nicieties, but for the purposes of what I want to present, the above is enough for us to do some analysis. Everything I’m writing below is laid out in the spreadsheet which you can download here. comres bbc poll
OK, to business. Let’s look at the first question: Do you believe in the resurrection of Jesus? 301 active Christians said they did (93.2%), 433 non-active Christians did (62.3%) and 496 of those who said they weren’t Christian did (14.0%). The non-Christians are comprised of all those who gave another religious identity or none at all.
Now, when we look at the confidence intervals (CI), we see an interesting thing. The active Christians have a CI on this question of 87.7% to 93.2%, non-active of 58.6% to 66.0% and non Christians of 10.8% to 17.2%. I hope you’ve spotted the key observation here – the CIs of the non-active and active Christians don’t cross over so we say that there is a clear statistical difference between the two groups. To put it another way, non-active Christians don’t believe the same thing as active Christians on the Resurrection.
I’ve put that observation in bold because it’s incredibly important. When the BBC report that “Christians” do or don’t believe a certain thing, they are throwing together a whole group of people ranging from those who are active members of a worshipping community (taking their faith seriously) and those who might call themselves Christian but don’t ever do anything “Christian” with anyone else who is a Christian. This difference is suddenly very apparent when you separate out active and inactive Christians; suddenly the inactive ones look very different to those who are actively engaged in a Church. Indeed, when it comes to the core belief of Easter, the inactive Christians are very different from active Christians.
And guess what – this observation continues question after question. In the table below I show you the Confidence Interval for each key question for our three groups. In almost every single case there is a clear statistical difference between the inactive Christians and the active Christians. On many questions the inactive Christians look far more like the non Christians than the active Christians (for example, believing in the Christian understanding of life after death, believe in a literal physical resurrection).
|Believe in Resurrection||87.7 – 98.6||58.6 – 66.0||10.8 – 17.2||Clear statistical difference between inactive and active Christians|
|Word for Word Resurrection||52.1 – 63.0||14.8 – 22.3||0.3 – 6.7||Clear statistical difference between inactive and active Christians|
|Some of Easter Story not literal||30.5 – 41.4||39.9 – 47.3||7.2 – 13.7||No statistical difference between inactive and active Christians|
|Don’t believe Easter Story||0 – 10.4||27.2 – 34.7||78.4 – 84.8||Clear statistical difference between inactive and active Christians|
|Believe in some form of Life after Death||79.7 – 90.6||46.6 – 54.1||27.0 – 33.5||Clear statistical difference between inactive and active Christians|
|Believe in Christian Life after Death||67.3 – 78.2||28.5 – 36.0||11.2 – 17.6||Clear statistical difference between inactive and active Christians|
Now, I’ll leave it for others to explore why some active Christians don’t believe core Christian doctrines (the physical resurrection of Jesus, life after death etc). All I want to point out here is that when you dissect the “Christians” into active and inactive, it’s not just that the inactive don’t look like the active, it’s that the inactive often don’t look much different from those who don’t claim to be Christian.
Moving forward it seems clear that those who commission opinions polls of this sort need to clearly distinguish between active and inactive “Christians”. The way that ComRes do this (stratifying by church attendance) is probably the most convenient for accurate results and it does demonstrate a clear difference between those who are active members of their churches and those who claim the name “Christian” but aren’t part of any meaningful Christian community.
Do active Christians believe the Easter story? Well probably over 90% do believe in the resurrection. Should inactive Christians who can barely scrape 60% belief in the resurrection be put into the same bucket as active Christians when reporting what “Christians” believe? Given that these inactive Christians are clearly statistically different from active Christians, might I humbly suggest the answer to my rhetorical question is “No”.
Ian Paul adds: Peter’s analysis above raises some very interesting issues in a number of directions.
First, we can observe that actually attending a church makes a difference to belief. It might be argued that going to church teaches you Christian doctrine; or it might be argued that if you believe Christian doctrine then you will attend church. But either way, church attendance and conformity to Christian belief do indeed go together. Attendance (‘bums on pews’) really does matter.
Secondly, this means that Gavin Ashenden’s theological point—that belief in the resurrection is such a core Christian belief that it doesn’t make much sense to call yourself a Christian without believing in it—is also true sociologically. Belief in the resurrection does indeed correlate with active Christian involvement.
Thirdly, this raises a question about the BBC’s reporting of the poll. The BBC is a big corporation, and they must have within the organisation people who are trained in statistical analysis. And the poll included all the information needed about active and non-active Christians. So why did they themselves not undertake this analysis? And why did they decide to publicise the results in such a misleading way—at the moment of a major Christian festival? Can you imagine a misleading story, undermining a sense of belief, being published at a major festival of another world religion? (If you would like to complain about this, you can do so here.)
Lastly, this has an impact on other surveys which purport to show ‘what Christians believe’. In January last year, Jayne Ozanne commissioned a poll which ‘demonstrated’ that the majority of Anglicans supported her view on sexual ethics. But the poll did no such thing, precisely because it did not ask a stratification question about church attendance in the way the BBC ComRes poll did. Given that such a question is so important, it would be hard to justify not including such a question in your survey—unless, of course, you wanted to fix the results and get the answer you were looking for.
Much of my work is done on a freelance basis. If you have valued this post, would you consider donating £1.20 a month to support the production of this blog?