Читать книгу The Times Guide to the House of Commons - Литагент HarperCollins USD, Ю. Д. Земенков, Koostaja: Ajakiri New Scientist - Страница 34
How the polls really got it right
ОглавлениеAndrew Cooper
Founder of Populus
The 2010 general election saw more opinion polls published than ever before – more than 90 polls during the course of the campaign: a rate of about three per day. Nearly half of these were from one organisation, YouGov, who produced a daily poll for The Sun, but during the campaign 11 different research companies produced voting polls.
The polling organisations between them used every conceivable mode of interviewing voters and deployed a wide range of ways to weight and adjust their data. These differing approaches, however, produced a fairly consistent picture as the election campaign kicked off, with the Conservatives 7 to 10 per cent ahead of Labour and the Liberal Democrats about a further 10 per cent behind. The polling story of the campaign was the subsequent abrupt surge in Liberal Democrat support, and its failure to materialise on election day.
Several polls picked up a growing frustration among many voters during the first week of the campaign. Even before the first TV debate the Populus poll for The Times published on April 14 found that more voters were hoping that the election would result in a hung Parliament than in a Conservative or Labour majority. The same poll found 75 per cent thinking that it was “time for a change from Labour”, but only 34 per cent that it was also “time for a change to the Conservatives”; two fifths of the electorate wanting change, but unsure which party, if any, they trusted to deliver the kind of change they wanted. Furthermore, only 6 per cent of voters felt that the main parties were being completely honest about their plans for dealing with the deficit and only 4 per cent that they were being honest about their tax plans. These findings to a great extent defined the mood of the voters.
The first debate resulted in one of the most dramatic swings in party support ever seen, with the Liberal Democrats jumping by about 10 per cent more or less literally overnight, with the gain coming slightly more from the Conservatives than from Labour. There were nearly 40 polls published between the end of the first debate and the end of the third debate and the Lib Dems were in the lead in five of them and in second place, ahead of Labour, in all but four. When, on the stroke of 10pm on election night, the exit poll predicted that the Liberal Democrats would end up with fewer MPs than at the previous election it was met with widespread incredulity because it seemed irreconcilable with the consensus of pre-election polls. The exit poll turned out, of course, to be right.
Close analysis suggests that Lib Dem poll support was always frothy: it relied heavily on strong support from younger voters and people who had not voted at the previous election, groups that in past elections have been disproportionately likely to end up not voting at all. Most polls are weighted to take account of how likely respondents say they are to vote, but there is a tendency for people to overstate their own probability of voting and there is little or nothing pollsters can do systematically to compensate for those who insist that they are certain to vote and then do not.
Polls during the campaign also consistently suggested that Lib Dem support was softer: those saying that they were going to vote Lib Dem were also consistently more likely than Labour or Conservative voters to say that they had not definitely decided and may end up voting differently. The implication of these findings was that the election result was always likely to be worse for the Lib Dems than the mid-campaign polls implied. But voting polls are heavily modelled these days, applying adjustments intended to project what the result will look like, not just present a snapshot of responses. This means that by the end of the campaign the polls ought to have reflected the underlying softness in Lib Dem support in a lower vote share, and that did not happen. Furthermore all the opinion polls overstated support for the Lib Dems: if the polls overall were performing properly they should have scattered either side of the result, with some understating Lib Dem support, and that did not happen either.
There is some evidence that the swing away from the Lib Dems mainly occurred in the final 24 hours, too late to be properly reflected in the final pre-election polls. The Times poll published on election day, for example, put support for the Conservatives on 37 per cent (which is what they got), Labour on 28 per cent (they got 30 per cent) and Lib Dems on 27 per cent (they got 23.5 per cent). Fieldwork for this poll was done on the Tuesday and Wednesday before the election and the two halves of the sample produced revealingly different results. The 1,500 interviews conducted on Tuesday, May 4, would, if presented separately, have shown the Conservatives on 35 per cent, Labour on 26 per cent and Lib Dems on 29 per cent. But among the 1,000 people interviewed on Wednesday, May 5, the Conservatives were on 38 per cent, Labour on 30 per cent and the Lib Dems on 24 per cent. Conducting fieldwork over a longer timeframe – two or three days, rather than one – generally improves the chances of a poll sample being properly representative, capturing the views of busy and harder-to-reach voters. In this case it may have helped to obscure a very late swing away from the Lib Dems, principally to Labour.
It was not all bad news for the pollsters. All but one of the nine organisations that produced a poll on electioneve came within 2 per cent of the Conservative share, five were within 1 per cent and two got it exactly right. All but two of the final polls came within 2 per cent of the Labour share and two were within 1 per cent. Overall it was not as good a performance as 2005, when the polls as a whole were more accurate than ever before, but it was better than at many other elections.