Pages

Friday, April 26, 2024

Has Resolve Strategic changed its polling methodology?

The aggregation models I use make many simplifying assumptions. One assumption is that the polling methodology adopted by each polling firm remains unchanged over time. If a pollster indicates that they have changed their polling methodology, then I would treat the subsequent polls as a different series to the prior polls. I have done this with Essential when they published a change to their polling methodology.

When I look at the recent polls from Resolve Strategic, it looks like the polls from 2024 are less favourable to Labour than the polls from the prior two years. In the first chart below, we can see that the last poll result was almost four percentage points more favourable to the Coalition than the long term average for Resolve Strategic. In the second chart below, we can see that historically Resolve Strategic has been the most favourable poll for Labor (on average). And in the third chart below, we can see that the individual Resolve Strategic polls in 2022 and 2023 (indicated with the letter h) were often (but not always) well above the aggregation, and two of the three polls in 2024 are not. 



Now, this could be just the usual random noise and chance that comes with statistics. And in coming months we will see a return of the prior patterns of systemic house bias at Resolve Strategic. But this apparent change could also be the result of a change in polling methodology at Resolve Strategic. At the moment I don't know which explanation is the most plausible. While I have looked, I have not seen a statement on any methodology change from Resolve Strategic (please provide a link below in the comments if I have simply missed it).

If this apparent change in house bias is sustained in the next couple of Resolve Strategic polls, I will assume that there has been something of a change in polling methodology, and I will separate the 2024 and subsequent polls into their own series. 

Please note: this is not a criticism of Resolve Strategic. I have enormous respect for all pollsters, and I appreciate that opinion polling is much harder today than it was (say) 35 years ago (when almost every house had a landline and no-one had mobile phones). At the moment, I am just observing that the last three polls from Resolve Strategic perhaps look a little different from the earlier polls, and I am musing why this might be the case. Nonetheless, I would be somewhat disappointed if it turns out that Resolve Strategic has changed its polling methodology but has not communicated how it has changed and why it has changed.

3 comments:

  1. Dear Mark the Ballot.

    Thanks for the enormous amount of work you do on this site. I have a question about your data for Essential's 2PP.

    Essential publishes a "2PP-plus" measure which sums to less than 100 because they do not exclude undecided respondents. Many poll aggregators calculate conventional 2PP measures for Essential by excluding undecideds and applying last election preference flows to their primary voting intention results.

    Eyeballing your Vote Share (GRW Normal fixed) chart suggests your Essential Labor 2PP readings since the start of 2024 are: 51, 52, 49.5, 50.5, 47, 51 and 49.

    However when a conventional 2PP is calculated (excluding undecideds and applying 2022 election preference flows) the Labor 2PP results for the first seven Essential polls of 2024 are: 52.9, 52.9, 51.3, 51.5, 49.4, 51.9 and 51.

    Interested to understand the reasons for this discrepancy.

    Regards Mark D.


    ReplyDelete
    Replies
    1. Mark - apologies for the late response - I have been simply upweighting the essential 2pp to 100%.

      Delete
    2. Thanks! This goes to the issue for poll aggregators of how to treat the different 2PP measures being compared: classic 2PP (Newspoll, Freshwater, YouGov) where the pollster applies last election preference flows to primary voting intention results; respondent allocated 2PP results (like Morgan where the results are often 1-2 ppts different compared to using the classic method); and pollsters not publishing 2PP estimates (Essential, Resolve) where the aggregator has to make a decision about how to arrive at a 2PP result. Not a lot may turn on it in terms of the aggregate results but in theory I would think the preferable approach would be to use a consistent methodology for the 2PP measure across all the polls being aggregated. Thanks again for all the work on this site.

      Delete