63% estimate of Covid vaccine by May - Maby Forecast Live
On September 10, we were joined by UnHerd science writer Saloni Dattani and 41 forecasters on a call to investigate and explore the likely timeline of a coronavirus vaccine.
Before the call, Tom and I narrowed down the scenario that we care about. A vaccine which isn’t widely accepted, approved or available for distribution in large numbers won’t count for much, so in selecting an impactful event we settled on asking about an FDA-approved vaccine available in large numbers of doses in the United States as a scenario which likely indicates vaccine success:
When will 25 million doses of an FDA approved vaccine for Covid-19 be available in the US?
Then we decided on the scope. Other public forecasts have asked whether there will be a vaccine before May 2021 as a yes/no binary, but we wanted to have an idea about when that might be if it comes sooner than May. So we decided to ask across five bins:
- A: Before Nov 4
- B: Nov 4 to Dec
- C: Jan to Feb
- D: Mar to Apr
- E: May or later
The call began with us introducing the concepts and taking the group through a quick calibration quiz. This gave us some data about the group's level of calibration - the better calibrated, the more reliable their forecasts.
Perfect calibration would mean that the outcome frequency and forecast probability were equal, which wasn't quite the case for our group, but they did better than a lot of groups we have seen recently.
Over time it’s typical to see new forecasters become better calibrated, as they become better acquainted with their own internal sense of probability.
Forecasting - round 1
We forecast over two rounds. The first round goal was to get the lie of the land among the group, and surface questions for our guest speaker Saloni. The median response for each bin at the end of the first round was as follows:
At the end of the initial round we had a 60% probability the vaccine would arrive before May, with finer grained forecasts on the period in advance of that date.
Each forecaster could submit a comment and/or question anonymously, which were voted for by the group as a whole.
We then opened the discussion, focusing first on the points most voted for by participants, making sure to cover the standard forecast checklist:
- Base rate / outside view / reference class (How do vaccines normally arrive? How do big medical projects normally get organised?)
- Inside view, breaking the problem down - what are the steps to this happening, how does the situation in front of us correspond to the base rate / reference class (what's the specific way that a vaccine of this type might progress?)
- Scope and scale (is there a difference between 1m, 10m, 25m, 100m doses? What if the time horizon was over a decade?)
- Relevant sources- who has been writing on this? Are they reliable? Have we over-reacted/underreacted to recent news (what about the vaccine trial that was halted - is that a big deal?)
- Biases - could optimism/pessimism bias play a role? What about political biases? (does Trump talking about vaccines emotionally affect people's forecasts?)
At the end of this discussion we opened forecasting on the update round.
We asked all forecasters to consider the viewpoints and information they’d heard submit an estimate again, altering their response if they felt compelled to by the discussion or some other new information and analysis.
This time the group was slightly more optimistic about the prospects of a vaccine, giving 63% overall to a vaccine before May, most likely in March or April.
Looking at the full distribution of forecasts, we saw that the first round forecasts were more widely dispersed, and it’s easy to identify where the updated forecast became slightly more optimistic.
This was our first public test of some of the new app features, as well as our first time forecasting when the team is mostly new to forecasting, and new to each other - our typical client team already knows each other and has worked together before.
With 40 or more people simultaneously forecasting we're glad everything worked according to plan, and we learned a tremendous amount about how to make this work even better in the future - it's all being fed into the design for the next session.
We also collected some benchmark forecasts from other platforms last Thursday (including from our former employers!) so it will be especially interesting to see how things resolve.
Help us improve Maby:
We founded Maby to help any organisation build an efficient forecast capability, and I hope we showed a little of that on Thursday. If your organisation or one that you work with would like be able to produce fast and accurate forecasts then we'd love an introduction, even if they’re not looking to buy anything in right now - we can only make our forecasting knowledge useful if we understand the problems you're facing, so feedback and information is immensely valuable to us as we build our app and systems.