Skip to navigationSkip to content
PLACE YOUR BETS

The ultimate statistical model for predicting the 2018 Academy Awards Best Picture winner

Youyou Zhou
By Youyou Zhou

Things reporter

Trying to predict the Academy Awards Best Picture winner is a fun challenge. While there’s no way for one person to get into the minds of over 7,000 members of the Academy who vote on the award, there are plenty of indicators leading up to the Oscars awards night. From enthusiastic movie-goers to serious data scientists, people have built models to predict the winner. Quartz collected quantifiable indicators used by various prediction models, sorted them into four major categories, and ranked the nine Best Picture nominees using the four approaches. Which film seems most likely to win depends on how much you weight each of the four categories in any prediction model you might build—but just five films appear to have any realistic chance across any of the models.

Category 1: Buzz and fanfare

What’s the most talked about movie on social media? Which one received the best reviews in your friends circle? Many models are built on the belief that the public’s attention and opinion have an influence on the votes. We tend to agree.

For this category, we ranked the nine nominees by searches on Google, tweets received, and their ratings on IMDb. We then aggregated the three rankings to get a final buzz score from nine (highest) to one (lowest).

NomineeBuzz scoreSearch (score)Tweets (score)IMDb (score)
Call Me by Your Name9798
Three Billboards8869
The Shape of Water7983
Lady Bird6473
Dunkirk5517
Phantom Thread4156
Get Out3613
Darkest Hours2311
The Post1111

Category 2: Prior Awards

The awards prior to the Oscars–like the Golden Globes and Screen Actors Guild Awards–are good indicators because they are decided by people who also cast their votes for the Oscars. This is the measure FiveThirtyEight has used in the past five years to build its Oscars prediction model. For its transparency and reliability, we simply took the FiveThirtyEight model’s predicted points for the nine movies this year and scored them one to nine, with nine reflecting the greatest likelihood of winning.

NomineeAwards score
The Shape of Water9
Three Billboards8
Dunkirk7
Lady Bird6
Get Out5
Call Me by Your Name4
The Post3
Darkest Hours2
Phantom Thread1

Category 3: Money talks

Dunkirk is a movie by Warner Bros. with a budget of $100 million. Call Me By Your Name, on the other hand, is an indie film made for $3.5 million. All major studios are in the business of lobbying the Academy to win more Oscars, knowing that the Best Picture winner could make more money from box office sales than the other nominees.

We believe that money, to a certain extent, also influences the votes. (Though there’s some evidence that in recent years big box office returns haven’t predicted Oscar wins—at least not the way they used to.)

Aggregating the box office numbers (as of Feb. 28) with budget sizes, here is where the nine movies stand money-wise:

NomineeMoney scoreBox office (score)Budget (score)
Dunkirk999
The Post878
Get Out682
Darkest Hours666
Three Billboards554
The Shape of Water445
Phantom Thread327
Lady Bird233
Call Me by Your Name111

Category 4: Critic reviews

Movie rankings aren’t complete without critics’ reviews. Critics and publications, while not having voting rights, have considerable influence upon both the public and the industry insiders. We aggregate the Metacritic and Rotten Tomatoes scores for the nominees to get the ranked critic scores.

NomineeCritic scoreRotten TomatoesMetacritic
Lady Bird999
Dunkirk869
Call Me by Your Name777
Get Out693
Three Billboards565
Phantom Thread436
The Shape of Water344
The Post222
Darkest Hours111

Now it’s your turn to decide how much influence each of the four fields has when picking the winner. By varying the weights for each of the four categories, you will get different Best Picture winners. Try different combinations of weights and see which one you get.

In this simplified model, depending on how you weigh the four categories, you will get different results. You might have noticed that no matter how you tweak the weights, some nominees just stand no chance of winning, and some are more likely to win than others.

The because the structure of the initial data always results in biases in models. In our case, it is the choice we made when selecting these four categories and selecting rankings instead of other numeric measures. To get a full picture of how our model favors one movie over another, we mapped out the ultimate winner by every possible combination of weights that can be assigned to the four fields. Only five of the nine nominees stand a chance of winning in our model:

Dunkirk, the predicted winner of 51% of all combinations, stands strong because of its top score on money (9) and relatively high positions for prior awards (7) and critic reviews (7).

Putting a little more weight on buzz and prior awards, you might get Three Billboards Outside Ebbing, Missouri, as it strikes a balance between buzz (8) and awards (8), or The Shape of Water because of its top score for prior awards (9) and high rating on buzz (7). About 27% of all possible combinations predict these two as the winner.

Lady Bird, the critics’ favorite, is the winner in 11% of possible scenarios. Call My by Your Name, with its top score for buzz and second place by critic reviews, takes about 12% share of combinations.

One note of warning before you announce your prediction to families and friends: The Academy uses a preferential voting system to get the Best Picture winner, with voters ranking the nominees. That approach rewards collective preferences, meaning the least-disliked film will win over more niche movies with ardent fan followings. That, while not accounted in the model, could have an influence.

Correction: The article has been revised since publishing to correct a mistake in calculating the Buzz score.