Prologue: the reason for the dearth in blog posts is that I am stuck in Tanzania and paying attention to In Camera has not been easy. However, I wrote this post on the plane, and have finally found the time to put it up. Forgive the shitty graphic - I did not find the time to make a nice one. Okay, done...
Bloggers, "real" journalists and ordinary folk have been trying to predict Oscar winners for ever and a day now. They have come up with all kinds of doctrines and theories - from award accolades (the Globes, BAFTAs, SAGs, DGAs, PGAs, Critics' Choice Awards and more) to hard numbers like Box Office stats to unquantifiable factors like who's running the campaign, the Academy-friendliness of the subject matter and whether or not the nominee is "due" for a win or not. It's a difficult business, and reminds me to some extent of a friend of mine at University who kept trying to predict the Lotto numbers using basic stats and an excel sheet. He is still poor. I have written a little on Oscar influences before, but have discovered a new, almost infallible approach. Okay no, it's complete nonsense but I am totally convinced that it will work this year. Hit the jump to see where my conviction comes from and who is going to win Best Picture (in case you haven't noticed the title).
What makes a picture win that most coveted of Oscars? As already stated, many factors. I think people have been focusing too much on things like award accolades, and too little on the obvious: general audience appeal, and critical acclaim. To some extent, the average Academy member falls somewhere between your Joe Public film goer and a film critic. Of course, extraneous factors (like the campaign itself) will result in the odd outlier, but my feeling in general is that general appeal and critical acclaim should be a good basis for predicting.
So, how do you measure these things? Easy. IMDB is probably the biggest movie website on the planet. It's been going since 1990, is now owned by Amazon, and sees over 100 million unique users each month. Essentially, users rate the movies out of 10, and the score is aggregated. A perfect measure of general appeal. Yes, results get inaccurate when votes are low and there are other issues too, but we are talking Best Picture nominees, so most of those issues disappear.
For the critics, it's not that easy. For these purposes, I have decided to use two of the biggest critic review aggregator websites: Rotten Tomatoes and Metacritic. For those who don't know, they work as follows (and each have a specific purpose):
- on Rotten Tomatoes (from it's Wikipedia page): the "staff first collect online reviews from authors that are certified members of various writing guilds or film critic associations. To become a critic at the site, a critic's original reviews must garner a specific amount of "likes". Top Critics are generally ones that write for a notable newspaper. The staff then determine for each review whether it is positive ("fresh", marked by a small icon of a red tomato) or negative ("rotten", marked by a small icon of a green splattered tomato)"..."The website keeps track of all of the reviews counted (which can approach 270 for major, recently released films) and the percentage of positive reviews is tabulated." In other words, Rottentomatoes is an ideal measure of general critical appeal. A 100% Fresh film may not be as good as an 80% Fresh one, simply because the latter was more divisive. Think of the system what you will, but it is undeniable that general critical appeal is quite an important factor when it comes to Academy votes.
- Metacritic (via its Wikipedia page) "is somewhat similar to Rotten Tomatoes, but the scoring results sometimes differ very drastically, due to Metacritic's method of scoring that converts each review into a percentage before taking a weighted average and listing different numbers of reviews." In other words, Metacritic tries to quantify critical acclaim rather than just measure general appeal. The result is important, because it lets you know which films where really worshipped by the critics, but I don't trust it too much, and don't ever rely on it for personal use. Basically, it's difficult to reduce every critic's review to a score out of 10. Not all score on that basis, some don't score at all and some score at different levels. Either way, it's part of the mix.
It's very simple really. I took the IMDB, Rotten Tomatoes and Metacritic scores for each Best Picture nominee. I also calculated an average. This average is what I see as an indicator of which film will win Best Picture. Awfully crude, I know. There are numerous potential flaws in the system, like giving each source an equal weighting for example. The high Rotten Tomatoes scores tend to make quite a difference to average scores, Metacritic scores can be odd sometimes, and are IMDB reviews really worth as much as we think? However, this is a basic indication and not an exact science. Instead of spending time I don't have on figuring out a weighting system, I leave as is. It works fine. Another issue is extraneous factors. Instead of trying to cater for these factors in the "model", I will try to attribute perceived outliers to these factors.
Another important factor is, of course, Box Office performance. I have listed the US Gross Box Office takings for each film (via Box Office Mojo of course) for information purposes. Trying to build it in would take to much time. For now, the Box Office figures can be a tool to explain outliers as well.
Here is how this year's Best Picture nominees stack up against each other (pardon the formatting):
According to my highly complex mathematical model, The Artist is hands down the favourite for 2012. It wins in each category except box office. Add to that its major award wins as well as the Weinstein Company, and we have a winner. I have been wrong often, but this year I am sure. Hugo and The Descendants are widely seen as it's chief rivals, but they don't stand a chance.
What else does the above list tell us? Well, for one, that Hugo, Moneyball, Midnight in Paris and The Descendants round out the top 6. Sounds about right, except that The Descendants ranks a bit too low. The reason? Probably it's lower IMDB score. I think that an average Academy member is a lot closer to a critic than a public, so if the model needed fine tuning, the first place I would look would be to adjust the weighting of the IMDB score.
It also tells us that Extemely Loud is very much the bottom of the pile. That is 100% correct, as its nomination was one of the major Oscar upsets this year. But does The Help deserve to be at number 8? Probably not. It's lower positioning is likely due to the fact that (1) the critics didn't love it that much, and (2) its massive box office takings (it's at number one by a margin of almost $100 million) aren't taken into account in my model.
The ranking above is absolutely meaningless on its own. So yes, my little predictor says that The Artist will win this year. Why should anyone believe that? Fortunately, it is easy to test the system by looking at previous years. Lets do it! Here are the rankings (winners in yellow) for 2009, 2010 and 2011.
According to both critics and commoners alike, 2009 was a year that belonged to Slumdog Millionaire. Indeed, the Academy thought so too. Much like this year, 2009 was a no contest. It was also the last time there were 5 Best Picture Nominees. Looking at the quality of that list, it really wasn't a spectacular film year.
In 2010 we saw the return of the 10 Best Picture nominee system. It's also a little trickier because the winner isn't streaks ahead like in 2009 (and 2012). According to the Oscar Oracle, the winner should have been Up, with The Hurt Locker a close second. Before I shoot down the system, I have to say that one area where an exception to the rule is required is the Pixar movie. First, they seem to attract unprecedented critical and mainstream praise. Year in and year out, Pixar movies end up being the best reviewed of all. Second, there is a Best Animated Feature category, effectively meaning that it is near impossible for an animated film to win Best Picture as well. Third, I don't know if the old-fashioned Academy members are ready to vote in an animated Best Picture winner yet. So, for now, Pixar is out.
With Pixar movies out, the model predicts a win from The Hurt Locker. Quite correct. The rest of the list is by no means perfect. An Education ranks too high in my view, and Inglourious Basterds too low. The former seems to be due to it's broad appeal, and the latter due to it's low Metacritic score. How can that be? Critics loved the Basterds. I just don't trust that Metacritic system.
The other 2010 controversy was, of course, The Blind Side being nominated. Yes, it had a decent IMDB Score but was not loved by the critics. The result? The lowest score of the year by quite a margin. It was likely nominated (1) because it's a heart warming true story, (2) because of the huge awards attention being bestowed upon Sandra Bullock, and (3) because it took over $250 million at the US Box office. It's those factors that cause odd movies out to crack the nod.
Lastly, we have 2011. Once again, a Pixar movie comes out tops. Forgetting about that one, we have a close battle between The Social Network and The King's Speech. Interesting, because that is exactly what went down in last year's Oscar race. Many pundits are still bitter that The King's Speech edged out The Social Network. So am I. The reason? Well, the critics loved The Social Network (just look at that Metacritic score - it's the highest of the last 4 years). However, extraneous factors, including a masterful Oscar campaign by the Weinsteins, the old fashioned subject matter and the general appeal factor gave The King's Speech the win. So, here the model is wrong, but the race was so close you can't exactly expect it to split hairs for you.
The rest of the list? Winter's Bone is way too high. It was critically acclaimed, but the film itself was just too small to be a serious player. I also don't think that Black Swan, The Fighter and Inception should be the bottom three. Then again, all the nominees last year had exceptional ratings. 2010 really was one helluva year for movies. See below.
With many a science experiment, the results often bear unexpected fruit. In this case, calculating the averages for each year can tell us a little about how the last 4 years stack up against each other...
According to the above table, 2010 (leading to Oscars 2011) was the best movie year of the last 4 by quite a margin. It comes out tops in every category except the Box Office, where it ends up in second. To be fair though, the Box Office champion (2009) had Avatar, the highest grossing film of all time. Those figures will screw up any stats. I have to say, I agree whole heartedly that 2010 was the best year in a while. 8 out of the 10 nominees scored in excess of 9 out of 10 in my book.
The best Best Picture nominees of the last four years? According to people and critics, it's (in descending order) Toy Story 3, The Social Network, The Artist, Up and The Hurt Locker.
The worst year is, unsurprisingly, this year. I can't speak for all the Best Picture nominees as I still need to see The Artist, Extremely Loud and War Horse, but so far I haven't been too impressed. Please don't let me be misunderstood, I loved all those films, they just couldn't come close to what I experienced this time last year. The Artist really is my last bastion of hope. Second worst: 2009. Slumdog was awesome, and I absolutely loved Benjamin Button, but the rest of them? Once again, decent films, but none of have inspired me to give them a second watch.
The Academy obviously LOVES Stephen Daldry. In 2009 his film The Reader was nominated, even though it's average score is a mere 65%. That is over 20% below Slumdog! And this year he did it again, Extremely Loud did even worse with 52%, almost 40% below The Artist! How he does it, I don't know. I'm not complaining though, since I really loved his other films Billy Elliot and The Hours.
Sadly, 2012 is the worst Box Office year as well, with an average of just under $65 million per nominee. That is over $100 million per film less than 2010! Almost triple. Bitch.
As this terrific statistically mathematical science experiment draws to a close, I would like to make a few concluding observations:
- The Artist is going to win Best Picture this year. If it doesn't, I will… be wrong. Seriously though, 2012 doesn't have the strange intricacies we saw in other years.
- The Academy has a puzzling obsession with Stephen Daldry
- Box Office and Ratings go hand in hand. Yes, good movies make more money. They may not always have the biggest opening weekend, but word of mouth ensures that the keep raking in the cash week after week. So Hollywood, stop with the Twilight and make some good stuff for us to watch!