In this final review of the 2020 Oscar season, I thought I’d try to compare some of the different ranking systems for these movies and try to explain their similarities and differences. There are multiple ways to look at how good a movie might be. There is the general audience reaction, critical response, Oscar recognition, and – in my case – my own personal evaluation. The question I am trying to address is where those scales produce different rankings of the movies and how might we explain those differences.
The measures of these scales vary a bit. In the case of general audience reaction, I use the IMDB Audience rating which is the average rating on a scale from 1 to 10 of, sometimes, hundreds of thousands of viewers who have seen the movie and go to the IMDB website to register their opinion. Professional critic response is measured from the Critic Metascore a composite measure based on content analysis of the reviews from professional movie critics. In an earlier essay, I compared the audience rating and the Critic Metascore and commented on where there were discrepancies. For purposes of this analysis, I combined these two measures in an average rating that represents the Public ranking of a movie, giving equal weight to what the viewing public says and what the critics say.
The Oscar recognition scale uses an index I created that assigns scores to movies based on what categories they were nominated in and whether they won in those categories. In a previous essay, I distinguished between major and minor categories with the directing, acting, and writing categories being ‘major’, and the more technical categories considered ‘minor’. A minor nomination receives one point, a major nomination gets 2 points, and a nomination for Best Picture, Animated Feature, Documentary Feature, or International (Foreign) Feature gets 3 points. If the movie wins a category, the points are doubled. This scale is imperfect, of course, but it does tend to give an accurate picture of how the Academy evaluates these movies. (Obviously, they have to get at least 1 point to even make the list!). Finally, my evaluation from 1-5 stars is used as the third scale in this comparison.
The methodology for this analysis, then, is simply to rank the movies on each of these three scales and identify where a movie falls in a different location on the scale. Minor variations are ignored and only major differences are worthy of comment.
So the first question is how well does Oscar recognition reflect the general public’s reaction to a movie. This comparison compares the Public Ranking to the Oscar Recognition Scale. There are some interesting variances. One ‘specialized’ movie (the documentary For Sama) was under-recognized by the Academy. The public rating placed this documentary second on the list of 38 movies while it only received a single nomination, for Documentary Feature, which it did not win. Using this methodology, the Academy definitely under-rewarded this movie compared to what the viewing public and movie critics think.
Eight movies received more recognition from the Academy than the public ratings suggest they should have. Once Upon a Time…in Hollywood, received the second highest level of Oscar recognition but was 16th (out of 38th) in the Public rating. The explanation here is probably that because it was about the industry itself, the Academy liked it more than the general public. The Two Popes, Bombshell, Judy, and Harriet were also rewarded more by the academy than the public thought appropriate. Mostly, I think these were rewards for specific actors (Anthony Hopkins, Charlize Theron, Margot Robbie, Renee Zellweger, and Cynthia Erivo all did terrific jobs), but these movies weren’t otherwise popular successes. JoJo Rabbit and Joker were two movies that received significantly more Oscar recognition than the Public may have thought appropriate. In JoJo Rabbit, the movie’s structure and convoluted humor may not have scored popular support. And Joker may have simply been a bit too violent and polarizing. The Star Wars: Rise of Luke Skywalker movie was also overrated by the academy, probably because, as the last of the series, they wanted to finally recognize the franchise, while the public has pretty much gotten tired of it.
On the other hand, there were six movies (in addition to For Sama) that the Academy slighted compared to public response. Pain & Glory received two nominations, but zero Oscars while the public ranked it ninth. Knives Out, A Beautiful Day in the Neighborhood, Lighthouse, and Ad Astra received only single nominations while the public loved them. Knives Out, for example, ranked 11th by the Public while it only received a single Writing nomination. Then, of course, there is The Avengers: Endgame, the movie that made the most money of any movie ever and yet the Academy only gave it a single nomination for its visual effects.
When 15 of the 38 nominated movies are evaluated differently by the Academy than by the Public, then clearly there is a different set of values going on. I’m not sure that should be particularly surprising, but it does suggest that you need to distinguish Oscar buzz from popular perception of quality – they are not the same thing.
So, how did my evaluations compare to either of these other scales? Is my opinion more aligned with the public, or the Academy? As might be expected, the results are mixed.
In the Documentary category, I really thought For Sama should have won the Oscar, so I disagreed with the Academy there and aligned more with the public. But I also thought the Edge of Democracy was better than the public thought it was (although not as good as For Sama.). I was really disappointed with Les Miserables (in the International/Foreign Feature category) and disagreed with both the public and the Academy on that one. Call me a curmudgeon, if you will, but I also didn’t find the How To Train Your Dragon installment, nor Missing Link (both in the Animated Feature categories) really worth the time to watch unless, of course, you have small children in which case, go for it?
I have to admit that I liked Joker better than the public did and I’m not quite sure what that says about me. I’m not particularly fond of violence, but Phoenix’s acting was terrific and the story really, in my opinion, is a sort of metaphor for the rise of Trumpers! I also liked the Two Popes and the Star Wars film better than the public did, maybe because of my age. On the other hand, I didn’t like Pain and Glory as much as the public did, again, maybe because of my age and how close it hits to home.
There were four movies that I thought the Academy underrated while the public was right on. Lighthouse, Knives Out, A Beautiful Day in the Neighborhood, and Ad Astra all should have received more nominations or even an Oscar.
Once Upon a Time…in Hollywood is a film where I agreed with the public more than the Academy, but then, because it is about Hollywood, we understand they were being self-indulgent. Avengers: Endgame is a case where I liked the movie less than the public did, but thought it deserved more Academy recognition. While Bombshell and Judy were the exact opposite – I liked them better than the public did, but thought the Academy overrewarded them. Finally, while I thought 1917 was a technical masterpiece I thought it was underwhelming as a story, so I rated it both below the public and below the Academy’s ranking.
Over fourteen months ago, just after the Oscar nominees had been announced and before I saw any of the 38 movies, I ranked them based on the data available to me (audience reaction, critical Metascore, and Oscar nominations). (That analysis was presented back then and is available in the archives). I compared that ranking with my final ranking, based on my reviews. Of the thirty-eight movies, only ten of them were really very different from their initial place in the ranking (five were surprisingly higher, and five were disappointingly lower.). The biggest surprise was how much I liked Toy Story 4, The Edge of Democracy and the Joker. The biggest disappointments were 1917 and Once Upon a Time in Hollywood. Most of the rest of the movies ended up within a few notches of being on the same ranking. Not sure what the takeaway is from that, but it might just mean that the consolidated opinions of others might very well match my/your own in most cases. Interesting.
In short, there is no shortage of diversity of opinion. All of which means one should definitely consider multiple sources when evaluating a movie. Or, just watch it yourself and form your own opinion…