This post was originally published on 2/7/2011.
I'm a nerd, economist, and a movie snob. Sometimes it makes me hard to deal with.
My brother has a natural fear of picking movies with me. During our holiday visits home we invariably try to watch a movie and it ends in eyes being rolled in my direction. My family has taken to calling any movie I pick as a "depressing indie drama."
I don't think of this as being difficult, I think of it as getting the most out of my time. Having seen thousands of great movies, I have trouble committing two hours to a movie of dubious quality. In an effort to avoid wasting time on bad and mediocre flicks, I am on a quest to better predict how much I will enjoy a given movie. I've rated more than 700 movies on Netflix, I visit IMDB about 25 times week, I've tried Flixster, Rotten Tomatoes, and the blogs of well-known critics. The goal is to accurately correlate my movie-watching happiness with the ratings provided by these sources. So far the results are disappointing. No one source accurately predicts my preferences. Even inter-comparing and creating composite indexes frequently leads to contradictory predictions. To date, the best predictor I've found is a film's IMDB rating, but this number is far from perfect.
IMDB ratings are worst when movies are newly released. For a film like Citizen Kane, the IMDB score is accurate, and no wonder: enough people have seen it to decide how good it is. In fact, Orson Welles' masterpiece has 145,319 ratings on IMDB, a score of 8.6/10, and is listed by the American Film Institute as the best movie ever made. [1] Citizen Kane is pretty similar to other critically acclaimed films on IMDB. Among the top ten films, the average number of IMDB votes is 152,073 and the median score is 8.45. With so many ratings, my guess is that these movies are more accurately rated than a movie with 1% as many reviews that was released last week.
Take Inception for example. When it was released it had a rating of 9.3 on IMDB and thousands of reviews. But how could this be? Was Inception actually a better movie than Citizen Kane, Casablanca, The Godfather, Gone with the Wind, Lawrence of Arabia, The Wizard of Oz, The Graduate, On the Waterfront, Schindler's List, and Singin' in the Rain? Having seen all of these films, I had a hard time believing it.
So, I hypothesized that IMDB ratings were biased upwards for young movies. When new movies come out, the first people to see them are early adopters and critics. As an example, someone disinterested in a new film may see it eventually [2], but they are unlikely to see it the first day it comes to their local theater. My contention was that seeking out such pre-releases, in combination with marketing and release hype, would select and reinforce overly-positive movie reviews.
To test this theory, I spent three months sampling a randomly-selected group of 21 new releases. I started sampling on November 9th by finding IMDB's list of upcoming movies and recording the first data point for all of them [3]. I then checked the ratings once weekly to see if my prediction about prerelease hype held up to a little empirical rigor. My sample was surprisingly diverse. Among the movies I sampled there were big budget Hollywood films like Tron: Legacy as well as indie films like Rare Exports [4]. Because some of the films were slated to release later in the month of December and some had pre-screeners who rated the movies before a popular release, I didn't have an equal number of data points for each film. Almost every film did reach score equilibrium; the score remained stable for at least three sampling periods (three weeks). Here is a time series for the films. I've omitted the titles since it would clutter the graph too much:
Looking at the graph is a bit confusing, and there isn't a clear trend. So I turned to the numbers. With a little statistical crunching I found that the average movement in rating was -.2125, significant at 95% confidence. In other words, new movies do have inflated IMDB ratings, on average those ratings are .2 points above where they will eventually settle.
The greatest volatility in rating was in the first two sample periods, which is to be expected. The Tempest and Casino Jack were the biggest losers (shedding 1.6 points in the 3 month period). There were several films that appear to have been correctly assessed from the get-go and had no rating change after 12 weeks: I Love You Philip Morris, The Tourist, The Fighter, Little Fockers, and a French film by the name of The Illusionist. The rest suffered small declines in score that are consistent with my theory.
The takeaway here is that if you are asked to watch a new release, assume that the IMDB rating is overly-optimistic by about a fifth of a point, then go anyway and have a good time.
[1] Even a film snob like me must admit that it is ridiculous to make such a claim but it sure sounds definitive.
[2] I suspect the biggest reason that disinterested people see films is social pressure.
[3] The equivalent page for this week would be here.
[4] I tracked all of the following films: Black Swan, I Love Your Phillip Morris, Rare Exports, The Warrior's Way, The Tourist, The Tempest, The Chronicles of Narnia: Voyage of the Dawn Treader, The Company Men, The Fighter, Tron: Legacy, Yogi Bear, How do you Know, All Good Things, Rabbit Hole, Casino Jack, Little Fockers, True Grit, Somewhere, The Illusionist, Gulliver's Travels, and Country Strong.