Editor’s Note: We discussed the mechanics of building a rating system, and aggregating ratings, in yesterday’s class. This post is in reference to that.
IMDB ratings: Mission critical to life
When I hear of a movie, the first thing I invariably do is pull up its IMDB page and look at the rating- an action many of us resort to. These ratings have helped me discover many wonderful movies. Just a couple weeks back, I heard of a movie called Birdman, the reflex action led me to the IMDB website where I saw a respectable rating of 8.1. Now interested, I promptly went to the theater and caught the movie, and I hear today that it won the Oscar for the Best Movie. IMDB for the win!
The IMDB dual rating system with a separate formula for top 250
Spurred by today’s class on review systems, let’s take a peek beneath the hood of the IMDB ratings. The formula for calculating the Top Rated 250 Titles is as below:
Weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C where:
R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250 (currently 25000)
C = the mean vote across the whole report (currently 7.0)
The second term above is interesting: it pulls the movie rating towards the mean vote of 7.0. Further, for rating the top 250, only “regular voters’” votes are considered.
The formula for any movie (non-250) is just a simple weighted average of all votes, after application of “various filters”. You and I can contribute to this “regular” rating of movies. IMDB does not disclose its exact methods of weighting.
A consequence is that every movie has 2 ratings- a regular rating and a top 250 rating. For example, Shawshank Redemption (#1) has a rating of 9.2 on the top 250 list and 9.3 on its separate movie page (see below). Some movie may have high ratings on their individual page but do not make it to the top 250. The implication is that for choosing the best of the crop, reviews from the accredited folk matter the most.
The “Class”ic issues crop up
IMDB uses mean ratings above, which could suffer from a bias stemming from both the issues discussed in class:
- User behavior: My 7 is different from your 7; for instance an American’s 7 is different from a German’s 7 leading to higher ratings for an American movie.
- Cuisine behavior: A drama movie rater (serious critic) might be much more conservative in his scoring than say an Animation movie rater (a childlike adult)
The solution is weighted means, as discussed in class. We need to give some “power users” more weightage than others, when calculating scores. While IMDB claims “In order to avoid leaving the scheme open to abuse, we do not disclose the exact methods used”, it has provided some sketchy details. The weighting scheme apparently has been developed over 10 years and is tweaked regularly. What they are saying in essence is: “Trust us, we know what we’re doing”.
In IMDB we have to blindly trust; Rotten Tomatoes more transparent but different methodology altogether
IMDB is keeping its rating system as a closely guarded secret, like Coca Cola does for its formula. So we movie buffs have no option but to blindly trust IMDB’s word, and number. Rotten Tomatoes is another popular movie rating website, and IMDB’s closest rival. RT differs from IMDB in two ways; firstly it provides two ratings, and secondly it has a different methodology for calculation.
Rotten Tomatoes provides two movie ratings: Critic rating and Audience rating. The Critic rating is called the “Tomatometer” and is the official, trademarked rating of the website. The methodology used here has not been discussed in class: the Tomatometer is simply the “percentage of professional critic reviews that are positive for a given film or show”. The Audience rating, meanwhile, is simply the percentage of regular RT users who’ve rated a movie 3.5 or above on a 5 point scale. This is akin to saying a positive rating is equivalent to 1 upvote, while a negative rating is equivalent to a vote of 0, with the total score is divided by the total number of votes. This is akin to a binary scale for rating- either a like or a dislike.
Differing philosophies; rate RT better overall because of option but IMDB better on comparability
For their flagship scores, RT and IMDB follow different philosophies- while the Tomatometer is based wholly on critic scores, IMDB takes into account scores of all users. Therefore, IMDB is a crowdsourced score while the Tomatometer is not.IMDB appears to be losing some information due to its insistence on a “1 score” system. On RT, there are some movies which clearly the average movie audience loves while the critics hate, and vice versa (see Lucy below). As a viewer, on RT, you can judge whose rating you want to use. On IMDB, though, you don’t know how the weighted average is calculated – it’s a secret formula – and are therefore unaware if a movie is a hit with the average cinema goer or with the “power user” critics. However, there are 4 ratings numbers one can use in RT: The Tomatometer, the “Average Tomatometer rating”, the audience score and the Average Audience Rating. It is difficult to compare two movies given four numbers for each. Therefore, if I want to watch a movie, I’d prefer to use RT’s scores (Audience and Tomatometer), but if I want to compare the ratings of two movies, I’d prefer to use IMDB because there are 4 scores for each movie on RT, which makes difficult the process of comparing.
Next time when you reach the IMDB page of a movie and you think about reading this post, then please mentally doff your hat to me. For more such dope, visit https://ashwinwins.blogspot.com. And, enjoy that movie!
IMDB Rating formula: http://www.imdb.com/chart/top?ref_=nb_mv_3_chttp
IMDB Rating details: http://www.imdb.com/help/show_leaf?votes
Rotten Tomatoes rating formula: http://www.rottentomatoes.com/about/
Ashwin Ravikumar is a second year PGP student at IIMB, and part of the Spreadsheet Modelling for Business Decision Problems course.