Brainy Gamer Podcast - Episode 28
Small miracles

We deal with criticism

Metacritic Some of my gamer pals monitor Metacritic scores like investors tracking a stock ticker. Last October, after writing a positive piece about Demon's Souls, I received a Twitter DM from a reader concerned that the game had "dipped below 90" on Metacritic...soon followed by an alert that the score had risen again to 90 and would likely "remain stable" there.

He was relieved because a 90+ score from Metacritic means "universal acclaim," whereas an 89 translates as "generally favorable reviews." As a player who admired Demon's Souls, he clearly felt invested in its score.

I've long dismissed Metacritic as a distorted metric, and I actively discourage my students from allowing it to influence their choices of games to play. If you're looking for guidance, I tell them, identify a few reviewers or critics whose sensibilities seem to align with your own and carefully read their responses to the games they play. Don't get handcuffed by a number.

Consider, too, that none of us is a jack of all trades. If you're looking for a thoughtful review of a Splinter Cell game, maybe Simon Parkin at Eurogamer is your man. If it's an indie game, maybe Rock, Paper, Shotgun has it covered. For an RPG like Dragon Age, why not let a staff of writers cover it from multiple angles, like they did at The Border House? If it's a sports game, the Operation Sports staff is probably all over it, and the readers there will happily chime in with their opinions too.

It's easy for me to casually discredit Metacritic as a flawed or overrated system, but there's no denying its impact on both the consumer and production sides of the industry. Despite my admonitions, most of my cash-poor students rely on it religiously. When new games cost $60 a pop, Metacritic functions as a vital resource allocation tool for them. On the other side of the ledger, I've been told by several developers that publishers keep a close eye on Metacritic scores, with some incentivizing scores of 80+ with bonuses. A disappointingly low Metacritic score, as one developer put it, "means people lose their jobs."

So what exactly is wrong with Metacritic? Other than harboring a general sense that it's flawed, I've never  taken the time to examine why I believe that's so. Would a closer look at Metacritic's system dispel that impression? I decided to examine how Metacritic functions as a review aggregator, focusing on aspects of its system that seem problematic to me. In a nutshell, here's what I find found.

  • Metacritic's method for meta-scoring video games differs substantially from the one it employs for movies, books, and music. Games must score 90 or above to receive a "universal acclaim" rating, but all other media Metacritic aggregates receive that designation at 81 or above. Metacritic claims this is meant to account for the perception that a 3-out-of-5-star movie is considered worth seeing, while a comparable 60 score for a game suggests that game is barely playable.

    Clearly, Metacritic is attempting to account for perceptions, as well as raw numbers, and this leads to some shaky methodology. If we're guided by perceptions of what numbers mean, should we also account for the fact that an Edge Magazine 9 bears almost no relationship to a Play.tm 9?
  • Metacritic's conversion of all scores to a 100-point system is problematic at best. Here's how they explain it:

When you tell a computer to compute the average of B+, 45, 5, and *****, it just looks at you funny and gives an error message. When you tell a computer to compute the average of 83, 45, 50, and 10, it is much, much happier. Thus, in order to make our computers happy (and calculate the METASCORES), we must convert all critics' scores to a 0-100 scale.

The odd thing here is that sometimes Metacritic cares about perceptions and other times it doesn't. When a B+ is converted to an 83, that conversion simply doesn't work properly. I assign scores to students for a living, and I can tell you that not a single one of them would exchange a B+ for an 83. Metacritic's scale may be effective keeping its computer happy, but it does a poor job of translating the intended value of a letter grade review from a site like the Onion A/V Club.

  • Speaking of perceptions, Metacritic believes it can accurately read the minds of critics who don't assign review scores. Here's how they explain extracting a number from a film review by New York Times' critic Manohla Dargis:

...our staff must assign a numeric score, from 0-100, to each review that is not already scored by the critic. Naturally, there is some discretion involved here, and there will be times when you disagree with the score we assigned. However, our staffers have read a lot of reviews--and we mean a lot--and thus through experience are able to maintain consistency both from film to film and from reviewer to reviewer. When you read over 200 reviews from Manohla Dargis, you begin to develop a decent idea about when she's indicating a 90 and when she's indicating an 80.

Really? You can honestly do this? What methodology are you applying here? Do you count the number of positive adjectives she uses? I must say this one leaves me speechless.

  • Finally, Metacritic casts its net far too widely, and not widely enough, when it aggregates game review sources. 149 game review venues are included in its calculations, 3 times more than the number it includes for movies. Many of these sites seem questionable to me, with far too many poorly written or even ill-conceived reviews included in mix. On the other side of the coin, many fine critics and reviewers from distinguished and widely-read blogs are left out, presumably because they don't work for 'game review sites' or assign numbers.

Maybe that's for the best if such writers are to be given the Manohla Dargis treatment. If the function of Metacritic is to discharge bots, gather data, and crunch them for public consumption, then so be it. But if Metacritic aims to accurately reflect the critical response to games (which is what it claims), it needs a system for doing so that doesn't overalue me-too enthusiast sites and undervalue thoughtful evaluative writing about games that many of us read and produce on a regular basis.

I'm not suggesting Metacritic has no value or useful role to play. I'm simply suggesting that if we rely on it as a tool to guide our game-playing choices, we should know how that tool is built, and we should understand its limitations.

Comments