Ouch, that’s what publisher THQ’s probably saying after “middling” Metacritic review scores for Homefront appeared to drag share prices down a staggering 21 percent in trading late yesterday.
Homefront takes place in what THQ calls an “alternate history” universe, where–in the near future, about 2027–North Koreans invade a diminished United States. Cue besieged patriotic bells and uber-arsenal-of-shootin’ whistles.
Metacritic, which aggregates book, music, film, and game reviews, currently shows Homefront’s average score at a “yellow” 72 percent for the Xbox 360 version, while the PS3 version sits slightly higher (and in the “green”) at 76 percent.
According to the L.A. Times, investors saw those scores and panicked, driving THQ’s price down $1.25 to $4.69.
The trouble with Metacritic is that they’ve never been able to normalize those scores to compensate for editorial disparity: Some sites consider a median score of 50 percent perfectly average–average meaning “not good or bad, just somewhere in the middle”–while others beholden to the academic A-to-F scale, consider a score of 50 flatly failing.
For the former, an average score in the 70s is easily “better than average,” even “reasonably good.” For the latter, it’s like handing the game a D, on the verge of slapping it with an F.
See the problem here?
If you follow me, you know where I stand on Metacritic (as does site co-founder Marc Doyle, who’s always graciously agreed to disagree with me). To sum up, I think having a place to access review links quickly, without digging through dozens of sites is a wonderful idea, marred by problematic math. I’m talking about the trouble you get into when you have a number system that purports to be representative, but where the people feeding the system all have different ideas about what the numbers mean. If you’re averaging scores on a fixed scale of 1 to 100, but importing from sites that interpret that scale uniquely, well, problem (it shouldn’t take a mathematician to explain why).
Put another way, the measurement ruler may look the same, but each site’s measurement tools can be radically different.
I should know. I’ve written for most outlets in this biz, both online and print. I’ve had to grapple with dozens of editorial takes on scales that ranged from stars to letters and simple numbers to catchy phrases. Some of those outlets even publish guides that explain what their scores mean. Each guide confirms the differences–some slight, some significant–from others.
Blame the games industry if you like, but if Metacritic really wanted to play fair, it would display links to reviews with unaltered scores and leave off the not-really-average top number. I realize some of you–perhaps even a majority of you–live and die by that number. But I’d also like to think you can see how mad it is: A flat scoring system review outlets have never rallied around, that produces dodgy “averages” capable of toppling a publisher’s stock price. Maybe Homefront’s average is higher. Maybe it’s lower. Since all the scores aren’t normalized, we’ll never know.
Incidentally, investors don’t care if Metacritic’s top score is representative or not. All they’re thinking about, when they’re reacting as they did yesterday, is how the public’s going to respond when it sees that top level, squared off, color-coded number.
Interact with Game On: Twitter – Facebook – Get in touch