Halo 3, a 9.5? No way
April 6, 2008 6:14 AM

Are video game reviews legitimate? How does the "official" video game review system really work?

When I'm in the mood for a new video game, I usually start at a game review aggregator like gamerankings.com. Then, I sort the latest games from the last so many months by average rating. Then, I work my way down the list until something interesting stands out. It's not a very inspiring approach, but it seems to have worked well enough for the amount of time I have put toward it.

However, very often I've noticed that the individual review sites' scores are very close to one another. And, when I say close I mean like exactly same, or within 1 or 2 point variance on a 100 point scale. Further, the score similarities seems to be present on the very best games as well as the worst ones. This phenomenon is much different from what I experience with movie reviews where an individual professional review may vary wildly from one another.

So, my educated guess is that one of two things are happening: 1) the bigger game review sites are collaborating on the reviews, either amongst themselves or with the game production companies. Or, 2) the game review sites are all using some similar quantitative, objective criteria to come up with their ratings. I'm sure a little of both happens, but I'm probably missing something obvious as well.

So, for those of you who work in the game industry (or know how it works), how are games actually reviewed? What is the process for a review, from the time a game is announced to the time it's published in one of the more major sites or magazines?
posted by brandnew to Media & Arts (20 answers total) 9 users marked this as a favorite
I like Metacritic. It, too, is a review aggregator, but the difference is that they grade on a curve, so you don't have the issue of everything being assigned the same number of points. I also remember reading a sory about a major game company (Electronic Arts, I believe) going into a bit of a panic because the average Metacritic score of their games was slipping. THAT'S HOW AWESOME IT IS.
posted by 1 at 6:34 AM on April 6, 2008


I'm pretty confident neither of your guesses is right. No collaboration, and certainly no "objective criteria" (aside, perhaps, from bug reports). There's no official system, each site has it's own system.

You see so many similarities in reviews because of the famous 6 - 9 scale of video game reviewing. That is, on a 10 point scale, terrible games get a 6 and great games get a 9. You combine that limited scale with a general tendency among game reviewers to susceptible to prerelease hype and you get a lot of similar scores.

Metacritic/gamerankings are great and all, but I'd suggest taking some time and finding a few critics/sites you generally agree with and reading them on new games. England's EDGE is pretty much the only game review publication (or web site, but most of Edge's material isn't online) I could honestly recommend.
posted by malphigian at 7:07 AM on April 6, 2008


Another cause of the tight grouping of game scores is that reviewers don't actually use the full scoring scale. It's been observed often that while sites may grade games on a putative scale of (say) 1 to 10, in reality scores only go as low as 6, for example. Coupled with the tendency for hyperbole in game reviewing, and the short time frame to play and review a game, and you end up with terrible games getting 70% score and good (but not great) games getting 95%.

Additionally, it's clear that a number of review sites are willing to be swayed by game company pressure, perhaps some gaming swag and the threat of not getting any more review copies. See the pulled reviews for Neverwinter Nights 2 and Kane & Lynch.
posted by outlier at 7:11 AM on April 6, 2008


This post may interest you. It's apparently a widespread problem- video game publishers are the ones buying ad space in the mags that review them.

Some other thoughts-

- VG reviews are often fanboys rather than trained journalists. Their starting point for a review tends to be positive rather than neutral.

- There are more "objective" criteria to look at when reviewing a VG as opposed to a movie- movies don't crash, they don't (usually) vary in video quality, and they don't show up in theaters missing key scenes.

- Very few VG's actually suck, because they can keep changing/fixing them. If a director shoots a crappy film, or if your lead actor delivers a bad performance, there's not much you can do after the fact.
posted by mkultra at 7:15 AM on April 6, 2008


There's definitely some game company meddling in review scores in major magizines, and presumably, websites.

I've heard a case where a junior writer was told to score a notoriously bad sequel to a big franchise game high because the publishers had paid them off somehow.

I would do as you do and make sure the scores are at least somewhat consistent. Checking smaller websites for untainted reviews too. Or buy whatever strikes a chord with you, and sell it if you don't like it.
posted by iamcrispy at 7:30 AM on April 6, 2008


I was going to say that they're basically about as trustworthy as movie reviews. But as others have pointed out, game reviews may be more susceptible to hype and advertising that movie reviews. So there's that angle. But still, my point is this: games (and movies, clothes, books, products, everything) are reviewed by individuals, or small groups. The taste of one individual or small group can differ drastically from that of another. Which is why I also like sites like metacritic and rotten tomatoes. You can get a feel for what the general consensus is. Also, you can read some of the good reviews, and some of the bad ones. Chances are, the good ones will be focusing on the same few points, which in the minds of the individuals who wrote them are very important, while the bad reviews will be focusing on a few different points, which are important to those individuals. Then you can decide if you're in the camp that doesn't really care too much, say, how strong the storyline is, as long as the gameplay is good, or if your whole experience is ruined if you can't be immersed into the story and world of the game.
posted by gauchodaspampas at 7:44 AM on April 6, 2008


Mkultra is correct. Advertising revenue affect reviews greatly. Look to non-financial blogs (blogs with no advertising) for opinions that matter.
posted by Murray M at 7:46 AM on April 6, 2008


It's been like this a long time. The magazine I reviewed Populous (what, 18 years ago?) for had to "re-review" it when my first published review gave it an 8. The editor's re-review gave it a 10. I was told that the marketing department got a call from EA threatening to pull all of their ads from the publisher. Good times.
posted by scruss at 8:21 AM on April 6, 2008


There's definite corruption in the professional game review business; see, for example, Gerstmann's firing at Gamespot. It's not just advertising; the big game review outlets also get art packages, early access to gameplay, interviews, etc. In return they're expected to deliver positive coverage. For well marketed games you'll often see 3 or 5 articles on a site like IGN before the review. By the time IGN's hyped the game so much before release it's hard for them to turn in a negative review.

There's also a cycle of self-reinforcement as to which games are going to be great. Halo 3 was guaranteed a 9+ before anyone ever even played it. At least that was well deserved; other games like Assassin's Creed or Mass Effect that also got lots of excitement pre-release got high reviews, and yet in actuality weren't that well-loved of games.

When I'm looking for review guidance I usually use Metacritic. I also trust Destructoid as a source for less biased reviews. Finally, I've found that the best games are usually the ones that have high scores but got no pre-release hype. Good surprises like Portal, Puzzle Quest, Homeworld.
posted by Nelson at 8:22 AM on April 6, 2008


GameSpot's lost a lot of their writers after advertising went too far and had a reviewer fired (fine, allegedly). Kotaku's starting to do reviews without scores, so they force you to actually read the thing.

There was a lot of hubub when someone scored Twilight Princess an 89 (I believe that was the game), and was just eviscerated for being "so far" off of the other reviews. I think there's a huge incentive to not stray too far from what others are saying, and I think that self-preservation and not wanting to rock the boat are much more influential forces on review scores than any overt advertising pressure.

That said, I do hope that newspapers, not reliant on advertising, will start to review games and do so honestly. I'd be happy if they kept their four star rating scale, where we've seen them do 1.5 star ratings pretty regularly on mediocre movies. I mean, they review plays, and who the fuck goes to plays anymore?
posted by sachinag at 8:23 AM on April 6, 2008


I'm in the game industry, although as a creator, not as a reviewer. One thing that makes game reviews seem more tightly clustered than music and movie reviews is that the curve is different. A movie review of 3 stars out of 4 means that it is a good movie. A game review of 7.5 out of 10 means that it is a disappointment. So good games are going to mostly have review scores between 8 and 9.5.

The commenter "1" is correct about how seriously publishers take metacritic scores; there is, unsurprisingly, a strong correlation between metacritic score and sales.
posted by dfan at 9:54 AM on April 6, 2008


One of the reasons that video-game critics come closer to consensus than critics of other kinds of media, I think, is that they're torn between the way that one might criticize a functional object (e.g., 'there's some slowdown during graphically-busy moments,' or 'the online scoreboards are broken') and the way that one might criticize a piece of art (e.g., 'the story is implausible even by the standards of JRPGs' or 'that damn talking bug never shuts up'). With the first kind of criticism, it's not surprising that scores are usually very close, because this kind of stuff is more or less objective.
posted by box at 12:33 PM on April 6, 2008


One place I like to check when I want to know about a game is GameFAQs. For any game in the database, they have user-submitted reviews, and while they're not especially well-written, if you read a bunch of them, you can get a sense of what different people liked or didn't like about a game.
posted by phaded at 12:44 PM on April 6, 2008


When you keep in mind just how small the grading scale actually is (6.5-9.5 for all practical purposes), and combine that with a (deliberately) generic model ("game play", "visualizations", "storyline", etc.), it should be no surprise why game ratings would come so close to one another.

You could probably rate a game based on hype alone and get within a half a point without ever having played the game. Thought experiment: there's a new game coming out, and the company that made it is touting their latest killer physics engine in all the press releases. A few months down the line, some trailers get "leaked" that show the graphics look pretty good. The game is a first-person shoot-em-up. That's all you get to go on, now--what's the score?

Well, automatically it can't be a 10 because it's a first-person shooter. That's at least a half-point--maybe a full point off for the lack-of-novelty factor, alone. A new physics engine that they're confident-enough in to "leak" trailers of? That means a solid game-play score, no matter how dumb the A.I. or hackneyed the storyline. Give it a 9.2 to be on the safe side.

See how easy that was? But if you saw how many points you really had to choose from, you'd quickly realize there's not a lot of opportunity for large-scale discrepancies.

Kind of like a Soviet election.
posted by Civil_Disobedient at 1:11 PM on April 6, 2008


This has been going on since Amiga Power magazine, which came under immense pressure to stop giving games accurate scores. Someone upthread mentioned Edge -- they started from the same stable as Amiga Power, and take a huge pride in unbiased reviews.
posted by bonaldi at 3:01 PM on April 6, 2008


I get the free Game Informer magazine from EBgames, which you'd think would give every game the thumbs-up, but they frequently give scores of less than 4 to games with muti-page ads right in the same issue.
posted by Stylus Happenstance at 5:09 PM on April 6, 2008


There's another way in which scores can be skewed which may not have been mentioned yet: frequently a review for a particular title is going to be handed off to the reviewer who 's already known to have some amount of interest in it. For instance, an editor's generally not going to assign the review for the latest edition of Madden to someone who hasn't played last year's. So what you often get is racing game fans reviewing racing games, rpg fans reviewing rpgs, etc. whereas Ebert reviews pretty much anything. I don't know what effect this has on the scores, but I'd suspect it has some.
posted by juv3nal at 5:43 PM on April 6, 2008


I agree with jv3nal. You can actually see the same thing happening in niche movie magazines. A movie site/magazine specializing in, say horror movies, is more likely to give a good review to a mediocre horror flick than a mainstream media will.
posted by SageLeVoid at 9:24 PM on April 6, 2008


I don't work in the industry, but this sounds like the weekly topic I read at the video game culture magazine The Escapist: it was on so-called video games journalism : specifically, the one that sticks out is Russ Pitts' To Hear Ourselves Review.

Pitts quotes Warren Spector, known for being father of the games System Shock and Deus Ex, who sums it up perfectly: "A review serves one purpose and one purpose only: to give readers data they need to make a buy/no buy decision. End of story. To do that, the reviewer has to have a consistent editorial stance. It doesn't matter if you agree with a reviewer on a particular game or movie or book or record as long as they're consistent enough that you can determine from reading the review whether you would like the game, movie, book or record yourself. Reviewers and readers have to develop an ongoing relationship of sorts. I don't see that happening much in the world of game reviews.

This is why I visit IGN for all reviews: not because the writing is great (it usually isn't), but I can tell, within their quite consistent metric, if a game is worth my time or not.
posted by BenzeneChile at 8:01 AM on April 7, 2008


I consider Netjak to be a more trustworthy reviewer than any major games site. They won't always have a review out for the game you're interested in, cause it's like 5 guys with no special connection to the industry. But this makes them more honest.
posted by breath at 1:17 PM on April 7, 2008


« Older Why is the Archive command missing from Outlook's...   |   What's the radio scale bar for? Newer »
This thread is closed to new comments.