• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Jason Scheier: "Starfield 83 Metacritic Score was only possible because of Xbox outlets"

I mean, would you say it could tilt the overall score by 5 points? I'd say there is a big difference in terms of perception between 83 and 78
I'd say that anyone that cares enough about 5 points is more interested in trolling than anything else. It feels like people can't stand the fact an xbox game didn't score less than 80. If it had they'd have more material to whack off to on this forum.
 

ckaneo

Member
Yeah, I'm not sure this line of thinking is fair. While it is true xbox fan sites inflate the scores I've never seen anyone say this for Sony or Nintendo sites which also do the exact same thing. Remember that Pokémon reviewer from ign who then went on to literally work for Nintendo?

It's okay to just say the reception was mixed
 

bender

What time is it?
I'd say that anyone that cares enough about 5 points is more interested in trolling than anything else. It feels like people can't stand the fact an xbox game didn't score less than 80. If it had they'd have more material to whack off to on this forum.

Depends where those five points are at. The perception of an 8 and 7 is very different and we've seen bonuses tied to Metacritic score (Fallout New Vegas) so it's more than just forum fodder.

We're at the point where forums consider 83 a failure. Metacritic is meaningless.

When review sites largely use a 7-10 scale, it is very meaningless.
 
Last edited:

MarkMe2525

Gold Member
It's easy enough to show your "flip side" argument if it exists. We aren't talking about user reviews here, the amount of professional reviews (those factored into a Metacritic score) is finite. Go read the bottom end of the spectrum and point out the obvious "flip side". If you want to get nerdy about it, do the same for the reviews that Jason questioned, throw out all the obvious examples of pro-bias and anti-bias and refactor the score.

For what it is worth (see absolutely nothing), my gut tells me it would be much to maintain professional status (again, factored into Metacritic with a pro-bias than it would an anti-bias). Access and important and publishers are quick to pull that access over the smallest of slights (see: Sony and The Esccapist/Zero Punctuation).
Again, I disagree with the premise. I don't see a feasable way to reliably and accurately categorize a review as objectively pro or negatively biased. As soon as one attempts to do this, we are then applying ones own bias to the categorizing itself.

The only way to "prove" anything about bias (overall) is to be inside the individuals head, I think we can both agree that is unreasonable. Maybe one may be able to make an assessment of an individuals bias by meticulously reading every review ever submitted by each individual reviewing the game, but that doesn't seem particularly productive and is also open to the reader subjecting their own bias to the process.

Minus objective proof, we are left with assumptions and speculation. You are free to believe what you like, as I'm not attempting to make any definitive statements regarding the overall bias of reviewers other than bias exist in both directions.

My point is there is a certain number of reviewers that inflated the score and certain number who deflated the score, I don't think anyone can make a legitimate claim to know the ratio of these two scenarios. In the absence of any definitive answer, "I believe" it would be prudent to just accept the score for what it is, an 83.

I also stand by my claim that, in my subjective opinion, Starfield is easily an 83.
 
Again, I disagree with the premise. I don't see a feasable way to reliably and accurately categorize a review as objectively pro or negatively biased. As soon as one attempts to do this, we are then applying ones own bias to the categorizing itself.

The only way to "prove" anything about bias (overall) is to be inside the individuals head, I think we can both agree that is unreasonable. Maybe one may be able to make an assessment of an individuals bias by meticulously reading every review ever submitted by each individual reviewing the game, but that doesn't seem particularly productive and is also open to the reader subjecting their own bias to the process.

Minus objective proof, we are left with assumptions and speculation. You are free to believe what you like, as I'm not attempting to make any definitive statements regarding the overall bias of reviewers other than bias exist in both directions.

My point is there is a certain number of reviewers that inflated the score and certain number who deflated the score, I don't think anyone can make a legitimate claim to know the ratio of these two scenarios. In the absence of any definitive answer, "I believe" it would be prudent to just accept the score for what it is, an 83.

I also stand by my claim that, in my subjective opinion, Starfield is easily an 83.
 

diffusionx

Gold Member
We're at the point where forums consider 83 a failure. Metacritic is meaningless.

I really don't care about a game's metacritic. I would give Ghost of Tsushima a 10/10 personally. But the companies definitely DO care about MC, a LOT, and they care because they have found correlations between the scoring and the sales. So companies do try to get out in front of it and manage it. Bethesda is actually one of the more notorious companies in this and have been for a long time. For them the difference between 78 and 83 is quite large.
 
Last edited:

ReBurn

Gold Member
83 for a video game is a success.

83 for a game backed by the platform owner who purchased the studio for 7 billion dollars, marketed to the max, virtually given away for free via gamepass, and literally the last hope of the platform after a string of weak games is a failure.
I don't think the additional context is really all that important when I comes to the actual quality of the game. All of the extraneous stuff is what the warriors use to load their console war cannons. Your "last hope" and "failure" quips are literally hyperbole.
 

Three

Member
Again, I disagree with the premise. I don't see a feasable way to reliably and accurately categorize a review as objectively pro or negatively biased. As soon as one attempts to do this, we are then applying ones own bias to the categorizing itself.

The only way to "prove" anything about bias (overall) is to be inside the individuals head, I think we can both agree that is unreasonable. Maybe one may be able to make an assessment of an individuals bias by meticulously reading every review ever submitted by each individual reviewing the game, but that doesn't seem particularly productive and is also open to the reader subjecting their own bias to the process.

Minus objective proof, we are left with assumptions and speculation. You are free to believe what you like, as I'm not attempting to make any definitive statements regarding the overall bias of reviewers other than bias exist in both directions.

My point is there is a certain number of reviewers that inflated the score and certain number who deflated the score, I don't think anyone can make a legitimate claim to know the ratio of these two scenarios. In the absence of any definitive answer, "I believe" it would be prudent to just accept the score for what it is, an 83.

I also stand by my claim that, in my subjective opinion, Starfield is easily an 83.
You can easily show bias. In fact it's a statistics term, not something subjective . It means the sample surveyed doesn't provide an accurate representation of the population. There are ways you can do this as bender has pointed out. For one, you can show how much some outlets deviate from the general population, with enough data you can even show a correlation with why that bias exists.
 
Last edited:

Topher

Identifies as young
Again, I disagree with the premise. I don't see a feasable way to reliably and accurately categorize a review as objectively pro or negatively biased. As soon as one attempts to do this, we are then applying ones own bias to the categorizing itself.

The only way to "prove" anything about bias (overall) is to be inside the individuals head, I think we can both agree that is unreasonable. Maybe one may be able to make an assessment of an individuals bias by meticulously reading every review ever submitted by each individual reviewing the game, but that doesn't seem particularly productive and is also open to the reader subjecting their own bias to the process.

Minus objective proof, we are left with assumptions and speculation. You are free to believe what you like, as I'm not attempting to make any definitive statements regarding the overall bias of reviewers other than bias exist in both directions.

My point is there is a certain number of reviewers that inflated the score and certain number who deflated the score, I don't think anyone can make a legitimate claim to know the ratio of these two scenarios. In the absence of any definitive answer, "I believe" it would be prudent to just accept the score for what it is, an 83.

I also stand by my claim that, in my subjective opinion, Starfield is easily an 83.

The only "proof" is the fact that these sites in question have affiliated themselves with the Xbox brand. This is hardly the first time people have looked sideways at the opinion expressed by such as a site with suspicion. It has been pointed out many times that the lofty review scores of certain PlayStation games have been helped along by review sites which are named directly after the PlayStation brand. I see nothing wrong with pointing that out with the acknowledgement that this is typical when it comes to first party games. I made that point several times in the Starfield review thread.

I don't disagree with you that a score of 83 for Starfield is certainly legit. But would I personally score it in the 9s/10s? Probably not. The fact that so many of those scores are from Xbox affiliated sites is the reason so many are looking sideways at them. But, in their defense, they are not alone. There are several non-Xbox gaming sites who scored the game that high as well. In fact, I see quite a bit more from non-Xbox sites in the high 9 and 10s than Xbox sites. So while I still think you can point to self-affiliation with a brand as suspect, in the case of Starfield I do not think it was the defining factor in the reviews. As poppabk poppabk pointed out, the PC reviews of the game have no Xbox site reviews and it actually scored higher.

For my own personal perspective, I usually dismiss review scores that are extreme outliers. The Jimquisition's review of 4/10, for example. That's absurd, imo, and far off the other. If a metacritic is 8 then I think 2 point swing either way is perfectly feasible.
 
Last edited:

StueyDuck

Member
Game reviews are pointless...

Starfield being a great example. Glowing reviews all around at launch meanwhile it was barely playable on PC without mod support (PC being the place to play a Bethesda game)

There's so so so many more examples of why games media are redundant and pointless that I'm surprised anyone takes things like s metacritic Score seriously
 

jm89

Member
Jason Schreier says that Starfield only got a high score on Metacritic thanks to Xbox fan sites. Not to mention that the game wasn't released for PlayStation.
it-is-known-yes.gif
 

NikuNashi

Member
Im so torn on this.

On the one hand I cant agree with him because I hate the cretin.
On the other hand hes right and I hate that Starfield scores were pumped up by shills.


Its a genuine catch 22.
 

LordNerevar76

Neo Member
I don't think we want to start the game of stripping fan site reviews from "their" games or you'd see similar drops in Nintendo and Sony titles...
 

bender

What time is it?
My point is there is a certain number of reviewers that inflated the score and certain number who deflated the score, I don't think anyone can make a legitimate claim to know the ratio of these two scenarios. In the absence of any definitive answer, "I believe" it would be prudent to just accept the score for what it is, an 83.

I also stand by my claim that, in my subjective opinion, Starfield is easily an 83.

As my math teachers would say, show your work. Jason is working under the assumption that Xbox branded sites are skewing the average. If your counter is, there are just as many bad actors working to lower the score, show some examples.
If it were me, I'd take my personal score and ask myself how far away from that score in either direction would make me raise my eyebrow. I'd then read those reviews and look for overexaggerating, omissions, or scores that don't live in harmony with the review.

I don't really have a dog in this hunt. From the little I played, I'd call Starfield a 6/10 but I have some built in biases in that I don't really care for space settings in games and Bethesda is largely given a pass for outdated and broken mechanics in their games for a really long time. And on that second point, I see the "backlash" Starfield has received as the rest of the gaming world catching up to the fact that Bethesda has been skating by since Morrowind. I can look past my biases though and you personally thinking it is an 8.3 doesn't really give me pause. Giving it a 10 on the other hand at the very least would make me distrust your objectivity and/or taste in games.
 

MarkMe2525

Gold Member
You can easily show bias. In fact it's a statistics term, not something subjective . It means the sample surveyed doesn't provide an accurate representation of the population. There are ways you can do this as bender has pointed out. For one, you can show how much some outlets deviate from the general population, with enough data you can even show a correlation with why that bias exists.
The field of statistics does not have a monopoly on the term "bias". It is used in statistics and science, and as usual, has a technical definition for the respective fields. I am discussing psychological bias (prejudice, preference, inclination) which is how the term is typically used colloquially.

In the case of using deviation from the norm to assess an outlets bias, I feel as if it would be less fruitful than you suggest. A review outlet is not a monolith, and in today's age of frequent professional churn and gig culture, judging a review outlets "bias" would be complicated by the fact that no one reviewer can typically represent the output of the outlet as a whole. This was extensively explained by ign after the Starfield review blowback. A review is not some average of a teams opinion, but the opinion of the one commissioned to write the review (who in many situations is a freelancer and not even employed by the outlet.) As I stated in my previous post, one may look at the individual reviewers and assess their past work, but as I mentioned, there are problems with this approach as well. If you want to give it a shot then be my guest. :)

As my math teachers would say, show your work. Jason is working under the assumption that Xbox branded sites are skewing the average. If your counter is, there are just as many bad actors working to lower the score, show some examples.
If it were me, I'd take my personal score and ask myself how far away from that score in either direction would make me raise my eyebrow. I'd then read those reviews and look for overexaggerating, omissions, or scores that don't live in harmony with the review.

I don't really have a dog in this hunt. From the little I played, I'd call Starfield a 6/10 but I have some built in biases in that I don't really care for space settings in games and Bethesda is largely given a pass for outdated and broken mechanics in their games for a really long time. And on that second point, I see the "backlash" Starfield has received as the rest of the gaming world catching up to the fact that Bethesda has been skating by since Morrowind. I can look past my biases though and you personally thinking it is an 8.3 doesn't really give me pause. Giving it a 10 on the other hand at the very least would make me distrust your objectivity and/or taste in games.
Lucky for you, I just did my homework. Lets look at the 80 reviews on metacritic that list an actual score. Let's ignore the fact that some reviews are on a scale of 1-10, 1-100, and others use something else like a 5 or 10 star system, I am just going with the number listed. Barring I made some mathematical error (i double checked), the average score for all the reviews is an 85.1, and the average score for all scores minus any outlets that are outward facing xbox centric is 83.9. I found 10 outlets that alluded to xbox or windows in some form, those are the scores I removed.

So even though I disagreed with the premise to start with, I didn't find any evidence that would grant credibility to Jason's assertion. Granted, I only looked at Metacritic. ( No chance I'm spending anymore time on this)
 
Last edited:

bender

What time is it?
Lucky for you, I just did my homework. Lets look at the 80 reviews on metacritic that list an actual score. Let's ignore the fact that some reviews are on a scale of 1-10, 1-100, and others use something else like a 5 or 10 star system, I am just going with the number listed. Barring I made some mathematical error (i double checked), the average score for all the reviews is an 85.1, and the average score for all scores minus any outlets that are outward facing xbox centric is 83.9. I found 10 outlets that alluded to xbox or windows in some form, those are the scores I removed.

So even though I disagreed with the premise to start with, I didn't find any evidence that would grant credibility to Jason's assertion. Granted, I only looked at Metacritic. ( No chance I'm spending anymore time on this)

Your average is probably different because of weighting as not all reviews are treated the same.

And you still haven't shown your work. Your premise is that there are just as many bad actors to counter Jason's premise that Xbox branded sites and cancel each other out. So show some examples.
 

James Sawyer Ford

Gold Member
Lucky for you, I just did my homework. Lets look at the 80 reviews on metacritic that list an actual score. Let's ignore the fact that some reviews are on a scale of 1-10, 1-100, and others use something else like a 5 or 10 star system, I am just going with the number listed. Barring I made some mathematical error (i double checked), the average score for all the reviews is an 85.1, and the average score for all scores minus any outlets that are outward facing xbox centric is 83.9. I found 10 outlets that alluded to xbox or windows in some form, those are the scores I removed.

So even though I disagreed with the premise to start with, I didn't find any evidence that would grant credibility to Jason's assertion. Granted, I only looked at Metacritic. ( No chance I'm spending anymore time on this)

You yourself just proved what Jason said, the scores would not be possible if it weren't for the extreme shill site deviation in scores. Yes, those also apply to Sony/Nintendo to some degree, but this is a much larger effect.
 

MarkMe2525

Gold Member
You yourself just proved what Jason said, the scores would not be possible if it weren't for the extreme shill site deviation in scores. Yes, those also apply to Sony/Nintendo to some degree, but this is a much larger effect.
Oh geez, you must have missed the fact that the MetaCritic score is listed as 83. In both cases, the average 85.1 and 83.9 is higher. I guess dualshockers got a check from MS too with their 90. What a goober
Your average is probably different because of weighting as not all reviews are treated the same.

And you still haven't shown your work. Your premise is that there are just as many bad actors to counter Jason's premise that Xbox branded sites and cancel each other out. So show some examples.
Be nice regarding format, I made this assuming only I was going to see it. The X next to a score indicates xbox or windows centric. https://docs.google.com/spreadsheets/d/1l84kjQzrGEUFJZbcjBtfT6Hw9xoRR9uGdWqHPv3i1pE/edit?usp=sharing
 
Last edited:

Three

Member
The field of statistics does not have a monopoly on the term "bias". It is used in statistics and science, and as usual, has a technical definition for the respective fields. I am discussing psychological bias (prejudice, preference, inclination) which is how the term is typically used colloquially.
That's exactly where the term bias comes from though and it is the same meaning. In this case it would be those specific reviewers scores that are sampled resulting in a deviation of the score from that of the general population. That's exactly what Jason is referring to if you read the tweet. Not what you're referring to as "psychological bias" which if you think about it doesn't really exist.
In the case of using deviation from the norm to assess an outlets bias, I feel as if it would be less fruitful than you suggest. A review outlet is not a monolith, and in today's age of frequent professional churn and gig culture, judging a review outlets "bias" would be complicated by the fact that no one reviewer can typically represent the output of the outlet as a whole. This was extensively explained by ign after the Starfield review blowback. A review is not some average of a teams opinion, but the opinion of the one commissioned to write the review (who in many times is a freelancer and not even employed by the outlet.) As I stated in my previous post, one may look at the individual reviewers and assess their past work, but as I mentioned, there are problems with this approach as well. If you want to give it a shot then be my guest. :)
It would be pretty fruitful, why wouldn't it be? especially when the hypothesis is that outlets with xbox in their name are what's bias. You can show this to be true or not statistically and even quantify it.
 
Last edited:

James Sawyer Ford

Gold Member
Oh geez, you must have missed the fact that the MetaCritic score is listed as 83. In both cases, the average 85.1 and 83.9 is higher. I guess dualshockers got a check from MS too with their 90. What a goober

Metacritic is not based on "average". That is your flaw. They give higher weighting to sites that have larger influence.

Remove the Xbox shill sites from the MetaCritic and you get something lower than 83.
 

bender

What time is it?
Be nice, I made this assuming only I was going to see it. The X next to a score indicates xbox or windows centric. https://docs.google.com/spreadsheets/d/1l84kjQzrGEUFJZbcjBtfT6Hw9xoRR9uGdWqHPv3i1pE/edit?usp=sharing

Let me rephrase. Numbers only tell part of the story. You'd need to take examples of low scores and show why the reasoning is out of bounds. And to be fair, Jason should have done the same thing with all 15 reviews he found to be outside of the "critical consensus" and not just focused on the Xbox/MS branded websites.

Examples of out of bounds reviews: IGN's 2009 Football Manager review, IGN's Godhand review
 
Last edited:

Crayon

Member
Another example: Ghost of Tsushima, same MC as Starfield:


Has ZERO PlayStation specific sites that gave it a 95%+. Highest score is 90 from PS Universe and Push Square.

This is what I'm saying about the Sony too defense. It's horse shit. Thieves think all men steal.
 

MarkMe2525

Gold Member
Metacritic is not based on "average". That is your flaw. They give higher weighting to sites that have larger influence.

Remove the Xbox shill sites from the MetaCritic and you get something lower than 83.
You better put that shovel down before you dig yourself too deep. I clearly explained that I removed the xbox and windows centric scores ( there were 10 scores that came from sites alluding to xbox, windows, achievements). You can look at the data yourself as i posted the spreadsheet above.

Again THE AVERAGE WAS 83.9. Not to hard to understand. If you disagree with the methods, do it yourself.
 

James Sawyer Ford

Gold Member
You better put that shovel down before you dig yourself too deep. I clearly explained that I removed the xbox and windows centric scores ( there were 10 scores that came from sites alluding to xbox, windows, achievements). You can look at the data yourself as i posted the spreadsheet above.

Again THE AVERAGE WAS 83.9. Not to hard to understand. If you disagree with the methods, do it yourself.

How is it that you can remove the highest scores and not get a lower score? That was the whole point of Jason's argument

NOT TO MENTION the influence that these early scores had on "setting the tone" for later scores that may (also) be inflated.

You'd be amazed at what peer pressure can do when you see a bunch of 10s right out of the gate.
 
Last edited:

MarkMe2525

Gold Member
How is it that you can remove the highest scores and not get a lower score? That was the whole point of Jason's argument

NOT TO MENTION the influence that these early scores had on "setting the tone" for later scores that may (also) be inflated.

You'd be amazed at what peer pressure can do when you see a bunch of 10s right out of the gate.
How about you take the 32 seconds to read my post in full before making an ass out of yourself. Metacritic doesn't show the pure average, they display a weighted average. The pure average of all the scores they list is 85.1, this is how you get 83.9 when removing the 10 scores i alluded to earlier.
 

Kssio_Aug

Member
How is it that you can remove the highest scores and not get a lower score? That was the whole point of Jason's argument

NOT TO MENTION the influence that these early scores had on "setting the tone" for later scores that may (also) be inflated.


You'd be amazed at what peer pressure can do when you see a bunch of 10s right out of the gate.
Just saw the edit. Now you're reaching!
 
Last edited:

James Sawyer Ford

Gold Member
How about you take the 32 seconds to read my post in full before making an ass out of yourself. Metacritic doesn't show the pure average, they display a weighted average. The pure average of all the scores they list is 85.1, this is how you get 83.9 when removing the 10 scores i alluded to earlier.

So it's lower. Gotcha.
 

MarkMe2525

Gold Member
Let me rephrase. Numbers only tell part of the story. You'd need to take examples of low scores and show why the reasoning is out of bounds. And to be fair, Jason should have done the same thing with all 15 reviews he found to be outside of the "critical consensus" and not just focused on the Xbox/MS branded websites.

Examples of out of bounds reviews: IGN's 2009 Football Manager review, IGN's Godhand review
I have gone on too long, I never intended to attempt to make any definitive statements other than bias works both ways. This is why it would be prudent to refer to the weighted averages given by metacritic.
 

James Sawyer Ford

Gold Member
I wonder how deep you can keep digging, Jason inferred that the scores would be even lower than what is displayed, which is 83. Keep going though, I'm sure you are looking just fine to all the people reading.

It's 83 point something

Removing the Xbox sites would make it lesser than that. Not to mention the argument about setting the tone which can have a big impact early on.
 
Last edited:

bender

What time is it?
I have gone on too long, I never intended to attempt to make any definitive statements other than bias works both ways. This is why it would be prudent to refer to the weighted averages given by metacritic.

I think it's a fascinating conversation, actually, but I've been in a data driven mood of late. I don't think there are easy answers either way.
 
Top Bottom