Gears of War Ultimate Edition PC Specs Leaked

What do you propose we all do if someone can't afford 16GB of RAM? Downgrade back to 4GB for everyone so these guys will be able to run modern games on their PCs or what?

His point it valid. 16GB of RAM is a moderate amount on PC these days. I've had 32GB since the beginning of 2013 and I plan to go 64GB for my next platform upgrade. It's cheap and it is relatively easy of an upgrade if you want to go from 8 to 16 to 32.

There is no reason to complain before even seeing what the game offers for such requirements and how it actually runs with less than 16GBs.

I think the complaint is: "I DON'T HAVE THAT. FUCK THAT GAME" on a smaller scale. I just looked up the prices for 16GB and I'm like, it's under a $100?!?!?! I need to upgrade my comp and I'm definitely gonna get that at least. Shit, last time I upgraded (prolly like 2011) to a medium spec comp that was way more expensive. If I hadn't need to replace my MB/Proc I would be on it. Let me get this wedding stuff paid for (Bruh, I don't understand how people do divorces, after I pay all this money, FUCK THAT) and we'll see what I can do. I'm ready for these games (especially if they are cross-buy. I'll just buy them for XBO and when I upgrade my PC I'll have a better looking ver).
 
I think the minimum CPUs listed are much more powerful than the Jaguar cores. So those will probably be overkill if 30fps at Xbox One settings (assuming it can even be calibrated as close as possible) is your target. The minimum AMD GPU is also stronger than the Xbox One, but does not have access to as much memory so texture quality will have to give.

Textures aside I expect the minimum specs to match the console quite easily if DX12 is as efficient as claimed.

PCs have more overhead and are less optimized (multiple hardware configurations vs just one), so it'll always take a slightly stronger PC to match a well optimized console game, even on DX12.
 
PCs have more overhead and are less optimized (multiple hardware configurations vs just one), so it'll always take a slightly stronger PC to match a well optimized console game, even on DX12.
Yes, but I suspect the gap between Jaguar cores and the minimum CPUs listed is very substantial, even bearing in mind the lesser degree of efficiency I would be very surprised if a FX 6300 CPU is "needed" to match the console. I think we can go even lower than that with low overhead APIs.
 
As they've said, they didn't start again because they wanted to keep the core gameplay elements. That would of changed with a re-write.

Come on PC guys. Every thread around requirements is either "shitty console port and theyve done nothing for PC" or "Unoptimised crap, look at those specs".

A stick of DDR4 8GB is 30 quid for christs sake.

It's not about extra ram or anything, it's that I feel like the PC gaming community as a whole have a certain visual quality level in their head that they equate to a certain strength of hardware.

No one is complaining about Witcher 3 or ROTTR requiring powerful hardware to run smoothly because of how those games look.

GOW's visual quality does not correlate in a logical way to the hardware the devs are saying is required to run it smoothly.

The remastered version is just now beginning to approach the visual quality we were shown was possible for the game TWELVE years ago

xv6EzKz.jpg
 
If you can't afford a stick of RAM which is cheaper than the game your purchasing, then maybe you shouldn't be gaming at all. Especially PC gaming.

Are you saying you can get 16gb of RAM for under the price of the game?
Good RAM too?

Link please?

:)
 
The remastered version is just now beginning to approach the visual quality we were shown was possible for the game TWELVE years ago

xv6EzKz.jpg

I made a thread maybe...2 years ago? about how UE3 and gears was one of the first large scale downgrades that ever occured... but is somehow collectively forgotten about in gaming circles. For some reason the watch dogs thing is bigger although the technical differences between the UE3 engine preview and GeOW 1 retail release are much larger.
 
I made a thread maybe...2 years ago? about how UE3 and gears was one of the first large scale downgrades that ever occured... but is somehow collectively forgotten about in gaming circles. For some reason the watch dogs thing is bigger although the technical differences between the UE3 engine preview and GeOW 1 retail release are much larger.

I think it's still cause GOW ended up looking a good bit better than pretty much anything on consoles at that time, and even PC, despite the downgrade from the tech demo.

Whereas Watch Dogs post-downgrade seemed like just another good looking open world-game not much better looking than something like the PC version of Sleeping Dogs.
 
I made a thread maybe...2 years ago? about how UE3 and gears was one of the first large scale downgrades that ever occured... but is somehow collectively forgotten about in gaming circles. For some reason the watch dogs thing is bigger although the technical differences between the UE3 engine preview and GeOW 1 retail release are much larger.

i vividly remember the massive unreal tournament 3 downgrade. it was disgusting. it looked like a completely different game.
 
I made a thread maybe...2 years ago? about how UE3 and gears was one of the first large scale downgrades that ever occured... but is somehow collectively forgotten about in gaming circles. For some reason the watch dogs thing is bigger although the technical differences between the UE3 engine preview and GeOW 1 retail release are much larger.

what was the reason for the downgrade? Poor PC and Xbox hardware specs? UE3 didn't get real time lighting until much much later (something that was shown in the move the burning car demo)
 
Strange, I thought Gears UE or any Microsoft exclusive coming to Windows 10 would not have a retail version.
Might be just a case with a Win10 store key in it. Just like physical versions of games that are on Steam, Uplay, etc.
 
I made a thread maybe...2 years ago? about how UE3 and gears was one of the first large scale downgrades that ever occured... but is somehow collectively forgotten about in gaming circles. For some reason the watch dogs thing is bigger although the technical differences between the UE3 engine preview and GeOW 1 retail release are much larger.
The promo screenshots they used were undoubtedly offline renders and it was hilariously false advertising...I think the term bullshot was coined because Gears screenshots didn't look like the game.

IF we are talking about the Gears 1 reveal then it never really was that big of a downgrade honestly outside of some dynamic lights with shadows for some of them that Gears 1 lacked compared to the very first showcase that ran at 15 frames per second and had worse model complexity and textures (probably because it was a made in a few weeks). At the same time I think it had gains in some areas atleast.
.


It's not about extra ram or anything, it's that I feel like the PC gaming community as a whole have a certain visual quality level in their head that they equate to a certain strength of hardware.

No one is complaining about Witcher 3 or ROTTR requiring powerful hardware to run smoothly because of how those games look.

GOW's visual quality does not correlate in a logical way to the hardware the devs are saying is required to run it smoothly.

The remastered version is just now beginning to approach the visual quality we were shown was possible for the game TWELVE years ago

xv6EzKz.jpg
If you are using model quality as an example then it should be noted that the models in Gears Remaster are about only 20% more detailed but have improved texture quality (which is what makes it look better). To me the models themselves somehow don't look as complex as the ones in Gears 3. That Berserker model is probably very close to what the original Gears had in terms of complexity, minus the skin textures and the studio lighting on that model (lol).

As for the underlined, yes it does correlate if you think about it though.
It needs a 980Ti for 4k/60FPS and I think that is very reasonable for a 1080P/30FPS Xbox one game considering it's double the framerate and 4 times the resolution.
 
Yes, but I suspect the gap between Jaguar cores and the minimum CPUs listed is very substantial, even bearing in mind the lesser degree of efficiency I would be very surprised if a FX 6300 CPU is "needed" to match the console. I think we can go even lower than that with low overhead APIs.
I'm not denying that, but in real world situations from PC gaming, you always need a setup beefier than the paper specs for a console to beat it. Also bear in mind that the Xbox One is an APU with eDRAM, there's probably very little latency between those Jaguar and GCN cores... especially if they take advantage of HSA.
 
No one is complaining about Witcher 3 or ROTTR requiring powerful hardware to run smoothly because of how those games look.

Hmm, yes people were complaining how a gtx 970 still had drops at 1080p for an xbox one game.

Not sure about witcher because I didn't follow that.
 
People complaining about 650ti and 2gb VRAM 8gb of RAM is silly in 2016. Especially for a rebuilt game. This doesn't look like a slapped together HD cash grab.
 
Why are games that aren't particularly complex or groundbreaking requiring so much RAM? Are devs simply not taking the time to optimize anymore and just loading as much of the data as possible into RAM? I can get large VRAM requirements for all the huge textures, but the original game ran on a machine with 512MB RAM (522 if you include the eDRAM).
 
Why are games that aren't particularly complex or groundbreaking requiring so much RAM? Are devs simply not taking the time to optimize anymore and just loading as much of the data as possible into RAM? I can get large VRAM requirements for all the huge textures, but the original game ran on a machine with 512MB RAM (522 if you include the eDRAM).

well this is the remake and it runs on a console with 8GB

it's shouldn't come as a surprise it might need 16GB if it can scale to 4K
 
Top Bottom