What will next gen graphics look like?

Why are you so offended by it? Why can't we compare new PCs to current gen consoles? Why does it need to be a "fair" contest? This isn't about fair...it's about reality. He's showing that the differences in image quality are very noticeable and demonstrable.

If you cant see that the guy is a massive dick about it then holy shit, wow.
 
Where problems with resolution and frame rate will probably come in will be: some new engine or tech is deployed that 'must' be used in order to hype the game up. Developer comments the current consoles can't handle it with all the bells and whistles turned on, without severe compromises. Publisher instructs developer to insure the game can produce nice bullshots, and that a nice fake gameplay vertical slice can be assembled for E3.

Developer says "Yes Sir", drops the res below 720p, and allows the frame rate to hit 20fps half the time.
 
I think you'd be surprised. 30 fps is because atm everyone is pushing a bit 'too far'.
The industry knows that COD is popular for a reason; I wouldn't be surprised at a number of launch titles being 60 fps.

Its too much effort for some developers to run at 60 on current consoles, next gen will hopefully make this easier (especially when I think it will be too expensive for most to optimise their games - 60 fps then becomes the better path to take).


PS: Thanks for those Kameo screens MMaRsu; haven't played my 360 in a long time now; but I might have to check that game out now - looks gorgeous! Wish there was more like that in this gen! Its on a short list!

That's why Battlefield sold 14 million at 30FPS? CoD is not popular because its 60FPS. Also I don't know why console gamers are getting mad at PC gamers for suggesting that next gen games will look like high end PC games now. The IQ on console games next gen will never match up to high end PC games today.
 
lol pd0

Come on now

03kameo.jpg


Next gen launch games will look TONS better than anything out on PC at that time probably. Just as Kameo did back in 05.
Wouldn't you say that's an art style victory rather than a tech victory, though? Very few 360 or PC games look anything like Kameo, even to this day.
 
I'm guessing you haven't actually played Witcher 2 on ultra, and don't say you saw a pic or a youtube video, they don't do the game justice. Considering many devs are struggling to push current consoles to the limit, don't expect many games outside of the really big franchises that sell 5+ mill to look anything remotely close to Samaritan.
Who's struggling? All I see is systems getting maxed out. If devs are struggling, they are struggling to deal with these underpowered consoles. Devs can do more and they are eager to do more. This is proven by games like Watch Dogs, that Final Fantasy realtime E3 demo, the Unreal 4 demo.

Even with Witcher 2 they had to scale back in order to get it to fit on consoles. How are they struggling to keep up with the 360 and PS3?
 
l
Next gen launch games will look TONS better than anything out on PC at that time probably. Just as Kameo did back in 05.

No, they wont. The best looking games are made by multiplatform developers now, so what You will see on consoles, You will see on PC too, just with better quality.
 
People think that just because you stick parts inside of a proprietary box you get some kind of free performance increase because of "optimization". Generally in the past the delta between consoles at launch vs PCs of the day has been because of the highly customized parts designed specifically for the console boxes and the high level of specialization. Consoles played games and didn't need to devote resources towards anything else.

In addition there was a fairly level playing field in terms of thermals and power draw. PCs weren't sucking down wattage and so consoles could roughly match them in terms of transistor count and design complexity.

All of this is different now.

1. Consoles next gen are going to get parts either directly off the shelf or only lightly customized for their use. The cost of R&D in the semiconductor space has skyrocketed, and the parts made for PC and mobile are good enough that there's no real benefit to coming up with your own silicon from scratch. This sea change actually came during the design process for the current gen, but Nintendo threw a monkey wrench into the works by going with overclocked Gamecube chips and Sony obviously cost themselves a boatload of money by betting badly with Cell and tossing the RSX in at the last minute.

The 360 got a prototype unified shader GPU but you won't be seeing this happen next time. Both MS and Sony are going with AMD-based GPU designs and there's nothing far enough along in production that could conceivably go into production for the new consoles. They're getting GCN architecture, the only question is transistor count and clock speed.

2. Consoles aren't as specialized anymore. The current offerings all do a lot more than just play games and everyone in the hardware space is doubling down on this for the future. We're going to get boxes that try to do everything and thus have to devote CPU clocks and RAM space to OS level functionality. This will ultimately reduce the resources available to games.

In addition there won't be as much low-level coding as there has been in the past. Everyone's concentrating on multiplatform releases which means higher levels of coding abstraction and less platform-specific optimization. More middleware also means less programming to the metal. Ballooning budgets will take their toll here too, but I'm just concentrating on the technical aspects here.

3. As I mentioned, thermal draw is possibly the biggest factor. High end PCs have power supplies that can draw over 1000W and GPUs that take up a substantial fraction of that. Consoles will simply not be able to measure up.

It's not my intention to really disparage the consoles here. For the amount of money you'll spend I'm sure both Sony and MS will be providing a better experience than buying a $300-400 PC (for the first year or two anyway), and developers will (stupidly IMO) still be putting out console-only games that you just can't get on PC.

But when you're talking purely in technical terms, I think it's pretty safe to say that Durango and Orbis will definitely not be able to measure up.
That's really not how "No such thing as a free lunch" works. The phrase isn't used to tell people that they need to do something more, it's used to alert people that some thing that they are being presented with has a hidden cost.

Back to the actual discussion, the thing that you're underestimating is the developers. Speaking in purely technical terms, PCs have and will always be superior to consoles. Optimization or not, a top end PC will always be capable of more than a game console.

Still, if you go back to 2005 and show me a PC game running with 512MB RAM it's not going to look like Uncharted 3. As a matter of fact, no game running on any PC in 2005 is going to look like Uncharted 3. The hidden 'cost' that you're not seeing is that developers flock to consoles because of the standardization and frankly because that's where business is. That's why you have a console with 2005 tech doing things that no PC could have dreamed of. No one was there squeezing everything they could out of 512MB in 2004, but they did it on consoles. There's a lot more to graphics than hardware. The software that actually pushes that hardware to its limits is a huge variable.


The best example that I can probably give for this is actually through a music composer:

1-bit audio. Beep or no beep.

Tim Follin did this with that 1-bit audio.


I guess we'll just have to wait and see though. Go ahead and post the absolute best high-end PC graphics that are on the market right now. It'll be interesting to compare when I bump the thread in a couple of years.
 
No, they wont. The best looking games are made by multiplatform developers now, so what You will see on consoles, You will see on PC too, just with better quality.

I feel like this another point people are looking over. High end PCs right now will most likely be able to run any next gen launch console game with ease unless its terribly unoptimized.
 
Didn't Unreal Engine 4 give us a pretty good idea of what to expect? I mean I know a lot of top developers won't be using it, but it does give us a good idea of what roughly the median will be.
 
Didn't Unreal Engine 4 give us a pretty good idea of what to expect? I mean I know a lot of top developers won't be using it, but it does give us a good idea of what roughly the median will be.

The main thing with Unreal Engine 4 is that is that it's suppose to be incredibly easy to use while also getting results at a much faster rate then other engines. Obviously after Unreal Engine 3 being used for a metric shit load of games this gen., I would assume Sony and MS are getting poked by a very sharp stick on a regular basis to make damn sure the new consoles run the engine in as much glory as they can bleed out of them.
 
That's really not how "No such thing as a free lunch" works. The phrase isn't used to tell people that they need to do something more, it's used to alert people that some thing that they are being presented with has a hidden cost.
It's speaking to the design process. People think they get a "free lunch" for performance just because it goes in a console box and don't tend to be aware of the tradeoffs that have been involved in the past and won't be as much (or at all) of a factor in the future.

The hidden 'cost' that you're not seeing is that developers flock to consoles because of the standardization and frankly because that's where business is.
The business is currently in putting your game out everywhere.
 
Wouldn't you say that's an art style victory rather than a tech victory, though? Very few 360 or PC games look anything like Kameo, even to this day.

I'd say its tech was fairly impressive at the time. A small fraction of games in 2005 had depth of field blur and self-shadowing, or that many characters onscreen.
 
Wouldn't you say that's an art style victory rather than a tech victory, though? Very few 360 or PC games look anything like Kameo, even to this day.

The art style I suppose is fine but the art itself is fucking abysmal, it's seriously one of the ugliest games from a visual design perspective that has ever made it to a full release. It's winning no victories anywhere.
 
I remember when GT1 came out on PS1. When I seen those graphics my mind was blown. I didn't think that anything would come close to it and it was so realistic.

Can't imagine what the future holds.
 
Looking at this thread , if people find current gen GFX @ a good iq as next gen, then i feel they might get blown away by a good numbers of games next gen mostly from first parties
 
This post still gets me every time. I can't believe there actually was a point where people's high end expectations for next-gen were The freaking Witcher 2. Thank God we finally have some information on next-gen engines to reassure people of the leap that will happen.
Don't consoles usually introduce new tech? At least did I feel like every game suddenly started to use HDR only after the 360 was released
 
No, they wont. The best looking games are made by multiplatform developers now, so what You will see on consoles, You will see on PC too, just with better quality.

What we're discussing here is "will there be a leap for launch next gen games".
of course, those games will also run on current pc. That's the point!


Techs and graphics in games wait for the console to come out to show off.


It's like a kick off. Pc can do watches dogs right now, or agni's philosophy, BUT the industry will wait for mainstream consoles to be able to output that to really use those techs. That's why we're only seeing those techs right now, and watches dog or Star Wars are 2013 games.

Pc players always ignore that. Consoles are just setting the tempo, that's as simple as that. And i guess it's because an AAA title wouldn't bring enough money on PC alone.
 
I'd say its tech was fairly impressive at the time. A small fraction of games in 2005 had depth of field blur and self-shadowing, or that many characters onscreen.
Also extreme parallax mapping all over the place and dense clouds of particles drifting around.
 
What we're discussing here is "will there be a leap for launch next gen games".
of course, those games will also run on current pc. That's the point!


Techs and graphics in games wait for the console to come out to show off.


It's like a kick off. Pc can do watches dogs right now, or agni's philosophy, BUT the industry will wait for mainstream consoles to be able to output that to really use those techs. That's why we're only seeing those techs right now, and watches dog or Star Wars are 2013 games.

Pc players always ignore that. Consoles are just setting the tempo, that's as simple as that. And i guess it's because an AAA title wouldn't bring enough money on PC alone.

Next gen:

dog-watching-television-by-laertes.jpg
 
I'd say its tech was fairly impressive at the time. A small fraction of games in 2005 had depth of field blur and self-shadowing, or that many characters onscreen.
Well yeah, it was doing some neat stuff. But the prominent feature in those shots is dat grass, and it's just textures.

The HDR is also really well done, but it quickly became the most abused feature this gen.
 
Probably a bit better because they'd be working from a more powerful base level
Probably a lot better, and this is basically the point. BF3 on max settings is not the best thing that a high-end PC can produce. A console with mid-range PC tech will still be pushing graphics that high-end PCs haven't seen yet.
 
Perhaps one step closer to This

To me this is game assets all over the place with shiny tricks on top of it, basically what will be next gen. The texture quality is not even that good.

Of course i'm not talking about the shot with the whole town. But even that...
 
Who's struggling? All I see is systems getting maxed out. If devs are struggling, they are struggling to deal with these underpowered consoles. Devs can do more and they are eager to do more. This is proven by games like Watch Dogs, that Final Fantasy realtime E3 demo, the Unreal 4 demo.

Even with Witcher 2 they had to scale back in order to get it to fit on consoles. How are they struggling to keep up with the 360 and PS3?

How about THQ, plenty of their games don't break even, yet they sell more than 1 million units, because the cost of making games with modern looking graphics is getting unreasonably high. Have you also notice the large number of Japanese games that are now on hand helds because they could make a game with a much smaller budget. Not all companies can pull a Naughty Dog or a DICE and push the consoles to their knees. And if we saw much smaller risks this generation because of games costing too much to make, imagine next gen when every game has to compare to the latest big budget next gen Battlefield.
 
aeolist said:
People think that just because you stick parts inside of a proprietary box you get some kind of free performance increase because of "optimization". Generally in the past the delta between consoles at launch vs PCs of the day has been because of the highly customized parts designed specifically for the console boxes and the high level of specialization. Consoles played games and didn't need to devote resources towards anything else.

In addition there was a fairly level playing field in terms of thermals and power draw. PCs weren't sucking down wattage and so consoles could roughly match them in terms of transistor count and design complexity.

All of this is different now.

1. Consoles next gen are going to get parts either directly off the shelf or only lightly customized for their use. The cost of R&D in the semiconductor space has skyrocketed, and the parts made for PC and mobile are good enough that there's no real benefit to coming up with your own silicon from scratch. This sea change actually came during the design process for the current gen, but Nintendo threw a monkey wrench into the works by going with overclocked Gamecube chips and Sony obviously cost themselves a boatload of money by betting badly with Cell and tossing the RSX in at the last minute.

The 360 got a prototype unified shader GPU but you won't be seeing this happen next time. Both MS and Sony are going with AMD-based GPU designs and there's nothing far enough along in production that could conceivably go into production for the new consoles. They're getting GCN architecture, the only question is transistor count and clock speed.

2. Consoles aren't as specialized anymore. The current offerings all do a lot more than just play games and everyone in the hardware space is doubling down on this for the future. We're going to get boxes that try to do everything and thus have to devote CPU clocks and RAM space to OS level functionality. This will ultimately reduce the resources available to games.

In addition there won't be as much low-level coding as there has been in the past. Everyone's concentrating on multiplatform releases which means higher levels of coding abstraction and less platform-specific optimization. More middleware also means less programming to the metal. Ballooning budgets will take their toll here too, but I'm just concentrating on the technical aspects here.

3. As I mentioned, thermal draw is possibly the biggest factor. High end PCs have power supplies that can draw over 1000W and GPUs that take up a substantial fraction of that. Consoles will simply not be able to measure up.

It's not my intention to really disparage the consoles here. For the amount of money you'll spend I'm sure both Sony and MS will be providing a better experience than buying a $300-400 PC (for the first year or two anyway), and developers will (stupidly IMO) still be putting out console-only games that you just can't get on PC.

But when you're talking purely in technical terms, I think it's pretty safe to say that Durango and Orbis will definitely not be able to measure up.

Thank you for this post. So much more eloquent than I could've ever put it.


Fancy Corndog said:
Probably a lot better, and this is basically the point. BF3 on max settings is not the best thing that a high-end PC can produce. A console with mid-range PC tech will still be pushing graphics that high-end PCs haven't seen yet.

Some shitty textures aside, it takes a lot more hardware grunt than any of the next gen consoles have to run it at Ultra at 1080p (and a lot more than that to keep it at 60fps). I think that's what some of us are trying to point out.
 
Thank you for this post. So much more eloquent than I could've ever put it.

So if gaming isn't the main concern of console anymore then doesn't it lose some of its relevance and reason for existence? And besides, then why even go for the upgrades in the first place? Why don't MS and Sony just upgrade to the point where more RAM is added for non-gaming functionality and call it a day? I am sure people won't mind especially if they can start to sell it at nearly half the price of WiiU and have the entire library of PS3 or 360 to boot. Kind of reminds me of what Jeff Rigby was talking about. Just rebrand PS3.5 or XBox 360.5 into PS4 and Xbox 8.

All three of the consoles would be neck and neck in terms of graphical prowess and only one would be a gaming console and the remaining to two media boxes for the nearly half the price of the former.

Makes me wonder why MS and Sony are even trying...
 
People think that just because you stick parts inside of a proprietary box you get some kind of free performance increase because of "optimization".

You will get more performance if you optimize your code (amongst other things). But that's not "free", it costs a lot of money, and I don't think anyone assumes that.

Generally in the past the delta between consoles at launch vs PCs of the day has been because of the highly customized parts designed specifically for the console boxes and the high level of specialization. Consoles played games and didn't need to devote resources towards anything else.

Wether a console dedicates the performance to games or to other uses has nothing to do with using stock parts.

In addition there was a fairly level playing field in terms of thermals and power draw. PCs weren't sucking down wattage and so consoles could roughly match them in terms of transistor count and design complexity.

I don't see how thermals and power draw have to limit a console (at this price class!). There is no law which says that consoles have to be incredibly small or that consoles can not have high power draw.

All of this is different now.
1. Consoles next gen are going to get parts either directly off the shelf or only lightly customized for their use. The cost of R&D in the semiconductor space has skyrocketed, and the parts made for PC and mobile are good enough that there's no real benefit to coming up with your own silicon from scratch. This sea change actually came during the design process for the current gen, but Nintendo threw a monkey wrench into the works by going with overclocked Gamecube chips and Sony obviously cost themselves a boatload of money by betting badly with Cell and tossing the RSX in at the last minute.

The 360 got a prototype unified shader GPU but you won't be seeing this happen next time. Both MS and Sony are going with AMD-based GPU designs and there's nothing far enough along in production that could conceivably go into for the new consoles. They're getting GCN architecture, the only question is transistor count and clock speed.

I don't think you can yet make any statements about that. Microsoft is the creator of DirectX and they work closly with hardware vendors. I think it is absolutely possible that they will use some kind of custom design which is based on a existing design.

2. Consoles aren't as specialized anymore. The current offerings all do a lot more than just play games and everyone in the hardware space is doubling down on this for the future. We're going to get boxes that try to do everything and thus have to devote CPU clocks and RAM space to OS level functionality. This will ultimately reduce the resources available to games.

But how much will this reduce resources? It seems that Microsoft will make heavy use of such features, but it is entirely possible that Sony will be conservative with this, and will only make relatively small improvements (see PS Vita). And that would not need much resources.

n addition there won't be as much low-level coding as there has been in the past. Everyone's concentrating on multiplatform releases which means higher levels of coding abstraction and less platform-specific optimization. More middleware also means less programming to the metal. Ballooning budgets will take their toll here too, but I'm just concentrating on the technical aspects here.

I don't think that's true. Maybe average dev teams won't do much low-level coding, but the developers of the middle ware will still do this (and bigger dev teams). And budgets won't grow that much.

3. As I mentioned, thermal draw is possibly the biggest factor. High end PCs have power supplies that can draw over 1000W and GPUs that take up a substantial fraction of that. Consoles will simply not be able to measure up.

I don't think this is an issue here, or can you build a $400 PC which draws so much power, that you can not build a comparable console? Thermal draw is irrelevant in this price category.

It's not my intention to really disparage the consoles here. For the amount of money you'll spend I'm sure both Sony and MS will be providing a better experience than buying a $300-400 PC (for the first year or two anyway), and developers will (stupidly IMO) still be putting out console-only games that you just can't get on PC.

But when you're talking purely in technical terms, I think it's pretty safe to say that Durango and Orbis will definitely not be able to measure up.

It makes no sense to compare PCs to consoles without looking at the price. PCs are highly scalable, of course you can build a more powerful machine if you want.


[...]Some shitty textures aside, it takes a lot more hardware grunt than any of the next gen consoles have to run it at Ultra at 1080p (and a lot more than that to keep it at 60fps). I think that's what some of us are trying to point out.

I am sure next gen consoles will be able to display a game like BF3 with ultra settings at 1080p/30fps. But they won't get this game, they will probably get BF4 with super ultra settings at 720p/30fps.
 
I think everyone expecting launch titles to be massively better than what's available on high-end PCs today will be severly disappointed. It didn't happen last gen (Imo -which seems to be shared by some others here- Kameo doesn't look better than Far Cry, and it has objectively lower IQ... Same thing for doom 3 and perfect dark), so I don't see it happening this gen.

However, one or two years later? Of course console games are going to look heaps better than anything that's currently available! (although IQ won't be as good as what is possible to this day, but that's just something that comes with playing on consoles... And I doubt it'd be as noticeable as it on current gen consoles if games next gen start running in native 1080p, 4K TVs won't hit the mass market until 4/5 years anyway)
 
The look of next gen gaming is decided as much by the production workflows and budgets for AAA games, as the game engines/technology afforded by next-gen hardware.

What this means is that the large scale production of the extra-high-res textures, the generation of million polygon+ levels and models, etc., the increased use next-gen cloth, hair/fur, higher end animation techniques, advanced AI hasn't been released yet (or if it has, it hasn't been perfected yet). The effects and shaders possibly with a billion+ floating point operations per second haven't been optimized for console-style games yet.

So while today's high-end PCs may technologically be capable of the same level of processing (or more) than next gen consoles, the actual 'sweet-spot' target for PC-exclusive games is a much lower end PC, because only a very small few have these high end PCs exist today, compared to the total gaming market.

The only real PC exclusive AAA game that's come out recently is Diablo III, and that clearly targets a lower-end PC - it runs pretty well on intel's integrated HD4000 graphics, and that should tell you what the true target is. Up-ressing textures and assets designed mainly for console games (BF3, COD MW3, etc.) is just putting lipstick on a pig, IMHO. The proof in the pudding is that these games run just fine on relatively low-mid range cards today. Nothing is designed for a truly high-end PC anymore - the last of those was probably the original Crysis. Id, Epic, Bioware, Bethesda etc. have abandoned high end PC development, and Valve - well who knows what Valve is up to. PC gaming has its niche, but it's not bleeding edge (IMHO). It does mean that high end PCs get the nicest looking and running current gen multi-platform games (even if they arrive late), plus some nice niche stuff (indie gaming, MMOs, strategy, etc.). But I digress.

Adding higher resolutions, more FPS, better shadowing and lighting, AA and AF etc. to a current gen game only makes it a look a little better - what makes a game look next-gen is when you develop levels, models and fundamental rendering techniques that can only run on what is considered today to be truly high end, and that hasn't happened yet.
 
Nothing is designed for a truly high-end PC anymore - the last of those was probably the original Crysis. Id, Epic, Bioware, Bethesda etc. have abandoned high end PC development, and Valve - well who knows what Valve is up to. PC gaming has its niche, but it's not bleeding edge (IMHO). It does mean that high end PCs get the nicest looking and running current gen multi-platform games (even if they arrive late), plus some nice niche stuff (indie gaming, MMOs, strategy, etc.). But I digress.

And the platform is better for it. If there was a Crysis every year, only the hardcore crazies would be on the platform. The consoles have, I think, played a huge part in the "resurgence" of the PC as a gaming platform the past few years. It has made the bar of entry on the PC very low. The ball will be kicked down the field when the new consoles come out, though I don't know how much. It will be only temporary, though.

Quite frankly, I think BF3 as a launch title for the next systems would gain quite a lot of attention. I know there are a lot of people meh-ing that game, but it blew this console gamer's mind when I started it up a couple weeks ago.
 
What we're discussing here is "will there be a leap for launch next gen games".
of course, those games will also run on current pc. That's the point!


Techs and graphics in games wait for the console to come out to show off.


It's like a kick off. Pc can do watches dogs right now, or agni's philosophy, BUT the industry will wait for mainstream consoles to be able to output that to really use those techs. That's why we're only seeing those techs right now, and watches dog or Star Wars are 2013 games.

Pc players always ignore that. Consoles are just setting the tempo, that's as simple as that. And i guess it's because an AAA title wouldn't bring enough money on PC alone.
Definitely.

We're in that weird transitional period, the same period that happened around 2004.
 
I call bullshit on that dude in the beginning of the thread claiming BF3 and Metro 33 maxed out aint no thing.
ridiculous man. Get a better monitor. Get better eyes.
Next gen consoles wont launch with cards that compete with the top end PC cards of the day, but developers always do better on closed systems. I'd love to see what Naughty Dog could do with a GTX 560.
 
How about THQ, plenty of their games don't break even, yet they sell more than 1 million units, because the cost of making games with modern looking graphics is getting unreasonably high. Have you also notice the large number of Japanese games that are now on hand helds because they could make a game with a much smaller budget. Not all companies can pull a Naughty Dog or a DICE and push the consoles to their knees. And if we saw much smaller risks this generation because of games costing too much to make, imagine next gen when every game has to compare to the latest big budget next gen Battlefield.
THQ is obviously doing something wrong then cause it's main competition, EA and Ubisoft are making money. SquareEnix is obviously ready for next gen. And in case you haven't noticed, very few games this gen even comes close to Uncharted 3 when it comes to visuals, which is arguably the best on consoles. Nobody is saying that every game has to be top notch in that department. We are just ready for them to look and perform better than they do now.

And budgets, we've been crying about budgets since last gen. There was a thread a week or two ago about the average Japanese dev's salary being $250,000.
 

Next gen will be interesting especially if the consoles are packed with enough RAM. Thing is, PC gaming is limited by minimum spec requirements and a new console cycle usually raises this minimum spec bar even higher in time.

When that bar is raised to DX11, 4 GB of ram (instead of DX9 and 1 GB ram) etc, we should see even better looking games....and maybe more complex (less linear games) since developers will have enough ram to create bigger and more dynamic worlds (on consoles)
 
Top Bottom