It's as though you want to be right and wrong at the same time.Ridiculous. It's not a bad thing when it lets them achieve clocks they couldn't by opting for a fixed clock
Yeah in a perfect world Sony could just go balls to the walls with clocks without having to worry about thermals or noise. Same with Microsoft.
If we were in a perfect world.
More unknown variables to contend with on the PC. everything is known about the console by the manufacture of that console.If it's that much better, why don't any PC GPUs use it?
You seem to be suggesting XBOX series X weaker than PS4 Pro?The point has been driven in to the ground. Xbox One still runs those games at 720p/900p. Xbox One X 6TF still had to downgrade its resolution for RE2 Remake to match the same FPS output as a 4 TF PS4 Pro.
We can just chalk that up as "Optimized for the Xbox One X".
Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?Developers can optimize their games using a fixed profile. They don't even need to pay attention to variable clocks
Nah I think most people get it.Another day, another Microsoft talk. Their talk is really all over the place.
The way the talk, they are making it only more confusing for everyone.
You should know that both company's must have each other dev kits by now, MS even knew that they had a more powerful console before sony announcement, thats why they stated "The most powerful console" before any official announcement from sony.MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
This analogy is so flawed i dont know where to startA goal keeper.
One moving one static.
Which is harder to score against.
A target.A goal keeper.
One moving one static.
Which is harder to score against.
What unknown variables would stop, say, Asus or MSI from releasing a card that always runs at it's maximum advertised clock speed? The reason they're not doing that is because there simply isn't a need for it. Oh, and the tech press and customers would rip them a new one.More unknown variables to contend with on the PC.
I don't really get what's unpredictable about only having the GPU clock up when the workload demands it. That's still predictable behavior. What's the point of having the thing run at the same speeds whether you're playing Halo Infinite or Spelunky?everything is known about the console by the manufacture of that console.
Why not work within the limits of the design and power envelope in a consistent predictable way?
It's as though you want to be right and wrong at the same time.
you're making his point for him and want to say he's ridiculous.
Say it with me "variable clocks is a compromise". it literally is a give(power to cpu) and take(power form GPU) on the PS5.
Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?
I would say the contrary. They seem to be pretty consistent in their messaging since the PS5 reveal. They keep saying they aren't worried and still have the most powerful console. Also, I would imagine their decades as a software developer might give them some insight on challenges developing.MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
PlayStation 5's boost clocks and how they work
One of the areas I was particularly interested to talk about was the boost clock of the PlayStation 5 - an innovation that essentially gives the system on chip a set power budget based on the thermal dissipation of the cooling assembly. Interestingly, in his presentation, Mark Cerny acknowledged the difficulties of cooling PlayStation 4 and suggested that having a maximum power budget actually made the job easier. "Because there are no more unknowns, there's no need to guess what power consumption the worst case game might have," Cerny said in his talk. "As for the details of the cooling solution, we're saving them for our teardown, I think you'll be quite happy with what the engineering team came up with."
ARTICLE CONTINUES BELOW
Regardless, the fact is that there is a set power level for the SoC. Whether we're talking about mobile phones, tablets, or even PC CPUs and GPUs, boost clocks have historically led to variable performance from one example to the next - something that just can't happen on a console. Your PS5 can't run slower or faster than your neighbour's. The developmental challenges alone would be onerous to say the least.
"We don't use the actual temperature of the die, as that would cause two types of variance between PS5s," explains Mark Cerny. "One is variance caused by differences in ambient temperature; the console could be in a hotter or cooler location in the room. The other is variance caused by the individual custom chip in the console, some chips run hotter and some chips run cooler. So instead of using the temperature of the die, we use an algorithm in which the frequency depends on CPU and GPU activity information. That keeps behaviour between PS5s consistent."
Inside the processor is a power control unit, constantly measuring the activity of the CPU, the GPU and the memory interface, assessing the nature of the tasks they are undertaking. Rather than judging power draw based on the nature of your specific PS5 processor, a more general 'model SoC' is used instead. Think of it as a simulation of how the processor is likely to behave, and that same simulation is used at the heart of the power monitor within every PlayStation 5, ensuring consistency in every unit.
ARTICLE CONTINUES BELOW
"The behaviour of all PS5s is the same," says Cerny. "If you play the same game and go to the same location in the game, it doesn't matter which custom chip you have and what its transistors are like. It doesn't matter if you put it in your stereo cabinet or your refrigerator, your PS5 will get the same frequencies for CPU and GPU as any other PS5."
360p geselecteerd als afspeelkwaliteit480p low geselecteerd als afspeelkwaliteit
PlayStation 5 New Details From Mark Cerny: Boost Mode, Tempest Engine, Back Compat + More
23:24
A new video report from Digital Foundry on what we've learned about the system since the reveal.
Watch on YouTube
Feedback from developers saw two areas where developers had issues - the concept that not all PS5s will run in the same way, something that the Model SoC concept addresses. The second area was the nature of the boost. Would frequencies hit a peak for a set amount of time before throttling back? This is how smartphone boost tends to operate.
"The time constant, which is to say the amount of time that the CPU and GPU take to achieve a frequency that matches their activity, is critical to developers," adds Cerny. "It's quite short, if the game is doing power-intensive processing for a few frames, then it gets throttled. There isn't a lag where extra performance is available for several seconds or several minutes and then the system gets throttled; that isn't the world that developers want to live in - we make sure that the PS5 is very responsive to power consumed. In addition to that the developers have feedback on exactly how much power is being used by the CPU and GPU."
Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "
In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
ARTICLE CONTINUES BELOW
"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.
PS5 caps its CPU and GPU clocks at 3.5GHz and 2.23GHz respectively, but how stable are the frequencies?
At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame.
"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
It's an innovative approach, and while the engineering effort that went into it is likely significant, Mark Cerny sums it up succinctly: "One of our breakthroughs was finding a set of frequencies where the hotspot - meaning the thermal density of the CPU and the GPU - is the same. And that's what we've done. They're equivalently easy to cool or difficult to cool - whatever you want to call it."
ARTICLE CONTINUES BELOW
There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny
what computer case do you have?What unknown variables would stop, say, Asus or MSI from releasing a card that always runs at its maximum advertised clock speed? The reason they're not doing that is because there simply isn't a need for it. Oh, and the tech press and customers would rip them a new one.
All modern systems do downclock, even the consoles, the PS5 implementation of AMD's SMartshift is different from a "simple" downclocking of a CPU or GPU.I don't really get what's unpredictable about only having the GPU clock up when the workload demands it. That's still predictable behavior. What's the point of having the thing run at the same speeds whether you're playing Halo Infinite or Spelunky?
And maybe Cerny is just making it up ? Timestamped.
Also consider the sony patent on cooling - we have not seen it yet. Is 22 % GPU extra speed worth it ? We shall see.
When triangles are small discussion from cernys speech, what do you think he is referring to ?
What else had lots of small traingles............
You should know that both company's must have each other dev kits by now, MS even knew that they had a more powerful console before sony announcement, thats why they stated "The most powerful console" before any official announcement from sony.
These are multi billion dallar company's not casual people digging for infos.
I would say the contrary. They seem to be pretty consistent in their messaging since the PS5 reveal. They keep saying they aren't worried and still have the most powerful console. Also, I would imagine their decades as a software developer might give them some insight on challenges developing.
I have no doubts that there will be some fantastic games on the PS5, especially from 1st party devs. However, I think its pretty reasonable to think that a moving target (variable rates) would be harder to hit. Personally, I'm a bit concerned about the upclock on the PS5 GPU. I know if I overclocked my 2080 in my PC to 2.3Ghz it would be a crashfest, even with a 750 watt power supply. I think MS's decision to keep everything moderately clocked will pay off in dividends with heat and noise. I'm glad this is the approach they are staying with.
the example was more to show what a compromise is.What? No, that's smartshift and only works one way which is unused power from the CPU goes to the GPU.
why even say this, you're just restating unrelated facts and correlating in an attempt to make a false equivalence.Yeah, both consoles have compromises. XSX had to comprise with a lower GPU clock speed by opting for a fixed clock but have sustained performance. PS5 had to compromise with fluctuating clocks but can push peak clocks higher than normal thanks to it.
As I said before, claiming variable > fixed clocks when you actually factor in reality is just wrong.
which is it?That still doesn't change the fact that there would have been benefits to them implementing a variable clock over the fixed clock they opted for.
Which is my point. In the real world, fixed clocks > variable clocks is false narrative
where do the clients/gamers/players play games?Do developers optimise on the retail console or the dev kit?
Well that's what I heard when people said games would be held back by current gen...Yeah, and you have that, you have a octa core Ryzen with SMT over 3ghz, vs a shitty Jaguar at 2.3 maximum (XBOX X). Even at 3ghz would be > 4x more powerful than Jaguar
Anyway, seriously you think better AI next gen? lol. We have the same AI levels than PS2 gen, with much better CPU's
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
Perhaps MS knows people that have both dev kits or they tried it on their own.There's no way they could know if the PS5 is harder to optimize for or not.
Perhaps MS knows people that have both dev kits or they tried it on their own.
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
I doubt it. lol
MS definitely knows the low down, they have been confidently dropping mini-bombs if you are paying attention.
Unlike internet fans, MS do not need to rely on tech/PR 'deep dive', no grey areas to imagine wonderful scenarios.
Something can't be easier to develop for if something is varied rather than stable.
Something can't be easier to develop for if hardware delivers less power with varied clocks.
Devs are also whores who will say anything to hype up things. Just read developer interviews about any other previous generation jump.
But they know if their console is easier to develop for with fixed clocks or variable clocks. They don't need the Ps5 for that.
And they say the former is true.
Quite possibly. Anything Cerny says means no more to me than what Phil says means to the Sony fans that frequently like to imply or outright state Phil is lying.
No it's just a normal gaming forum thread, You have to understand when you step out of your next gen safe space and don't have all your Sony loyalists to back you up opinions other than the hive mind narrative that permeates that thread are permitted.
but its ok im sure you wont notice as your eyes cant look away from the car crash ps5 internet router stylings.
It's as though you want to be right and wrong at the same time.
you're making his point for him and want to say he's ridiculous.
Say it with me "variable clocks is a compromise". it literally is a give(power to cpu) and take(power form GPU) on the PS5.
More unknown variables to contend with on the PC. everything is known about the console by the manufacture of that console.
Why not work within the limits of the design and power envelope in a consistent predictable way?
The real question is why has this technology not taken over the laptops world? should sony have put this into their laptops to Kill the competition?
if its free performance with no compromise.
You seem to be suggesting XBOX series X weaker than PS4 Pro?
Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?
Nah I think most people get it.
it might just be you.
Also, you should have read the article before posting, not reading it is a sure way to be confused about its contents.
Yes, that is exactly what I'm saying. The PS4 Pro is weaker than the Xbox Series X. /s
Reference: https://www.udemy.com/course/how-to-support-your-adolescents-reading-development/
Microsoft is so transparently jealous. If your that confident in your engineering decisions then let your exclusive games do the talking.
Wait never mind... I understand Microsoft carry on then.
Stop trying to be rational, it only going to make you informed.He really didn't imply they aren't interested in TFLOPs; he implied they aren't interested in increasing their max TFLOP by way of variable clock rates.
"Variable clocks aren't worth the extra TFLOPs on paper" would be a better paraphrase.
Now obviously one could argue he's wrong, but that's more aligned to what he said.
Because it is harder to predict the behaviour of your game. E.g. if a scene runs just fine at the higher clockrate and the system happens to deliver that most of the time you test a scene, but struggles to keep performance at a lower clock, which also sometimes happens in the same scene, it is very difficult to replicate the issue to fix it. Of course variable clock rate is only one of many factors that can lead to variable performance. In fact, a Microsoft game recently ran into issues with another variable aspect: Ori and the Will of the Wisps had issues with the memory management of Unity and had inconsistent loading times and framerates as a consequence.He says it would have made it harder to develop for if they used variable clocks? How so? I really wish he would expand on that.
What's the difference? Instead of having the chip run at a locked frequencies and letting the power vary they chose to set limits (peaks) for the frequencies and give the chip a fixed power budget. These peaks are not sustained and are the top end of the spectrum. It's likely that neither the CPU or GPU will be both be running at peak frequencies simultaneously any given time given the fixed power budget with one of the units needing to reduce it's frequency(downclocking and reducing power draw) in order to increase the frequency(upclocking and increasing power draw) of the other unit. Cerny states this decision made the device easier to cool. I stated downlocking because the unit downclocks from those peak frequencies.Do you actually understand what you posted? The freq at which the the GPU/CPU operate at are based on a fixed power draw (Which is how they're able to predictably cool the console efficiently). What you suggested in your original post is that this is based on thermal throttling.
"Shortcomings, as in downclocking the CPU and/or GPU in order to keep your device cool (which in turn keeps the fans quiet)" -GODBody![]()
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.It's likely that neither the CPU or GPU will be both be running at peak frequencies simultaneously any given time given the fixed power budget
Sony's variable clock rate method is more predictable than usual though.Because it is harder to predict the behaviour of your game. E.g. if a scene runs just fine at the higher clockrate and the system happens to deliver that most of the time you test a scene, but struggles to keep performance at a lower clock, which also sometimes happens in the same scene, it is very difficult to replicate the issue to fix it. Of course variable clock rate is only one of many factors that can lead to variable performance. In fact, a Microsoft game recently ran into issues with another variable aspect: Ori and the Will of the Wisps had issues with the memory management of Unity and had inconsistent loading times and framerates as a consequence.
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.
Frequencies aren't what cause power draw on their own.
According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.
Frequencies aren't what cause power draw on their own.
According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.
If both units are running at their peak frequencies constantly then there is no need for a variable frequency.
Frequency and power draw are explicitly tied together. When you overclock a CPU on a PC you increase the amount of power given to it thereby increasing the frequency at which it runs.
When the GPU is at it's peak it's drawing power away from the CPU to hit that peak frequency and if the power budget is fixed that CPU is not getting extra power to hit it's peak frequency at the same time.
I'm not comparing variable clocking to my 2080. I was saying if I overclocked my 2080 to the GPU clock speed of the PS5 it would cause stability and heat problems even with a large power supply.You sure about this?
I'm sorry, but you don't seem to understand "how" the variable clocks work in the PS5. You can't compare it to your 2080 in your PC. This is totally different.
He might be referring to multithreaded process versus single threaded processes. You might want a higher clock to process a single thread faster than usual, because it's not possible to run the task in parallel. So while you bump the clock, you might not be using the full number of logic units available, so your current draw might not be that much higher over a given interval of time. Thus you wouldn't be drawing more power.If both units are running at their peak frequencies constantly then there is no need for a variable frequency.
Frequency and power draw are explicitly tied together. When you overclock a CPU on a PC you increase the amount of power given to it thereby increasing the frequency at which it runs.
When the GPU is at it's peak it's drawing power away from the CPU to hit that peak frequency and if the power budget is fixed that CPU is not getting extra power to hit it's peak frequency at the same time.
Why would they both be running at max frequencies if the workload is not intensive doesn't require it?Again.. you are missing one huge factor.. what is running on the CPU/GPU (workload.)
They can both be running at max as long as the workload isn't too high.
"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
There’s enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz
I'm not comparing variable clocking to my 2080. I was saying if I overclocked my 2080 to the GPU clock speed of the PS5 it would cause stability and heat problems even with a large power supply.
I get the power shift from CPU to GPU Sony is promoting, but less power equals less performance so there is going to have to be compromise if the shift needs to occur. For example if a developer needs more GPU power and it throttles the CPU to provide it, will CPU functions like AI suffer? These are the questions that haven't really been answered.
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.
Frequencies aren't what cause power draw on their own.
According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.
Why would they both be running at max frequencies if the workload is not intensive doesn't require it?
That is 100% NOT how it works. The dev does not decide where the power goes. All they have control over is the code they write, so they can write less intensive on the CPU code if they want to ensure max GPU power, but they can't explicitly do that by just setting power levels or anything.That's not exactly how this works. The CPU and GPU can run at full power at the same time. But if the dev wanted to put some CPU power to the side they can. They can develop their game with this lesser CPU power in mind (say 3.2 GHz, instead of 3.5 Ghz) so that the GPU is always at 10.3 TFs of power.
Yeah see my next post on the topic where I revisted the actual quotes; he actually says "at or near" when talking about max frequencies so if you read between the lines... it actually means most of the time one or the other probably gives.Cerny made so many conflicting or vague statements in that eurogamer article. The phrase "most of the time" is mentioned throughout the boost clock talk. Then there is "CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz". There is more statements I could point out including Digital Foundry's about developers throttling back on the cpu in order to sustain the gpu 2.23Ghz clock speed. Now why would they need to do that if it can run both cpu and gpu at peak frequency?
Now here is the heart of the matter Jason Ronald is talking about...
"Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "
In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "
Six time the word optimized is used when talking about programming for variable clocks. Its going to be more developer work to get the most out of. That's all he was saying.
There is more statements I could point out including Digital Foundry's about developers throttling back on the cpu in order to sustain the gpu 2.23Ghz clock speed.
You literally just said the same thing I said. They would move power to the GPU and take power from the CPU which would throttle the CPU and reduce performance of the CPU. The question is what are the impacts.That's not exactly how this works. The CPU and GPU can run at full power at the same time. But if the dev wanted to put some CPU power to the side they can. They can develop their game with this lesser CPU power in mind (say 3.2 GHz, instead of 3.5 Ghz) so that the GPU is always at 10.3 TFs of power.