I love the energy here but, there is an untold dynamic at play here.
The power draw on the CPU and GPU is very different.
lets say the Ps5 total power usage is 300Watts (lets just say)
How much do you think the CPU gets out of that constant power draw.
and How much does the GPU get.
My 3900x

(bawlin) has 12 cores at base clock 3.8GHz and 70Mb cache with a max power draw of 105 watts,
XSX and PS5 8 core CPU's are not drawing 100 watts. But let's just say they do which leaves 200~ for the GPU.
PS5 - default mode
200Watts GPU + 100watts CPU = 300watts total.
PS5 - max GPU mode
200+50watts + 100-50watts CPU = 300watts total.
This seems to be the real conundrum with the PS5,
How much of a CPU hit are you willing to take to get max GPU performance.
How much more frequency do you get with the additional wattage give to the CPU and CPU.
since the power curve is much steeper the higher you go in frequency for the CPU or GPU trading power from one component to the other does not look all that efficient.
you get less per watt the higher the frequency you push.