It does not help with stuttering brought about by massive frame-time spikes. It does, however, help with judders brought about persisting or late frames as John duly highlights here:
It helps only in the sense that without VRR and v-sync, with an uneven frame pacing, the monitor will be showing continuing screen tearing, as the GPU delivers frames at a random interval.
But the game is still sending frames with uneven interval, and this can be noticed by gamers.
So VRR is not fixing the issue, it's just hiding additional screen tearing. And screen tearing is often perceived as judder.
It's much less obvious when the fps drops from 120 to 80 then from 80 to 40 or 60 to 40. Shader compilation stutters have nothing to do with this discussion, so not sure why you're bringing them up. Point is, you're completely incorrect that it's preferable to play a locked 60 than unlock your frame rate and let VRR do its thing. Almost no one on PC locks their monitor or games to 60 or 120. Your counter-argument only applies when there are huge shifts in frame times/rates such as instantly going from 120 to 70fps, but how often does that even happen? Most of the time when gaming, your frame rates will remain within a certain window and gradually increase or decrease as you enter heavier scenes or the action ramps up.
Huh, maybe you should have read the argument prior to butting in and saying a bunch of incorrect information? The discussion started because a poster claimed that the PS5 Pro could potentially "run" games better than a $2500 PC equipped with a 4090. I said that this wasn't going to happen. Another poster then said that it might very well be the case with Rift Apart and I answered that it wasn't because it's already much better on PC and runs at a higher frame rate. The poster then replied that going above 60 in Rift Apart isn't good because it increases frame-pacing issues, so it would be tied with 60fps in most cases and that's blatantly wrong. Rift Apart doesn't have frame pacing issues going above 60fps at all and you don't get massive fluctuations that push your fps to 120 one second and the down to 65 the next.
The preferred way on PC is to set an fps cap a few frames below the monitor's max refresh rate so your frame rate never goes beyond the VRR window. After that, it's perfectly fine to let your game go above 60 without hitting 120 consistently. Everyone with a high refresh-rate monitor does that. Almost no one who cannot hit 120 most of the time will cap themselves to 60. This is nonsense.
Even with VRR, I can notice a drop from 80 to 60 fps. VRR helps by not having the screen tearing and by maintaining a reasonable low latency. But it's still noticeable.
VRR will not fix performance optimization issues. And it does not fix stutters from asset streaming of shader compilation.
But it's a lot better than the previous solutions.
Without VRR, devs on consoles had two solutions. One is to continue to enforce v-sync. And that means that a drop from 30 or 60 fps, will result in v-sync adjusting the frame rate to a multiple of the TV refresh rate.
So if a game is running at 60 fps, get s a drop to 59 fps and the game enforces v-sync, it will drop to 30 fps to maintain a sync with the TV at half refresh rate. And this is very noticeable for the player.
The other option is to drop vsync when frame rate drops. And this means screen tearing. Which is also noticeable, but less jarring than enforcing c-sync.
VRR is by far the best solution. But the frame rate is still dropping. With small drops in frame rate, the player might not even notice it. Especially if the frame pacing is good.
But if it's a drop of something like 20 fps, the player will still notice it. And if it's a stutter from asset streaming or shader compilation, the player will still notice it.
So let me reiterate, VRR only fixes problems with syncing frame delivery between the GPU and the monitor.
It does not solve performance issues.