Every time I upgrade, I hate the idea of bending the knee to Nvidia. I hate that they have a soft monopoly, and I want more consumer choice. Buuuuuut, DLSS (quality), DLAA, and now frame generation all work extremely well. Also, I've been using Nvidia cards since I got back into PC with the GTX 680 and I've never once had an issue with drivers. That whole "it just works" thing has been 100% true for me. Everyone I've talked to (more like interrogated) with AMD cards have eventually admitted they've had some issues, although they typically downplay them.
So as much as I hate to say it, if you're going to go with one of those three options, I would vote for the 4070 Ti. I don't think there's any chance the 5000 series cards will come down in price point meaningfully. We need the AI market to crash for that to happen, and if it's going to, I think it's still more than a year out.
That said, if you're getting 80fps in Cyberpunk with acceptable settings with your current hardware, what are you actually going to accomplish by getting more GPU horsepower? 120fps? Does that increase in frame rate do a lot for your brain? I have to admit that it really doesn't for mine. I get 97% of the value from a solid 60fps. Going above that to 90, 120, or 165 (my monitor's max rate) barely adds anything for me. I use Riva Tuner to cap my card at 120 at the system level. That's as high as I can sense any difference at all. Anything above that is just wasting electricity.
Maybe I'm just ignorant, but I doubt there's going to be any games out this year that are more technically demanding than Cyberpunk. So if you're already good with how that runs, you might take a cold shower and try to wait a while longer.