Ulysses 31
Member
Same launch price for a 15-20% faster card? Doesn't seem so bad to me.That's wild... where's the original forum post. How is this possible?
And that's the Super version, OG 4080 launched at 1200$.
Same launch price for a 15-20% faster card? Doesn't seem so bad to me.That's wild... where's the original forum post. How is this possible?
Waiting game is kinda iffy in my vision with all the tech innovations going at lightning speed these days, u don't want to buy a GPU to close to a new generation anymore because u fall behind faster. And that's what happens when u wait for the TI versions which makes them less value in my view.
Sitting on a GPU for multiple generations is also what is not a good idea anymore these days. Imagine buying a 3080 ti, a 4080 absolute levels it with framegen a year later. With path tracing pushed forwards in that gen, u need framegen to get stable framerates. a 4080 vs a 3080 u see a massive performance difference because of it and sadly path tracing is off the menu. While if u bought a 3080 at launch, u got a full 2 years with games launching around that tech before a change happens towards newer tech. Imagine having a 3080 and path tracing releases and u can't because u lack a feature, imagine when 5080 launches and sit on a 3080 when the next title makes full use of 4x framegen and pushes visuals forwards, that 3080 will sit at single digits trying to run it.
This is also why i am convinced that sitting for multiple gens with higher end GPU's is a dumb thing to do. A 5080, with 4x framegen absolute levels a 4080 in newer titles, games that are going to take advantage of that with higher settings will come out in the next 2 years as devs are going to use it. so buying a 5080 at launch u get 2 full years of having a gpu that games are made for is a good idea, waiting for a 5080 ti, that's 1 year u lose out of it.
I think the best thing to do is, before nvidia announces there next gpu's, is to sell your current high end gpu. sell that 4080 for 900 bucks ( they go for 900-950 second handed in my country ), add ~300 bucks and have a 5080 and u are ready to go for this gen. Do the same with the next generation when it arrives. 4090's lose there value harder here it seems like u can pick them up now second handed so u return is far worse, but even there u can do it with, just sell it earlier then the announcement and u should be good.
That's wild... where's the original forum post. How is this possible?
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
He is not wrong. he is sharing his opinion lol. What's wrong about that?Dudes wrong, i enable framegen on every game i can besides twitch shooters.
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
What do you mean? jeff is great. he knows his stuff.He’s full of shit.
He is not wrong. he is sharing his opinion lol. What's wrong about that?
Besides, he knows his shit. He was always crazy about input latency
Whatever. I don't agree. I think it's shit.If he stated what u stated, he's full of shit. Framegen is the best thing that happened after DLSS. Helps massively with CPU bottlenecks and GPU bottlenecks.
"It feels like GeForce Now". WTF is he playing at? 20fps base that go to 40fps with frame gen. Duh, it feels like ass.What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.
You think it's shit based on what? Don't you have a 3080? When have you tried DLSS frame generation?Whatever. I don't agree. I think it's shit.
but also, I only posted his opinion because I found it interesting. I am not here to defend the guy lol.
nothing he says is factually wrong either. just listen to 10 minutes of it from 21:30
I tried dlss on a 4080. It looked smoother but undeniably added some input latency."It feels like GeForce Now". WTF is he playing at? 20fps base that go to 40fps with frame gen. Duh, it feels like ass.
You think it's shit based on what? Don't you have a 3080? When have you tried DLSS frame generation?
What was the context? You went from what fps to what fps? What was the game.I tried dlss on a 4080. It looked smoother but undeniably added some input latency.
Frame generation isn't magic. It has its caveats, but claiming it feels like GeForce Now is utter bullshit. Once again, I will refer to Cyberpunk 2077 and the utter hypocrisy of the discourse surrounding frame generation.And obviously tried some fsr3 on my pc and it's a nightmare but it's worse than dlss3
Besides. I know how it works. You know how it works... it must add some latency. Why would I add any latency for fake more fps.
I feel 4090-5090 is poweful enough to enjoy high fps without added latency.
And no. Reflex is not the offset for that. using reflex lowers your fps a bit too. It's not free.
edit: Cmon guys. we are not enemies. Are we really happy that nvidia is marketing this AI crap and fake everything? Cmon. They are not even disclosing real performance numbers because it's probably unimpressive compared to 40xx cards.
edit: He is not even downright shitting on it. He does later says that sometimes the difference is small enough, people might not notice. It's personal. He doesnt like it
From 70 to 140 in hogwarts felt OK.What was the context? You went from what fps to what fps? What was the game.
Frame generation isn't magic. It has its caveats, but claiming it feels like GeForce Now is utter bullshit. Once again, I will refer to Cyberpunk 2077 and the utter hypocrisy of the discourse surrounding frame generation.
Here, at native with Reflex off at 42fps, you had over 100ms of latency.
You would have around 80ms at 60fps. Why wasn't anyone bitching back then?
Or how about God of War on PS5 at 60fps with over 110ms of latency? On PC, over 120ms. Enable LLM, you still have 73ms.
If you were to add frame gen with Reflex in God of War, you still wouldn't come anywhere near the 73.4ms of Reflexx Off+LLM, let alone the 112ms on PS5. Is everyone crying about the input latency on PS5? Because I played it at 60fps on my Pro and it's fine. It's much better on PC, but it's still perfectly playable on consoles.
The reality is, the fact that frame generation comes bundled with Reflex makes the added latency compared to no Reflex negligible. There is nothing to bitch about unless you had been complaining about latency before the days of Reflex. Is Reflex without frame generation better for latency? Absolutely, but acting like frame generation is unplayable when you're getting 100fps+ (a base of 50-60fps) is nonsense. Just don't toggle it on when you have 20-30fps and don't bother with it for fast-paced multiplayer games (that run on potatoes anyway, so why would you need it)? It's especially funny when games like GOW 2018 were at freakin' 30fps and 150ms of input latency when they came out but nobody had a thing to say about it.
I'd upgrade, that's 150w below what's recommended.Do you guys think that I could get away with running a 5090 with a 850 watt PSU? Or do you think I need to upgrade?
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
That would be one of the worst GPU generational increases we've ever seen.Same launch price for a 15-20% faster card? Doesn't seem so bad to me.
And that's the Super version, OG 4080 launched at 1200$.
He’s full of shit.
Nope he's 100% right and Kenpachii , that is one of the worst takes I've ever read. Framegen does nothing to help with CPU bottlenecks because it has absolutely 0 impact of game logic and a negative impact on input latency. I'd say that you get the feeling of more frames but you don't because latency is higher. It terms of technologies from Nvidia, the stack is as follows:If he stated what u stated, he's full of shit. Framegen is the best thing that happened after DLSS. Helps massively with CPU bottlenecks and GPU bottlenecks.
I played on 240hz tn gsync monitor for 6 months and indeed, 60fps started looking like shit.Because 60fps stops looking good enough once you've experienced higher framerate smoothness.
What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.
Well it depends on the system configuration; Ryzen is not that power hungry, and test with a beefy cpu and some overclocking with a 4090 show a total system draw at peak of about 600-650W.I'd upgrade, that's 150w below what's recommended.
Unless you plan to do something extreme like cap the power limit at 50% and gimp the performance
Empirical evidence proves that he's wrong.
Per Digital Foundry:
Frame Gen adds about 10ms of latency on average. Geforce Now adds about 60ms of latency on average.
Imaging comparing consoles to pc when >90% of console users are playing with a controller. On a mouse, you can actually feel the input latency very obviously. On a controller, you need ridiculous levels of latency to even start to notice it. Not the same thing at all.If you were to add frame gen with Reflex in God of War, you still wouldn't come anywhere near the 73.4ms of Reflexx Off+LLM, let alone the 112ms on PS5. Is everyone crying about the input latency on PS5? Because I played it at 60fps on my Pro and it's fine. It's much better on PC, but it's still perfectly playable on consoles.
The reality is, the fact that frame generation comes bundled with Reflex makes the added latency compared to no Reflex negligible. There is nothing to bitch about unless you had been complaining about latency before the days of Reflex. Is Reflex without frame generation better for latency? Absolutely, but acting like frame generation is unplayable when you're getting 100fps+ (a base of 50-60fps) is nonsense. Just don't toggle it on when you have 20-30fps and don't bother with it for fast-paced multiplayer games (that run on potatoes anyway, so why would you need it)? It's especially funny when games like GOW 2018 were at freakin' 30fps and 150ms of input latency when they came out but nobody had a thing to say about it.
Nope he's 100% right and Kenpachii , that is one of the worst takes I've ever read. Framegen does nothing to help with CPU bottlenecks because it has absolutely 0 impact of game logic and a negative impact on input latency. I'd say that you get the feeling of more frames but you don't because latency is higher. It terms of technologies from Nvidia, the stack is as follows:
Reflex: S tier technology.
DLSS: B tier technology
FrameGen: Trash tier technology.
I played on 240hz tn gsync monitor for 6 months and indeed, 60fps started looking like shit.
But I wanted to go 4k ips and at the time these panels were 60hz. I was so impressed with picture quality that I got the monitor back then and I got used back to 60hz in like 1 day.
Souls and many console games are 60fps limited anyway so at least there is no pain going back to 60.
Now I have 120hz lg c1 oled for 3 years and have no problem switching 120,60 and even some 30fps gaming if it's done well.
So yeah, 60fps definietly starts to look worse if you get used to high hz but it's not more than a mild annoyance for 15 minutes
Unless you are weak brained.... but Are you playing the game or is the game playing you?
You really took what he said literally?
Jesus christ wept when he saw your intellect.
He was obviously exaggerating but it still does add input lag. 10ms or more, it does add it.
There's nothing wrong with liking frame gen however I do not. Frame Gen is not useable for me due to it's artifacts and latency. It's very easy to tell when it's on and I just cannot ignore it's flaws.WTF? Frame gen with reflex is very usable and input lag is not big at all. My personal issue with FG is that many developers have shit implementations and many issues with UI etc.
It does not help at all. The frames have absolutely nothing to do with the game and game engine. I don't want it at all.Frame gen also VERY MUCH helps with CPU bottlenecks, when your game is limited to 50fps by CPU you will get 100fps thanks to frame gen. In CPU limited scenarios you actually get 2x fps vs less when GPU limited (cost of FG).
AgreedReflex is S tier
Not with it's very visible artifacts, it's not. If DLSS had no artifacts, it would be S-tier. However, it's riddled with artifacts so it cannot be S tier or A tier.DLSS is S or A tier
I've already shared my feelings on Frame Gen so it's obvious I'd never agree with this rating.FG is B or C in current version,
Who knows but the specs for the cards would suggest that this can be very much right.Wait what ? That can't be right..
His post didn't mention what CPU he's running, could be a 14900KS for all I know. You also want some breathing room for transient spikes.Well it depends on the system configuration; Ryzen is not that power hungry, and test with a beefy cpu and some overclocking with a 4090 show a total system draw at peak of about 600-650W.
I'd say 850 is enough, even for a 500W gpu. 1000 is a no matter what you put in your config ultra-safe recommendation.
I would put Reflex in c tier, never been able to use it without it causing noticeable visual stutter. I have an order card, so I'm not sure if it has gotten any better with 4000 series cards.WTF? Frame gen with reflex is very usable and input lag is not big at all. My personal issue with FG is that many developers have shit implementations and many issues with UI etc.
Frame gen also VERY MUCH helps with CPU bottlenecks, when your game is limited to 50fps by CPU you will get 100fps thanks to frame gen. In CPU limited scenarios you actually get 2x fps vs less when GPU limited (cost of FG).
Reflex is S tier
DLSS is S or A tier
FG is B or C in current version,
I have no argument. If you listened to Jeff for 10-15 minutes you would get what he said.Uh yeah? If someone is going to make claims about why they "hate" something or think "it's shit", they better damn well provide actual truth as to why they don't like it. Common fucking sense.
And as Gaiff shared, with Reflex latency oftentimes doesn't increase at all compared to native, and is oftentimes still even better than the console latency that you find acceptable when you play.
Your argument is asinine. And what happened to you deciding to have a cooler head after your ban/return? Maybe don't insult people's intelligence just because they point out why you're wrong about your own biases.
Fair enough. You can use framegen to play with worse input lag and artifacts rather than taking 10 minutes of unconscious effort to play at 60hz again... Doesn't make sense to me but you play however you like.and that's where Frame Gen comes in. I don't have to get used to it. I'm playing Indiana Jones with a controller, not Counter Strike 2 with a mouse. Discussions of input latency are moot.
People weren’t complaining about Cyberpunk’s 80ms latency at 60fps before Reflex was introduced. Now, 60ms is too much? Right.Imaging comparing consoles to pc when >90% of console users are playing with a controller. On a mouse, you can actually feel the input latency very obviously. On a controller, you need ridiculous levels of latency to even start to notice it. Not the same thing at all.
Wait what ? That can't be right..
Which people are you referring to? I certainly complainedPeople weren’t complaining about Cyberpunk’s 80ms latency at 60fps before Reflex was introduced. Now, 60ms is too much? Right.
I’m sure you were. Likely the type of player who also thinks 60fps sucks and it’s 120fps or bust.Which people are you referring to? I certainly complained
Wait what ? That can't be right..
I'm waiting for the 5070 Super.How are we feeling about the 5070 Ti compared to the 5070? Is 4GB extra RAM really worth $200? I'm just wondering if it will even be noticeable. I'm playing 1440p...
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
I don't know follow the guy at all, but I click his channel, go find his latest gameplay, and he's grinding some Warframe with a Playstation controller at potato velocity. Doesn't exactly give me the impression of a high-octane twitchy latency sensitive gamer.What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.
So yeah, 60fps definietly starts to look worse if you get used to high hz but it's not more than a mild annoyance for 15 minutes
Unless you are weak brained.... but Are you playing the game or is the game playing you?
Why no use strong brain for overcome 10ms lag? If only weak brain no like 120fps->60/30fps despite that also adding input latency alongside dropping all those frames?You really took what he said literally?
Jesus christ wept when he saw your intellect.
He was obviously exaggerating but it still does add input lag. 10ms or more, it does add it.
yeah we're even lucky nvidia did reflex much before frame gen. maybe they even regret it. they practically created a new standard for people and now people want them to achieve that standard with frame gen. I will say reflex is still underrated. I remember playing cyberpunk and I didn't even know how to measure latency and stuff but I could see game being super laggy despite getting 60-70 FPS with my gtx 1080 at 1080p native with most settings set to med and high. I had to cap the frames to 45 just so I can get back to responsive gameplay.People weren’t complaining about Cyberpunk’s 80ms latency at 60fps before Reflex was introduced. Now, 60ms is too much? Right.
I think they did the same bullshit in Wukong where they paired Reflex with frame generation in the menu. There's no reason not to add Reflex for cards that don't have frame generation besides artificially inflating the value of Lovelace cards. Reflex on its own is fantastic.there's also a worrying trend with alan wake 2 and indiana jones where reflex is not available for non frame generation users. no idea what went wrong in these games but I hope it is unintentional
I think they did the same bullshit in Wukong where they paired Reflex with frame generation in the menu. There's no reason not to add Reflex for cards that don't have frame generation besides artificially inflating the value of Lovelace cards. Reflex on its own is fantastic.