• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)

Waiting game is kinda iffy in my vision with all the tech innovations going at lightning speed these days, u don't want to buy a GPU to close to a new generation anymore because u fall behind faster. And that's what happens when u wait for the TI versions which makes them less value in my view.

Sitting on a GPU for multiple generations is also what is not a good idea anymore these days. Imagine buying a 3080 ti, a 4080 absolute levels it with framegen a year later. With path tracing pushed forwards in that gen, u need framegen to get stable framerates. a 4080 vs a 3080 u see a massive performance difference because of it and sadly path tracing is off the menu. While if u bought a 3080 at launch, u got a full 2 years with games launching around that tech before a change happens towards newer tech. Imagine having a 3080 and path tracing releases and u can't because u lack a feature, imagine when 5080 launches and sit on a 3080 when the next title makes full use of 4x framegen and pushes visuals forwards, that 3080 will sit at single digits trying to run it.

This is also why i am convinced that sitting for multiple gens with higher end GPU's is a dumb thing to do. A 5080, with 4x framegen absolute levels a 4080 in newer titles, games that are going to take advantage of that with higher settings will come out in the next 2 years as devs are going to use it. so buying a 5080 at launch u get 2 full years of having a gpu that games are made for is a good idea, waiting for a 5080 ti, that's 1 year u lose out of it.

I think the best thing to do is, before nvidia announces there next gpu's, is to sell your current high end gpu. sell that 4080 for 900 bucks ( they go for 900-950 second handed in my country ), add ~300 bucks and have a 5080 and u are ready to go for this gen. Do the same with the next generation when it arrives. 4090's lose there value harder here it seems like u can pick them up now second handed so u return is far worse, but even there u can do it with, just sell it earlier then the announcement and u should be good.

Multi Frame Gen is dumb.
 

rofif

Can’t Git Gud
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
 

Kenpachii

Member
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.


Dudes wrong, i enable framegen on every game i can besides twitch shooters.
 

Gaiff

SBI’s Resident Gaslighter
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.

He’s full of shit.
 

Kenpachii

Member
He is not wrong. he is sharing his opinion lol. What's wrong about that?
Besides, he knows his shit. He was always crazy about input latency

If he stated what u stated, he's full of shit. Framegen is the best thing that happened after DLSS. Helps massively with CPU bottlenecks and GPU bottlenecks.
 

rofif

Can’t Git Gud
If he stated what u stated, he's full of shit. Framegen is the best thing that happened after DLSS. Helps massively with CPU bottlenecks and GPU bottlenecks.
Whatever. I don't agree. I think it's shit.
but also, I only posted his opinion because I found it interesting. I am not here to defend the guy lol.
nothing he says is factually wrong either. just listen to 10 minutes of it from 21:30
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.
"It feels like GeForce Now". WTF is he playing at? 20fps base that go to 40fps with frame gen. Duh, it feels like ass.
Whatever. I don't agree. I think it's shit.
but also, I only posted his opinion because I found it interesting. I am not here to defend the guy lol.
nothing he says is factually wrong either. just listen to 10 minutes of it from 21:30
You think it's shit based on what? Don't you have a 3080? When have you tried DLSS frame generation?
 

rofif

Can’t Git Gud
"It feels like GeForce Now". WTF is he playing at? 20fps base that go to 40fps with frame gen. Duh, it feels like ass.

You think it's shit based on what? Don't you have a 3080? When have you tried DLSS frame generation?
I tried dlss on a 4080. It looked smoother but undeniably added some input latency.
And obviously tried some fsr3 on my pc and it's a nightmare but it's worse than dlss3
Besides. I know how it works. You know how it works... it must add some latency. Why would I add any latency for fake more fps.
I feel 4090-5090 is poweful enough to enjoy high fps without added latency.
And no. Reflex is not the offset for that. using reflex lowers your fps a bit too. It's not free.

edit: Cmon guys. we are not enemies. Are we really happy that nvidia is marketing this AI crap and fake everything? Cmon. They are not even disclosing real performance numbers because it's probably unimpressive compared to 40xx cards.

edit: He is not even downright shitting on it. He does later says that sometimes the difference is small enough, people might not notice. It's personal. He doesnt like it
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I tried dlss on a 4080. It looked smoother but undeniably added some input latency.
What was the context? You went from what fps to what fps? What was the game.
And obviously tried some fsr3 on my pc and it's a nightmare but it's worse than dlss3
Besides. I know how it works. You know how it works... it must add some latency. Why would I add any latency for fake more fps.
I feel 4090-5090 is poweful enough to enjoy high fps without added latency.
And no. Reflex is not the offset for that. using reflex lowers your fps a bit too. It's not free.

edit: Cmon guys. we are not enemies. Are we really happy that nvidia is marketing this AI crap and fake everything? Cmon. They are not even disclosing real performance numbers because it's probably unimpressive compared to 40xx cards.

edit: He is not even downright shitting on it. He does later says that sometimes the difference is small enough, people might not notice. It's personal. He doesnt like it
Frame generation isn't magic. It has its caveats, but claiming it feels like GeForce Now is utter bullshit. Once again, I will refer to Cyberpunk 2077 and the utter hypocrisy of the discourse surrounding frame generation.

Here, at native with Reflex off at 42fps, you had over 100ms of latency.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png


You would have around 80ms at 60fps. Why wasn't anyone bitching back then?

Or how about God of War on PS5 at 60fps with over 110ms of latency? On PC, over 120ms. Enable LLM, you still have 73ms.

xn7W425.png


If you were to add frame gen with Reflex in God of War, you still wouldn't come anywhere near the 73.4ms of Reflexx Off+LLM, let alone the 112ms on PS5. Is everyone crying about the input latency on PS5? Because I played it at 60fps on my Pro and it's fine. It's much better on PC, but it's still perfectly playable on consoles.

The reality is, the fact that frame generation comes bundled with Reflex makes the added latency compared to no Reflex negligible. There is nothing to bitch about unless you had been complaining about latency before the days of Reflex. Is Reflex without frame generation better for latency? Absolutely, but acting like frame generation is unplayable when you're getting 100fps+ (a base of 50-60fps) is nonsense. Just don't toggle it on when you have 20-30fps and don't bother with it for fast-paced multiplayer games (that run on potatoes anyway, so why would you need it)? It's especially funny when games like GOW 2018 were at freakin' 30fps and 150ms of input latency when they came out but nobody had a thing to say about it.
 
Last edited:

rofif

Can’t Git Gud
What was the context? You went from what fps to what fps? What was the game.

Frame generation isn't magic. It has its caveats, but claiming it feels like GeForce Now is utter bullshit. Once again, I will refer to Cyberpunk 2077 and the utter hypocrisy of the discourse surrounding frame generation.

Here, at native with Reflex off at 42fps, you had over 100ms of latency.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png


You would have around 80ms at 60fps. Why wasn't anyone bitching back then?

Or how about God of War on PS5 at 60fps with over 110ms of latency? On PC, over 120ms. Enable LLM, you still have 73ms.

xn7W425.png


If you were to add frame gen with Reflex in God of War, you still wouldn't come anywhere near the 73.4ms of Reflexx Off+LLM, let alone the 112ms on PS5. Is everyone crying about the input latency on PS5? Because I played it at 60fps on my Pro and it's fine. It's much better on PC, but it's still perfectly playable on consoles.

The reality is, the fact that frame generation comes bundled with Reflex makes the added latency compared to no Reflex negligible. There is nothing to bitch about unless you had been complaining about latency before the days of Reflex. Is Reflex without frame generation better for latency? Absolutely, but acting like frame generation is unplayable when you're getting 100fps+ (a base of 50-60fps) is nonsense. Just don't toggle it on when you have 20-30fps and don't bother with it for fast-paced multiplayer games (that run on potatoes anyway, so why would you need it)? It's especially funny when games like GOW 2018 were at freakin' 30fps and 150ms of input latency when they came out but nobody had a thing to say about it.
From 70 to 140 in hogwarts felt OK.
From like 40 to 70-80 in Portal RTX felt worse.
Probably the reason being, 70 is good to start with
 

dbztrk

Member
Do you guys think that I could get away with running a 5090 with a 850 watt PSU? Or do you think I need to upgrade?
 
Last edited:

Celcius

°Temp. member
Do you guys think that I could get away with running a 5090 with a 850 watt PSU? Or do you think I need to upgrade?
I'd upgrade, that's 150w below what's recommended.
Unless you plan to do something extreme like cap the power limit at 50% and gimp the performance
 
Last edited:

Mister Wolf

Member
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.


Because 60fps stops looking good enough once you've experienced higher framerate smoothness.
 
He’s full of shit.
If he stated what u stated, he's full of shit. Framegen is the best thing that happened after DLSS. Helps massively with CPU bottlenecks and GPU bottlenecks.
Nope he's 100% right and Kenpachii Kenpachii , that is one of the worst takes I've ever read. Framegen does nothing to help with CPU bottlenecks because it has absolutely 0 impact of game logic and a negative impact on input latency. I'd say that you get the feeling of more frames but you don't because latency is higher. It terms of technologies from Nvidia, the stack is as follows:

Reflex: S tier technology.
DLSS: B tier technology
FrameGen: Trash tier technology.
 

rofif

Can’t Git Gud
Because 60fps stops looking good enough once you've experienced higher framerate smoothness.
I played on 240hz tn gsync monitor for 6 months and indeed, 60fps started looking like shit.
But I wanted to go 4k ips and at the time these panels were 60hz. I was so impressed with picture quality that I got the monitor back then and I got used back to 60hz in like 1 day.
Souls and many console games are 60fps limited anyway so at least there is no pain going back to 60.
Now I have 120hz lg c1 oled for 3 years and have no problem switching 120,60 and even some 30fps gaming if it's done well.

So yeah, 60fps definietly starts to look worse if you get used to high hz but it's not more than a mild annoyance for 15 minutes
Unless you are weak brained.... but Are you playing the game or is the game playing you?
 

analog_future

Resident Crybaby
What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.

Empirical evidence proves that he's wrong.

Per Digital Foundry:






Frame Gen adds about 10ms of latency on average. Geforce Now adds about 60ms of latency on average. They're not at all comparable and the vast, vast majority of players would never even be able to perceive 10ms of latency.
 
Last edited:
I'd upgrade, that's 150w below what's recommended.
Unless you plan to do something extreme like cap the power limit at 50% and gimp the performance
Well it depends on the system configuration; Ryzen is not that power hungry, and test with a beefy cpu and some overclocking with a 4090 show a total system draw at peak of about 600-650W.

I'd say 850 is enough, even for a 500W gpu. 1000 is a no matter what you put in your config ultra-safe recommendation.
 

rofif

Can’t Git Gud
Empirical evidence proves that he's wrong.

Per Digital Foundry:





Frame Gen adds about 10ms of latency on average. Geforce Now adds about 60ms of latency on average.

You really took what he said literally?
Jesus christ wept when he saw your intellect.
He was obviously exaggerating but it still does add input lag. 10ms or more, it does add it.
 
If you were to add frame gen with Reflex in God of War, you still wouldn't come anywhere near the 73.4ms of Reflexx Off+LLM, let alone the 112ms on PS5. Is everyone crying about the input latency on PS5? Because I played it at 60fps on my Pro and it's fine. It's much better on PC, but it's still perfectly playable on consoles.

The reality is, the fact that frame generation comes bundled with Reflex makes the added latency compared to no Reflex negligible. There is nothing to bitch about unless you had been complaining about latency before the days of Reflex. Is Reflex without frame generation better for latency? Absolutely, but acting like frame generation is unplayable when you're getting 100fps+ (a base of 50-60fps) is nonsense. Just don't toggle it on when you have 20-30fps and don't bother with it for fast-paced multiplayer games (that run on potatoes anyway, so why would you need it)? It's especially funny when games like GOW 2018 were at freakin' 30fps and 150ms of input latency when they came out but nobody had a thing to say about it.
Imaging comparing consoles to pc when >90% of console users are playing with a controller. On a mouse, you can actually feel the input latency very obviously. On a controller, you need ridiculous levels of latency to even start to notice it. Not the same thing at all.
 

Bojji

Member
Nope he's 100% right and Kenpachii Kenpachii , that is one of the worst takes I've ever read. Framegen does nothing to help with CPU bottlenecks because it has absolutely 0 impact of game logic and a negative impact on input latency. I'd say that you get the feeling of more frames but you don't because latency is higher. It terms of technologies from Nvidia, the stack is as follows:

Reflex: S tier technology.
DLSS: B tier technology
FrameGen: Trash tier technology.

WTF? Frame gen with reflex is very usable and input lag is not big at all. My personal issue with FG is that many developers have shit implementations and many issues with UI etc.

Frame gen also VERY MUCH helps with CPU bottlenecks, when your game is limited to 50fps by CPU you will get 100fps thanks to frame gen. In CPU limited scenarios you actually get 2x fps vs less when GPU limited (cost of FG).

Reflex is S tier
DLSS is S or A tier
FG is B or C in current version,
 
Last edited:

Mister Wolf

Member
I played on 240hz tn gsync monitor for 6 months and indeed, 60fps started looking like shit.
But I wanted to go 4k ips and at the time these panels were 60hz. I was so impressed with picture quality that I got the monitor back then and I got used back to 60hz in like 1 day.
Souls and many console games are 60fps limited anyway so at least there is no pain going back to 60.
Now I have 120hz lg c1 oled for 3 years and have no problem switching 120,60 and even some 30fps gaming if it's done well.

So yeah, 60fps definietly starts to look worse if you get used to high hz but it's not more than a mild annoyance for 15 minutes
Unless you are weak brained.... but Are you playing the game or is the game playing you?

and that's where Frame Gen comes in. I don't have to get used to it. I'm playing Indiana Jones with a controller, not Counter Strike 2 with a mouse. Discussions of input latency are moot.
 
Last edited:

analog_future

Resident Crybaby
You really took what he said literally?
Jesus christ wept when he saw your intellect.
He was obviously exaggerating but it still does add input lag. 10ms or more, it does add it.

Uh yeah? If someone is going to make claims about why they "hate" something or think "it's shit", they better damn well provide actual truth as to why they don't like it. Common fucking sense.

And as Gaiff Gaiff shared, with Reflex latency oftentimes doesn't increase at all compared to native, and is oftentimes still even better than the console latency that you find acceptable when you play.



Your argument is asinine. And what happened to you deciding to have a cooler head after your ban/return? Maybe don't insult people's intelligence just because they point out why you're wrong about your own biases.
 
Last edited:
WTF? Frame gen with reflex is very usable and input lag is not big at all. My personal issue with FG is that many developers have shit implementations and many issues with UI etc.
There's nothing wrong with liking frame gen however I do not. Frame Gen is not useable for me due to it's artifacts and latency. It's very easy to tell when it's on and I just cannot ignore it's flaws.
Frame gen also VERY MUCH helps with CPU bottlenecks, when your game is limited to 50fps by CPU you will get 100fps thanks to frame gen. In CPU limited scenarios you actually get 2x fps vs less when GPU limited (cost of FG).
It does not help at all. The frames have absolutely nothing to do with the game and game engine. I don't want it at all.
Reflex is S tier
Agreed
DLSS is S or A tier
Not with it's very visible artifacts, it's not. If DLSS had no artifacts, it would be S-tier. However, it's riddled with artifacts so it cannot be S tier or A tier.
FG is B or C in current version,
I've already shared my feelings on Frame Gen so it's obvious I'd never agree with this rating.
 

Celcius

°Temp. member
Well it depends on the system configuration; Ryzen is not that power hungry, and test with a beefy cpu and some overclocking with a 4090 show a total system draw at peak of about 600-650W.

I'd say 850 is enough, even for a 500W gpu. 1000 is a no matter what you put in your config ultra-safe recommendation.
His post didn't mention what CPU he's running, could be a 14900KS for all I know. You also want some breathing room for transient spikes.
I wouldn't want to cut it that close regardless but I guess we'll see how power hungry they are in most gaming scenarios once they launch
 
Last edited:

proandrad

Member
WTF? Frame gen with reflex is very usable and input lag is not big at all. My personal issue with FG is that many developers have shit implementations and many issues with UI etc.

Frame gen also VERY MUCH helps with CPU bottlenecks, when your game is limited to 50fps by CPU you will get 100fps thanks to frame gen. In CPU limited scenarios you actually get 2x fps vs less when GPU limited (cost of FG).

Reflex is S tier
DLSS is S or A tier
FG is B or C in current version,
I would put Reflex in c tier, never been able to use it without it causing noticeable visual stutter. I have an order card, so I'm not sure if it has gotten any better with 4000 series cards.
 

rofif

Can’t Git Gud
Uh yeah? If someone is going to make claims about why they "hate" something or think "it's shit", they better damn well provide actual truth as to why they don't like it. Common fucking sense.

And as Gaiff Gaiff shared, with Reflex latency oftentimes doesn't increase at all compared to native, and is oftentimes still even better than the console latency that you find acceptable when you play.



Your argument is asinine. And what happened to you deciding to have a cooler head after your ban/return? Maybe don't insult people's intelligence just because they point out why you're wrong about your own biases.
I have no argument. If you listened to Jeff for 10-15 minutes you would get what he said.
I just posted a vid and from my experience, reflex usually ain't worth it.
It has it's issues too. It costs performance and can increase framepacing issues.
Imo just VRR 2-3fps below max hz and it's great. Triple buffering (borderless window) is way faster than double buffering too.
Reflex is just the old frame queue setting set to 1... more/less.
Reflex is not a tool to offset FG delay. It was there way before under different name like I mentioned.

Well I am cooler headed aren't I? But I am still not resistant to pcmr fanboys bs around these parts. It's gotten really bad.
You guys should have some more decency to critique nvidia by now.
and that's where Frame Gen comes in. I don't have to get used to it. I'm playing Indiana Jones with a controller, not Counter Strike 2 with a mouse. Discussions of input latency are moot.
Fair enough. You can use framegen to play with worse input lag and artifacts rather than taking 10 minutes of unconscious effort to play at 60hz again... Doesn't make sense to me but you play however you like.
I am not as sensitive to input lag with controller too.
The graph with GOW being 120ms on ps5 makes no sense to me. It doesn't feel like it but the 30fps mode is terrible and worse than it was on ps4 version on ps4.
 

Gaiff

SBI’s Resident Gaslighter
Imaging comparing consoles to pc when >90% of console users are playing with a controller. On a mouse, you can actually feel the input latency very obviously. On a controller, you need ridiculous levels of latency to even start to notice it. Not the same thing at all.
People weren’t complaining about Cyberpunk’s 80ms latency at 60fps before Reflex was introduced. Now, 60ms is too much? Right.
 
Wait what ? That can't be right..

My current thinking is that either the effective clock speeds are a lot lower than Ada and/or NV gutted the FP32 to make room for the AI/RT changes.

I don't think NV would sandbag either, because if AMD fears that's what's going on, they will stick with $499 and have a product that could easily be faster in raster and maybe even RT than NV's $999 one.

Reviews are gonna be glorious.
 

Buggy Loop

Gold Member
Wait what ? That can't be right..

I've seen so much extrapolation, frame counting (from a video? LOL) and not knowing rigs or game versions or driver versions in the past that its not even worth stopping to think about it

Benchmarks in 2 weeks will tell everything.
 

bender

What time is it?
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.


I love Jeff but bro bro needs to do something about his eyebrows.
 

//DEVIL//

Member
The more I read about the 5080 and 5090. the more i am convinced that I should grab a 4090 for cheap with all these poeple trying to sell their cards on facebook.

from the look of it. the 4090 is still more powerful than the 5080 by a good margin, with more vram.

30 to 40% performance boost for the 5090 is nice. but when your base game is like 22 frames in wukong to 27... like who gives a fuck -_-.

someone is selling the 4090 in my area for 990$ US MSI gaming trio. I think I should buy that and be done with it.
 

Pegasus Actual

Gold Member
What do you mean? jeff is great. he knows his stuff.
besides, it's his opinion but is he wrong? It does add latency. That's just what it is.
I don't know follow the guy at all, but I click his channel, go find his latest gameplay, and he's grinding some Warframe with a Playstation controller at potato velocity. Doesn't exactly give me the impression of a high-octane twitchy latency sensitive gamer.
So yeah, 60fps definietly starts to look worse if you get used to high hz but it's not more than a mild annoyance for 15 minutes
Unless you are weak brained.... but Are you playing the game or is the game playing you?

You really took what he said literally?
Jesus christ wept when he saw your intellect.
He was obviously exaggerating but it still does add input lag. 10ms or more, it does add it.
Why no use strong brain for overcome 10ms lag? If only weak brain no like 120fps->60/30fps despite that also adding input latency alongside dropping all those frames?

FrameGen is great for just about any single player game. My big annoyance with it is that it requires HAGS enabled which trashes performance for my most common use-case of having some YouTube up on one of my extra monitors while gaming.

Weird to me that anyone would complain about FrameGen latency when Nvidia ships it along with constant latency-improving updates to their products.
 

yamaci17

Member
People weren’t complaining about Cyberpunk’s 80ms latency at 60fps before Reflex was introduced. Now, 60ms is too much? Right.
yeah we're even lucky nvidia did reflex much before frame gen. maybe they even regret it. they practically created a new standard for people and now people want them to achieve that standard with frame gen. I will say reflex is still underrated. I remember playing cyberpunk and I didn't even know how to measure latency and stuff but I could see game being super laggy despite getting 60-70 FPS with my gtx 1080 at 1080p native with most settings set to med and high. I had to cap the frames to 45 just so I can get back to responsive gameplay.

reflex is great, no need for frame limiting, no need to watch gpu usage or guess what your gpu usage is going to be, just enable it and you get low latency all the time.

I even remember in 2021 how I'd complain about a single player game not getting reflex and most people would say "come on mate, what are you going to do with low latency, this is not a competitive game"

but for the sake of argument, I don't personally think 60 fps sucks without reflex. up until cyberpunk, I was able to get a locked 60 or 90 FPS with GPU headroom in most FPS games so I never knew how latency heavy a FPS game can be with GPU bound input lag. GPU bound input lag is definitely playable with TPS games like alan wake 2 etc. but of course, I still like reflex a lot in those titles too.

there's also a worrying trend with alan wake 2 and indiana jones where reflex is not available for non frame generation users. no idea what went wrong in these games but I hope it is unintentional
 
Last edited:

Katatonic

Member
Didn't know looking at menu screens, troubleshooting .exe's, review bombing games, watching Linus Tech Tips, and belittling consoles also required a $2,000 GPU.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
there's also a worrying trend with alan wake 2 and indiana jones where reflex is not available for non frame generation users. no idea what went wrong in these games but I hope it is unintentional
I think they did the same bullshit in Wukong where they paired Reflex with frame generation in the menu. There's no reason not to add Reflex for cards that don't have frame generation besides artificially inflating the value of Lovelace cards. Reflex on its own is fantastic.
 
Last edited:

Bojji

Member
I think they did the same bullshit in Wukong where they paired Reflex with frame generation in the menu. There's no reason not to add Reflex for cards that don't have frame generation besides artificially inflating the value of Lovelace cards. Reflex on its own is fantastic.

There are more games where devs did shit like this (still wakes the deep for example).
 

HeisenbergFX4

Gold Member

Summary?

Don't really have the patience to sit through a 20 minute video if its just guesses and assumptions which I assume it is since no one has tested these cards yet

If anything is a scam its the whole 5070 = 4090 marketing as that was repeated by a few of my PC gaming buddies and I personally think thats straight BS
 
Top Bottom