• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"I Need a New PC!" 2022. The GPU drought continues...

Status
Not open for further replies.
Will raptor lake support pci-e 5.0 ssd’s or only pci-e videocards? Looks like there will only be 16 pci-e lanes if I’m not mistaken?
Hmm definitely should support 5.0 ssds.

Whether it does at full speed or not is up in the air afaik... But it will support the full speed for 4.0 drives, or 7gb/s which should be god's plenty for this generation.

I got a 4.0 drive rated at 5.1gb/s and the thing is sickeningly fast!

Edit : okay yeah so I fully read about it because it's curious ; raptor lake has 16x lanes at pcie5 and ssd speed is limited to pcie 4.0 x4. So basically, if you upgrade to raptor lake from your... 10700k? IIRC? Just get a 7gb/s pcie 4 x4 drive for maximum future proofing.
 
Last edited:
Also, I broke down and bought a 3080 12gb, I just couldn't resist. It's $800, but it doesn't ship til August 17, and then I have 30 days to get a price match after that if it drops further, which I think it will.

https://www.evga.com/products/product.aspx?pn=12G-P5-4877-KL

This 3060 was really bugging me man, and we don't even know if the 4070 will even have 12gb ram. This 3080 should be better than 4060, I would imagine.

4080/90 is out of the question because of power usage.

I'm going to rma this 3060 because it has horrible coil whine, then keep the new one sealed for another crypto boom, hello eBay lol.
 
Last edited:

Celcius

°Temp. member
If you buy a bigger SSD and want to do a bit-perfect copy of your data onto the new one so that you don't have to reinstall your OS, apps, and everything then what program do you guys use?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Also, I broke down and bought a 3080 12gb, I just couldn't resist. It's $800, but it doesn't ship til August 17, and then I have 30 days to get a price match after that if it drops further, which I think it will.

https://www.evga.com/products/product.aspx?pn=12G-P5-4877-KL

This 3060 was really bugging me man, and we don't even know if the 4070 will even have 12gb ram. This 3080 should be better than 4060, I would imagine.

4080/90 is out of the question because of power usage.

I'm going to rma this 3060 because it has horrible coil whine, then keep the new one sealed for another crypto boom, hello eBay lol.
Good call. The 3080 12GB is going to last a while and it's a damn good video card.

I've honestly not liked what I have heard so far about the 4000 series. Going balls to the wall without any concern for power draw is just nuts. I also think nvidia is going to price these so ridiculously sky high that it wont be worth it.

Given that a 1080 Ti is still a good GPU, I suspect my 3080 12GB GPU will be good for a while.
 

JohnnyFootball

GerAlt-Right. Ciriously.
If you buy a bigger SSD and want to do a bit-perfect copy of your data onto the new one so that you don't have to reinstall your OS, apps, and everything then what program do you guys use?
Most of the time companies have their own software to do it, but Acronis is usually considered the best.
 

Yerd

Member
If you buy a bigger SSD and want to do a bit-perfect copy of your data onto the new one so that you don't have to reinstall your OS, apps, and everything
Usually they come with a free version of migration software. I think the typical procedure is to create a partition that matches your current drive, and partition the rest into a new drive.
 
Good call. The 3080 12GB is going to last a while and it's a damn good video card.

I've honestly not liked what I have heard so far about the 4000 series. Going balls to the wall without any concern for power draw is just nuts. I also think nvidia is going to price these so ridiculously sky high that it wont be worth it.

Given that a 1080 Ti is still a good GPU, I suspect my 3080 12GB GPU will be good for a while.
Yeah they are just getting out of control with power draw. 4090 would use my entire current psu probably by itself lol. Evga recommends a 750 watt psu for this card so I just barely made that cut, and with undervolt it should be more than fine.

Supposedly Nvidia are going with chiplets design for the 5xxx series so hopefully they get power draw under control then.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Usually they come with a free version of migration software. I think the typical procedure is to create a partition that matches your current drive, and partition the rest into a new drive.
You can get Acronis keys on ebay for dirt cheap. I'd get that if you want something legit. Most freeware comes with so many strings attached that it ends up being a huge hassle.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Yeah they are just getting out of control with power draw. 4090 would use my entire current psu probably by itself lol. Evga recommends a 750 watt psu for this card so I just barely made that cut, and with undervolt it should be more than fine.

Supposedly Nvidia are going with chiplets design for the 5xxx series so hopefully they get power draw under control then.
I'm more curious to see what AMD is able to with ray tracing. Eventhough, I root for AMD, I still buy nvidia GPUs because I love EVGA.

The 4000 series has far too many unknowns at this point. Starting with release date. I am starting to believe that nvidia will actually delay the release of the 4000 series until next year to let 3000 stock sell down.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I still believe a day will come when EVGA will make AMD cards. And I pray it happens, lol (Well, that's if AMD takes the performance crown and/or watt/perf)
Sapphire is the EVGA equivalent for AMD. I am not quite sure that EVGA will ever make AMD GPUs. EVGA has a very good relationship with nvidia. To my knowledge EVGA gets the best binned GPUs in exchange for exclusivity. I don't know for sure, but it's pretty obvious that EVGA has a good deal going with Nvidia.
 

GreatnessRD

Member
Sapphire is the EVGA equivalent for AMD. I am not quite sure that EVGA will ever make AMD GPUs. EVGA has a very good relationship with nvidia. To my knowledge EVGA gets the best binned GPUs in exchange for exclusivity. I don't know for sure, but it's pretty obvious that EVGA has a good deal going with Nvidia.
Yeah, I know. Just a pipe dream of mine. I loved my EVGA 1060 I had and their customer support is top tier. Only reason I miss Nvidia's card, because of EVGA, haha.
 

twilo99

Member
Can confirm, if you are after an nvidia gpu, go with EVGA.

I've had a Sapphire and an XFX AMD card and both have been stellar. I haven't had to contact customer service, and I hope it stays that way, but Sapphire is probably better.
 

Mercador

Member
Can you put the lights out of the EVGA? I'm not in that rainbow style, more looking for a standard case without windows.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Yeah they are just getting out of control with power draw. 4090 would use my entire current psu probably by itself lol. Evga recommends a 750 watt psu for this card so I just barely made that cut, and with undervolt it should be more than fine.

Supposedly Nvidia are going with chiplets design for the 5xxx series so hopefully they get power draw under control then.
Plus, people act like the 3000 series GPUs arent still damn good cards. Performance for those cards are GREAT and it's not like they're going to become shitty with current games. RIght now, I think we are reaching diminishing returns in graphical prowess that turning down certain settings in games isn't much of a big deal. I always keep shadow detail on low. Texture detail is also a big one that can take extra resources when the benefits are often visually very weak and non-existent when in motion.
 
Plus, people act like the 3000 series GPUs arent still damn good cards. Performance for those cards are GREAT and it's not like they're going to become shitty with current games. RIght now, I think we are reaching diminishing returns in graphical prowess that turning down certain settings in games isn't much of a big deal. I always keep shadow detail on low. Texture detail is also a big one that can take extra resources when the benefits are often visually very weak and non-existent when in motion.
3060ti, 3070 are bad cards just because they're so vram gimped. Really it's just unacceptable.

3080 12gb is by far the best card, has all the bandwidth, plenty of vram and is the biggest gpu due even though the 3090 has more cuda cores enabled.

I think my card will last me til the 5 series. Always keep textures highest if you have the vram, hence why 3070 is such crap.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
3060ti, 3070 are bad cards just because they're so vram gimped. Really it's just unacceptable.

3080 12gb is by far the best card, has all the bandwidth, plenty of vram and is the biggest gpu due even though the 3090 has more cuda cores enabled.

I think my card will last me til the 5 series. Always keep textures highest if you have the vram, hence why 3070 is such crap.
I wouldn't agree with that. The 8GB will have limitations, but it won't matter AS much unless you are using ultra high rez textures. Granted the 12GB is much better in terms of longevity, but the 3070 can last a good while. We still have people rocking 980 Tis and 1060s. They can be used.
 
I wouldn't agree with that. The 8GB will have limitations, but it won't matter AS much unless you are using ultra high rez textures. Granted the 12GB is much better in terms of longevity, but the 3070 can last a good while. We still have people rocking 980 Tis and 1060s. They can be used.
They can bes used, but the point is they won't be as good as they should have been because of the lack of vram. And the reason for that were poor market conditions where Nvidia could get away with it.

980ti had ample ample vram for the time it released, 3070 doesn't, you already have to lower texture and streaming settings.

Heck even the 970 wasn't this gimped, that card was great.

I intentionally avoided getting all 3 higher tier cards vs. my 3060 just because I knew they were bs...
 
Last edited:

Dream-Knife

Banned
I wouldn't agree with that. The 8GB will have limitations, but it won't matter AS much unless you are using ultra high rez textures. Granted the 12GB is much better in terms of longevity, but the 3070 can last a good while. We still have people rocking 980 Tis and 1060s. They can be used.
This guy will literally block you for arguing about vram.

Anything he owns is the best whatever. He was arguing last year that the 3060 was better than a 3070 because it had more vram. He also allegedly used it at 4k 120.
 

Kenpachii

Member
In what way? When I think of the 970 I think of a card that had 500MB of it's memory gimped and nvidia had to settle.

In every way. Its kind equal towards 970 was to the PS4.

3080 will last for a long long time, unless u plan on sitting at high resolutions such as 4k, its probably going to age badly. for 1080p/1440p its going to last the entire gen and start next gen easily.
 
In every way. Its kind equal towards 970 was to the PS4.

3080 will last for a long long time, unless u plan on sitting at high resolutions such as 4k, its probably going to age badly. for 1080p/1440p its going to last the entire gen and start next gen easily.
I would say the 10gb 3080 is like 970 if it was priced accordingly, but they wanted 1080ti money for it, so nah...

In terms of performance, a little cut vram, yeah in that way it's like 970.

But it should have been sold as the 3070, for 500 tops and the actual 3080 should have had 12gb from the start. 3080 12gb was extortion as well for a long time, only now is it good value.

----

The thing with the 3060, I said it was better value than the other cards because they weren't gouging you for vram, and you will never have to turn down textures unlike on the step up models. I just didn't want to be cheated, but yeah 3060 is too slow for my needs now. I am surprised how hard it is to hit 120 in some games with it, you literally can't get stable 120 on metro enhanced even at 1080p dlss.
 

Celcius

°Temp. member
Just to chime in a little, I agree with him about the vram. I had a 3070Ti for about a month and running COD Cold War at 4K max settings (except without ray tracing), performance was good but there were many cases where it would look like the textures hadn’t fully loaded in or would take a bit to. I never had the issue on my 1080Ti, 3080Ti, or 3090. I was disappointed that the 16gb 3070Ti got cancelled because that card would have been a great 4K card imho (without ray tracing).
 

GreatnessRD

Member
To be fair, you shouldn't be buying 3060's and 3070's if you're playing at 4K. That just doesn't make any sense to me, unless you understand you can't have high/ultra settings and max everything. Those class of cards are 1080p and 1440p monsters, in my opinion.
 
Just to chime in a little, I agree with him about the vram. I had a 3070Ti for about a month and running COD Cold War at 4K max settings (except without ray tracing), performance was good but there were many cases where it would look like the textures hadn’t fully loaded in or would take a bit to. I never had the issue on my 1080Ti, 3080Ti, or 3090. I was disappointed that the 16gb 3070Ti got cancelled because that card would have been a great 4K card imho (without ray tracing).
Some guys don't get it's not always about pure fps, yeah you can lower the settings on the 3060ti so that you don't get these issues, and it will run significantly faster than 3060 at like for like settings, but the fact that a cheaper card wouldn't do that with higher textures is mind boggling.

This was a very weird generation for Nvidia...
 

GreatnessRD

Member
Some guys don't get it's not always about pure fps, yeah you can lower the settings on the 3060ti so that you don't get these issues, and it will run significantly faster than 3060 at like for like settings, but the fact that a cheaper card wouldn't do that with higher textures is mind boggling.

This was a very weird generation for Nvidia...
Nvidia was just chillin' on the fact that they have the mindshare and DLSS. Now their seat is a little warmer with AMD coming with the 6000 series, FSR 2.0 and now Intel soon. They better get it together fast.
I would say no 30 series card is a 4k card.
Nah, that's a little extreme, the 3080/Ti and 3090/Ti is more than capable.
 

Dream-Knife

Banned
Nah, that's a little extreme, the 3080/Ti and 3090/Ti is more than capable.
WED3UiW.jpg

For older games.
 
Nvidia was just chillin' on the fact that they have the mindshare and DLSS. Now their seat is a little warmer with AMD coming with the 6000 series, FSR 2.0 and now Intel soon. They better get it together fast.

Nah, that's a little extreme, the 3080/Ti and 3090/Ti is more than capable.
That and the whole shortage/crypto round 2 thing 😛

3 series is definitely better than rdna 2 because of rt performance esp. Dlss is whatever, it's a nice option but I avoid using it, for the ghosting problems I found.

Rdna 3 will be interesting because it's a chiplet design, Nvidia is still monolithic so that's probably why they're pushing the 4 series so hard, to stay ahead... I'm curious to see how rdna 3 performs with rt. If Nvidia gets stingy again with vram/too many different models because they're milking, they might/hopefully lose some market share.

And yeah really rooting for Intel as well they had a rough start but we need more competition.
 

Rickyiez

Member
I would say no 30 series card is a 4k card.
The only game that couldn't run 4k60 on my 3080Ti is Cyberpunk with RT . Other than that , every other GPU intensive games I played like

RDR2
Control
FF7R
God of War
FH5
Metro Exodus
RE Village

easily run above 70-80FPS 4k . So how is it not a 4k card ? You're looking for 4k120 ?

If you said there's no 30 series card that can run 4k with RT on and DLSS off , then yeah . But neither will next gen be able to run any newest RT titles at 4k without DLSS I can guarantee you that .
 
Last edited:
I recently upgraded to a 3080 and got myself a g sync monitor. I can't tell the difference In picture quality with gsync on? Just notice a little more input delay.
 

GreatnessRD

Member
WED3UiW.jpg

For older games.
So based on the chart you showed, everything aside from two games is over 60 FPS. And if you tuned settings correctly, everything would be over 60 FPS. Just turning down to high would show a huge difference, but at $1500 for GPUs, I could understand not wanting to compromise at all, so if that's the point you're making, fair point.
That and the whole shortage/crypto round 2 thing 😛

3 series is definitely better than rdna 2 because of rt performance esp. Dlss is whatever, it's a nice option but I avoid using it, for the ghosting problems I found.

Rdna 3 will be interesting because it's a chiplet design, Nvidia is still monolithic so that's probably why they're pushing the 4 series so hard, to stay ahead... I'm curious to see how rdna 3 performs with rt. If Nvidia gets stingy again with vram/too many different models because they're milking, they might/hopefully lose some market share.

And yeah really rooting for Intel as well they had a rough start but we need more competition.
Q4 gonna be very interesting. Can't wait to see the unveilings of everything. I was rooting for Intel, but they really dropped the ball thus far, lol.
 
GreatnessRD GreatnessRD Lol. I could say the gtx 980 wasn't a 1080p card because you couldn't run Witcher 2 from 2011with ubersampling at stable 60 xD

People that go on and on about ultra settings, it just tells me they don't know how to tweak.

Even on my 3060, I can do 4k 60 on many games if I tweak settings. Example : I booted up Spyro reignited at 4k max settings, it was dipping into high 30s at worst, but I keep textures max, then tweak each individual setting before it significantly makes the game look worse, i.e. shadows on high look the same as ultra, but going to medium changes the lighting, so use high. Use basic AO instead of ultra, bam you got your 60fps. Is it not 4k because I did those things? 🤔

So to say a 3080 12gb, ti, 3090 aren't 4k cards?!
 

Yerd

Member
I recently upgraded to a 3080 and got myself a g sync monitor. I can't tell the difference In picture quality with gsync on? Just notice a little more input delay.
Do you understand what gsync does? It doesn't change anything to do with image quality. It matches the monitor refresh with the gpu. When you spin your view really fast in a game(usually a sure fire way to see it), you will see top half and bottom half of the screen will have split second different images because of how the image is delivered to the screen. There is a visible break in the two images, called screen tearing. Gsync makes that stop.
 

Dream-Knife

Banned
easily run above 70-80FPS 4k . So how is it not a 4k card ? You're looking for 4k120 ?

So based on the chart you showed, everything aside from two games is over 60 FPS. And if you tuned settings correctly, everything would be over 60 FPS. Just turning down to high would show a huge difference, but at $1500 for GPUs, I could understand not wanting to compromise at all, so if that's the point you're making, fair point.

By what definition? I can get 4K and at least 60 fps on numerous recent games. If Cyberpuink with raytracing is your standard then, yeah, sure.

Unless a game is horribly optimized, I'd want at least 100 fps. None of these cards can do that consistently at 4k. They can do 1440p very well though. 60 FPS is console stuff.
 

Jayjayhd34

Member
Hi guys I'm about to buy a fractal design torrent. I really like Rgb lights so wanted rgb version however I don't know if the Rgb lights are on the front fans. I'm going have remove the front fans to fit my aio cooler cheers.

Going for i9 12900k build ddr4 as didn't see point in ddr5
 

JohnnyFootball

GerAlt-Right. Ciriously.
Unless a game is horribly optimized, I'd want at least 100 fps. None of these cards can do that consistently at 4k. They can do 1440p very well though. 60 FPS is console stuff.

Hi guys I'm about to buy a fractal design torrent. I really like Rgb lights so wanted rgb version however I don't know if the Rgb lights are on the front fans. I'm going have remove the front fans to fit my aio cooler cheers.

Going for i9 12900k build ddr4 as didn't see point in ddr5
Save a bunch of money and go with a 12700K with DDR5. You will make your upgrading path for the future a lot easier. Unless you got DDR4 memory that you plan to take out of an old system and put in a new one, I'd say start the DDR5 investment. I kinda wish I had done that.
 

V1LÆM

Gold Member
i wouldn't be building any system today with only DDR4 lol. DDR5 might be more expensive but it's worth it. your system will be set for many years. i built a DDR4 system when skylake came out and i got 7 years out of it. it still works of course but i'd be kicking myself if i had stuck with DDR3 in 2015 just because it was cheaper lol.
 

Jayjayhd34

Member
i wouldn't be building any system today with only DDR4 lol. DDR5 might be more expensive but it's worth it. your system will be set for many years. i built a DDR4 system when skylake came out and i got 7 years out of it. it still works of course but i'd be kicking myself if i had stuck with DDR3 in 2015 just because it was cheaper lol.
I can afford it I just dont see any reason to go ddr5 the main reason being I saw benchmark of games performing worse with ddr5..
 
Last edited:

GreatnessRD

Member
I can afford it I just dont see any reason to go ddr5 the main reason being I saw benchmark of games performing worse with ddr5..
The only reason I'd say go for DDR5 in this situation is because you're already going for the top of the line stuff in Alder Lake's i-9. And DDR5 is dropping in prices and as BIOS updates come out, improvements will be made. I don't know how often you upgrade, but you may even decide to move onto Raptor Lake and it should be improvements with that and DDR5 as well. Just my two cents. Good luck with whatever you decide.
 

Dream-Knife

Banned
i wouldn't be building any system today with only DDR4 lol. DDR5 might be more expensive but it's worth it. your system will be set for many years. i built a DDR4 system when skylake came out and i got 7 years out of it. it still works of course but i'd be kicking myself if i had stuck with DDR3 in 2015 just because it was cheaper lol.
DDR5 still isn't worth it performance wise. Speed vs latency isn't there yet.

If having a lower number bothers you, then by all means.
 
Last edited:
Dam I'm going to be on this ordering page for another week really can't make my mind up.
My two cents.

If you get ddr5, even after you move on to the next socket, you can carry that ram over with you, that way you don't have to buy ddr4 now, then ddr5 again when you upgrade.

Go ahead and get 32 gigs of ddr5, 6400mhz at cas latency 38, just go the whole way.

What CPU do you currently use? I agree with JohnnyFootball JohnnyFootball to choose the 12700k over the 12900k BUT depending on what you are upgrading from, it may make sense to save even more money and get the i5 12400, if you are planning on upgrading to raptor lake. Like, if you're using a quad core right now.

Then you can get the 13700k, which has as many cores as 12900k, but faster architecture. You could wait a year or two then grab it on fire sale.

Alternatively, maybe just wait for raptor lake to come out, because it's coming soon? I would just wait, if possible. If you don't need the PC for work, just wait and get 13700k.

But ddr5, definitely. You won't have to buy more ram for the whole generation.
 
Last edited:

Jayjayhd34

Member
My two cents.

If you get ddr5, even after you move on to the next socket, you can carry that ram over with you, that way you don't have to buy ddr4 now, then ddr5 again when you upgrade.

Go ahead and get 32 gigs of ddr5, 6400mhz at cas latency 38, just go the whole way.

What CPU do you currently use? I agree with JohnnyFootball JohnnyFootball to choose the 12700k over the 12900k BUT depending on what you are upgrading from, it may make sense to save even more money and get the i5 12400, if you are planning on upgrading to raptor lake. Like, if you're using a quad core right now.

Then you can get the 13700k, which has as many cores as 12900k, but faster architecture. You could wait a year or two then grab it on fire sale.

Alternatively, maybe just wait for raptor lake to come out, because it's coming soon? I would just wait, if possible. If you don't need the PC for work, just wait and get 13700k.

But ddr5, definitely. You won't have to buy more ram for the whole generation.

Currently using I7 9700k
 
Status
Not open for further replies.
Top Bottom