• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel announces Arc B580 at $249 and Arc B570 GPUs at $219

adamsapple

Or is it just one of Phil's balls in my throat?
Is there any reason why you don't see these cheaper cards in pre-built PC builds? You only see AMD/Nvidia.

These price points are pretty good and if they lead to cheaper pre-builds, they'll probably entice more people.
 

mansoor1980

Member
8A8cgnB.gif
 

poppabk

Cheeks Spread for Digital Only Future
I said that earlier. Nvidia runs 90% of the GPU market simply by name alone. It's why the 1650Ti isn't $80 right now. People would buy that old card before they'd touch an AMD card. Most people don't even realize there is a difference.

For Intel to break in, Nvidia will have to royally shit the bed.
That's why I think it's a mistake to skip the high end - even if you think it isn't going to sell that well - just as an advertisement of the brand. Pretty much every game on PC is shown on a 4090 and people see that and go buy a 4060 not fully understanding the gulf that the 9 to 6 represents.
 

tkscz

Member
That's why I think it's a mistake to skip the high end - even if you think it isn't going to sell that well - just as an advertisement of the brand. Pretty much every game on PC is shown on a 4090 and people see that and go buy a 4060 not fully understanding the gulf that the 9 to 6 represents.
Because the majority PC gamers now a days don't care about maxing out the graphics, they just want to play the game at an affordable price.

And let's be honest, people barely have the $500 - $600 for a 4070(Ti), let alone the $2,000 or more for a 4090. They go to get their graphics card, see that price and go directly to whats more affordable, but still has nvidia's name on it.

Again, the top seven used cards may be all nvidia but none of them are high tier cards. The only high tier card in the top 10 is the 3070. More people still use the GTX 1060 then use an RTX 4070. More people use the iGPU on their AMD and Intel CPU than own a 4090.

So I can't agree with you on that point. The type of people who frequent places like GAF may want to highest end card to get those high end graphics, but average consumer just wants the game to run, even if they have to run it at 900p low settings, they're happy that it's running at all.

Breaking into the GPU market at this point isn't showing off what you have at the highest end, but literally just trying to show that you exist outside of nvidia. The price is a good start but people could see it and go "I can buy this $250 Intel card or I can buy a card with nvidia's name on it for $150" and end up with a GTX 1650, the fourth most used graphic card after five years since it's released that nvidia has discontinued.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Because the majority PC gamers now a days don't care about maxing out the graphics, they just want to play the game at an affordable price.

And let's be honest, people barely have the $500 - $600 for a 4070(Ti), let alone the $2,000 or more for a 4090. They go to get their graphics card, see that price and go directly to whats more affordable, but still has nvidia's name on it.

Again, the top seven used cards may be all nvidia but none of them are high tier cards. The only high tier card in the top 10 is the 3070. More people still use the GTX 1060 then use an RTX 4070. More people use the iGPU on their AMD and Intel CPU than own a 4090.

So I can't agree with you on that point. The type of people who frequent places like GAF may want to highest end card to get those high end graphics, but average consumer just wants the game to run, even if they have to run it at 900p low settings, they're happy that it's running at all.

Breaking into the GPU market at this point isn't showing off what you have at the highest end, but literally just trying to show that you exist outside of nvidia. The price is a good start but people could see it and go "I can buy this $250 Intel card or I can buy a card with nvidia's name on it for $150" and end up with a GTX 1650, the fourth most used graphic card after five years since it's released that nvidia has discontinued.
But I think they buy low end Nvidia because all they see is games being ran on Nvidia 40xx.
 
There should be no 8gb cards in 2024 on any mid range cards at least, perhaps on the lowest possible card like a rtx 5050 and even then it should be 10gb at least. Even on 1080p 8gb vram isn't enough anymore with more and more current gen games being released recently. Even a old current gen game like Ratchet and Clank Rift Apart needs more than 8gb of vram, just look at how a older rtx 3060 12gb easily beats out their newer rtx 4060 8gb vram version. This is just for 1080p now imagine the folks that uses higher resolution and other games that are more demanding than Ratchet and Clank Rift Apart.

Mid range cards should all be 16gb of vram at least. While their lowest 5050 should be 12gb or 10gb.
 
Last edited:

grvg

Member
I've got a question - is the primary purpose of upscaling to be so that you can run raytracing at higher resolutions at a high frame rate? Example - why go with dlss if you have hardware that can run 120fps with raytracing off other than to be able to turn it on? Sorry if this is a dumb question, I've never been a high end PC gamer.
 

hinch7

Member
Some comparision

Might have been a good alternative as they were going for sub $300 several months ago but stock of 6700XT have largely dried up now.

XeSS 1.3 on native hardware is also way better than FSR 3.1 in terms of quality. And Arc will have better RT performance.

The 8600XT may have a chance of competing though assuming AMD price it right, the GPU having enough VRAM and if FSR 4 is any good.
 
Last edited:

LordOfChaos

Member
I've got a question - is the primary purpose of upscaling to be so that you can run raytracing at higher resolutions at a high frame rate? Example - why go with dlss if you have hardware that can run 120fps with raytracing off other than to be able to turn it on? Sorry if this is a dumb question, I've never been a high end PC gamer.

It can certainly be one reason for it. Ray tracing is the current big effect that takes a heavy toll, and upscaling helps the most when a frame is very complicated to render, so instead of that complexity for so many pixels it can just guesstimate many of them instead.

This goes hand in hand with ray tracing because it's so heavy on current GPUs, but it can be a help for anything that's heavy. Say we're a few years out and even a base game is struggling to run at 120fps on it at native.
 

twilo99

Member
Might have been a good alternative as they were going for sub $300 several months ago but stock of 6700XT have largely dried up now.

XeSS 1.3 on native hardware is also way better than FSR 3.1 in terms of quality. And Arc will have better RT performance.

Right, I really think that the B580 will be impossible to beat at that price point, unless AMD also don't care about making money
 

hinch7

Member
Right, I really think that the B580 will be impossible to beat at that price point, unless AMD also don't care about making money
True Intel has to take drastic measures to get people to buy their GPU's and build market and mind share. And are willing to cut their margins in to do so. AMD doesn't really have to but time will tell if they want to compete or just go with Nvidia's pricing, minus several dollars. They'll definately try keep their flagship in the lineup the most competitive against Nvidia. Which should be in between the 5070 and 5070Ti.
12GB n48 is way faster, than B580


My friend got 6750 last week, there no 6700XT anymore.
Doubt they will sell N48 (8700XT?) for under $400. If rumors are true and full Navi 48 performs like a 4080 then they'll milk that for all its worth. I'd say they'd do the 8800 XT for $649. And perhaps $499-549 for the 8700 XT.
 
Last edited:

SolidQ

Member
Doubt they will sell N48 (8700XT?) for under $400.
There 3 version N48
12GB/16GB/16GB and one N44

AMD make everything to make them cheap, but it's about margins and how much AMD want it.


AMD doesn't really have to but time will tell if they want to compete or just go with Nvidia's pricing, minus several dollars
AMD will compter with NV. Intel for now alone
 
Last edited:

winjer

Member

Intel is progressing with the development of its Battlemage-generation graphics card, known internally as the BMG-G31. On December 3, the company announced two new models in the Battlemage series: the Intel Arc B580 with 12GB of memory and the Intel Arc B570 with 10GB of memory, both built on the BMG21 architecture. The Intel Arc B570 10GB model is set to be available for purchase on January 16. Import and export data have highlighted the importance of the BMG-G31 chip in Intel's Battlemage generation. Records show that Intel exported a shipment of graphics cards featuring the BMG-G31 chip from Malaysia to India.

This suggests that Intel is focusing on developing higher-end GPUs within the Battlemage series. The BMG-G31 is expected to provide better processing power and improved energy efficiency, meeting the needs of current graphics-intensive applications and gaming.
This decision will likely consider the upcoming product launches from competitors like AMD and NVIDIA during the same period. By timing its release strategy in this way, Intel aims to position its BMG-G31-based graphics cards competitively in terms of performance and features within the changing and challenging GPU market.
 

00_Zer0

Member
I need to pick up a card early next year, but need it to punch above the power of Intel's offerings. I need the card to be in the 4070 Super-4070 Super Ti power side of things and have okish ray tracing abilities. With the recent leaks of the RX 8800 and the improved ray tracing I may pick up one of these if it's in the 500-600 range. Guess we'll find out soon enough whether AMD's last hurrah for the RDNA line of cards will be worth it in the midrange category.Wish Intel would have competed more in the mid range instead of this lower tier.
 

simpatico

Member
I'm probably going in for the 580 bros. Upgrading from a GTX 1080. Been stubborn about the new pricing. I want to vote for this with my wallet. Hardware XeSS is pretty freaking sweet from the vids I've seen. What better way to show my hard headedness than to buy a barely supported memePU. But it sounds like a fun winter project, and in the end, it's only a $250 gamble for a good cause.
 
Last edited:

DenchDeckard

Moderated wildly
Is there any reason why you don't see these cheaper cards in pre-built PC builds? You only see AMD/Nvidia.

These price points are pretty good and if they lead to cheaper pre-builds, they'll probably entice more people.

I am hoping intel can claw some marketshare here.

From my understanding, due to the poor launch of arc and that intel don't even hold 1% share of the desktop market that no SI's will risk putting it into prebuilds.

If they can get some positive reviews on this launch and build some word of mouth i think we could hopefully see some more competition.

The entry /mid range needs fierce competition.
 

Gaiff

SBI’s Resident Gaslighter
No worries. Gamers were clueless enough to buy "green card no matter what" for a while now. 3050 vs 6600 story is particularly notable. Slower, more power hungry and more expensive card by The Filthy Green outsold AMD's amazing 6600 4 to 1.
Do you do anything else than bitch and moan about NVIDIA? Jesus Christ, you're insufferable.
 

Type_Raver

Member
Looking at the comparison graphs, the 580 seems pretty impressive though im finding it difficult to gauge how it compares to its peers.

While it does mention a 4060, it would appear to possibly go pound for pound against a 3070.
Or is that too much of a reach?
 

kevboard

Member


it is impressive that at the same clock speed it can beat the A750 by this amount in Fortnite.

one GPU class down and fewer Xe cores.
so their architectural improvements seem to be pretty impressive.

I really hope they will release a B7 class card too
 
Top Bottom