• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA stock to lose $400 billion ( US tech 1 Trillion) after DeepSeek release

Buggy Loop

Member
What we see from openAI and other tech AI companies is what they want the public to see. There's models internally that we're not prepared for so USA in panic for world domination of AI is a cute facade, they'll use this to put fear into government and twist arms to force massive money injection into it.
 

Drake

Member
I always knew nvidia was inflated. good to see.

Everyone going all in on AI is equally moronic. AI is just a tool. its the products that matter.

The real reason behind it, I think is because tech companies want to use it to replace highly compensated coders. Especially in more niche languages where some of these people are pulling in 300k+ a year.
 
Last edited:

Dane

Member
For all i've heard and read about the IT sector in USA, they were becoming bloated and inneficient, their margins and futre promises to investors were maintaning that bubble, there's a possible reckoning coming where the pressure will demand to cull the fat because they know it won't affect the company or make it productive. It was almost akin to a government body.
 

diffusionx

Gold Member
What we see from openAI and other tech AI companies is what they want the public to see. There's models internally that we're not prepared for so USA in panic for world domination of AI is a cute facade, they'll use this to put fear into government and twist arms to force massive money injection into it.
It's quite convenient that Trump gets into office with this tech bro support, he announces an initiative for AI development, which is fine, and then IMMEDIATELY after, this Chinese model arises out of nowhere that is somehow vastly superior to what the tech bros have been doing.

This is space race, "missile gap", Cold War type shit all over again. People need to smarten up!
 

Bernoulli

M2 slut
What we see from openAI and other tech AI companies is what they want the public to see. There's models internally that we're not prepared for so USA in panic for world domination of AI is a cute facade, they'll use this to put fear into government and twist arms to force massive money injection into it.
Isn't this what they have done?
Sell to the government the idea that AI is magic and they can only do it with the 500 billions
Then china releases a superior model for free, open source and doesn't require that much computing
 

Bernoulli

M2 slut
It's quite convenient that Trump gets into office with this tech bro support, he announces an initiative for AI development, which is fine, and then IMMEDIATELY after, this Chinese model arises out of nowhere that is somehow vastly superior to what the tech bros have been doing.

This is space race, "missile gap", Cold War type shit all over again. People need to smarten up!
It's not from no where, they have been working on it for 2 years, there were some previews and older models from deepseek before R1

No matter how ahead you are designs end up converging
That's why Warplanes for example look similar between us and china , over a long time engineers from each side reach the same conclusion, it's not about copying
 

Makoto-Yuki

Gold Member
wouldn't you still need the best hardware? maybe the problem is in education/skills?

if this is what china can do with old GPUs then what can they do with the new ones? What could USA do if they were more efficient?
 
Last edited:

Nankatsu

Member
Donald Trump GIF by PBS NewsHour
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Not understanding the connection. Doesn't DeepSeek use Nvidia GPUs as well behind the scenes?

It's 20x more CPU efficient than other AI commerical models. That's bad news for Nvidia's and OpenAI business. Why invest 500 billion dollars in new Nvidia super computers and run OpenAI models when you can switch to DeepSeek and get 20 times the performance for free?
 
lol 😂. This is like the SuperZip of AIs. An AI program that’s probably actually using other AI sources and just scraping. Ofcourse it uses less resources.

“Unlike many Chinese AI firms that rely heavily on access to advanced hardware, DeepSeek has focused on maximizing software-driven resource optimization,” explains Marina Zhang, an associate professor at the University of Technology Sydney, who studies Chinese innovations. “DeepSeek has embraced open source methods, pooling collective expertise and fostering collaborative innovation. This approach not only mitigates resource constraints but also accelerates the development of cutting-edge technologies, setting DeepSeek apart from more insular competitors.”

 

Buggy Loop

Member
Isn't this what they have done?
Sell to the government the idea that AI is magic and they can only do it with the 500 billions
Then china releases a superior model for free, open source and doesn't require that much computing

AI is like a gold rush

Nvidia sells the shovels

China, USA, they both use the same shovel

Models are much like miners trying to find a gold vein. All this stock speculation should have the biggest impacts on peoples making models like openAI, even though I still think that their public models are not what's on the other side of the curtain and where the billions have been injected.

DeepSeek uses Nvidia GPUs. It's not because that it doesn't require as much computing that an AI farm will think of reducing GPU count, he just got more computational power on same hardware with a new model. He'll want everything he can. Especially since newer generations typically have better power efficiency and racks. These are costly for AI farms, regardless of model.

In no way this is a message that Nvidia will sell less GPUs, that's insane. On the contrary, this "panic" mode tech bros are doing on TV is to inject even more cash into it. This is an arm race. Although, DeepSeek is open source so its a matter of time in finding out how they optimized it.
 
Last edited:

E-Cat

Member
I don't think this will hurt nvidia. It just allows more companies to buy nvidia chips.
Yep

Like, we just had the revolution called test-time compute paradigm, you can literally throw infinitely more exponential compute at any reasoning problem for linear gains in performance. It's perpetually insatiable, now you can solve even harder problems for the same cost
 
Last edited:

Bernoulli

M2 slut
AI is like a gold rush

Nvidia sells the shovels

China, USA, they both use the same shovel

Models are much like miners trying to find a gold vein. All this stock speculation should have the biggest impacts on peoples making models like openAI, even though I still think that their public models are not what's on the other side of the curtain and where the billions have been injected.

DeepSeek uses Nvidia GPUs. It's not because that it doesn't require as much computing that an AI farm will think of reducing GPU count, he just got more computational power on same hardware with a new model.

In no way this is a message that Nvidia will sell less GPUs, that's insane. On the contrary, this "panic" mode tech bros are doing on TV is to inject even more cash into it. This is an arm race. Although, DeepSeek is open source so its a matter of time in finding out how they optimized it.

Google, Apple and Meta are all rushing to make their own AI chips to not rely on the expensive nvidia ones
With this new model showing that you need much less compute it's easier for them to switch away from nvidia and use your own chips or alternatives like AMD
 

Buggy Loop

Member
Google, Apple and Meta are all rushing to make their own AI chips to not rely on the expensive nvidia ones
With this new model showing that you need much less compute it's easier for them to switch away from nvidia and use your own chips or alternatives like AMD

Fran Healy What GIF by Travis


Oh yea, because they have a new model, they'll want less computational power and let a competitor buying Nvidia get the upper hand. Nonsense.

No it doesn't work like that. All Google, Apple, Meta and Microsoft's GPUs outside of Nvidia are always for previous models. They put Nvidia for the most top line research.
 

Shodai

Member
The real reason behind it, I think is because tech companies want to use it to replace highly compensated coders. Especially in more niche languages where some of these people are pulling in 300k+ a year.
It's more about extracting more value out of your high-performing resources. Salary is inconsequential in many regards, it's more about how many $ does each employee generate.

Lower level/performing folks are the ones who should be worried.
 

BennyBlanco

aka IMurRIVAL69
The real reason behind it, I think is because tech companies want to use it to replace highly compensated coders. Especially in more niche languages where some of these people are pulling in 300k+ a year.

Forget the coders. That NVidia CES presentation was nightmare fuel. AI cameras watching warehouse workers. AI robot chef arms.

If robots can efficiently do skilled labor jobs like a chef, we are so fucked as a species. It’s gonna eliminate a ton of middle class jobs and make billionaires more rich.
 

GHG

Gold Member
GHG GHG should I buy bro?

I posted about this potentially coming a few weeks ago:

Genuinely might consider sitting this one out, it's getting a bit ridiculous now, even for me. Neither my 4090 or 4080 are at a point in anything I play where I'm itching for an upgrade, so if need be I'll just hold on to what I have at see where things land with the 6000 series.

If anyone is interested, word is that enterprise customers are starting to scoff at the price increases and some of Nvidia's business practices such as showing reluctance to sell anything other than fancy server racks.

We are embraking upon a tipping point on the enterprise side of things because what's keeping this gravy train going is FOMO - where all the AI competitors don't want to be left behind on the hardware side of things and fall behind those who have purchased the latest and greatest hardware. The biggest issue though - most of them are struggling to monetise (and come up with solutions for monetising) their AI solutions in a way that justifies the relentless continued cost of needing to stay "on cycle" from a hardware perspective.

So in summary, enterprise are being pushed to their limits as far as cost is concerned, as are regular consumers. The biggest issue everyone faces though is the fact that there's no meaningful alternative. So until that changes, Nvidia will continue to milk the situation for all it's worth.

There's going to be a bust before an eventual tangible boom.

It's always the way with things that require a massive step change in order to fully realise the potential of something that is revolutionary but requires huge up front cost in order to even begin to take advantage of it.

We went through it at the beginning of the industrial revolution, we went through it when the Internet first started to become mainstream and we will go through it again with AI.

All I will say is to combine the above with the information regarding efficiency gains that deepseek has managed and come to your own conclusion.

The days of businesses specifically needing a high volume of specialised Nvidia chips to run AI solutions may well be long gone. If everything in the research paper rings true, it's a complete paradigm shift and the first of many to come. This is akin to the first general purpose chip coming along.

Not understanding the connection. Doesn't DeepSeek use Nvidia GPUs as well behind the scenes?

Using their solution you no longer need to use Nvidia chips to get it running well. And even if you decide to use Nvidia chips, you certainly don't need as many.
 
Last edited:
Top Bottom