• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA stock to lose $400 billion ( US tech 1 Trillion) after DeepSeek release

Buggy Loop

Member
You don't "need less compute", you need as much compute as you can muster. All this means is there's now a double-exponential instead of an exponential.

And voila

This is just an accelerator to strike it rich faster on robots / autonomous cars / etc. They won't simply stop buying or slow down because a model is more efficient :messenger_tears_of_joy:

Its an arm's race. There's no slowing down.
 
Pff in a few days Nvidia will announce their last numbers and they'll still be printing money like there is no tomorrow.
Panic reactions to literally nothing are just a stock thing. We're way past the point where stocks in general have more than a loose correlation to what is actually going on.
 

E-Cat

Member
And voila

This is just an accelerator to strike it rich faster on robots / autonomous cars / etc. They won't simply stop buying or slow down because a model is more efficient :messenger_tears_of_joy:

Its an arm's race. There's no slowing down.
Seriously, did people's brains fall out of their ears or something??

"Hey, did you hear there's a new, more efficient model out of China?"

- Yikes, I guess that means the scaling laws for foundation models and inference no longer apply, stop buying GPUs!
 

GHG

Gold Member
And voila

This is just an accelerator to strike it rich faster on robots / autonomous cars / etc. They won't simply stop buying or slow down because a model is more efficient :messenger_tears_of_joy:

Its an arm's race. There's no slowing down.

The arms race looks a lot different now though.

Due to the open source nature of deepseek it means companies can adapt and intergrate it at will. You no longer need to use Nvidia hardware, you no longer need to go to one of the big players for an AI solution, and you certainly don't need huge nuclear power plants running in the background feeding it energy in order for it to be viable.

Yes you can sit there with millions of Nvidia chips and say you have the most capacity - but who exactly are you selling to and at what price?
 

Buggy Loop

Member
Seriously, did people's brains fall out of their ears or something??

"Hey, did you hear there's a new, more efficient model out of China?"

- Yikes, I guess that means the scaling laws for foundation models and inference no longer apply, stop buying GPUs!

"Yea! Let's slow down on spending". Said nobody ever when it comes to disrupting tech and even worldwide domination when they are on the cusp of having robotics, self driving cars and AGI models aiming towards singularity and change the world (for better or worst is anyone's guess).

Someone found a gold vein that's easier to dig. He ain't gonna stop buying the best shovel and slow down. Total nonsense. But I guess that's what wallstreet wants to portray right now on media. Speculate a crash and then buy low. Oldest trick in the world with media reporting on stocks.
 
Google, Apple and Meta are all rushing to make their own AI chips to not rely on the expensive nvidia ones
With this new model showing that you need much less compute it's easier for them to switch away from nvidia and use your own chips or alternatives like AMD
Not really. DeepSeek's approach implies that you can get much more out of your hardware resources for a fraction of the cost - a massive boost to efficiency.

Nvidia still holds the bleeding edge when it comes to the hardware required for building AI, and this news doesn't change anything. If anything, AI development will accelerate even more rapidly than it had before now that compute limitations have been mitigated.

This is still an arms race, and all sides of the competition won't want to be left behind. They'll still go for the best resources available, especially since the value proposition is much more enticing since they can squeeze even more out of the high-end GPUs. Nvidia is actually very well positioned considering how much they're banking on massively increasing tensor cores as opposed to compute units...
 

GHG

Gold Member
"Yea! Let's slow down on spending". Said nobody ever when it comes to disrupting tech and even worldwide domination when they are on the cusp of having robotics, self driving cars and AGI models aiming towards singularity and change the world (for better or worst is anyone's guess).

Someone found a gold vein that's easier to dig. He ain't gonna stop buying the best shovel and slow down. Total nonsense. But I guess that's what wallstreet wants to portray right now on media. Speculate a crash and then buy low. Oldest trick in the world with media reporting on stocks.


And 15% down in a single day with this kind of volume being traded is not "speculate a crash". It's a crash no matter how you want to spin it.
 

Buggy Loop

Member
The arms race looks a lot different now though.

Due to the open source nature of deepseek it means companies can adapt and intergrate it at will. You no longer need to use Nvidia hardware, you no longer need to go to one of the big players for an AI solution, and you certainly don't need huge nuclear power plants running in the background feeding it energy in order for it to be viable.

Yes you can sit there with millions of Nvidia chips and say you have the most capacity - but who exactly are you selling to and at what price?

You're thinking like someone doing AI in his basement at home. That's neat and all, yea he has a free model, yea he doesn't need to buy blackwell to generate hentai porn.

That's not how the big boys tech firms play.

The new model scales with inference speed and scalability on GPUs. They didn't invent an ASIC solution ala bitcoin here GHG, the model scales fully the more you throw inference computational power at it. They found a more efficient model BECAUSE they are restricted on using bigger GPUs, they didn't find a way to do it without GPUs.

There's as much chance as a snowflake in hell that these companies will lift the foot from the pedal, in fact the panic is now who is gonna be using this model and the biggest AI farm to beat everyone first to robotics or autonomous cars, or even AGI. The panic is going towards USA going even more FULL IN to not let China get the advantage.


And 15% down in a single day with this kind of volume being traded is not "speculate a crash". It's a crash no matter how you want to spin it.

I mean peoples are dumb, what can you do about it. If they invested in AI without understanding the principles, no wonder they are reactionary to this.

Like I said, the biggest ones that should be hit with a trading blow are the tech firms behind the models.
 
Last edited:

E-Cat

Member

And 15% down in a single day with this kind of volume being traded is not "speculate a crash". It's a crash no matter how you want to spin it.
It's impossible to calculate a ROI on the Singularity; except to say there will probably be only one or two winners, make your pick carefully...
 

viveks86

Member
Pff in a few days Nvidia will announce their last numbers and they'll still be printing money like there is no tomorrow.
Panic reactions to literally nothing are just a stock thing. We're way past the point where stocks in general have more than a loose correlation to what is actually going on.
This is what I'm thinking as well. Seems too speculative. Stock market gonna stock market, I guess.
 

viveks86

Member
Using their solution you no longer need to use Nvidia chips to get it running well. And even if you decide to use Nvidia chips, you certainly don't need as many.
Except this is historically not how an arms race would work right? If they are more efficient, they will just max out performance and sell even more throughput and better response times so they can leap frog the competition. This only makes sense once the technology reaches some saturation point (singularity?) and there is no point in improving further.
 
Last edited:

Buggy Loop

Member
It's impossible to calculate a ROI on the Singularity; except to say there will probably be only one or two winners, make your pick carefully...

I see you are a man of culture

That's what USA / China governments are chasing for

If they fear China even sniffing the idea of beating them to the punch, well let's just say ...

You'll never have seen a funding as big as what's about to come. Forget the Space shuttle / Apollo program / Manhattan project. It'll look cute compared to the funding AI will get.

Singularity is the single most disruptive thing that will happen to mankind. The country being first to it will leap everyone and its exponential once you have it.

Its not because you find a more efficient way to reach your goals that you lift the foot from the pedal. AI transformers for youtube videos is not the end game, far from it.
 
Last edited:

GHG

Gold Member
You're thinking like someone doing AI in his basement at home. That's neat and all, yea he has a free model, yea he doesn't need to buy blackwell to generate hentai porn.

That's not how the big boys tech firms play.

The new model scales with inference speed and scalability on GPUs. They didn't invent an ASIC solution ala bitcoin here GHG, the model scales fully the more you throw inference computational power at it. They found a more efficient model BECAUSE they are restricted on using bigger GPUs, they didn't find a way to do it without GPUs.

There's as much chance as a snowflake in hell that these companies will lift the foot from the pedal, in fact the panic is now who is gonna be using this model and the biggest AI farm to beat me to robotics or autonomous cars, or even AGI. The panic is going towards USA going even more FULL IN to not let China get the advantage.

You are missing the forest for the trees.

  • There is no "China advantage" - it's open source and every single company can learn from and adopt deepseek's methodologies.
  • Yes of course this new model scales, but if you are an AI solutions provider are you really going to run 1000 GPU's at full tilt when your current customer base only dictates you need to run 10 at full tilt?
Nobody said anything about anyone taking their foot off the pedal. This is the biggest accelerant we have had since this so called "AI revolution" was first mooted. It's just that the proverbial pedal now looks very different to what people originally thought it would look like. If the US go ahead with Stargate now as originally planned then it will end up looking like the Maginot line once all is said and done.

Bottom line is as follows - if you give a company an opportunity to cut cost with no decrease in output then they are going to take it. Remember, that was the primary selling point of this whole AI revolution in the first place.

Except this is historically not how an arms race would work right? If they are more efficient, they will just max out performance and sell even more throughput and better response times so they can leap frog the competition. This only makes sense once the technology reaches some saturation point (singularity?) and there is no point in improving further.

Ask yourself this - who is buying and at what prices are they willing to buy at?

When the mooted costs are higher than just employing people for certain disciplines, would a business rather opt for a person or an AI solution?

They were already struggling to monetise things as is. Scaling up doesn't solve that problem. First they need to figure out what customers will want from all this, then you are free to scale up accordingly.
 
Last edited:

MiguelItUp

Member
I'm sure things will work out for them regardless, I mean, they're sitting pretty on quite a lot at the moment.

That being said, I wish they would get pushed around some more, they need it.
 

MikeM

Member
Man I wish I had more free cash. I’d be jumping back into Nvidia. Next earnings it’ll moon again.
 

Bernoulli

M2 slut
I see you are a man of culture

That's what USA / China governments are chasing for

If they fear China even sniffing the idea of beating them to the punch, well let's just say ...

You'll never have seen a funding as big as what's about to come. Forget the Space shuttle / Apollo program / Manhattan project. It'll look cute compared to the funding AI will get.

Singularity is the single most disruptive thing that will happen to mankind. The country being first to it will leap everyone and its exponential once you have it.

Its not because you find a more efficient way to reach your goals that you lift the foot from the pedal. AI transformers for youtube videos is not the end game, far from it.
But what is singularity? we are already seeing exponential advancements in AI
 

GHG

Gold Member
It's impossible to calculate a ROI on the Singularity; except to say there will probably be only one or two winners, make your pick carefully...

And would you say the eventual winner will be a single hardware solutions provider?

Because that is precisely how Nvidia has been trading over the last 18 months or so.
 

Buggy Loop

Member
You are missing the forest for the trees.

  • There is no "China advantage" - it's open source and every single company can learn from and adopt deepseek's methodologies.
  • Yes of course this new model scales, but if you are an AI solutions provider are you really going to run 1000 GPU's at full tilt when your current customer base only dictates you need to run 10 at full tilt?
Nobody said anything about anyone taking their foot off the pedal. This is the biggest accelerant we have had since this so called "AI revolution" was first mooted. It's just that the proverbial pedal now looks very different to what people originally thought it would look like. If the US go ahead with Stargate now as originally planned then it will end up looking like the Maginot line once all is said and done.

Bottom line is as follows - if you give a company an opportunity to cut cost with no decrease in output then they are going to take it. Remember, that was the primary selling point of this whole AI revolution in the first place.

What The Wtf GIF by MOODMAN


I know its open source

New model means Stargate can reach AGI faster and beyond AGI. Why would they cut costs.

I feel like I've detailed enough how scalability and arms race will not slow down things. If speculators are that dumb with stock market then so be it, they created a dip.
 
  • Empathy
Reactions: amc

nemiroff

Gold Member
Sheesh, that didn't take long.

I'm glad I didn't push their stocks to friends and family...
 
Last edited:

E-Cat

Member
And would you say the eventual winner will be a single hardware solutions provider?

Because that is precisely how Nvidia has been trading over the last 18 months or so.
There doesn't have to be an eventual single winner for NVDA to still have plenty of runway before that.
 

E-Cat

Member
You are missing the forest for the trees.

  • There is no "China advantage" - it's open source and every single company can learn from and adopt deepseek's methodologies.
  • Yes of course this new model scales, but if you are an AI solutions provider are you really going to run 1000 GPU's at full tilt when your current customer base only dictates you need to run 10 at full tilt?
Nobody said anything about anyone taking their foot off the pedal. This is the biggest accelerant we have had since this so called "AI revolution" was first mooted. It's just that the proverbial pedal now looks very different to what people originally thought it would look like. If the US go ahead with Stargate now as originally planned then it will end up looking like the Maginot line once all is said and done.

Bottom line is as follows - if you give a company an opportunity to cut cost with no decrease in output then they are going to take it. Remember, that was the primary selling point of this whole AI revolution in the first place.



Ask yourself this - who is buying and at what prices are they willing to buy at?

When the mooted costs are higher than just employing people for certain disciplines, would a business rather opt for a person or an AI solution?

They were already struggling to monetise things as is. Scaling up doesn't solve that problem. First they need to figure out what customers will want from all this, then you are free to scale up accordingly.
Are you aware of the o3 model? It cost like $350 grand and ~16 hours for a single SOTA solution to ARC-AGI... if they put 10x more money in, the results would have been even better. How much money and inference time would you be willing to spend in order to solve the Riemann hypothesis? Or devise a better way of building microchips? Or a more efficient AI learning algorithm? There's literally no limit, the fact that it's now available to everyone will only highlight the importance of compute supremacy. A lead of 1 month could make or break the winner, the US gov is gonna go absolutely apeshit pedal to the metal on this one...
 
Last edited:

Greggy

Member
It's quite convenient that Trump gets into office with this tech bro support, he announces an initiative for AI development, which is fine, and then IMMEDIATELY after, this Chinese model arises out of nowhere that is somehow vastly superior to what the tech bros have been doing.

This is space race, "missile gap", Cold War type shit all over again. People need to smarten up!
It's open source bro. Go read the github files instead of making up conspiracies on a game forum. This what our youth has become and we wonder why China has outpaced us technologically.
 

E-Cat

Member
But what is singularity? we are already seeing exponential advancements in AI
We don't even have autonomous research organizations comprising human genius-level AI agents, that's still 2-3 years away. Talk of the singularity before that is premature
 
Last edited:

GHG

Gold Member
What The Wtf GIF by MOODMAN


I know its open source

New model means Stargate can reach AGI faster and beyond AGI. Why would they cut costs.

I feel like I've detailed enough how scalability and arms race will not slow down things. If speculators are that dumb with stock market then so be it, they created a dip.

It's not about things "slowing down". Meta is up, Microsoft is attempting to rally back, Apple (whose chips will benefit the most from this due to their architecture) are up. These (and all other competing) companies will benefit greatly from deepseek and can now use what they already have (along with looking at alternatives to Nvidia - yes, this may well bring AMD back in to the picture if the methodologies can be adapted to get around AMD's chip to chip issues) to scale up at no extra cost.

For nvidia though, the days of them hard selling dozens of fully equipped AI server racks at a time may be well and truly gone.

While an unintentional consequence, this directly attacks Nvidia's most. You no longer need to use CUDA and you no longer need to make sure you have access to the massive bandwidth Nvidia's "virtual GPU's" offered. It's ironic, because the only reason this has come about is because of the restrictions placed on China when it comes to nvidia chips. Funny what can be achieved in the world of software when you're forced to be efficient :)
 

EN250

Member
I always knew nvidia was inflated. good to see.

Everyone going all in on AI is equally moronic. AI is just a tool. its the products that matter.
Everyone in tech wants AI so far up their asses so it can physically wipe them cheeks, it doesn't make any sense 🤦‍♂️
 

GHG

Gold Member
Are you aware of the o3 model? It cost like $350 grand and ~16 hours for a single SOTA solution to ARC-AGI... if they put 10x more money in, the results would have been even better. How much money and inference time would you be willing to spend in order to solve the Riemann hypothesis? Or devise a better way of building microchips? Or a more efficient AI learning algorithm? There's literally no limit, the fact that it's now available to everyone will only highlight the importance of compute supremacy. A lead of 1 month could make or break the winner, the US gov is gonna go absolutely apeshit pedal to the metal on this one...

The bold is a horrible assumption.

What this has proven is that if you want more gains in a short amount of time then you should focus more on improving the efficency of your software instead of waiting on and relying on more (or better) hardware.

In some tasks we have a 25x improvement - how long would we have to wait (or how much more would you need to spend) to get the same result via hardware alone?
 
Last edited:

E-Cat

Member
The bold is a horrible assumption.

What this has proven is that if you want more gains in a short amount of time then you should focus more on improving the efficency of your software instead of waiting on and relying on more (or better) hardware.

In some tasks we have a 25x improvement - how long would we have to wait (or how much more would you need to spend) to get the same result via hardware alone?
No one is waiting on better hardware, the megacaps are simply spending exponentially more each turn (yes, obviously this will hit a wall sooner or later). And, yes, there are also the 3-4x generational leaps in hw every couple of years.

It's cool that we got a one-time improvement like this, but the vast majority of compute in the future is going towards test-time inference compute. We don't even know the inference cost/performance curve of DeepSeek-R1... more data is needed.

OpenAI are now iterating every ~3 months on the o-series, they can incorporate whatever RL lessons there are to be had from the DeepSeek paper on o4. o3 got 25% on the research-level math FrontierMath benchmark, do you realize how much compute is going to be required for ~90% on this test -- with, or without, this new efficiency gain?
 
Last edited:

Buggy Loop

Member
It's not about things "slowing down". Meta is up, Microsoft is attempting to rally back, Apple (whose chips will benefit the most from this due to their architecture) are up. These (and all other competing) companies will benefit greatly from deepseek and can now use what they already have (along with looking at alternatives to Nvidia - yes, this may well bring AMD back in to the picture if the methodologies can be adapted to get around AMD's chip to chip issues) to scale up at no extra cost.

For nvidia though, the days of them hard selling dozens of fully equipped AI server racks at a time may be well and truly gone.


Look, you're hoping for cheap GPUs and that's fine

But you're way off with "the time of fully equipped AI server racks is gone", like completely delusional.

Has nothing to do with Nvidia or AMD. Tech firms will pick the best inference available and scale it as high as they can with whatever power they have available to make it run, nuclear power and all.

Can the shovel seller switch hands in the future? Sure. But as of now, even AMD for free is still more expensive than Nvidia for these firms when they consider the whole picture of total cost of ownership.


While an unintentional consequence, this directly attacks Nvidia's most. You no longer need to use CUDA and you no longer need to make sure you have access to the massive bandwidth Nvidia's "virtual GPU's" offered. It's ironic, because the only reason this has come about is because of the restrictions placed on China when it comes to nvidia chips. Funny what can be achieved in the world of software when you're forced to be efficient :)

You always need an interlink for any massive GPU scaling. What are you even on about. You're talking about things you don't understand. DeepSeek doesn't change GPU scalability. Its incredible the misinformation.

Oh look


AMD Instinct™ GPUs accelerators are transforming the landscape of multimodal AI models, such as DeepSeek-V3, which require immense computational resources and memory bandwidth to process text and visual data. AMD Instinct™ accelerators deliver outstanding performance in these areas.

Requirements for bandwidth and computational ressources not going away.

Tech firms forever have managed to do without Cuda, how else is AMD selling anything?
That's not what they buy Nvidia for, not the big tech firms at least. Even OpenAI which uses Nvidia GPUs has its inhouse solution, Triton.
When you're that big you have the software inhouse. Cuda again, you're thinking advantages like someone making hentai porn in his basement.
 
Got to love seeing Jensen's nose being rubbed in the shit.

Also, trumps trade war and bidens continuation of it will prove to be the biggest geopolitical bungle of this century. It forced a sleeping giant to awaken.
 

ProtoByte

Weeb Underling
Someone correct me if I'm wrong, but I've read here and there that the responses Deepseek generates aren't as accurate/high quality as at least the paid ChatGPT stuff?
 
Top Bottom