• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Switch 2 battery to last 4 hours and why AMD lost out

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
This really might not be great sign for game performance in handheld mode though....

Which is really surprising. I could see if the got 6 hours of battery life out of it. But 5 watts, just for it to last 4 hours? Not great.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives

So the 2017 console with 4-6.5 hours (or 4-9 hours after the first SoC revision) of battery life with a successor likely launching 8 years later for a likely higher MSRP touting up to 4 hours battery life as something to shout about? Aaaaalright…

Would $400 make this a reasonable device?
 

jm89

Member
As much as I would hate for this to be true, it's a total plausible Nintendo move to make.
They are deffo gonna skimp somewhere, and probably not just the LCD screen. Assuming they are aiming for $400, they need to keep the hardware costs within that but also leave space for profit.
 
Last edited:

gokurho

Member
Seems AMD was in the running for the switch 2 but lost the bid do to efficiency of their chips and wanting to force Nintendo into 15watt performance, but Nintendo wanted 5watts. They also state that the battery in the switch 2 is slightly better than switch 1 because the new chip is more efficient. So we could be looking at 4 hours of time now.

Might explain why OLED is missing also. 5 watts sounds very low for switch 2 unless that chip is magical!



Battery life is likely to be one of the highlights of the Nintendo Switch 2. Judging from last week's leaks, the battery should provide around 20 wh, a slightly higher capacity than the original Switch battery. However, the system will consume way less power in handheld mode than its predecessor, granting much better battery life than the current generation console.


UPDATED POST SEE #141 for new USBC ADD-ON ATTACHMENT
reported by a new video shared by Moore's Law is Dead,
Sad Happy Hour GIF
 

marquimvfs

Member
5W mode could be the power they were pursuing for retrocompatibility, I guess. Given that the AMD couldn't achieve that power because of the emulation layers, they were discarded. Also, there's nothing impressive at those marks alone, it could be just Nintendo wanting to go cheap on battery and cooling. It only will be a remarkable thing if that power level translates to a high performance. On the other hand, if the 5W is only about the retrocompatibility, if they manage to emulate switch with that power level, while improving the assets quality, that will be sort of amazing.
 

FireFly

Member
This rumor, if true, is effectively confirming Samsung 8nm process. T239 is 1536 ampere cores with Nvidia pegging peak efficiency around 540Mhz based on the 8nm T234 it's derived from. That's ~1.6 "fake" double counted dual issue Teraflops at it's sweetspot. 5W portable mode definitely ain't running at 540Mhz, they'll probably underclock to 300 Mhz or lower like Switch 1, while docked sees a similar 700-800Mhz clock. That's 900 Gigaflops portable and 2.4TF's docked. Again, those are both "fake" dual issue/double counted teraflops, so real world raw docked compute will be around 70% lower, between base Xbone One and Base PS4. The Z1 Extreme is 8.6 "fake" teraflops, while base Z1 is closer at 2.8TF, but still beyond the likely clocks of an 8nm T239. Nintendo saw ballooning silicon costs and literally straight up rebadged their mid-gen Switch Pro design to a Switch 2.
Yes, so if peak efficiency is at 540Mhz, it will be reduced at 300 MHz! Therefore Nintendo could have designed a smaller and cheaper chip for the same performance! That's why 8nm doesn't make sense for a chip of T239's size. See the below analysis by Thraktor.


Also, TFs on Ampere are not "fake". You can achieve them in compute applications, but since the extra FP32 unit can only be used at the expense of an INT32 one, the real world performance gains in games are less. You should multiply the Ampere TF figures by ~0.7 to get the equivalent RDNA 2 part or ~0.875 to get the equivalent GCN part. For example the 5 TF - 6 TF 3050 Ti is about as fast as the 5 TF (GCN-based) 470 in the TPU benchmarks.
 
Last edited:

Ozriel

M$FT
Regardless, Nintendo will ALWAYS be a kid friendly company. You're delusional if you think they even look at Steam Deck or Rog Ally when creating Switch 2. Their priorities will be to keep it light, to keep it cool, to keep it small and to keep it cheap.

kid friendly or not, you’re delusional if you think they don’t look at the marketplace and potential competition when they make their hardware.

And one key thing they’d have focused on for the Switch 2 would be giving it capable enough specs to guarantee many more third party ports in the years to come. There’s money being left on the table if that’s not a focus. Significant amount of money.

‘Cheap’ is debatable too. I strongly doubt a Switch 2 would come in at a penny less than the $349 launch price of the Switch OLED. Especially now they’ll have to ship it with decent sized, relatively high speed storage.
 

CrustyBritches

Gold Member
Does MLID actually know anything? I feel like that guy is a top tier bullshit YouTube bro.
MLID is the perfect example of 'Fake it till you make it". He was all bullshit and repackaging existing info for years until the PS5 Pro info he accurately leaked.

As far as the Switch 2 goes, the best sources have been the Nvidia leak and Kopite7Kimi. In this case, the power limits could just be assumed based on the Switch 1.
 

El Muerto

Member
I'm curious if Nintendo was going to go with AMD's Soundwave ARM chip that's being developed, or a newer AMD x86 apu.
 

Three

Member
BS.

Nintendo were already going for an Nvidia chip, they had a partnership for 10 or 20 years, something like that, AMD was a non-factor. Maybe it's old news and it actually happened for Switch 1, but most probably it's tales from their asses.
Wasn’t Wii an AMD console?
 

kevboard

Member
Is 4 hours supposed to be a flex? That’s awful.

the original Switch had about 3h on average. lower or higher depending on how much GPU power a game needed.
only with the Mariko revision did the battery life go above 4h on the Switch 1.

Breath of the Wild, the big launch title, was 3h or less depending on your screen brightness. it being around 2 hours on max brightness.
I personally always got around 2h 30min with my screen settings (around 40%) on my launch system.

also, 4h of portable use is totally fine. I can't think of a situation where I would be 4h away from a power source while using a handheld system.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
The actual limit I wouldn't know. Typical total system power consumption in a 'AAA' game is around 20-25W. GOW and HZD can hit 27-28W. TW3 can hit 25W in certain areas.
When I was looking for a power bank for mine I read on sites that can go up to 45W. And that makes sense as the og charger is 45W and battery sometimes doesn't charge or even discharge when I am plugged and playing something very power intensive...
 

Haint

Member
Yes, so if peak efficiency is at 540Mhz, it will be reduced at 300 MHz! Therefore Nintendo could have designed a smaller and cheaper chip for the same performance! That's why 8nm doesn't make sense for a chip of T239's size. See the below analysis by Thraktor.


Also, TFs on Ampere are not "fake". You can achieve them in compute applications, but since the extra FP32 unit can only be used at the expense of an INT32 one, the real world performance gains in games are less. You should multiply the Ampere TF figures by ~0.7 to get the equivalent RDNA 2 part or ~0.875 to get the equivalent GCN part. For example the 5 TF - 6 TF 3050 Ti is about as fast as the 5 TF (GCN-based) 470 in the TPU benchmarks.

Yes that's why I said actual performance is around 70% less than the "fake" TF figure would imply. Only thing Nintendo cares about is price, and Samsung 8nm wafer is heavily rumored to be at least 1/3rd the price of a 5nm TSMC wafer, $5K - 6k range Vs. $15K - $18K. I don't even think a 5nm wafer will produce 2x more chips per wafer, never mind 3x or better. Nintendo's miser ass is most definitely NOT going to pay a 125%+ premium per chip for better clocks or power efficiency, not a chance in hell.
 
Last edited:

Kataploom

Gold Member
Wasn’t Wii an AMD console?
The Nvidia long term partnership started with the Switch and Switch 2 was always gonna have BC for which Nvidia was necessary. The sole reason Nintendo contracted DeNA for the current account system was for them to avoid having to start over again next gen with the installed base, as per Iwata words iirc so the BC and long term deal with Nvidia was basically a given since the Switch mere conception.
 

Rambone

Member
They are deffo gonna skimp somewhere, and probably not just the LCD screen. Assuming they are aiming for $400, they need to keep the hardware costs within that but also leave space for profit.
Yea, I suppose I won't try and dwell on whether or not we will get yesteryears sloppy seconds and just be happy to be getting (much needed) new hardware.
 

kevboard

Member
Wasn’t Wii an AMD console?

ATi technically. that was before AMD bought Radeon.
but the CPU was an IBM PowerPC design.

making the 7th gen the IBM PowerPC generation lol, as all 3 consoles had IBM CPU cores, with the PS3 and 360 even using the exact identical core design, with the only difference being that the PS3 only had 1 main CPU core while the 360 had 3 of these cores. the PS3 of course relying on the Cell's SPEs to help both the main core and the GPU
 
Last edited:

Exede

Member
Isn't the base run time of the Switch 2 kinda not suuuuper important. Everybody in here has a few power banks i guess.
You go on a trip, just add a power bank.
 

CrustyBritches

Gold Member
When I was looking for a power bank for mine I read on sites that can go up to 45W. And that makes sense as the og charger is 45W and battery sometimes doesn't charge or even discharge when I am plugged and playing something very power intensive...
For sure. I think that would be considered power consumption "from the wall" while charging, as opposed to total system power consumption. The Deck will show you it in a few of the overlay modes next to the battery percentage and estimated battery time remaining.

I have a kill-a-watt/power meter and when you are playing unplugged the Deck says 25W for TW3, then when you plug in the charger the power meter shows ~45W. If you turn off the Deck and charge it the power meter says ~25W.
 
Last edited:

lachesis

Member
This whole wattage count just reminded me of this.. ;)

mFgGmeG.gif


Not just CPU - but whole system engineers must be going thru something like this to keep the wattage usage under given amount of set allowance... much respect to them.

I'm honestly not expecting ground breaking - heck, PS4 Pro level at docked mode = I think it's plenty good enough for me as I don't really care much for RT or whatnot. Hope it would just keep consistent frame rate, hopefully 60fps for most games - at least 1080p. That's all I'm asking for.

Two sku rumors actually made me thinking.. that what if they are gunning for "lite" version that's similar to current "lite" ver? in size... and "XL" version as 2 skus? Lite being cheaper one, and XL being the expensive model...

Highly doubt if they are gunning at simultaneous 2 sku launch other than minor differences in included storage like Wii U model though... but one can imagine, I guess. :)
 

SweetTooth

Gold Member
As big of a blasphemy as low end Zen2 CPU in a high end gaming console in 2025?

I think not.
Tell me one instance of a console manufacturer changing the architecture of the CPU mid-gen. Increasing clocks of the same architecture doesn't count.

Once you answered, you will realize how clueless your comment sound.

You know making drastic changes to console CPUs entails changing the whole development suite, libraries..etc making developers life hell on earth like developing for a while new console!

Your second mistake is treating whatever CPU in consoles like the one used in PCs. consoles games are coded on a lower level program, there are also more specialized HW not found on PC (decompression blocks equals to 11 Zen cores on PS5) plus the eight Zen 2 cores. Of course you haven't seen DF article with GoW Ragnarok porting devs!

Believe me hardware designers at these companies are way smarter than you and me, if it was so easy and hassle free to "upgrade" the CPU, Microsoft would have done it for One X and Sony would have done it for PS5 Pro.

Now back yo my question, which midgen console refresh changed the CPU?
 
Last edited:

James Sawyer Ford

Gold Member
Makes sense for Nintendo to go with NVIDIA since DLSS is going to be super important for them and as far as we know AMD doesn't have a good competitor for that yet.

But yeah, I don't think AMD was ever in the equation to begin with.....due to various reasons like BC. Could just be another "we had to get a competitive bid" to lower the asking price a tad for NVIDIA, though reality is NVIDIA doesn't need to be competitive with anyone in this market.
 

Ozzie666

Member
As things like the Alloy emerge and rumors of MS hand held, Nintendo will school them all with their profitable strategic under powered hardware and gamers will enjoy it.

How do people still doubt and not understand this already? Did you really expect cutting edge tech?
 
Yes, with handheld PCs, with the rumored handhelds from Microsot and Sony.
Mmm yeah, no. Not unless these handhelds cost max $400 and include new Zelda, Mario and Mario Kart games. There's a reason why the Switch has sold 143 million units and the most successful handheld PC has sold around 3 million.
 

SweetTooth

Gold Member
Mmm yeah, no. Not unless these handhelds cost max $400 and include new Zelda, Mario and Mario Kart games. There's a reason why the Switch has sold 143 million units and the most successful handheld PC has sold around 3 million.
Bolded didn't save Nintendo when the big boys arrived to their space.
 

rnlval

Member
Yes, so if peak efficiency is at 540Mhz, it will be reduced at 300 MHz! Therefore Nintendo could have designed a smaller and cheaper chip for the same performance! That's why 8nm doesn't make sense for a chip of T239's size. See the below analysis by Thraktor.


Also, TFs on Ampere are not "fake". You can achieve them in compute applications, but since the extra FP32 unit can only be used at the expense of an INT32 one, the real world performance gains in games are less. You should multiply the Ampere TF figures by ~0.7 to get the equivalent RDNA 2 part or ~0.875 to get the equivalent GCN part. For example the 5 TF - 6 TF 3050 Ti is about as fast as the 5 TF (GCN-based) 470 in the TPU benchmarks.
Per SM basis, Ampere's 2X FLOPS increase is "fake" when NVIDIA didn't scale TMU by 2X. Modern graphics gaming workloads mostly operate on floating point data formats.

TFLOPS are useless by themselves without the read and write units such as TMUs and ROPS.
 
Last edited:
Top Bottom