• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 50 Series silently removed 32-bit PhysX support

rodrigolfp

Haptic Gamepads 4 Life
My man, entirely new versions of software does not mean that old versions are still maintained and updated. Windows 7 is dead. Unsupported and discontinued. Windows 8.1, Windows 10, and Windows 11 does not mean that Windows 7 is 'continued maintained and updated'

By that logic, hardware based Phys-X is still supported, it's just only the x64 version, and it has no BC with x32.
APIs, OSs and engines are normally not build from the ground with updated versions. They are literary updated/patched. That is why stuff like programs from Windows 95 still works on Win 11.

32bit GPU hardware accelerated PhysX is not supported anymore on rtx 5000. CPU hardware acceleration is. 🤷‍♂️
 

Nex240

Neo Member
I mean, sure, it sucks to lose access to the same (or better) quality of some older games with Phys-x enabled, but at the same point you can't expect software that old to work on newer equipment the same. For reference; from the DSOGaming article: "you’ll have to rely on the CPU PhysX solution, which is similar to what AMD GPUs have been offering all these years."

So yeah, it's not a big deal to me -- the games will still run, just not as nice.
HW accelerated Physx doesn't run well on the CPU... like at all. Nvidia made it run in x87 code to prevent AMD users ever running it. This is why Nvidia shouldn't remove 32bit support for it, it needs to run on their GPU's.
 
Last edited:
HW accelerated Physx doesn't run well on the CPU... like at all. Nvidia made it run in x87 code to prevent AMD users ever running it. This is why Nvidia shouldn't remove 32bit support for it, it needs to run on their GPU's.
Exactly.

Either make it multi-threaded with AVX2 support, or keep supporting CUDA 32-bit.

Or make CUDA open-source...

You cannot have it both ways.

Let's not forget it was nVidia that bought Ageia, nobody asked them to do it.

When you become the custodian of a certain tech, you also have the responsibility to maintain it, otherwise make all of it open-source (like id Software does) and let the community/modders maintain it at zero cost for you.
 

StereoVsn

Gold Member
That's true. Oh great, now there's going to be shortages of 1650 cards. (Yeah, I don't know how plentiful those cards are actually are.)
Umm… will be right back (furiously searches eBay). This is a great idea actually. Would 1650 suffice or would we need something beefier from that gen?

I mean just for 32bit PhysicX, with another main card doing the heavy lifting.
 
Last edited:

StereoVsn

Gold Member
Something tiny like a GTX 960 would be fine.
I feel like we will need to build a Win7 PC for games if they era in the not so distant future.

Edit: I will grab an older card for now. Would different drivers work alright if both cards are Nvidia but different gen I wonder. Never tried this before with completely different card generations like this.
 
Last edited:

Hoddi

Member
What makes you think that modern GPUs with 5000-10000 ALUs cannot run concurrently graphics + physics?

nVidia implemented Async Compute a while ago, so there is no performance penalty (hardware context switching).
I'm not sure what the root cause is but you can see how Batman performs in this video below. The frametimes are dreadful even though the framerate seems relatively fine.

Running PhysX on a dedicated GPU (even a very low end one) was enough to solve this last time I checked.

 
Umm… will be right back (furiously searches eBay). This is a great idea actually. Would 1650 suffice or would we need something beefier from that gen?

I mean just for 32bit PhysicX, with another main card doing the heavy lifting.
How would you feel if nVidia removed PhysX support for Turing GPUs in the future? Is there any guarantee they won't do it?

Or even worse they could stop supporting Turing in general, even for 3D graphics. What's next? Buying RTX 3050 just for PhysX?

Not to mention an extra GPU means higher power consumption, even while being idle...

Also, I don't agree with having multiple PCs. That's like keeping multiple consoles, instead of having BC.

I'm not sure what the root cause is but you can see how Batman performs in this video below. The frametimes are dreadful even though the framerate seems relatively fine.

Running PhysX on a dedicated GPU (even a very low end one) was enough to solve this last time I checked.


Even if modern nVidia cards cannot handle concurrent execution (highly unlikely, but let's assume that's the case), what makes you think the driver cannot partition the GPU resources for graphics + physics?

PhysX requires 256MB VRAM + 32 CUDA cores.

RTX 4070 has 5888 CUDA cores + 12288 MB VRAM.

You really think Batman Arkham Knight couldn't work just fine with less CUDA cores/VRAM for graphics, just to make physics run flawlessly?
 
Last edited:

Hoddi

Member
Even if modern nVidia cards cannot handle concurrent execution (highly unlikely, but let's assume that's the case), what makes you think the driver cannot partition the GPU resources for graphics + physics?

PhysX requires 256MB VRAM + 32 CUDA cores.

RTX 4070 has 5888 CUDA cores + 12288 MB VRAM.

You really think Batman Arkham Knight couldn't work just fine with less CUDA cores/VRAM for graphics, just to make physics run flawlessly?
There's no question that the driver could handle it much better. It just doesn't help since nvidia has never bothered fixing it.

Without better driver support, the best way to get stable framerates is with a secondary GPU.
 
There's no question that the driver could handle it much better. It just doesn't help since nvidia has never bothered fixing it.

Without better driver support, the best way to get stable framerates is with a secondary GPU.
TBH, with a 256MB VRAM requirement for PhysX there's no reason to require CUDA 64-bit.

32-bit is just fine for older games (<=4GB).

Jensen should give us a couple of technical explanations...
 
Don't want to support it anymore. That's likely it. You're not going to see a new game use 32-bit.
I never said new games should utilize 32-bit or DX9, but I expect BC for old games.

PC is the ultimate platform for game preservation. No? Do we want IBM PCs to become like Apple Mac or consoles?

Allocating an extra 200-300MB for 32-bit *.dll is no big deal, a 2TB SSD costs $100.
 

BlackTron

Member
Would different drivers work alright if both cards are Nvidia but different gen I wonder. Never tried this before with completely different card generations like this.

PTSD from experimenting with multi monitor setups when I was new at this. The craziest thing was a Nvidia and AMD card at the same time. No, the drivers didn't like it LOL. Even though I hate this idea, it actually ended up being AMD and Matrox. The AMD was a brand new 9800XT and the Matrox was this ancient corroded thing expressly made to deliver two VGA ports.

VERY long time ago but I think the only way to know for sure in your system/your cards is to try it. I also seem to remember the order of installing drivers making a difference.
 
Last edited:

dgrdsv

Member
You don't need a 32-bit OS to run 32-bit apps.

Long mode supports switching to 32-bit mode and even 16-bit on Linux via WineVDM.
32 bit mode is legacy mode which doesn't have as much resources available or access to security features the new h/w and OSes provide.
It is somewhat similar to DX11 vs DX12 where you can run DX11 on the newest GPUs but you won't be able to use their new features.
And the reasons to drop support for the older APIs are very similar here.
Win64 can't run 16 bit apps at all.

Nobody will axe it and no, PCs aren't switching to ARM either.
Oh my sweet summer child...

Why do I have the suspicion that some of you are salty AMD GPU users that never experienced the PhysX goodness and now you're having the time of your life due to schadenfreude? :)
I've played a game with GPU PhysX last month (AC4BF). It was literally unusable on my 4090 due to performance issues (essentially <20 fps with highest PhysX option).
So yeah I do have a feeling that the majority of those who are running screaming how unacceptable that is haven't really tried any games with it in the recent 10 or so years.
About half of them do not work well even on GPUs which are supposedly supporting h/w acceleration. Cryostasis would just crash for me with GPU PhysX all the time for example.

We can't expect but we are not happy when they drop, even less when it is not technically necessary as the case here.
On the contrary it is entirely necessary to drop 32 bit CUDA runtime support. It brings nothing but issues for Nvidia now when MS has stopped supporting 32 bit Windows.

Windows 11 still runs DX8 and older and 32bit games natively. There is only emulation needed for 16 bit programs.
Nope. Everything pre-DX9 is running through an emulation layer starting with Vista IIRC. And it's not good which is why you generally get better results with wrappers like dgVoodoo2 and d3d8to9.
The problem with 32 bit PhysX is that it will require some fuckery to even make a 32 bit wrapper which would translate the calls to 64 bit runtime. I'm not sure that it's even possible.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
32 bit mode is legacy mode which doesn't have as much resources available or access to security features the new h/w and OSes provide.
It is somewhat similar to DX11 vs DX12 where you can run DX11 on the newest GPUs but you won't be able to use their new features.
And the reasons to drop support for the older APIs are very similar here.
Win64 can't run 16 bit apps at all.


Oh my sweet summer child...


I've played a game with GPU PhysX last month (AC4BF). It was literally unusable on my 4090 due to performance issues (essentially <20 fps with highest PhysX option).
So yeah I do have a feeling that the majority of those who are running screaming how unacceptable that is haven't really tried any games with it in the recent 10 or so years.
About half of them do not work well even on GPUs which are supposedly supporting h/w acceleration. Cryostasis would just crash for me with GPU PhysX all the time for example.


On the contrary it is entirely necessary to drop 32 bit CUDA runtime support. It brings nothing but issues for Nvidia now when MS has stopped supporting 32 bit Windows.


Nope. Everything pre-DX9 is running through an emulation layer starting with Vista IIRC. And it's not good which is why you generally get better results with wrappers like dgVoodoo2 and d3d8to9.
The problem with 32 bit PhysX is that it will require some fuckery to even make a 32 bit wrapper which would translate the calls to 64 bit runtime. I'm not sure that it's even possible.
You are right. MS calls the WOW64 emulation.

But Win64 can run 16 bit apps with stuff like "otvdm".
 
Last edited:

peish

Member
Microsoft should standardize hardware-accelerated physics, I'm not sure why they haven't done it yet:


I've been saying the same thing about hardware-accelerated audio (EAX/A3D). It's a shame PC gaming keeps getting worse in a way, despite 3D graphics progressing.

Physics and audio are equally important for an immersive gaming experience.

TBH i dont even know what hardware accelerated means now.

I think the CUDA and RT and AI cores are there for developers to incorp good physics and positional audio. But they are lazy untalented. We see old games with more interesting objects physics than new games. Which dont make sense with how UE5 have advanced development..
 
TBH i dont even know what hardware accelerated means now.
Running physics via GPGPU instead of the CPU.

Have you ever tried software rasterization (3D rendering via the CPU)?

There's a reason we want hardware acceleration wherever possible (graphics, physics, audio).

Hardware acceleration has been a staple since the 3Dfx/EAX days.

I think the CUDA and RT and AI cores are there for developers to incorp good physics and positional audio. But they are lazy untalented. We see old games with more interesting objects physics than new games. Which dont make sense with how UE5 have advanced development..
Agreed. Gaming keeps getting worse for some reasons.

NV doesn't Open Source anything. Even in Linux I think you still need the binary driver to do anything other than 2D.
Then they should keep maintaining it.

If we normalize this behavior, one day even DX11/DX12 *.dll files will cease to exist... who would be okay with that?

Just keep the damn CUDA32 *.dll where they are, no need to test them (even though I'm pretty sure AI can test them automatically, no need for human intervention these days), no need to delete them. Let them be.

nVidia earns 90% of their income via AI, so it's very easy for them to ditch PC gamers altogether. Will people justify that too? ("AI is where it's at, nVidia stockholders couldn't care less about PCMR")

Well, maybe China's fierce AI competition could teach Jensen a lesson or two, especially when they start manufacturing TSMC-level microchips. Then nVidia will remember their PC gaming roots, but it will be too late.
 
I'm not sure what the root cause is but you can see how Batman performs in this video below. The frametimes are dreadful even though the framerate seems relatively fine.

Running PhysX on a dedicated GPU (even a very low end one) was enough to solve this last time I checked.


What resolution and CPU for that video?

I have a 2080ti and recently played through the game and also ran that same benchmark in stereoscopic 3d (3dvision) at 1080p with graphics maxed and got a solid framerate and I don't remember any stutters.
 
Last edited:
Well, it is indeed possible to offer an open-source CUDA alternative (I'm not sure how ZLUDA works exactly, does it translate CUDA API calls to OpenCL ones?):

2ff67ad848ea6a68c1d5941acdaf622d279c7c44311b7810b0c1d38f7e6a045f.png

It's a shame legal reasons prevent him from releasing the source code (it's not stolen, just reverse engineering)...

Microsoft on the other hand doesn't mind WINE (Win32 API to POSIX translator) or ReactOS (reverse-engineered open-source Windows).
 
Last edited:

kevboard

Member
this is extremely fucked.

some games will not even allow you to enable the full PhysX settings if it doesn't detect a compatible GPU. so games like Batman Arkham Origins will not allow you to have PhysX Smoke and Steam, only the cloth physics.
 

Famipan

Member
LOL! I felt bad for getting the disregarded 4060 and whenever I get my PC working again I’m looking forward to try Mirorr’s Edge in 4K 120fps on my LG TV.

Playing old PC games in 4K + high framerate is why I like PC-gaming and GPU’s to begin with!

Glad I didn’t sell it for the 5xxx-series!
 
Last edited:

Brakum

Member
If you have a 5000 series card. Return it and get your money back. These cards have been nothing but pain and suffering.
Not fun, not fast, not cheap, dangerous.
Stick with the 4's. 4000 series or the upcomming FSR 4 tech from AMD. FSR4 is going to be amazeballs, right? Please say it's going to be amazeballs.
The 4000 series are like twice the price here in switzerland. Got a 5080 for 1299chf. Was looking at the 4080's and most of them were well over 2000chf.
 

That's the CPU fallback, right?

And it's only 1 thread running ancient x87 code, because PS360 consoles didn't support SSE (bullshit excuse, they support vectorized code with VMX + Cell SPUs).

It's the epitome of lazy devs and nVidia nerfing the CPU codepath to promote their GPUs (and now they're throwing Blackwell under the bus).

Jensen should have warned everyone that Blackwell will no longer support PhysX acceleration, but he didn't. Too busy filling his wardrobe with new leather jackets, eh?

If someone could rewrite the old PhysX SDK/*.dll (v2.7.x, v2.8.x) in modern, multi-threaded code (SSE4, AVX256, AVX512), we wouldn't need a GPU. I'd be willing to donate some money if someone made a Patreon specifically for this purpose. There are thousands of people out there wanting game preservation.

Intel 12400F yields 768 GigaFlops @ 4 GHz all-core multi-threading. AMD Ryzen 5950X is almost 2 TeraFlops.
 

Garibaldi

Member
Woah woah woah there...stop providing context to the conversation!

Some of the Rage_Tyrants are gonna get pissed if you keep this up...

I also agree with you though, old game is old. Are people really playing Oblivion (just an example of an old game...not specific to this topic) on their 5090's (outside of creating click bait for youtube videos & social media posts)
The problem isn't them removing it. The problem is the alternative is not fit for purpose. From now on, assuming they don't implement a hardware based workaround for this in the future, we are stuck with a software (CPU) based solution for 32bit physx titles.

As shown in the benchmarks posted, even on modern high end CPUs this is poorly implemented and poorly performing. If they had implemented it to actually use modern CPU extensions and decent core/thread usage, we likely wouldn't care. But they haven't.
 
Last edited:

Wolzard

Member
Affected games:

Monster Madness: Battle for Suburbia
Tom Clancy’s Ghost Recon Advanced Warfighter 2
Crazy Machines 2
Unreal Tournament 3
Warmonger: Operation Downtown Destruction
Hot Dance Party
QQ Dance
Hot Dance Party II
Sacred 2: Fallen Angel
Cryostasis: Sleep of Reason
Mirror’s Edge
Armageddon Riders
Darkest of Days
Batman: Arkham Asylum
Sacred 2: Ice & Blood
Shattered Horizon
Star Trek DAC
Metro 2033
Dark Void
Blur
Mafia II
Hydrophobia: Prophecy
Jianxia 3
Alice: Madness Returns
MStar
Batman: Arkham City
7554
Depth Hunter
Deep Black
Gas Guzzlers: Combat Carnage
The Secret World
Continent of the Ninth (C9)
Borderlands 2
Passion Leads Army
QQ Dance 2
Star Trek
Mars: War Logs
Metro: Last Light
Rise of the Triad
The Bureau: XCOM Declassified
Batman: Arkham Origins
Assassin’s Creed IV: Black Flag

Using only the CPU has a giant impact:



NV doesn't Open Source anything. Even in Linux I think you still need the binary driver to do anything other than 2D.

It has been about 2 years since Nvidia has made open-source the kernel module and this will be the future model, so much so that the RTX 50 on Linux supports only the open module.


Part of what was proprietary was migrated to the GPU firmware. Userspace also remains proprietary, but with the kernel module open, Red Hat is developing an open source userspace called NVK.

 

S0ULZB0URNE

Member
Affected games:

Monster Madness: Battle for Suburbia
Tom Clancy’s Ghost Recon Advanced Warfighter 2
Crazy Machines 2
Unreal Tournament 3
Warmonger: Operation Downtown Destruction
Hot Dance Party
QQ Dance
Hot Dance Party II
Sacred 2: Fallen Angel
Cryostasis: Sleep of Reason
Mirror’s Edge
Armageddon Riders
Darkest of Days
Batman: Arkham Asylum
Sacred 2: Ice & Blood
Shattered Horizon
Star Trek DAC
Metro 2033
Dark Void
Blur
Mafia II
Hydrophobia: Prophecy
Jianxia 3
Alice: Madness Returns
MStar
Batman: Arkham City
7554
Depth Hunter
Deep Black
Gas Guzzlers: Combat Carnage
The Secret World
Continent of the Ninth (C9)
Borderlands 2
Passion Leads Army
QQ Dance 2
Star Trek
Mars: War Logs
Metro: Last Light
Rise of the Triad
The Bureau: XCOM Declassified
Batman: Arkham Origins
Assassin’s Creed IV: Black Flag

Using only the CPU has a giant impact:





It has been about 2 years since Nvidia has made open-source the kernel module and this will be the future model, so much so that the RTX 50 on Linux supports only the open module.


Part of what was proprietary was migrated to the GPU firmware. Userspace also remains proprietary, but with the kernel module open, Red Hat is developing an open source userspace called NVK.


Dang.
Crazy thing is I don't know if nvidia will give a solution in future silicon.
 

Guilty_AI

Member
No offense to the ones bothered by this, but i remember PhysX back then to always have been quite disliked due to killing fps, with the effects often not looking all that good to begin with (those borderlands 2 liquids looked awful). Whenever i saw people online at the time talking about this, vast majority turned the option off and recommended others to turn them off. Honestly i dont feel the loss all that much.
 
Last edited:

JRW

Member
Dang.
Crazy thing is I don't know if nvidia will give a solution in future silicon.

It's possible to install a 2nd older Nvidia GPU along with a 50 series and select the 2nd GPU as the PhysX processor (you can view these options in Nvidia control panel).

I just don't how far back you can go like would the GTX 1060 sitting in my closet be good enough etc.
 
Top Bottom