• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 50 Series silently removed 32-bit PhysX support

hinch7

Member
No offense to the ones bothered by this, but i remember PhysX back then to always have been quite disliked due to killing fps, with the effects often not looking all that good to begin with (those borderlands 2 liquids looked awful). Whenever i saw people online at the time talking about this, vast majority turned the option off and recommended others to turn them off. Honestly i dont feel the loss all that much.
Looking at some of the video's I'm inclined to agree. Not worth kicking up a fuss about.

It sucks that 5000 series doesn't support it but PhysX is largely outdated anyways.
 
No offense to the ones bothered by this, but i remember PhysX back then to always have been quite disliked due to killing fps, with the effects often not looking all that good to begin with (those borderlands 2 liquids looked awful). Whenever i saw people online at the time talking about this, vast majority turned the option off and recommended others to turn them off. Honestly i dont feel the loss all that much.

No worries Jensen, I got you. You just keep working at pumping up that stock price.
 
Looking at some of the video's I'm inclined to agree. Not worth kicking up a fuss about.

It sucks that 5000 series doesn't support it but PhysX is largely outdated anyways.
The concept of GPGPU-accelerated physics was never outdated, it's just that the execution (optimization) was subpar on PhysX and Microsoft didn't push for standardization (DirectPhysics API).

15+ years ago gaming was better than today...

I still remember Uncharted 2 utilizing Havok to offer Cell SPU-accelerated physics (destruction, exploding cars, particle effects, RPG/grenade/propane tank explosions etc):

It was a good era for both PC and console gamers.

I don't like static worlds with no physics interaction. I like experimenting with game objects (sometimes you see cool bugs).

What's the point of having 100 Teraflop GPUs (RTX 5090)? Just for graphics (and a little bit of AI)?

Imagine if Nintendo had 100 Teraflops on Switch to develop Zelda BoTW... they would put them to good use gameplay-wise.
 
Last edited:

hinch7

Member
The concept of GPGPU-accelerated physics was never outdated, it's just that the execution (optimization) was subpar on PhysX and Microsoft didn't push for standardization (DirectPhysics API).

15+ years ago gaming was better than today...

I still remember Uncharted 2 utilizing Havok to offer Cell SPU-accelerated physics (destruction, exploding cars, particle effects, RPG/grenade/propane tank explosions etc):

It was a good era for both PC and console gamers.

I don't like static worlds with no physics interaction. I like experimenting with game objects (sometimes you see cool bugs).

What's the point of having 100 Teraflop GPUs (RTX 5090)? Just for graphics (and a little bit of AI)?

Imagine if Nintendo had 100 Teraflops on Switch to develop Zelda BoTW... they would put them to good use gameplay-wise.

Havok or its iterations exists and is pretty much standardized. All without completely tanking performance or needed and external GPU to handle calculations.

We had all of this with Half-Life 2 and Battlefield Bad Company 2 decades ago but sadly not enough developers care for more advanced physics outside superfluous ones like hair etc. Physx is just legacy along the lines of hairworks etc.
 
Last edited:

dgrdsv

Member
nVidia can earn a bit of goodwill if they open-source older PhysX SDK versions (CPU/Ageia PPU).
It could be a lot more complex than one would think if the code weren't supposed to be open source from the start.
But let's hope that if not that then people will just reverse engineer the calls and figure something out.

Or you could just buy a GTX 1630 to handle PhysX in these five or so games where it even matters.
 
Havok or its iterations exists and is pretty much standardized. All without completely tanking performance or needed and external GPU to handle calculations.

We had all of this with Half-Life 2 and Battlefield Bad Company 2 decades ago but sadly not enough developers care for more advanced physics outside superfluous ones like hair etc. Physx is just legacy along the lines of hairworks etc.
Sure we had CPU physics, just like we had software rasterization back in the 90s.

But the GPU is a better fit for parallel processing... whether it's nVidia, AMD, or even Intel.

Or you could just buy a GTX 1630 to handle PhysX in these five or so games where it even matters.
Some people have Micro-ATX motherboards with only one PCIe slot (personally I hate them, but it's a trend in the PC industry these days, along with small, console-like cases), there's also increased power consumption in idle (maybe not much, but every watt counts in some countries) and last but not least, nVidia can easily remove PhysX altogether for all GPUs and/or stop supporting Maxwell/Turing cards (just like they did with Kepler).

Our best bet is reverse-engineering PhysX *.dll files.
 

dorkimoe

Member
so what is the best card out right now? i was gonna get a 5080 to upgrade from 3080...but all these issues is it worth it?
 

IbizaPocholo

NeoGAFs Kent Brockman

GTX 580 vs RTX 5080 Performance

- The GTX 580, despite being released in 2010, is presented as being up to 81 percent better than the more recent RTX 5080 in certain performance metrics.
- In gaming tests, the GTX 980 outperformed the RTX 5080 in several scenarios, indicating a better gaming experience from older hardware.
- The video features gameplay from Mirror's Edge, showcasing the RTX 5080's struggles against older graphics cards during performance evaluations.
- The GTX 980 was utilized alongside the RTX 5080 as an accelerator to enhance overall performance, highlighting the dependence on older technology for certain games.

Nvidia's PhysX Technology and Recent Changes

- Nvidia's slogan, "the way it's meant to be played," emphasized the transformative impact of PhysX on game graphics and gameplay.
- Recent user feedback indicated that PhysX was not functioning correctly on the new 50 Series cards, with Nvidia confirming that 32-bit CUDA applications are now deprecated.
- The removal of support for 32-bit CUDA applications has raised concerns about the future of vendor-specific graphical improvements and their reliability.
- The discussion highlights the skepticism surrounding proprietary technologies and their potential for abandonment, as seen with PhysX.

Impact of PhysX on Gaming Experience

- PhysX was originally developed to facilitate the integration of high-quality physical effects in games, primarily optimized for Nvidia hardware.
- The technology has faced criticism for its performance issues on non-Nvidia hardware, limiting the gaming experience for users of competing graphics solutions.
- The historical context of PhysX's development reveals its evolution from a discrete physics processing unit to its current CUDA core integration.
- Games utilizing PhysX have been shown to deliver significantly different visual experiences, especially in titles like Mafia 2, where disabling PhysX affects the game's graphical fidelity.

Testing Methodology and Game Selection

- The testing focused on five 32-bit games known for their use of PhysX technology, including Batman Arkham City, Borderlands 2, Mafia 2, Metro Last Light, and Mirror's Edge.
- The selection criteria excluded titles like Assassin's Creed 4 Black Flag due to performance caps and compatibility issues with AMD hardware.
- The benchmarks aimed to determine the performance differences between GPU-accelerated PhysX and CPU-based processing across various graphics cards.
- Each game was tested under controlled conditions, with specific settings adjusted to analyze the impact of PhysX on performance metrics.

Performance Results of Selected Games

- In Mafia 2, the GTX 580 significantly outperformed the RTX 5080 when PhysX was enabled, achieving better frame rates and stability.
- Metro Last Light's performance deteriorated on the RTX 5080 during PhysX-heavy scenes, highlighting the limitations of newer GPUs in handling older game technologies.
- Mirror's Edge displayed substantial performance issues when PhysX was disabled, reinforcing the importance of the technology for maintaining graphical integrity.
- Borderlands 2 demonstrated that the absence of PhysX effects resulted in a visually diminished experience, with significant graphical elements missing when the feature was turned off.

Conclusion on Nvidia's Technological Direction

- The discussion concludes with concerns about Nvidia's trend of developing exclusive technologies that may be abandoned, leaving users reliant on outdated features.
- The impact of removing support for 32-bit applications serves as a warning for future proprietary technologies and their long-term viability in gaming.
- The potential for future graphics technologies to become obsolete raises questions about their reliability and the commitment of Nvidia to support them.
- Overall, the video emphasizes the importance of understanding the implications of vendor-specific technologies in the gaming landscape, particularly as hardware evolves.
 
Last edited:
Top Bottom