Hey guys,
I dunno if this has been asked, but since there's such a thing as Hybrid PhysX for ATI+NVIDIA cards, do you know whether or not there are any out there with a solution to utilize both NVIDIA and the IGP for this? Case in point: I have a 650M GT and a 3610QM (with the Intel HD 4000). Now, the NVIDIA System software can switch PhysX processing to the CPU instead of the GPU. I set it as such, but the game I want to play with PhysX on at least medium, Borderlands 2, seems to still need the GPU processing for the PhysX effects because the usage of the Graphics processing core falls down to as much as 30% only whenever there are a lot of things happening in the screen (and even off; stones, limbs, liquids etc).
Have you guys come across anything of the sort already?
-
First off, the PhysX software stack was designed with NVIDIA GPUs in mind. Their software is highly optimized for their GPU architecture, and is not optimized at all (I might even argue they intentionally crippled the software stack) on CPUs. That is why you have noticed a massive drop in framerates when toggling that option.
Second off, hybrid AMD/NVIDIA PhysX is a hack/workaround so that one can render a game with the AMD GPU and run PhysX on the NVIDIA GPU. If such a workaround existed for IGPs/NVIDIA you would have the same restriction ... games would have to render using the IGP (clearly, that's worse than rendering with the NVIDIA card).
What I'm getting at is, for your particular system, you don't really have a choice -- both game and PhysX need to run on the NVIDIA GPU for an acceptable gaming experience.Cloudfire likes this. -
Yeah I kinda get that. I just find it weird that when I use the CPU to render PhysX effects on Medium, GPU usage still falls down. But yeah, I know it's just meant for low physx settings.
-
-
Meaker@Sager Company Representative
GPU usage is falling because the CPU is being loaded so heavily. The IGP can't be used to render or do the physX.
-
The only way to improve on PhysX is to get a second GPU dedicated to PhysX, so you'd need a new laptop and your options are quite limited.
-
get a laptop that has sli and disable it and use one card for physx although it's practically overkill.
-
..well.. since you generally have too little grunt for rendering the graphics context on a laptop (you always use all the simd-units), and physx is designed to offload the cpu (by generously wasting simd-unts/smx-cores) - usually, reserving the nvidia gpu for graphics rendering and offloading physx to the cpu will actually improve your performance(as long as the cpu has cycles to spare, which a dual/quad core typically will). There are exceptions, probably having to do with specific implementations of physx that are intended to be used to update the graphics context directly in per frame operations. Some old games with custom physx routines end up there. Same if the game is extremely cpu intensive all of the time. And you might run into some examples where you see physx runs on the cpu generate slower effects, somewhat mismatched animation, and so on.
But on a reasonably optimized and threaded game, you're normally better off using the graphics card only for rendering, even if the core load is not constantly at 100%. I think maybe the "auto" thing has something to do with the nvidia profiles as well, so certain games with known issues will actually run on the nvidia gpu. And I don't know if it's actually done, but it's at least technically possible to distribute the load between the igp/cpu and the nvidia gpu. That won't happen if the physx routines are forced to the gpu. -
-
-
Greg have the answer.
PhysX is an Nvidia feature. Its coded with CUDA. Using Intel`s EU cores to emulate PhysX would probably be a good idea but I don`t think Intel is willing to invest too much money in a feature that they most likely would pay Nvidia to support with the IGP.
If Nvidia gave away the library and the right to use it for free to the competitor with a graphic processor, there goes one of Nvidia`s marketing tools to get people to buy the GPUs from them.
It would be a bad choice for Nvidia to allow it. -
That's true, of course. But it's more about the way the physx routines are actually implemented, as far as I know. At the abstract level, what we're really talking about is running very uncomplicated maths repeatedly(or in parallel, when several simd-units on the graphics card are prepared to run the same instruction) across several memory areas. And when you actually implement this, it's usually just several threads with repeated instructions. Which actually is reasonably well optimized to run on an intel cpu anyway.
The exceptions are when we're talking about resubmits to the graphics context, that then introduce extra latency that typically the routines would not account for. And that can be running into all kinds of variable issues that depend completely on whether those physx optimized threads are for example meant to complete before draw-calls, that sort of thing. I don't know this for certain, but my impression is that unless for example physx is supported via middleware directly, developers tend to avoid relying on it. Like with everything else that might make graphics programming more interesting, although of course also less platform independent and therefore more difficult to deploy across different hardware. And instead they're opting for a more streamlined approach.
So for laptop gaming: that most games, even if they have physx optimisations, tend to run reasonably well in software mode. Because "reasonably well" is typically what you will settle for on laptops, since you don't have any graphics card grunt to go on. That you're likely to run the resolution and the effects up as far as you can, or that even if you disabled all the effects, there'd be few new free simd-units/cores anyway. -
The plan is to use an DIY EGPU HD 7770 2GB GDDR5 as the primary card and since my system doesn't have integrated graphics but instead an Nvidia GT 240m, I would like to use that as a PhysX card and play games like Batman, Grid, Dirt, Mafia etc, basically games from 2009-2013 on high settings in 1080p, yes I am very cheap. I will probably post a thread detailing my experience in 2 months.
Hybrid PhysX with NVIDIA+IGP?
Discussion in 'Gaming (Software and Graphics Cards)' started by kisetsu17, Oct 10, 2012.