The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Hybrid PhysX with NVIDIA+IGP?

    Discussion in 'Gaming (Software and Graphics Cards)' started by kisetsu17, Oct 10, 2012.

  1. kisetsu17

    kisetsu17 Took me long enough

    Reputations:
    289
    Messages:
    780
    Likes Received:
    2
    Trophy Points:
    31
    Hey guys,

    I dunno if this has been asked, but since there's such a thing as Hybrid PhysX for ATI+NVIDIA cards, do you know whether or not there are any out there with a solution to utilize both NVIDIA and the IGP for this? Case in point: I have a 650M GT and a 3610QM (with the Intel HD 4000). Now, the NVIDIA System software can switch PhysX processing to the CPU instead of the GPU. I set it as such, but the game I want to play with PhysX on at least medium, Borderlands 2, seems to still need the GPU processing for the PhysX effects because the usage of the Graphics processing core falls down to as much as 30% only whenever there are a lot of things happening in the screen (and even off; stones, limbs, liquids etc).

    Have you guys come across anything of the sort already?
     
  2. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    First off, the PhysX software stack was designed with NVIDIA GPUs in mind. Their software is highly optimized for their GPU architecture, and is not optimized at all (I might even argue they intentionally crippled the software stack) on CPUs. That is why you have noticed a massive drop in framerates when toggling that option.

    Second off, hybrid AMD/NVIDIA PhysX is a hack/workaround so that one can render a game with the AMD GPU and run PhysX on the NVIDIA GPU. If such a workaround existed for IGPs/NVIDIA you would have the same restriction ... games would have to render using the IGP (clearly, that's worse than rendering with the NVIDIA card).

    What I'm getting at is, for your particular system, you don't really have a choice -- both game and PhysX need to run on the NVIDIA GPU for an acceptable gaming experience.
     
    Cloudfire likes this.
  3. kisetsu17

    kisetsu17 Took me long enough

    Reputations:
    289
    Messages:
    780
    Likes Received:
    2
    Trophy Points:
    31
    Yeah I kinda get that. I just find it weird that when I use the CPU to render PhysX effects on Medium, GPU usage still falls down. But yeah, I know it's just meant for low physx settings.
     
  4. LakeShow89

    LakeShow89 Notebook Evangelist

    Reputations:
    0
    Messages:
    310
    Likes Received:
    3
    Trophy Points:
    31
    Were you able to get this to work I am trying to do the same.
     
  5. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    GPU usage is falling because the CPU is being loaded so heavily. The IGP can't be used to render or do the physX.
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    The only way to improve on PhysX is to get a second GPU dedicated to PhysX, so you'd need a new laptop and your options are quite limited.
     
  7. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    get a laptop that has sli and disable it and use one card for physx although it's practically overkill.
     
  8. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..well.. since you generally have too little grunt for rendering the graphics context on a laptop (you always use all the simd-units), and physx is designed to offload the cpu (by generously wasting simd-unts/smx-cores) - usually, reserving the nvidia gpu for graphics rendering and offloading physx to the cpu will actually improve your performance(as long as the cpu has cycles to spare, which a dual/quad core typically will). There are exceptions, probably having to do with specific implementations of physx that are intended to be used to update the graphics context directly in per frame operations. Some old games with custom physx routines end up there. Same if the game is extremely cpu intensive all of the time. And you might run into some examples where you see physx runs on the cpu generate slower effects, somewhat mismatched animation, and so on.

    But on a reasonably optimized and threaded game, you're normally better off using the graphics card only for rendering, even if the core load is not constantly at 100%. I think maybe the "auto" thing has something to do with the nvidia profiles as well, so certain games with known issues will actually run on the nvidia gpu. And I don't know if it's actually done, but it's at least technically possible to distribute the load between the igp/cpu and the nvidia gpu. That won't happen if the physx routines are forced to the gpu.
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    If the game supports SLI, this will give you much lower performance than enabling SLI and setting PhysX as 'auto-select'. Having a dedicated PhysX card has been shown to give little to no performance improvement if you already have a decent Nvidia GPU. The only scenario where it helps is if it's paired with an AMD card to enable GPU PhysX where it otherwise wouldn't be possible, although I'm not sure you're still able to do this anymore with recent Nvidia drivers.
     
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Just for what it's worth, Borderlands 2 just has a tendency to "dislike" some peoples' PCs. Sometimes I get drops below 30fps in that game regardless of my settings with a resounding 20% GPU load. Forcing to single card doesn't even do anything either. But it runs great for other people. I have no idea. It's apparently translated onto Borderlands the Pre-Sequel (but it happened on launch, so a ton of people noticed it). So if you're getting super low util and lots of framedrops and it's as a result of PhysX, well... you know what to do XD. Sorry you have to experience that though. Though in my case PhysX on or off didn't really help >_<.
     
  11. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Greg have the answer.
    PhysX is an Nvidia feature. Its coded with CUDA. Using Intel`s EU cores to emulate PhysX would probably be a good idea but I don`t think Intel is willing to invest too much money in a feature that they most likely would pay Nvidia to support with the IGP.

    If Nvidia gave away the library and the right to use it for free to the competitor with a graphic processor, there goes one of Nvidia`s marketing tools to get people to buy the GPUs from them.
    It would be a bad choice for Nvidia to allow it.
     
  12. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    That's true, of course. But it's more about the way the physx routines are actually implemented, as far as I know. At the abstract level, what we're really talking about is running very uncomplicated maths repeatedly(or in parallel, when several simd-units on the graphics card are prepared to run the same instruction) across several memory areas. And when you actually implement this, it's usually just several threads with repeated instructions. Which actually is reasonably well optimized to run on an intel cpu anyway.

    The exceptions are when we're talking about resubmits to the graphics context, that then introduce extra latency that typically the routines would not account for. And that can be running into all kinds of variable issues that depend completely on whether those physx optimized threads are for example meant to complete before draw-calls, that sort of thing. I don't know this for certain, but my impression is that unless for example physx is supported via middleware directly, developers tend to avoid relying on it. Like with everything else that might make graphics programming more interesting, although of course also less platform independent and therefore more difficult to deploy across different hardware. And instead they're opting for a more streamlined approach.

    So for laptop gaming: that most games, even if they have physx optimisations, tend to run reasonably well in software mode. Because "reasonably well" is typically what you will settle for on laptops, since you don't have any graphics card grunt to go on. That you're likely to run the resolution and the effects up as far as you can, or that even if you disabled all the effects, there'd be few new free simd-units/cores anyway.
     
  13. LakeShow89

    LakeShow89 Notebook Evangelist

    Reputations:
    0
    Messages:
    310
    Likes Received:
    3
    Trophy Points:
    31
    The plan is to use an DIY EGPU HD 7770 2GB GDDR5 as the primary card and since my system doesn't have integrated graphics but instead an Nvidia GT 240m, I would like to use that as a PhysX card and play games like Batman, Grid, Dirt, Mafia etc, basically games from 2009-2013 on high settings in 1080p, yes I am very cheap. I will probably post a thread detailing my experience in 2 months.