The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    PhysX: ATi Time Bomb, hilarious

    Discussion in 'Gaming (Software and Graphics Cards)' started by ziddy123, Apr 24, 2010.

  1. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
  2. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    LOL.

    Didn't nVidia get sued for this already? That's kinda childish IMHO.
     
  3. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    No idea, but this is with their latest PhysX release. It's funny because even if you buy an Nvidia to do a hybrid PhysX setup, they won't let you.

    But there is a patched PhysX for hybrid setup working.
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Nvidia are a bunch of arsehats. I'm glad I bought a 5870 instead of a GTX 470.
     
  5. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    Its a better product anyways, almost the same performance in every aspect with lower heat/noise/energy
     
  6. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    More hilarity with Nvidia PhysX ensues.

    EVGA has now banned discussions of PhysX mods to enable hybrid PhysX setups.

    ROFL ROFL.

    Nvidia PhysX no longer supports AGEIA PhysX Processors

    ROFL ROFL ROFL

    Any questions of how dumb Nvidia PhysX should now be answered. This physics engine is so incredibly closed system that it won't even support PhysX hardware unless it's been completely branded by Nvidia.
     
  7. lowlymarine

    lowlymarine Notebook Deity

    Reputations:
    401
    Messages:
    1,422
    Likes Received:
    1
    Trophy Points:
    56
    Let's see:
    -No chipset business left.
    -Tegra is DOA.
    -Fermi is a massive joke: The power requirements are obscene. The heat output could fry eggs. They're practically selling cards at a loss and the 5870 is still $100 less. The sub-20% yields that make it impossible to buy a GTX 480 even if you suffered some sort of massive head trauma and decided it was a good idea. DirectCompute performance is abysmal, making Tesla a joke, too.
    -Now this, which just screams "anti-trust lawsuit."

    nVidia has dug themselves a grave so deep they're about to hit magma. It's funny really, nVidia bought out 3dfx when it failed. Who's going to buy out nVidia now that they're going down the exact same way?
     
  8. Vaath

    Vaath Notebook Deity

    Reputations:
    77
    Messages:
    826
    Likes Received:
    0
    Trophy Points:
    30
    Gee, looks like my next desktop I build is gonna have to be sporting ATI, a first for me. Been kind of an Nvidia fanboy for awhile, losing faith fast. Funny thing is the same thing happened with me and 3dfx back then, my luck sucks for picking video card companies. LOL
     
  9. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Watch it be something as insidious as Intel making an offer for Nvidia lol.

    Honestly I don't think Nvidia will go under. Fermi's issue is a) yields b) power needs c) thermal output. Much of this issue is the addition of transistors for GPGPU related purpose, not just purely DX11 and OpenGL compatibility. For lower end cards, the feature set of the series to me is pretty good as far as capabilities go, since you don't need such extreme amounts of power in order to see the benefits of CUDA technology in mainstream computing. Also Nvidia is in bed with Apple, with both the Chipset/IGP and dedicated GPUs in iMacs and Macbooks. There are also royalties for the PS3's RSX and it's probably the PS4 will be using an Nvidia GPU and the Nintendo 3DS is rumored to be using Nvidia's Tegra mobile system. Fermi low and medium end parts haven't even come out yet, and I think Nvidia will make a decent amount with them. They won't suffer the massive power and heat issues that the GTX 400s do, though it's still most likely they will have worse power consumption per capability as their ATi counterparts. However in the lower end sectors, this won't matter so much except in slimline computers and laptops.
     
  10. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
    I think intel has tried in the past to buy Nvidia but nothing has progress, but who knows with the current situation... But would be interesting to see Intel+Nvidia vs AMD+ATI, endless wars.
     
  11. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Intel already paid $1.25 billion for licenses to AMD patents for the next ~4.5 years...they have no need to buy Nvidia anymore.
     
  12. Shadowfate

    Shadowfate Wala pa rin ako maisip e.

    Reputations:
    424
    Messages:
    1,329
    Likes Received:
    1
    Trophy Points:
    56
    Well Intel and AMD did not disclose what patents they paid assuming that is what the 1.25 Billion they really paid for.

    I remember the payment was also for the Law Suit and several other things.

    The patent mentioned may be several OLD patents by ATI and not the NEW patents in thier current and upcoming GPUs.

    if AMD just agreed to share ALL thier patents with their main rival that has BIG pockets they just dug their own graves.