The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    HYDRA Engine - Perfect GPU Scaling

    Discussion in 'Gaming (Software and Graphics Cards)' started by StormEffect, Aug 19, 2008.

  1. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    ( Link to full article)

    It seems a company called LucidLogix may beat Nvidia and ATI at their own game. PC Perspective has written an article detailing how Lucid's hardware and software may allow for perfect GPU scaling using an unlimited number of GPUs of any model within a respective brand.

    Here is a small clip of the simple, two page article:

    ( Link to full article)

    Here is a post I made further down, answering a question:

    They claim almost perfectly linear performance.

    (all examples use made up starting FPS values)

    Simple example: 3 x ATI 4850

    1 x 4850 = 15 FPS in Crysis at Max Settings.
    2 x 4850 = 30 FPS in Crysis at Max Settings.
    3 x 4850 = 45 FPS in Crysis at Max Settings.

    More complicated example: 1 x Nvidia GTX 280, 1 x Nvidia 9800GTX, 1 x Nvidia 8800GT

    GTX 280 = 25 FPS in Crysis at Max Settings.
    9800 GTX = 20 FPS in Crysis at Max Settings.
    8800 GT = 10 FPS in Crysis at Max Settings.

    GTX 280 + 9800 GTX = 45 FPS in Crysis at Max Settings.
    GTX 280 + 9800 GTX + 8800 GT = 55 FPS in Crysis at Max Settings.

    So essentially, Hydra is taking the place of CrossfireX and SLI, and according to their claims they make perfect use of what they are given, no wasted GPU power.

    They claim we'll see this integrated into motherboards and certain GPU boards by 2009, I am pretty excited!
     
  2. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    Sounds great on paper but will we ever see it? If anything the big companies will steal the idea and rename it.

    But how would you have a Geforece 6600 & 280 working together when they use different drivers these days... that would have to be gotten around first I think a way to have different drivers for each card if indeed you did mix and match card models that are far apart.

    Even more fun how about we toss ATI and Nvidia cards together :p
     
  3. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    The article does a decent job of explaining it. Even now, Nvidia drivers still use a unified model. So while one driver is not the best for all GPUs, it will work for almost all of them (going back to the 6 series).

    Allegedly we'll see actual hardware on the market by early 2009.

    Also, the system wont work with different brands of GPU, so no Nvidia + ATI setups. Interestingly enough, any system with the specified chip will be able to run multiple GPUs of either ATI or Nvidia. In other words, nobody needs CrossFire and SLI certification and hardware anymore, if this works out.
     
  4. royk50

    royk50 times being what they are

    Reputations:
    258
    Messages:
    1,975
    Likes Received:
    0
    Trophy Points:
    55
    interesting enough one of their investors are Intel.
     
  5. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Interesting concept, but how does this translate into improved gaming performance on multiple GPUs?
     
  6. Ennea

    Ennea wwwwww

    Reputations:
    62
    Messages:
    1,291
    Likes Received:
    2
    Trophy Points:
    56
    Wow. This'll turn serious if they actually complete this.
     
  7. Nocturnal310

    Nocturnal310 Notebook Virtuoso

    Reputations:
    792
    Messages:
    2,708
    Likes Received:
    0
    Trophy Points:
    0
    some of the best technologies remain in the pipeline forever...hope to see this one come out soon..

    The graphic industry is all messed up rite now..too many models too many games.

    if there is a graphic card which can be future safe for even 2 yrs..its **** good.
     
  8. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    They claim almost perfectly linear performance.

    (all examples use made up starting FPS values)

    Simple example: 3 x ATI 4850

    1 x 4850 = 15 FPS in Crysis at Max Settings.
    2 x 4850 = 30 FPS in Crysis at Max Settings.
    3 x 4850 = 45 FPS in Crysis at Max Settings.

    More complicated example: 1 x Nvidia GTX 280, 1 x Nvidia 9800GTX, 1 x Nvidia 8800GT

    GTX 280 = 25 FPS in Crysis at Max Settings.
    9800 GTX = 20 FPS in Crysis at Max Settings.
    8800 GT = 10 FPS in Crysis at Max Settings.

    GTX 280 + 9800 GTX = 45 FPS in Crysis at Max Settings.
    GTX 280 + 9800 GTX + 8800 GT = 55 FPS in Crysis at Max Settings.

    So essentially, Hydra is taking the place of CrossfireX and SLI, and according to their claims they make perfect use of what they are given, no wasted GPU power.

    They seriously claim we'll see this integrated into motherboards and certain GPU boards by 2009, I am pretty excited!
     
  9. stirfriedsushi

    stirfriedsushi Confuse a Cat LTD

    Reputations:
    60
    Messages:
    744
    Likes Received:
    0
    Trophy Points:
    30
    does look cool, but i am wary.
     
  10. liquidfir3

    liquidfir3 Notebook Guru

    Reputations:
    0
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    Does this mean that graphics cards will never really be outdated? If I have a basic graphics card today, and upgrade to a new card every 2 years, will I be able to link the 6 graphics cards I bought over 10 years together? Then the card I bought 10 years ago wouldn't be sitting in a box but giving me 2 fps in any next-gen game!
     
  11. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    I'm going out on a limb here, but provided all of the graphics cards you have include support for the version of DirectX and Shader Model you are using in whatever respective game, you could do exactly what you say here. This is provided that the HYDRA Engine chip is integrated directly into the motherboard.
     
  12. maestro manders

    maestro manders Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    This is an awesome article. Now all we need is motherboards with over 10 PCIe slots, lol.
     
  13. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Sounds fantastic in theory, but I don't know if Nvidia and ATi will let their proprietary linked GPU platform go to waste due to this upstart technology which is as yet unproven in the market.
     
  14. Phil17

    Phil17 Notebook Consultant

    Reputations:
    13
    Messages:
    292
    Likes Received:
    0
    Trophy Points:
    30
    This is an amazing idea! It's about time that the outdated SLI/Crossfire got replaced. It's not like they will double the performance of a system anyway, and if I can use older cards, all the better. I just hope ATI&Nvidia don't kill the project...
     
  15. unnamed01

    unnamed01 Notebook Deity

    Reputations:
    194
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
    Is this one of those "concept car" things/ideas where they talk about how great it is but is never made/implemented?
     
  16. Robgunn

    Robgunn Notebook Evangelist

    Reputations:
    163
    Messages:
    383
    Likes Received:
    0
    Trophy Points:
    30
    I bet Intel is very interested in this for Larrabee. They did claim near linear performance scaling for larrabee. I bet this is why.

    This is pretty neat tech. Its basically a scene analyzer that splits up the rendering based on scene complexity. This would also be great for Ray-tracing and photon mapping as well.