The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    It's Here! Perfect scaling when using multiple GPUs may soon be possible

    Discussion in 'Gaming (Software and Graphics Cards)' started by 1337haxorz, Aug 22, 2009.

  1. 1337haxorz

    1337haxorz Notebook Guru

    Reputations:
    13
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    UPDATE: http://www.engadget.com/2009/09/23/l...d-bring-peace/

    A year ago a a company called Lucid Logix presented a technology called Hydra. In short, the Hydra is a multi-GPU solution that balances all GPU workload before it's sent to the GPU. No CrossFire or SLI is needed, and they even claim to be able to run ATI and nVidia cards on the same machine. Up to 4 cards can be combine, regardless of brand, type, etc and they claim near-100% scaling.
    This technology has recently been spotted on MSI's P55 Big Bang Motherboard.

    ( link)

    ( Source)
     
  2. houstoned

    houstoned Yoga Pants Connoisseur.

    Reputations:
    2,852
    Messages:
    2,224
    Likes Received:
    388
    Trophy Points:
    101
    interesting stuff. hopefully everything will work as designed.
     
  3. Rahul

    Rahul Notebook Prophet

    Reputations:
    1,741
    Messages:
    6,252
    Likes Received:
    61
    Trophy Points:
    216
    One thing I am excited about is to be able to use older videocards lying around to boost performance as much as possible, not have those older cards collect dust, such as a 7900GT with a GTX 285 or is it not a good concept? Even if it isn't, its exciting to be able to use a modern ATI and Nvidia card together aside from Physx.

    But this will definitely hit desktops first and take a while to hit notebooks if ever.

    For notebooks, they first need to become more upgradable and modular, I'm excited about external GPUs, we just need a higher bandwidth connection for them, a new external port to use cards at their full potential, especially desktop video cards as external on a notebook, but not bottleneck them at x1 speed like the current Expresscard standard.
     
  4. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Sounds familiar, haven't the company in question been working on this for a while? I seem to recall a similar article mentioning the technology last year when it was still under development.
     
  5. 1337haxorz

    1337haxorz Notebook Guru

    Reputations:
    13
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    That is one of it's major selling points, it can take both cards and use all of the power that it possibly can from both of them.

    I disagree, If this technology takes off, and with their massive backing from Intel, I see no reason that manufactures would not put this into every new notebook. In my opinion, the first notebook that uses this technology will have huge sales if it is marketed well, and others will follow.

    Yes, there was a thread about it a year ago by storm effect [URL="http://forum.notebookreview.com/showthread.php?p=3799580" here[/URL].
     
  6. imaspin17

    imaspin17 Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    It is actually unable to have both ATI and Nvidia cards in the same system... They said the problem isn't so much with the hardware, but that operating systems only allow for 1 type of graphics driver to be installed at a time.
     
  7. Rahul

    Rahul Notebook Prophet

    Reputations:
    1,741
    Messages:
    6,252
    Likes Received:
    61
    Trophy Points:
    216
    I believe XP and 7 can have more than one graphics driver installed but Vista cannot, but I have heard now there is even a hack for Vista to do it.

    So for instance, you can have an ATI card as your main GPU and use an Nvidia GPU for Physx like some do.
     
  8. mechrock

    mechrock Notebook Evangelist

    Reputations:
    85
    Messages:
    594
    Likes Received:
    0
    Trophy Points:
    30
    I did not know you could do that. So your saying I have a Ati Card now. If I add a 8800 gt or something that supports Physx with the hack then I would be able to use Cuda?
     
  9. Tony_A

    Tony_A Notebook Evangelist

    Reputations:
    67
    Messages:
    487
    Likes Received:
    0
    Trophy Points:
    30
    I'll believe it when I see it.

    No one's been allowed to benchmark this product in action, even though hydra was unveiled a year ago.Wonder why?

    Call me skeptical, but this looks like another idea/patent company looking for either a buyout or some more VC money.
     
  10. 1337haxorz

    1337haxorz Notebook Guru

    Reputations:
    13
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    A year ago it was just a tech demo, and most tech demos usually never get past that stage. This is different because they have more then enough money from Intel for them to properly develop their product. Also, I doubt MSI would put it on one of their products, or Intel would still keep funding them if it didn't work somewhat as promised.

    Like you, though, I would like to see some benchmarks and stats for this.
     
  11. Intensity

    Intensity Notebook Geek

    Reputations:
    5
    Messages:
    79
    Likes Received:
    0
    Trophy Points:
    15
    An online seller in Malaysia is having it for sale for about $400, the MSI P55 Big Bang

    Is it me, or is the board really expensive
     
  12. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Here's the #1 reason. Because manufacturers won't stay afloat if notebooks become modular. The last thing they want is for you to be able to keep up with the times by upgrading a notebook over a course of years.
     
  13. mew1838

    mew1838 Team Teal

    Reputations:
    294
    Messages:
    1,231
    Likes Received:
    5
    Trophy Points:
    56
    Great post, totally agree
     
  14. Rahul

    Rahul Notebook Prophet

    Reputations:
    1,741
    Messages:
    6,252
    Likes Received:
    61
    Trophy Points:
    216
    If desktops can be modular and upgradable over time, notebooks should be as well. Perhaps not models from Toshiba, HP, Dell, etc. but at least ones from Sager, Clevo, etc please?
     
  15. magma_saber

    magma_saber Notebook Consultant

    Reputations:
    66
    Messages:
    234
    Likes Received:
    0
    Trophy Points:
    30
    i doubt this will ever really take off. nvidia and ati will stop it from succeeding. neither of them will want to have their cards working together with their competition.
     
  16. maozdawgg

    maozdawgg Notebook Geek

    Reputations:
    51
    Messages:
    81
    Likes Received:
    0
    Trophy Points:
    15
    One very good reason would be opposition from ATI and NVIDIA:

    Now of course I'm still not sure how you can use bnoth ATI and NVIDIA in the same system because it wasn't unti very recently did intel comeout with the x58 chipset that allowed for NVIDIA or ATI SLI/CF capablity, (whereas before it was eitehr the nvidia chipset or the AMD one) so not sure how you can SLFire an nvidia and an ATI card...not to mention you will need to somehow run two drivers at the same time for the two GPUs...... :confused:
     
  17. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    If this tech does what it says it does, I could see ATi and nVidia opposing this (more so for nVidia because they make more money with the SLi license) and Intel possibly buying Lucid Logix (Intel already gave Lucid Logix grant money, and lots of it). That way, Intel won't have to worry about having to get licenses for Crossfire of SLi to work with their chipsets.
     
  18. Starfox

    Starfox Notebook Evangelist

    Reputations:
    81
    Messages:
    302
    Likes Received:
    0
    Trophy Points:
    30
    "Perfect scaling"? No, never gonna happen with this method, and I'm willing to bet you're using the word perfect in a non-orthodox way..
     
  19. 1337haxorz

    1337haxorz Notebook Guru

    Reputations:
    13
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    They claim it can scale cards 85%-95% of both the cards potential. It isn't "perfect" but if it works it will be the closest thing to it.

    I don't think it's so much using an ATI card with a nVidia card as it is using an older card with a newer card. If this takes off then people wouldn't be forced to buy two newer cards, instead they will use their old card + a new card. Although on the other side of the coin, people will know that if they buy two newer cards, they will get the full bang for their buck.
     
  20. Starfox

    Starfox Notebook Evangelist

    Reputations:
    81
    Messages:
    302
    Likes Received:
    0
    Trophy Points:
    30
    BULL, unless you're rendering identically sized screen aligned quads with pixel shaders that perform no texture accesses.

    Seriously, there're a few problems: Z-buffer resolve, texture duplication (if 2 objects drawn on different cards share textures), and the incompatibilities introduced by rounding errors and anisotropic filtering algorithms, and lots more. There's a thread on Beyond3D (Which has lots of bright minds when it comes to computer graphics on its forums) discussing it in detail, and the general consensus seems to be that even near-perfect scaling or anything better than SLI's AFR sounds like a pipe dream. Here's the thread: http://forum.beyond3d.com/showthread.php?t=49028