The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Would a Single Card dual GPU work as an egpu?

    Discussion in 'e-GPU (External Graphics) Discussion' started by EpicBlob, Jul 16, 2012.

  1. EpicBlob

    EpicBlob Notebook Evangelist

    Reputations:
    49
    Messages:
    410
    Likes Received:
    16
    Trophy Points:
    31
  2. __-_-_-__

    __-_-_-__ God

    Reputations:
    337
    Messages:
    1,864
    Likes Received:
    10
    Trophy Points:
    56
    I don't recomend it for 3 reasons. 1 I think there are better priced and better performing cards. 2 it consumes much more power for the performance it delivers compared to other solutions 3 dual gpu cards need a much higher bandwith. in egpu, less is more, you want more performance while using less bandwith.
     
  3. borealiss

    borealiss Notebook Guru

    Reputations:
    12
    Messages:
    55
    Likes Received:
    0
    Trophy Points:
    15
    stick with single gpu cards. you'll need 2x the MMIO resources to run that card the you would a single gpu.
     
  4. EpicBlob

    EpicBlob Notebook Evangelist

    Reputations:
    49
    Messages:
    410
    Likes Received:
    16
    Trophy Points:
    31
    Yeah I guessed that would be the case. Not looking to buy it as I am content with my card, but it would be interesting if possible. If this were to be connected using TB 2.0 x4 connection and memory resources somehow become a non-issue... An external dual-gpu card would be awesome :D
     
  5. __-_-_-__

    __-_-_-__ God

    Reputations:
    337
    Messages:
    1,864
    Likes Received:
    10
    Trophy Points:
    56
    even with TB2.0 x4 it would be an issue. TB is not the holy grail. it still bottlenecks the gpu. bandwith is not enough. TB2.0 x8 wouldn't bottleneck most cards. it would bottleneck some, like dual gpu ones.
     
  6. borealiss

    borealiss Notebook Guru

    Reputations:
    12
    Messages:
    55
    Likes Received:
    0
    Trophy Points:
    15
    yeah, you would need more bandwidth. you'd probably be OK if you used the 2 separate thunderbolt ports on a macbook pro, provided you have enough MMIO ranges free. i think there are 2 separate channels.

    overall SLI is sort of a no-win situation in my opinion. there are loads of compatibility issues, it's barely perceptibly faster in most realworld settings, unless you're running 4 monitors at 1080p or higher with everything turned on, etc...

    If you're doing that, might as well build a dedicated desktop system... or go buy an alienware system if you must have SLI. their packaging would be so much cleaner then egpus.
     
  7. jamalabad

    jamalabad Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    I just started visiting formus in general now that i have egpu... but i'm pretty sure SLI would make a HUGE difference in an egpu setup. e.g, in link below 560ti sli is ~75% higher framerates than 560ti. in terms of 'persceptibility', maybe ~75% doesn't matter much if you're already at 60fps, but in egpu 75% is night and day difference. e.g. I'm running x220 560ti @ x1.2opt, and getting 35fps on GTA IV @ ~60-70 of max settings on 720p. e.g. setting anything from high to very high / high to medium and it is TOAST (drops to ~15fps).

    Plus I'm looking to do some serious CUDA bidnez w/ Matlab, so a performance jump like that is huge. I have/need a laptop, so this is the answer. except that nvidia disables sli it at x1.x, which makes it actually not the answer.

    Charts, benchmarks 2011 Gaming Graphics Charts, Enthusiast Index
     
  8. borealiss

    borealiss Notebook Guru

    Reputations:
    12
    Messages:
    55
    Likes Received:
    0
    Trophy Points:
    15
    you're at a single gpu and running into bottlenecks because you're bandwidth limited on the link, not compute or bandwidth limited on the card.

    adding a second card won't do much if anything for you because you can't even max out the full potential of the first card. it might even make it slower because of some of the overhead with SLI.

    this is all theorizing though, i could be wrong. go buy 2 cards for a egpu setup and run the benchmarks.

    for matlab, you probably aren't as bandwidth limited as games, so for a gpgpu application, you'll probably benefit. but then you're not really using the "SLI" aspect of it unless an application is written specifically for it. you are just farming out computations to 2 separate gpu entities, not acting in lockstep to solve a single problem. that is unless matlab specifically supports advanced scheduling to actually use a SLI setup, but i doubt it.
     
  9. __-_-_-__

    __-_-_-__ God

    Reputations:
    337
    Messages:
    1,864
    Likes Received:
    10
    Trophy Points:
    56
    depending on the application you are using, because not all programs use the same gpu resources, it will make it slower for sure. I've seen some demonstrations. Don't remember where now. but it's easy to make those tests on a desktop by duck taping some lanes of the pci-e connector.