The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    What is the best way to future-proof a laptop's GPU? (well, not forever, but for the most possible)

    Discussion in 'Gaming (Software and Graphics Cards)' started by leladax, Jun 20, 2013.

  1. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I noticed there are three ways:

    1. Get an extremely fast GPU now: This does not suit me since I never go for the very high end. Not only because of cost but because the very high end is also overpriced at its prime.

    2. Get a replaceable GPU: This sounds nice though I often find better offers locally on other laptops. I might be able to get a laptop from a neighboring country with a better make, say a Sager though I worry about warranty etc.

    3. Have a very good eGPU setup. I actually LOVE this idea. The reason for this is that I use my laptop as a mobile desktop that I set up in two different places with external monitors and all. So for my lifestyle, an extra external device is not that bad especially if it's light (a small ATX for example is NOT, contrary to what one may assume, they are huge compared to laptops).

    So.. IMO I'd probably go with a laptop with a Thunderbolt or another fast eGPU method though a. I hate Apple and it's overpriced stuff, so it would have to be one of those rare other machines that do it. And I worry about its supporting equipment being overpriced. Another similar technology would be better.

    Of course it would be amazing if they wised up and offered an easy way for x16 PCI-E very directly, the mobos are perfectly capable.

    Of course there's still the solution of swappable GPUs though they might be more constly than that and my lifestyle doesn't care too much about mobility, mainly about weight being low. And eGPU is low enough.
     
  2. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    1 & 2 go hand in hand. Top end GPU's are usually replaceable. And to be honest your best way to go is to get the fastest you can afford, and if you can the top end card. Only reason being is that laptop GPU's are quite expensive, and not really cost effective to keep upgrading your GPU.

    eGPU's are a mixed bag. They work only on a limited number of laptops, and are quirky to get working.
     
  3. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    And also not viable if you want to game on the go, which is essentially the point of having a gaming laptop in the first place.
     
  4. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    They are viable for certain lifestyles. e.g. I do move the thing a lot but NOT as a laptop, but strictly as a mobile desktop. i.e. I set it up on a desk with external peripherals for hours on two different places.

    In that framework, eGPU is spectacularly beneficial: In a very light package GPU is admirable.

    The main problem in my opinion is Intel being so good in CPUs compared to what others are on GPUs. A modern i7 lasts for around 5 years, a laptop GPU is often only good for a year if one doesn't pay very close attention to it initially.

    It's not strictly anyone's fault, but Intel has its own fabs, others beg others for manufacturing and logically their technology is behind.
     
  5. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251

    you're comparing apples to oranges there buddy. There is a reason we still have discrete graphics and don't rely on the intel igpu. ;)
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    That's because games are primarily GPU bound. Because they push the envelope all the time with the graphics, the core gameplay rarely changes so a much faster CPU isn't required.
     
  7. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    I am not sure what you nmean by a modern i7 "lasts" for five years. Firstly, it is not possible to get that number because the core "i" series hasn't been out for five years yet. Secondly, the very first chips released with the "i" nomenclature codenamed Bloomfield, were numbered 9xx. You hardly see any of those anymore. Most have upgraded. That being said, the reason GPUs "seem" to last not as long is because developers come out with games that are very hard on the GPUs. On the other hand, on a forum like this, you won't see many people who use software that is very hard on the CPU. In our lab, we need to upgrade out CPUs at the very MOST in two years. It all depends on what you do. For games, yes GPUs "seem" to last less. Although the GTX580m from three years ago is still a pretty powerful beast. For software like running highly optimized code in parallel, CPUs will be obsolete in two years. For working on MS Word, CPUs may not become obsolete in several years. I still have a HP Brio from 11 years ago with a 550MHz P3 that runs the Office Suite just fine. I doubt I can play CounterStrike on it. That doesn't mean that that P3 "lasted" longer than the GPU on it (a 64 MB nVidia GeForce 4 MX420).
     
  8. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Yep. CPU's will usually bottleneck games less quickly than GPU's, but 5 years is quite a stretch. As maverick1989 noted, 5 year old CPU's will manage basic tasks fine, but they won't power modern GPUs. Look at the Core 2 Quad CPU's which weren't very plentiful, the Q9000 mainly, is 4 years old and couldn't manage 90% of the games released today. It would just limit the performance way too much. Prior to that there were only dual core mobile CPU's and those would be like putting a lawn mower engine in a ferrari. It just isn't going to work. There have been users with a gen 1 i7 XM CPU that upgraded to a 680m and found the performance less than stellar.
     
  9. Bob

    Bob Notebook Consultant

    Reputations:
    20
    Messages:
    223
    Likes Received:
    0
    Trophy Points:
    0
    I got the next best GPU 3 years ago (5850m GDDR5) and it still holds up in modern games today at high details and 1080p except in Crysis 3 and a few other heavy titles.
     
  10. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Notebook with:

    - High end videocard with 4GB VRAM. Enough memory for future games with big textures. Also make sure its a notebook with MXM module because that means you can swap out the GPU and replace it with a better one. Preferably MXM Type 3.0B because it allow 100W GPUs.

    - 16GB RAM. A bit overkill but its cheap so why not?! Atleast you are settled for the future.

    - i7 from the newest CPU architecture. This is one component you can`t swap out because Intel use different chipset and/or different socket for each generation of new CPUs. It really doesn`t matter what CPU you buy as long as it is a Quad core. They are more than powerful enough to drive the fastest GPUs out there.

    - 240W PSU. Notebooks today is delivered mostly with 180W PSU. That is too little, because A) it won`t allow you to overclock much with the newest and fastest GPUs. And who knows what power requirements the upcoming GPUs have.

    - SSD. Forget about HDDs.
     
  11. GoodToGo

    GoodToGo Notebook Consultant

    Reputations:
    58
    Messages:
    294
    Likes Received:
    0
    Trophy Points:
    30
    Which laptops have (easily) swappable GPU's and 240W PSU's?
     
  12. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    You can mod your own PSU to get 240W if needed. That's what many Clevo owners are doing. But if you want one from the beginning then you'll have to get a Clevo 17" or Alieware 17". Asus I think comes with a 230W PSU with the 780m but they tend to have proprietary GPU's. MSI so far only offers 180W but use the battery to get the extra boost of power when needed.
     
  13. littlecx

    littlecx Notebook Deity

    Reputations:
    24
    Messages:
    783
    Likes Received:
    20
    Trophy Points:
    31
    for MSI, how long the battery can last that way? after battery drained only 180W available?
     
  14. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    The best way to future proof the best you can is:

    1. Get the fastest gpu you can possibly get for your budget
    2. Make sure it's true mxm so you can stretch it without buying a whole new laptop
    3. Be prepared to live with turning down some details at some point ;)
     
  15. Voodoofreak

    Voodoofreak Notebook Deity

    Reputations:
    64
    Messages:
    943
    Likes Received:
    2
    Trophy Points:
    31
    Option 1 from your list. Recycle after 3-4 years.
     
  16. GoodToGo

    GoodToGo Notebook Consultant

    Reputations:
    58
    Messages:
    294
    Likes Received:
    0
    Trophy Points:
    30
    What laptops come with true MXM cards? Is there any list?
     
  17. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    MSI Global GT70 Dragon Edition 2



    If you overclock it normally won't be running > 180W consistently, but when it does, it will draw extra power from the battery, and likely no more than 20W, and when it drops below 180W it will be charging the battery, so you should be able to get several hours if not more from even a high demanding game.
     
  18. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I don't understand the obsession of some people to promote what they bought themselves as "necessary". e.g. the guy with the SSD.

    What? Have you got any idea how useless a disk is to GPU performance? In fact, for most applications, the disk is not used AT ALL once they are loaded once, provided RAM is enough.

    Kids have to learn that a computer is made by the Memory, The CPU, The GPU and their interconnections. Stop ridiculing yourself with pretending other stuff are as important.

    It's especially ridiculous since disks can be promoted in the future anyway so it's completely off topic since it's de facto future proof.
     
  19. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    That might actually be an anti-argument that may drive me back to using a desktop and adding bulk to my life.

    I got at "random" a GPU that came with the first Sandy Bridge i7s because I didn't know better or had time to look for another or wait for another. It's around 15-20% slower than what you got but that's not enough to not be considered aging today. Sure, you did get an admirable GPU that lasted for 3 years but after that you almost hope for the whole thing to die so you can move on if it's unswappable.

    I keep thinking I'd prefer robust eGPU solutions. I don't use the battery of the laptop, it's mostly a mobile desktop, but even mini ITXs are too heavy and bulky to actually carry. Even an eGPU would be able to be carried compared to an ITX.

    It mainly bugs me enormously that the i7 is a beast that doesn't bottleneck the thing in the slightest, but GPUs are way far behind.

    And I keep thinking NVIDIA / AMD not having access to the smallest factor transistors is THE factor.

    NVIDIA didn't inform their fans that they couldn't move to 22nm even if they wanted to for cost reasons for nothing.
     
  20. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    I don't think he ever said it was necessary or that it would improve GPU performance. But you're absolutely wrong that it doesn't affect application performance at all.
     
  21. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    This is not true. You are referring to programming languages. Games do not do this. Only relevant information about surroundings is loaded into memory. That is why most games, especially sandbox style games, have areas that you enter and leave. Each area is loaded into memory (actually it says exactly that on the screen - "Loading ... "). That is why it takes a while to load. That is also why these loading times significantly decrease when the game is installed on a harddisk. A huge amount of memory is required by the CPU to actually send data to the GPU. The CPU loads to registers mapped for GPU use and the GPU reads data and instructions from these registers. That is part of the reason video games need so much RAM.
     
  22. djembe

    djembe drum while you work

    Reputations:
    1,064
    Messages:
    1,455
    Likes Received:
    203
    Trophy Points:
    81
    Personally, I just buy the best system for my needs and then use it until I see a compelling reason to upgrade. Buying a system with an MXM-standard graphics card or using an e-GPU setup may put off the inevitable a bit, but there's nothing really future-proof.
     
  23. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    You have no idea how modern games work. That's all I have to say. If they had to wait for a disk during live rendering, not only they would be slow, they would be hanging. Anything they do with loading must be pre-loading, never for the actual live scene. The live scene must always be pre-loaded for any FPS that goes above about 1FPS.
     
  24. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    Actually, you seem to be the one who has no idea how things work. Assets are frequently being loaded during play.
     
  25. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Right. During the "Loading..." screen scenario assets are being priortized, organized, and loaded into vRAM and system RAM for rapid fetching, but in no way are 100% of the assets loaded for a given level. Then you'd be limited to 1GB, 2GB, whatever vRAM limits are, and even then you can't use 100% of available vRAM because a large portion of it is utilized for processing the scenes.
     
  26. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    Actually considering that there is a very very high probability that you have played video games on a system that was designed by a team that I was a part of, I can assure you that I have a very good idea of how software in general works. It is your loss if you do not wish to understand the workings and simply come here with a colorful show of vanity. What hockeymass and WingNut have said is correct. The system RAM is used to push data to the GPU and then the GPU uses its dedicated memory to cache textures and other things while not working on them yet requiring them at a later stage. A modern video game as you term it needs several GB of memory for all its instructions and textures. You don't NEED all of them at once. Please look up information about this topic before coming back with an argument.
     
  27. leladax

    leladax Notebook Guru

    Reputations:
    0
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I reiterate. You have no idea about 3D graphics programming. Don't let your selfishness stop you learning about it.

    In the overwhelming majority of cases VRAM gets ABSOLUTELY no bottlenecks from petty disks during live rendering (not initial loading). If your GPU has VRAM related issues, I assure you, NO DISK will improve it. Well, unless we're talking about extremities of using a disk from the 90s and can't even load the rare 1MB here and there a common game might need once in a while, but that's rarely the case and often just bad programming. fread()s must be avoided as much as possible at all times and always be at least on a second thread and never limit the main thread.

    Unless you're strictly talking about loading half the World of Warcraft only when flying on a gryphon at super speed to only new (and non-raid) locations, in which case, KEK.

    In fact EVEN THEN the overwhelming majority of good games have done the right thing and fread()s are strictly limited to a second thread hence it doesn't affect rendering speed even in the rare case it's loading slowly.

    You will not be limited by petty disks 95% of the time, and about 100% of the time in competitive games (FPS, e-sports, MMO raiding, etc.) and that's a fact.

    It's especially funny that the more competitive and "important" a game or part of a game is the more likely to be designed to avoid loading from disks.

    i.e. spare me the lectures about "important to have SSDs for GPU performance". I wasn't born yesterday.
     
  28. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    NOBODY SAID THAT SSDs HAD ANYTHING TO DO WITH GPU PERFORMANCE.

    In addition, you've already said that you don't want to bother with basically the only way to future proof your GPU, soooo this thread seems pretty pointless.