The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    VRAM in Applications

    Discussion in 'Gaming (Software and Graphics Cards)' started by Phase, Mar 17, 2015.

  1. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    I know only a few games benefit from having 8GB of VRAM that the 880m and 980m have, but what about other applications?

    I know that extreme 3D super detailed modeling applications can eat it up, but what about more common software like Adobe Premiere, After Effects or Photoshop?

    Is having 8GB of VRAM overkill in the real world?

    I mean obviously getting a photoshop or premiere project with 100 layers full of 8k images would use it up, but that's not realistic.

    I'm asking any other content creators if there's a benefit in editing photos or videos with 8GB of VRAM and NOT using 3D modeling. I'd like to find a way to use my 8GB of VRAM to show to people that it can be useful.

    Thanks for your input.
     
  2. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    8GB is overkill. Doubtful any game will use more than 3GB at 1080p with my 980M.
     
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I've already established that games do indeed cross 3GB at 1080p easily. There's even a couple that can cross 4GB too. But not a majority.

    That being said, 8GB is indeed overkill for gaming, but since it's the next step up from 4GB (they can't use 6GB at full speed without making the card a 192-bit memory bus) there isn't any other choice, and any game that uses even slightly over 4GB all the way to 8GB will just benefit from caching and the extra memory, etc etc.

    There are certain programs used by professional employees that will benefit from such a huge vRAM buffer. There's been 24GB cards even since Kepler has been out. But home-user applications? I doubt that'll be used. If you were about to use something like that, you'd likely be hunting those quadro or tesla cards with huge vRAM amounts.
     
  4. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Overkill? OVERKILL???

    THIS. IS. oh wait nvm I'm not on a desktop forum lol

    On a more serious note though:
    [​IMG]
    [​IMG]
    [​IMG]

    Watch Dogs, FC4, and COD AW can all suck up 7+GB of vram at 4K. Incidentally all three have worse than dog poo optimization. Now compare those 3 games to Crysis 3.
     
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Are the games playable at those settings is the real question
     
  6. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Dying Light is sort of playable:
    [​IMG]

    Same with COD AW:
    [​IMG]

    Although to be fair, COD AW is one of those games that simply goes "you haz 12GB vram? OM NOM NOM" in terms of how it treats vram, so 7.3+GB is simply a reflection of terribad optimization.
     
    Last edited: Mar 18, 2015
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Dying Light on 980 has too many dips below 30 FPS for me to consider that playable. Looking at the line vs. Titan X, performance clearly isn't limited by VRAM (I don't see any absurd dips in min).

    And about AW, exactly as you said. Look at 690 hanging in there with a measly 2GB.

    My prediction is 8GB VRAM will be safe for the duration of this console gen.
     
  8. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    I wasn't asking about games at all. I know the 8GB will come in handy in the future. I was asking about any applications that can use the GPU for acceleration. Video editors or photo apps or Illustrator or flash or something that isn't using 3d models.

    BTW some Pascal GPU's will have 32GB of VRAM.... so overkill
     
  9. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    ^ooops :oops:

    @octiceps: I was only referring to Titan X in terms of Dying Light being somewhat playable, but yes it doesn't seem to be vram limited.

    I think 8GB is more than enough if you're using a single panel. However if you're doing 4K surround you probably will need that 12GB (as well as 3 Titan X's, so glhf with tri-SLI lol)
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    GPU ray tracing. If you wanna make the next Pixar movie go knock yourself out LOL.
     
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Even then, I still think 12GB is overkill. The thing about increasing pixel count through resolution/supersampling/downsampling is that even if you double or quadruple it, VRAM usage doesn't come anywhere close to doubling or quadrupling. It's not linear to the pixel count.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The problem is that while 8GB is about right for the "I'm set for the next few years regardless of settings/game/resolution", 6GB is not (especially at higher resolutions with multisample type AA) and the next linear bump from 6GB is 12GB due to the 384-bit memory bus.

    Of course they could have been hard about it and used a 512-bit memory bus and shoved an 8GB buffer on it, and we'd have gotten better memory speeds to boot. Might have made it slightly more worth that huge price tag.

    BUUUUUUT nVidia.
     
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Exactly.

    I don't know whether to laugh or SMH whenever I see nVidiots nVidia fans cheer at AMD falling behind the curve and hoping their next release is a huge flop. Then again these are typically (and I do mean typically) people who can only afford mid-range stuff, but still buy nVidia anyway because they think it's somehow more "premium", not realizing that mid-range is where AMD offers MUCH better bang for buck.

    /rant
     
    TomJGX, D2 Ultima and octiceps like this.
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    "Nvidiots" ROFL!
     
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Exactly. And again, all this posting of vRAM utilization is not really utilization. It's vRAM used. How much is really "utilized" for the game to run fluidly? I have been playing CoD Advanced Warfare and it sucks up the full 3GB vRAM I have but no signs of stutter or frame drops. Same with Middle Earth Shadow of Mordor. It's just a delicate balance between GPU performance vs vRAM utlization. You could throw 16GB on there and fill it up, but it doesn't mean squat.
     
    TomJGX likes this.
  16. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    I can second this.. COD AW has been sucking up vRAM like nothing.. Also uses more RAM/pagefile then any of my other games but with what I have, runs very smooth and maxed out..
     
  17. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I agree, but then the question becomes how do you distinguish between a game eating up all that vram because it truely, geniuinely needs that much, vs a game just going "YOU HAZ 12GB? OM NOM NOM".

    I think the only true apples to apples comparison would be to take cards that offer multiple vram configs (780 3GB vs 6GB, 290X 4GB vs 8GB, 7970 3GB vs 6GB), test them under identical situations and see if the min framerates tank with the lesser vram card. Similar cards based on the same chip could also be used of course eg 780 Ti vs Titan Black.
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Unfortunately that would be about the only/best way. I think you're safe with 980m performance and 4GB vRAM, and 970m with 3GB vRAM might be borderline but I doubt you'll have many if any issues.
     
  19. Wolfpup

    Wolfpup Notebook Prophet

    Reputations:
    128
    Messages:
    4,082
    Likes Received:
    13
    Trophy Points:
    106
    8GB is probably overkill, but my 2GB GTX 680m has over 800MB used up running AC3 plus a bunch of other Windows programs open on two 1080p displays soooooo that's a little uncomfortably close to 2GB.
     
  20. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    How is 800MB "uncomfortably close to 2GB"?
     
  21. Wolfpup

    Wolfpup Notebook Prophet

    Reputations:
    128
    Messages:
    4,082
    Likes Received:
    13
    Trophy Points:
    106
    It's already using nearly half my RAM with last gen games. System requirement are going to skyrocket as more and more games are designed around the Playstation 4. I think you can already surpass 2GB pretty easily with some games/settings.
     
  22. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's like saying "OMFG BF3 2011 last-gen game uses 1600MB VRAM maxed out on my 980M 4GB what am I gonna do!!!"
     
    James D likes this.
  23. Wolfpup

    Wolfpup Notebook Prophet

    Reputations:
    128
    Messages:
    4,082
    Likes Received:
    13
    Trophy Points:
    106
    Okay...? Well anyway, point is, while 8GB may end up being overkill, I'd sure prefer to have at least 4GB.
     
  24. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Windows won't use up your vRAM unless it's a 3D app, that will use up your system RAM.
     
  25. Wolfpup

    Wolfpup Notebook Prophet

    Reputations:
    128
    Messages:
    4,082
    Likes Received:
    13
    Trophy Points:
    106
    Eh? Anything the GPU is working on it's storing in the it's own video RAM. The desktop itself is 3D, but that would true regardless. At a minimum you'd have the frame buffer(s) there.
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Depends. If you have Optimus it's using the iGPU so system RAM. If you don't have Optimus, Aero uses VRAM on the dGPU. WDDM memory usage in W8 is higher than in W7 plus you don't have the option to disable Aero to free up more RAM/VRAM.
     
  27. Wolfpup

    Wolfpup Notebook Prophet

    Reputations:
    128
    Messages:
    4,082
    Likes Received:
    13
    Trophy Points:
    106
    Yeah. That's interesting that W8 uses more. Do you happen to know why? (Out of curiosity)
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No idea. Maybe caching the Metro UI in VRAM?
     
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    TomJGX likes this.
  30. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    +1
    I always thought that the instigator of creating new video cards with high vram were these games.
    Shame they aren't graphically as great as their predecessors.
    I would say to OP 8GB vram is about 2-3 years away in that game engines will actually use it graphically speaking.
     
  31. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    Or just the fact that a competitive AMD is good for the industry and good for consumers.
     
  32. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Frame buffers... all of mere few MB's. You don't need much video RAM to run Windows or Aero. Sure if you were running multiple CAD sessions then sure, even then it would likely swap data in and out of the vRAM buffer depending on which CAD session was active.
     
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I am reminded of the fiasco people talked on these forums years ago about discrete mobile mid-range GPU's with very high amounts of vRAM but a very constricted bandwidth.
    At the time, most people thought that high vRAM meant better performance, but it was constantly pointed out that the interface limitations and low bandwidth would prevent efficient utilization of all that vRAM.
    Namely, someone stated that by the time a full vRAM would be filled, performance would drop like stone because the gpu couldn't process the data properly.

    So I don't think vRAM plays a too crucial role when it comes to performance... but rather, bandwidth and bus width are more essential.
    The 9600m GT GDDR3 for instance with 512MB of RAM was able to handle games just fine... whereas a slower version of the same gpu with DDR2 for instance was 30% slower, but had 4x more vRAM in some iterations.

    My thought is that GDDR5 might run into similar problems when it comes to high amount of vRAM but constricted bandwidth and bus width... I'm specifically referring to Titan X.

    Hence, I think a 'balance' of sorts can be achieved.
    Namely, if AMD's HBM pans out properly, then even 4GB of vRAM of HBM will offer more than enough performance for discrete graphics cards and will probably outperform 8GB GDDR5 - especially at higher resolutions.

    vRAM seems to play a role when it comes to very large/detailed textures and high resolution gaming, plus potential mulit-monitor support... but for laptops where majority of gaming is likely to be done on the laptop itself, I doubt there would be that much of a difference between 4GB HBM and 8GB GDDR5 performance-wise, except at high resolutions where HBM has more bandwidth to play with, ergo it can process the data a lot faster than GDDR for instance and doing so with less vRAM.
    Plus, GGDR5 is more restrictive than HBM for future prospects. HBM also occupies much less space.

    And, architecture also plays a part in this, as well as how well the games/programs have been optimized in the first place.
     
    Starlight5 and HTWingNut like this.
  34. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    VR will require 6GB or more. 8GB/12GB definitely isn't overkill there. Pretty sure the Titan X was designed with developers in mind.

    I wonder if one day cards will have like 128GB vRAM frame buffers, lol.

    "I can game with the NVIDIA Titan ABCDXYZ with 128GB vRAM on 15k monitor at 120 FPS!" Windows 25 and scaling issues persist. :rolleyes:
     
    Last edited: Mar 21, 2015
    TomJGX and D2 Ultima like this.
  35. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    Nvidia says Pascal will have 10 times the performance of titan x. Way faster bandwidth too. Trying to decide if i should get the titan x for 4k gaming or wait till pascal, where midrange pascal gpus should blow titan x out of the water at high resolutions, from my understanding.
     
  36. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Oh dear god not that 10x faster BS again. Let me just quote 2 posts from another forum because I'm too tired to debunk the 10x faster myth:

    tl;dr 10x faster assuming you use 2x as many Pascal GPUs for very specific compute scenarios utilizing FP16. Nothing at all to do with games. None, zilch, nada.
     
    octiceps likes this.
  37. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    any idea how much more the 390x should perform in 4k gaming to the titan x? i'm very excited for amd to make a comeback, but im stuck with nvidia for the most part because of the cuda cores and gpu accelerated effects in adobe premiere and after effects and photoshop.
     
  38. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    btw sorry if that was a bit blunt, I've posted the exact same thing like at least 20 times this week on different forums so I'm a bit on the edge lol

    HBM should give the 390X an edge over Titan X in 4K gaming, assuming the core performance is there.
     
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    nVidia has proven that Maxwell, while much better for higher resolutions than Kepler, is not at the level of GCN, seeing as how R9 290X cards gained the lead lost at 1080p against Kepler and Maxwell, where Kepler lagged behind. So if the R9 390X is even a 1:1 performance with the Titan X, assuming 4K performance did not get "worse", it should whole down the Titan X and make it cry for its mommy at 4K.

    Also, have there been any tesla GPUs based on Maxwell yet? I wonder if Pascal is going to be Tesla only, and Volta might be the next step for consumers. Just a thought.
     
  40. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Maxwell's DP is crippled from the ground up, so I've a feeling there will not be any Maxwell Teslas.

    I'm sure James Clerk Maxwell won't be too happy about the lack of Teslas. :D
     
    Last edited: Mar 22, 2015
    TomJGX and octiceps like this.
  41. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Titan X seems to have same if not similar amounts of bandwidth like R9 290X.
    So at 4K it should behave similarly performance-wise (adjusting for drivers and game optimizations)... and since R9 390X has been reported to have 1.9 (practically 2) times the bandwidth of TitanX, I'm thinking that at 4K gaming, the HBM R9 should easily overrun Titan, whether it has 4GB of vRAM or 8GB.

    But I agree that when it comes to the desktop arena, Nvidia seems to have barely been able to catch up to (or somewhat lags behind) AMD in the bandwidth area, and architecturally they aren't as good as GCN at higher resolutions.

    In the mobile sector however, Maxwell is a lot more efficient as it scales better performance-wise and in power use compared to GCN.

    Its a shame AMD was unable to come out with anything new that's better in the mobile sector...
    Though, one thing that nags at me with the Tonga variant of top-end AMD mobile GPU is that no laptop has come properly EQUIPPED to handle the chip in question... therefore, there's a lot of throttling going on and insufficient power to the gpu so it can be properly tested under it's maximum.
     
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nope, Titan X has Maxwell memory optimizations, e.g. larger L2 cache and 3rd gen delta color compression, and blows away 290X at all resolutions. Memory bandwidth is only one part of the equation.

    Here is where Titan X stands in a synthetic test:

    [​IMG]

    Just like no laptop was equipped to handle 880M amirite? It's totally the fault of the laptops themselves and not, you know, an overly hot and power-hungry GPU. :rolleyes:
     
    Last edited: Mar 23, 2015
    D2 Ultima likes this.
  43. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    What? GTX Titan X cards have so much more bandwidth than R9 290X cards you couldn't OC a 290X card to have the same amount of bandwidth. They have 336GB/s + maxwell bandwidth efficiency which puts them somewhere closer to 384GB/s... R9 290X cards have 320GB/s and there's no way you're overclocking them a further 60GB/s.

    No. It's all architecture. The fact that Hawaii murdered Kepler at 4K with less bandwidth but only trades blows with Maxwell at 4K (which it loses to at 1080p, and has ~60GB/s more bandwidth than) means that architecture is the absolute deciding factor here. nVidia sucks at making cards for higher resolutions and that's the long and short of it. Maxwell came out and it still is barely able to keep up at 4K with cards it schools at 1080p... which, while an improvement over how Kepler tore to shreds at 1080p, equalized at 1440p and then lost at 4K, is still unimpressive for high resolutions. Even if the R9 390X cards lose a bit at 1080p to a Titan X, as long as they demolish at 3K and 4K, I will happily recommend them to everybody who wants high resolutions. Especially if it's cheaper. I'd like some of nVidia's marketing to take a dive because of raw results, because they need to start making cards for the consumer again. And for the love of all hell, nobody gives a crap about boost 1.0. 2.0, 3.0 or whatever .0 version they get to.

    nVidia lost in the early stages of Kepler vs the 7970 cards, but the top end cards (780Ti and Titan Black) took the highest memory bandwidth. AMD went to HBM technology, so it's pointless for nVidia to even fight. But as we all know, raising memory clocks is a small linear increase in FPS... sometimes... and isn't really all that important as far as gaming goes. But if GPU-accelerated work is a thing and memory matters, the R9 390X will be a winner. Or the R9 390 might be the real winner, being almost as strong with the same HBM applications. As far as I recall though, in Fermi and GTX 200 series (I never could find the name of that architecture anywhere) they were winners in bandwidth. #GTX285WithA512BitMemoryBusWasOP

    We don't know how good the M390X will be yet, so we'll wait and see. But I do agree it isn't looking all that good.
     
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Tesla. Which is also the brand name of Nvidia's lineup of GPUs for supercomputers. Confusing, I know.
     
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    So THAT'S why it was GT200 series....
     
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yep, name of the architecture is always the second letter after G in the GPU's code name.

    GT - Tesla
    GF - Fermi
    GK - Kepler
    GM - Maxwell
    GP - Pascal
    GV - Volta

    etc.
     
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well yeah I knew that, but every time I kept looking for "GTX 200 architecture" I kept finding a lot of dead ends explaining launch date and just overall saying "GT200"
     
  48. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's because all the desktop GTX 200 GPUs used the GT200 chip, unlike what we have now where 960 is GM206, 970 and 980 are GM204, and Titan X is GM200. They just didn't tell you what the T stood for.

    But yeah, it's confusing. Tesla was Nvidia's first unified shader architecture and actually debuted 2 generations before in the GeForce 8 Series, but Nvidia didn't switch to its current naming scheme until the GeForce 200 Series. So the 8 and 9 Series cards, although also Tesla, were codenamed G8x and G9x instead of GTxxx.
     
  49. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    To make things even more confusing, GTX was also used as a suffix for the high performance cards during the Geforce 7, 8, and 9 series. Thankfully these days it's either GT or GTX, none of this additional GTS and GS bullcrap.

    Although had XFX made ATi cards back in the day, I imagine the XFX X1900 XTX DD edition would've won just on the name alone. :D

    (honestly someone should just grow a pair and release an XXX edition card already, although I guess you could argue Sapphire's Tri-X cards are technically just that)
     
    Last edited: Mar 23, 2015
    TomJGX likes this.
  50. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Dem suffixes...

    Don't forget Ultra, XT, LE, SE, GTO, GSO, GTX+, GX2. And oh god, GeForce 4 MX.

    And that's just for Nvidia alone. :D
     
    sasuke256 and TomJGX like this.
 Next page →