The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Any Idea about Nvidia 800M series..

    Discussion in 'Gaming (Software and Graphics Cards)' started by sasuke256, Oct 6, 2013.

  1. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    It's more of a market segmentation thing than anything. Easier to mark up expensive 6GB models that way. Lots of people are complaining online that 3GB isn't enough, especially for multi monitor/4k.

    There's this tendency to put too much vram on the notebook side and not enough on the desktop side for some reason.
     
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well I do not fully disagree with you.
    But the people who do use Quadro cards actually need more VRAM. I can only imagine how much the big 3D projects over towns etc use when all is loaded on the GPU. That have a heck more heck more polygons and textures than just a map in a game.

    But yeah, putting more VRAM is also a way to separate Geforce cards and professional cards and to justify the sale for buyers.
    Multi display and 4K I think you will have issues with 3GB yes. Not many game on several displays and 4K is too demanding to enable all the anti aliasing that eats memory, so 3GB is not too little in the majority of the games.
    You can see example here from a guy who tested Titan with various resolutions and settings: http://forums.aria.co.uk/showthread.php/127794-How-much-VRAM-do-you-need

    Crysis 3 is the biggest VRAM hog there, but as you can see: 3.3GB with MSAAx4 (only 11FPS), it drops down to 2.2GB if you use FXAA.

    So I think 3GB is enough for pretty high res gaming
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    6GB of vram is needed for running 3 Ti cards together at 4k resolutions.

    The 6GB versions should be hitting early next year.
     
    reborn2003 likes this.
  4. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Good general points, but when you talk of SLI having '16GB's between the two' that's not how SLI works. You can't double the RAM when you're referring the VRAM of SLI cards. So, for instance, two 3GB cards in SLI have 3GB of space for storing textures and other information (not 6GB) - exactly the same amount as one of those 3GB cards would have if it was operating on its own (not in SLI). You can't double the RAM when talking about SLI - that's not how it works (both cards have to store the same duplicate information in their VRAM).
     
    reborn2003 likes this.
  5. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    You know, I agree with you in some ways. It's not by accident that the PS4 was designed with 8GB of fast access GDDR5 RAM (bandwidth similar to today's high end cards & additionally shareable amoungst CPU & GPU). This makes me think that in a couple of years as game developers code for that 8GB of GDDR5, that would mean that for PC's it would be advantageous to have 8GB (maybe 6GB as a minimum) for maximum performance efficiency when talking about the fast access VRAM available to the GPU. If I had a desktop, and I was going to be buying a GPU in the next year, I wouldn't buy one with less than 6GB (maybe 4GB) of VRAM - not if I didn't want it to last more than 2 years.
     
    reborn2003 likes this.
  6. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yes, but I'd pay the extra for 770M sli! 765M is pretty lowly, the memory bus just sucks - 770M sli offers better value performance than 765M sli.
     
    reborn2003 likes this.
  7. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yeah, I know. I'm aware of how it is.
     
  8. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    The 8GB is shared with the system, different from how PC systems run. GPU has GDDR5, CPU has DDR3. They are different architectures and x86/64 is not designed to utilize GDDR5 for general computing. More vRAM does not mean squat if the rest of the GPU and system are not up to snuff. The MSI GX60/70 are a perfect example. Throw in a powerful 256-bit GPU with a meager CPU, and it doesn't matter if it has 1GB or 8GB, it just won't perform well. If you want to manage high resolutions, like triple monitor or with 3K and 4K monitors coming into play, it may be needed more. But even an 880m likely isn't powerful enough to sustain 4k at 60FPS for most games.
     
    reborn2003 likes this.
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Fair enough, sounded like you didn't know based on how you talked about it. At least the people reading these messages will be better informed now about the truth of matters rather than having false percpetions about SLI and how VRAM works with it.
     
  10. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yes, I agree with you, I know the PC architecture is established around GDDR5 for the GPU and DDR3 for the CPU, I wasn't disputing that. I was just pointing out that in the PS4 the difference is that the GPU has access to nearly 8GB of GDDR5 at a pretty high bandwidth. If this is the case, then I would imagine that future games will be designed around that architecture & fact, therefore it might make sense for future-proofing purposes for PC GPUs to have close to the same amount of fast access VRAM available (e.g. close to 8GB of VRAM - I suggested 4 or 6GB as a sensible amount for future proofing). It's just my hunch & understanding of the matter; it doesn't mean I'm right about it, but that's my viewpoint on the future of PC gaming based on the fact that it often can be influenced by the hardware/design of consoles.

    ((Chicken or Coke!)) Chicken FTW!
     
  11. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    PS4 have GDDR5 and we PC users may not even get to touch DDR4 next year.

    Same with SATA Express, which supposedly isnt supported by Intels next mobile chipset for Broadwell.
    Its messed up.
    :/
     
    reborn2003 likes this.
  12. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I'm aware the two cards do not "add up" to make 16GB's of VRAM, I shouldn't have said that, I was trying to say that if there were two 8GB cards, there would be 16GB's of usable memory (between the two) of which the game would be able to split the workload between. Is that wrong?

    EDIT: Either way, that's overkill for a laptop. :D
     
  13. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yes, with SLI the workload is split between the 2 GPUs, each one renders alternate frames - the VRAM however is not split as we've said, the same information is stored in the VRAM of each card.
     
  14. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    A Tesla card certainly isn't a Quadro card :D And I thought there was green blood running through your veins (jk).
    It's pretty hard to do CAD work with a device that hasn't a video output... that's why I'd prefer a Quadro card for such work.

    It works fine for 1440p / 1600p with resonable settings, but there are several popular games that will hit easily use more than 3GB of vram when played at triple screen or even just with downsampling.
    3GB is definitely not "more than enough" for resolutions above 1080p. On the other hand, on a mobile system 8GB for gaming is total overkill, even with screen resolutions above 1080p.


    For x86/64 it doesn't really matter what kind of memory is used. Different ram might be addressed differently, but that's up to the memory controller to handle.
     
  15. iLiftCars

    iLiftCars Notebook Guru

    Reputations:
    17
    Messages:
    74
    Likes Received:
    12
    Trophy Points:
    16
    Wait, I think you may be confused. PC users do have access to GDDR5 in GPUs, and DDR3 in CPUs, just like the PS4. But, we will soon have DDR4 in CPUs whereas the PS4 won't be upgraded to it.

    Sent from my SAMSUNG-SGH-I747 using Tapatalk
     
  16. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    lol I thought Tesla and Quadro were kinda the same. Turns out Tesla = Computing, Quadro = Workstation. Best to have something to look at while doing CAD yes :D
    I don`t think there are any game that surpass 3GB up to 1600p. Maybe if you turn on SSAA to the max and such, but then we are back to the whole, "it will run too low FPS anyway."

    Nah I`m not confused. PS4 have GDDR5 as system memory, not just video memory. I just commented that PC users may not even get DDR4 as system memory next year :/
    I wonder what is best, GDDR5 or DDR4 as system memory? They say GDDR5 have high latency, but according to PS4 developers, they optimized the PS4 architecture so its neglible anyway.
     
  17. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    they can have the GDDR7 as a main RAM, their chips will forever stay a boosted HD7850 card, an overclocked GTX 770M can already equal or beat an HD7870 ;)
     
    Cloudfire likes this.
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    If you threw gddr5 in a wintel pc the processing would be so slow because Windows cpus are more linear than how gddr5 manages data if I have interpreted what I've read about it correctly.


    Beamed from my G2 Tricorder
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Hehe thats true. :D

    In a perfect world, my next notebook would have:
    GTX 880M SLI, minimum 3K IPS display (like the GT60 have) but a 17 or 18 inch display instead, DDR4 and a SSD running on a SATA Express connection.

    Sadly, I dont much of that will happen next year. Which is even more sad because all of this technology actually exist today but it is just not commercialized :(
     
  20. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I was going to ask the sellers on that Chinese site for the 880M's specs, but there's no messaging options, not even if you sign up
     
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  22. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I wonder if this is legit Kevin....


    :hi2:
     
  23. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581

    Maxwell Geforce cards shipping by March 2014
     
    reborn2003, TBoneSan and Robbo99999 like this.
  24. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Can't wait to see. Question is, is it worth sitting out if the first release is 28Nm? I think I'll wait to see some performance results.
    Sound like they'll be beasts.
     
    reborn2003 likes this.
  25. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    That's a good question, and for me I'd wait till the die shrink to 20nm - that's how they're gonna get that valuable performance per Watt which is so crucial and the major limiting factor in laptops. Everything I've read so far points to the end of next year for the die shrink. I think I might wait till Volta to upgrade because I only just got my card in the Summer, but I might buy a whole new laptop at that point. Maxwell's gonna be good though, I'm excited about it - interesting that they have an ARM cpu stuck on the card too, it will be interesting to see how that's used (maybe it will make main CPU performance less important).
     
    reborn2003 likes this.
  26. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    i think the first 28nm samples are gonna have a kind of higher TDP and max +10% performance,..
     
  27. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I think 10% is a little low. Even the 780M is more than 10% faster than the 680M, and that was a re-brand. You're talking about a new architecture... It will be more than 10%. It may not be like 30%+ or something, but definitely more than 10%, at least on the high-end cards. Perhaps only 10% on the low-end cards.
     
    sasuke256 likes this.
  28. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    exact i was more thinking about the GTX 860M/850M
     
  29. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Ah, okay. :D
     
  30. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    The 780m is not a rebrand of the 680m. It has a full gk104 core vs a cut down one that the 680m has.
     
    TBoneSan likes this.
  31. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    You could see a similar situation as last time.

    770M -> 865M
    780M -> 875M

    While we wait for the new generation chips to come in and go into the 870M and 880M brackets.
     
    sasuke256 likes this.
  32. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    All stock.

    880M > 680M SLI
    880M SLI > GTX 780Ti

    My hopes for this next generation. Could be (and probably is) a long shot, but one can dream... :D
     
    Cloudfire likes this.
  33. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    If it's a process shrink that should hold true. Tweaking a pair of 780Ms puts me on the same level as a water cooled and overclocked titan as it is.
     
  34. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    Exactely what i'm expecting..
    860M = Downclocked 865M or OC GTX 765M
     
  35. peppergrass

    peppergrass Notebook Guru

    Reputations:
    0
    Messages:
    59
    Likes Received:
    6
    Trophy Points:
    16
    leave for 6 months and come back to this!!!

    I'm sorry, but people are getting scammed, they just put out the 780.


    I know of no other hobby that will eat your money faster then PC gaming.. hell a PS4 looks attractive
     
  36. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yeah, it does eat away at you, and your wallet. It's just another one of those expensive hobbies... And there are tons of these types.

    As an enthusiast, the best thing to do is to upgrade your GPU every other year. You'll see at least a 40% increase in performance in doing so, and up to 60% [or more] on occasion, with architecture changes (i.e. 680M to 880M is going to be closest to the latter). The 780M is about 20% better than the 680M (depending on the game or benchmark) at stock, and that was basically a re-brand. The 880M should be at least 25% better than the 780M, if not more, which means it will be about 50% better than the 680M ( at the very least). I'm hoping it's a little more than that, along with the overclocking headroom offered by the 680M in addition to the 50%+ gain.

    EDIT: Lately we've seen a lot of weird things going on with Intel (Haswell). I'm hoping Broadwell doesn't hold us back...
     
  37. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    For 880m to interest me it would need to be more efficient in terms of PSU being my only bottleneck as far as 780m SLI goes for performance. I don't know if this will happen on the Q1 release without the die shrink.
     
  38. baii

    baii Sone

    Reputations:
    1,420
    Messages:
    3,925
    Likes Received:
    201
    Trophy Points:
    131
    Well. they have no choice but make them more efficient.
    As of now, laptop have no way to dissipate much more heat(maybe little more but not much) or electricity(maybe/ maybe not, not sure how mxm and motherboard are rated and with what limits).

    What gaming temp you get at the edge of psu blowing up?
     
  39. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Temps aren't the problem, they are really good. Just 330watt is woefully antiquate for even minor OCing. The 780m's don't get to stretch their legs even a little for many users simply because of PSU limitations.
     
  40. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's getting a little ridiculous imho. Machines with SLI top end cards are not that portable, especially if you take into account the size/weight of the power brick. You're almost better off getting an mITX SFF desktop and monitor for about the same weight and significantly cheaper, close to half the cost.
     
  41. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    They definitely aren't the kind of thing you can effortlessly throw in your bag and trod off to work in. They have their purpose though. I can't see myself lugging a mini ITX + K/M + Monitor around when I need away from home.
     
  42. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I used to take my shuttle sff in a duffel bag with kb/m and a strap that attached my monitor to the bag. Not something I'd do every day but did it a few times a month.

    Beamed from my G2 Tricorder
     
  43. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    It's got nothing on many hobbies actually.
     
  44. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Compared to flying, racing, car restoration, golf even... it's not horribly expensive. Maybe $2k/year if you upgrade annually using top end components.
     
    reborn2003 likes this.
  45. CyberTronics

    CyberTronics Notebook Consultant

    Reputations:
    172
    Messages:
    202
    Likes Received:
    15
    Trophy Points:
    31
  46. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    when is it gonna come out :)
     
    reborn2003 likes this.
← Previous page