It's more of a market segmentation thing than anything. Easier to mark up expensive 6GB models that way. Lots of people are complaining online that 3GB isn't enough, especially for multi monitor/4k.
There's this tendency to put too much vram on the notebook side and not enough on the desktop side for some reason.
-
Well I do not fully disagree with you.
But the people who do use Quadro cards actually need more VRAM. I can only imagine how much the big 3D projects over towns etc use when all is loaded on the GPU. That have a heck more heck more polygons and textures than just a map in a game.
But yeah, putting more VRAM is also a way to separate Geforce cards and professional cards and to justify the sale for buyers.
Multi display and 4K I think you will have issues with 3GB yes. Not many game on several displays and 4K is too demanding to enable all the anti aliasing that eats memory, so 3GB is not too little in the majority of the games.
You can see example here from a guy who tested Titan with various resolutions and settings: http://forums.aria.co.uk/showthread.php/127794-How-much-VRAM-do-you-need
Crysis 3 is the biggest VRAM hog there, but as you can see: 3.3GB with MSAAx4 (only 11FPS), it drops down to 2.2GB if you use FXAA.
So I think 3GB is enough for pretty high res gaming -
Meaker@Sager Company Representative
6GB of vram is needed for running 3 Ti cards together at 4k resolutions.
The 6GB versions should be hitting early next year.reborn2003 likes this. -
Robbo99999 Notebook Prophet
Good general points, but when you talk of SLI having '16GB's between the two' that's not how SLI works. You can't double the RAM when you're referring the VRAM of SLI cards. So, for instance, two 3GB cards in SLI have 3GB of space for storing textures and other information (not 6GB) - exactly the same amount as one of those 3GB cards would have if it was operating on its own (not in SLI). You can't double the RAM when talking about SLI - that's not how it works (both cards have to store the same duplicate information in their VRAM).reborn2003 likes this. -
Robbo99999 Notebook Prophet
You know, I agree with you in some ways. It's not by accident that the PS4 was designed with 8GB of fast access GDDR5 RAM (bandwidth similar to today's high end cards & additionally shareable amoungst CPU & GPU). This makes me think that in a couple of years as game developers code for that 8GB of GDDR5, that would mean that for PC's it would be advantageous to have 8GB (maybe 6GB as a minimum) for maximum performance efficiency when talking about the fast access VRAM available to the GPU. If I had a desktop, and I was going to be buying a GPU in the next year, I wouldn't buy one with less than 6GB (maybe 4GB) of VRAM - not if I didn't want it to last more than 2 years.reborn2003 likes this. -
Robbo99999 Notebook Prophet
Yes, but I'd pay the extra for 770M sli! 765M is pretty lowly, the memory bus just sucks - 770M sli offers better value performance than 765M sli.reborn2003 likes this. -
Yeah, I know. I'm aware of how it is.
-
The 8GB is shared with the system, different from how PC systems run. GPU has GDDR5, CPU has DDR3. They are different architectures and x86/64 is not designed to utilize GDDR5 for general computing. More vRAM does not mean squat if the rest of the GPU and system are not up to snuff. The MSI GX60/70 are a perfect example. Throw in a powerful 256-bit GPU with a meager CPU, and it doesn't matter if it has 1GB or 8GB, it just won't perform well. If you want to manage high resolutions, like triple monitor or with 3K and 4K monitors coming into play, it may be needed more. But even an 880m likely isn't powerful enough to sustain 4k at 60FPS for most games.reborn2003 likes this.
-
Robbo99999 Notebook Prophet
Fair enough, sounded like you didn't know based on how you talked about it. At least the people reading these messages will be better informed now about the truth of matters rather than having false percpetions about SLI and how VRAM works with it. -
Robbo99999 Notebook Prophet
Yes, I agree with you, I know the PC architecture is established around GDDR5 for the GPU and DDR3 for the CPU, I wasn't disputing that. I was just pointing out that in the PS4 the difference is that the GPU has access to nearly 8GB of GDDR5 at a pretty high bandwidth. If this is the case, then I would imagine that future games will be designed around that architecture & fact, therefore it might make sense for future-proofing purposes for PC GPUs to have close to the same amount of fast access VRAM available (e.g. close to 8GB of VRAM - I suggested 4 or 6GB as a sensible amount for future proofing). It's just my hunch & understanding of the matter; it doesn't mean I'm right about it, but that's my viewpoint on the future of PC gaming based on the fact that it often can be influenced by the hardware/design of consoles.
((Chicken or Coke!)) Chicken FTW! -
PS4 have GDDR5 and we PC users may not even get to touch DDR4 next year.
Same with SATA Express, which supposedly isnt supported by Intels next mobile chipset for Broadwell.
Its messed up.
:/reborn2003 likes this. -
I'm aware the two cards do not "add up" to make 16GB's of VRAM, I shouldn't have said that, I was trying to say that if there were two 8GB cards, there would be 16GB's of usable memory (between the two) of which the game would be able to split the workload between. Is that wrong?
EDIT: Either way, that's overkill for a laptop.
-
Robbo99999 Notebook Prophet
Yes, with SLI the workload is split between the 2 GPUs, each one renders alternate frames - the VRAM however is not split as we've said, the same information is stored in the VRAM of each card. -
A Tesla card certainly isn't a Quadro card
And I thought there was green blood running through your veins (jk).
It's pretty hard to do CAD work with a device that hasn't a video output... that's why I'd prefer a Quadro card for such work.
It works fine for 1440p / 1600p with resonable settings, but there are several popular games that will hit easily use more than 3GB of vram when played at triple screen or even just with downsampling.
3GB is definitely not "more than enough" for resolutions above 1080p. On the other hand, on a mobile system 8GB for gaming is total overkill, even with screen resolutions above 1080p.
For x86/64 it doesn't really matter what kind of memory is used. Different ram might be addressed differently, but that's up to the memory controller to handle. -
Wait, I think you may be confused. PC users do have access to GDDR5 in GPUs, and DDR3 in CPUs, just like the PS4. But, we will soon have DDR4 in CPUs whereas the PS4 won't be upgraded to it.
Sent from my SAMSUNG-SGH-I747 using Tapatalk -
lol I thought Tesla and Quadro were kinda the same. Turns out Tesla = Computing, Quadro = Workstation. Best to have something to look at while doing CAD yes
I don`t think there are any game that surpass 3GB up to 1600p. Maybe if you turn on SSAA to the max and such, but then we are back to the whole, "it will run too low FPS anyway."
Nah I`m not confused. PS4 have GDDR5 as system memory, not just video memory. I just commented that PC users may not even get DDR4 as system memory next year :/
I wonder what is best, GDDR5 or DDR4 as system memory? They say GDDR5 have high latency, but according to PS4 developers, they optimized the PS4 architecture so its neglible anyway. -
they can have the GDDR7 as a main RAM, their chips will forever stay a boosted HD7850 card, an overclocked GTX 770M can already equal or beat an HD7870
Cloudfire likes this. -
If you threw gddr5 in a wintel pc the processing would be so slow because Windows cpus are more linear than how gddr5 manages data if I have interpreted what I've read about it correctly.
Beamed from my G2 Tricorder -
Hehe thats true.
In a perfect world, my next notebook would have:
GTX 880M SLI, minimum 3K IPS display (like the GT60 have) but a 17 or 18 inch display instead, DDR4 and a SSD running on a SATA Express connection.
Sadly, I dont much of that will happen next year. Which is even more sad because all of this technology actually exist today but it is just not commercialized
-
I was going to ask the sellers on that Chinese site for the 880M's specs, but there's no messaging options, not even if you sign up
-
We scared them away :/
(Or they are sold)
This is how it looked like
-
I wonder if this is legit Kevin....
:hi2: -
-
Can't wait to see. Question is, is it worth sitting out if the first release is 28Nm? I think I'll wait to see some performance results.
Sound like they'll be beasts.reborn2003 likes this. -
Robbo99999 Notebook Prophet
That's a good question, and for me I'd wait till the die shrink to 20nm - that's how they're gonna get that valuable performance per Watt which is so crucial and the major limiting factor in laptops. Everything I've read so far points to the end of next year for the die shrink. I think I might wait till Volta to upgrade because I only just got my card in the Summer, but I might buy a whole new laptop at that point. Maxwell's gonna be good though, I'm excited about it - interesting that they have an ARM cpu stuck on the card too, it will be interesting to see how that's used (maybe it will make main CPU performance less important).reborn2003 likes this. -
i think the first 28nm samples are gonna have a kind of higher TDP and max +10% performance,..
-
I think 10% is a little low. Even the 780M is more than 10% faster than the 680M, and that was a re-brand. You're talking about a new architecture... It will be more than 10%. It may not be like 30%+ or something, but definitely more than 10%, at least on the high-end cards. Perhaps only 10% on the low-end cards.sasuke256 likes this.
-
exact i was more thinking about the GTX 860M/850M
-
Ah, okay.
-
The 780m is not a rebrand of the 680m. It has a full gk104 core vs a cut down one that the 680m has.TBoneSan likes this.
-
Meaker@Sager Company Representative
You could see a similar situation as last time.
770M -> 865M
780M -> 875M
While we wait for the new generation chips to come in and go into the 870M and 880M brackets.sasuke256 likes this. -
All stock.
880M > 680M SLI
880M SLI > GTX 780Ti
My hopes for this next generation. Could be (and probably is) a long shot, but one can dream...
Cloudfire likes this. -
Meaker@Sager Company Representative
If it's a process shrink that should hold true. Tweaking a pair of 780Ms puts me on the same level as a water cooled and overclocked titan as it is.
-
Exactely what i'm expecting..
860M = Downclocked 865M or OC GTX 765M -
leave for 6 months and come back to this!!!
I'm sorry, but people are getting scammed, they just put out the 780.
I know of no other hobby that will eat your money faster then PC gaming.. hell a PS4 looks attractive -
Yeah, it does eat away at you, and your wallet. It's just another one of those expensive hobbies... And there are tons of these types.
As an enthusiast, the best thing to do is to upgrade your GPU every other year. You'll see at least a 40% increase in performance in doing so, and up to 60% [or more] on occasion, with architecture changes (i.e. 680M to 880M is going to be closest to the latter). The 780M is about 20% better than the 680M (depending on the game or benchmark) at stock, and that was basically a re-brand. The 880M should be at least 25% better than the 780M, if not more, which means it will be about 50% better than the 680M ( at the very least). I'm hoping it's a little more than that, along with the overclocking headroom offered by the 680M in addition to the 50%+ gain.
EDIT: Lately we've seen a lot of weird things going on with Intel (Haswell). I'm hoping Broadwell doesn't hold us back... -
For 880m to interest me it would need to be more efficient in terms of PSU being my only bottleneck as far as 780m SLI goes for performance. I don't know if this will happen on the Q1 release without the die shrink.
-
Well. they have no choice but make them more efficient.
As of now, laptop have no way to dissipate much more heat(maybe little more but not much) or electricity(maybe/ maybe not, not sure how mxm and motherboard are rated and with what limits).
What gaming temp you get at the edge of psu blowing up? -
Temps aren't the problem, they are really good. Just 330watt is woefully antiquate for even minor OCing. The 780m's don't get to stretch their legs even a little for many users simply because of PSU limitations.
-
It's getting a little ridiculous imho. Machines with SLI top end cards are not that portable, especially if you take into account the size/weight of the power brick. You're almost better off getting an mITX SFF desktop and monitor for about the same weight and significantly cheaper, close to half the cost.
-
They definitely aren't the kind of thing you can effortlessly throw in your bag and trod off to work in. They have their purpose though. I can't see myself lugging a mini ITX + K/M + Monitor around when I need away from home.
-
I used to take my shuttle sff in a duffel bag with kb/m and a strap that attached my monitor to the bag. Not something I'd do every day but did it a few times a month.
Beamed from my G2 Tricorder -
Meaker@Sager Company Representative
It's got nothing on many hobbies actually.HTWingNut, reborn2003, Benmaui and 1 other person like this. -
Compared to flying, racing, car restoration, golf even... it's not horribly expensive. Maybe $2k/year if you upgrade annually using top end components.reborn2003 likes this.
-
CyberTronics Notebook Consultant
Lenovo has announced a laptop with gtx 860m
NVIDIA GeForce GTX 860M - NotebookCheck.net Tech
https://www.youtube.com/watch?v=1xJ2wtEqE2s&feature=youtube_gdata_playerreborn2003 likes this.
Any Idea about Nvidia 800M series..
Discussion in 'Gaming (Software and Graphics Cards)' started by sasuke256, Oct 6, 2013.