I'm down to the Lenovo T410 or Dell E6410 as my next notebook. The only big difference is that the T410 has 256MB of RAM on its NVS3100m GPU, where the E6410 has twice that amount on the same GPU. Some tests already show the T410's performance to be in line with the HP 8440w which has a 512MB FX380M GPU (which should in theory give better performance than the NVS3100m). Is there a substantial performance difference that I would see from the increased graphics RAM?
-
Doesn't really matter, for a 64-bit card.
-
No performance difference in games/settings that will run on both. 256MB might crash or stutter heavily if you choose ultra quality textures, high resolution or install mods with high quality textures,like Oblivion @ 1920x1080 with QTP
-
Realistically I don't need the highest detail textures. I've been working with an ATI Radeon 7500 for the last six years so anything will be a huge improvement. Are there any games out there right now or coming out in the near future that will require 512MB of VRAM?
-
Meaker@Sager Company Representative
If they REQUIRE 512mb of ram, your card will be choking already anyway.
-
The biggest issues are, that your card will have just 16 shaders, and the 64-bit bus. The amount of video memory available is essentially irrelevant, when dealing with such a low end GPU.
-
If having more shaders makes a huge difference, why is the ATI 3450HD (which has 40 shaders) delivering half the performance of the NVS3100m on benchmarks?
-
1 Nvidia shader/CUDA core != 1 ATI stream processor
It's a different architecture, you can't compare them on a 1 to 1 basis.
By rough comparison, people general use the 1 Nvidia shader = 5 ATI stream processors, but that's a rule of thumb type of thing. -
Meaker@Sager Company Representative
ATI shaders = 5 way
Nvidia shaders = 1 way
Each ATI shader (divide by 5 on the reported number) is better than an Nvidia one. However you have to remember Nvidia run theirs in a high clocked shader domain, typically 1200-1700mhz while ATI's entire core runs on the same clock domain usually 725-850mhz.
It depends how well each game is using ATI's 5 way system, if a game could keep them all full ATI's shader power would make Nvidia's power look like a pocket calculator.
However much like quad vs dual core that rarely happens. It looks like its typically 2-3. -
So if ATI shaders are superior technology why does there seem to be a preference for nVidia GPUs in current commercial notebooks? The Lenovo T410, Dell E6410, and HP 8440w all use the NVS3100m. I can't imagine the nVidia GPUs are any less expensive. Are there any applications out there that take full advantage of the ATI architechture? I did notice that the shader speed is far faster on the NVS3100m than on the HD3450 which I would assume makes a difference.
-
in your case, no difference at all.
-
Meaker@Sager Company Representative
Nvidia GPUs could well be cheaper, also Nvidia more recently was the only one with an active mobile driver platform. They tend to have more links in the industry it seems too.
I never said one was superior to the other. I just described how they work differently.
ATI: Less more complex shaders at a lower clock
Nvidia: More less complex shaders at a high clock
Substantial difference going from 256 to 512MB GPU RAM?
Discussion in 'Gaming (Software and Graphics Cards)' started by MattB85, Jun 9, 2010.