Hello there people. I just noticed dell.de has as option in the m18x the 680m sli. Price is double compared to amd 7970 CF.
Lets just not talk about that.
First question.
1)On dell.de it says 680m sli 2gb gddr5.
Is it 2gb or 4 gb ? I heard the 680m should have 4 gb per chip. If its double the price, at least we should be gettin double the memory.
2)Does anyone own a 680m sli rig so that i c an see some benchmarks ?
-
the 4GB 680M will be added later on
as far as I know everyone's 680m SLI rig is in production at the latest -
Dell reps have come out and told us that they will ONLY carry the 2GB version of the 680M. Clevo has a 4GB version that MIGHT be compatible, should you buy it and install it yourself.
Orders for the 680M SLI probably won't be in for several days more. -
Well thats a bummer.
Perhaps one can find a 680m sli 4gb kit that will fit the x7200. Then there would be no reason to upgrade to m18x. -
Basically, there will be zero performance difference between the 2GB and 4GB cards unless you're running games with very high resolution textures across 3 screens - in which case, the cards won't perform quick enough to get playable framerates even if they have 4GB of VRAM. -
As for me I refuse to pay for the 2gig model if I can get the 4gig. The price is almost identical so I want the top end model. Otherwise if were talking best bang for your buck then none of these cards hit that mark. -
What do you need 4GB for?
-
That's right, Dell confirmed 2GB is the only option that will be available. That's also right that more VRAM matters in case of multi-display setups or very very high single display resolutions, and since m18x are 1080p, the only real different between 2GB and 4GB versions would be the higher price of the latter.
From what I've observed, the only game that actually approached 2GB of VRAM usage on 580M equipped R1 was Crysis 2 with the hi-rest patch. -
That 4GB vs 2GB is better for multidisplays is a rumour that is very skewed
Look at the 2GB 680 vs 3GB 7970 reviews and you will see that the 680 still beats the 7970 when gaming on 3 screens.
Here is an example
Review: Three-screen GeForce GTX 680 vs. Radeon HD 7970 - Graphics - HEXUS.net
Question is: How much do we save on buying 2GB instead of 4GB 680M? -
Laptop Video Cards :: nVidia GTX 680M 4GB Mobile Video Card - R&J Technology, Clevo Barebone Notebook kits, Laptop and desktop system builder
Alienware on their website has the SLI (2gig version) for "upgrade price" of $1100. Single upgrade price for 550. Meaning that each card is over 550 since the upgrade price has already the 660m price included.
So the price increase is slight and lets face it if your paying top dollar for a 680m why not really get the best model out there.
Otherwise if you want best bang for your buck 7970 is the card you want.
As far as your argument of not needing 4gigs sorry but my 6990 Crossfire could not cut it with my 3 30" monitors (2560 x 1600 per screen).
I used hydra on my 6990 CF setup to create one big screen from these monitors but the gfx card just could not cut it.
So for me why should I pay top dollar on a 2gig card when I can get the 4gig version instead.
EDIT: Here is a link to my screens http://accessories.us.dell.com/sna/...=dellSearch&baynote_bnrank=0&baynote_irrank=2 -
has anyone ordered a m18x with sli ? how long did it take to get your rig from "preparation" to "production" ?
thanks -
The average time to go from production to shipping right now is misleading. Figure two weeks at the most right now. Once their stock of 680m improves that number should drop quite a bit.
-
-
But you've got a bigger problem: 6990 crossfire is not fast enough to run games at that sort of resolution, regardless of the amount of VRAM.
680M's will be the same. Even with 680M SLi, you're not going to be able to run games at 7680x1600 and high detail, even if you have 4GB of VRAM. The GPUs just aren't powerful enough for that.
My point is that the GPU power and shader count will bottleneck the system before the amount of VRAM will. So what's the point of having more VRAM? -
Just to back EviLCorsaiR on this one; running games at something in the neighborhood of 7680x1600 is going to death-grip the hell out of almost any set of graphics cards, laptop or desktop, currently out there. If you were to set up a 4xSLI or 4xCF desktop rig with some of the high end [670, 680, 7990] cards out there right now you might get away with it but even then it could be dicey.
Remember; even jumping from a 1920x1080 to a 2560x1600 res monitor is DOUBLING your pixel count. So when you jump from a 1920x1080 res to a triple screen setup to a triple screen setup with 2560x1600 monitors your dumping a huge amount of extra load on the graphics cards in a system.
While laptop graphics have come a long way and boast some pretty nifty capabilities with this generation, they just done have the compute power [irregardless of the amount of vram you throw at them] to power resolutions that high. -
I ordered mine on Monday, July 2nd. It's already in production, but the delivery date isn't until the 23rd. Sucks because I am impatient as all get out. I am a US customer, and I ordered 680m sli rig.
-
@Bohti, hope mine will be in production soon, ordered it the 27th june, still in preparation, delivery date is 24th july ... (french customer)
i assume the building and testing time won't last more than two weeks, you'll have it soon -
-
So why would I take a step backwards on stettings, limit myself on current and future advances offered in the application when purchasing a new system?
Thank you,
Monnie -
Talking the amount of VRAM vs. multi display setups: Let's start from the fact that 3 monitors cannot be connected to a single M18x talking a proper nvidia surround setup. You cannot mix digital with analogue output signal to go surround, and there is only one mini-DP and one HDMI out in m18x.
In other words 680M won't run surround regardless if it has 2GB of VRAM or 10.
BTW, the card will be obsolete next year anyway so what's the commotion about VRAM out of sudden? Will this year's games run fine maxed out on the 2GB version? Yes they will. There, done. -
Was just asking because i saw the 680m comes with 4gb then i saw on dell 2gb model.
I installed evga precision x, and with sli enabled in diablo 3, all maxed out and adaptive vsync on, i get 60 fps stable and aprxo 430mb from each card is occupied.
I guess 2 gb should be enough for most of the games out there, if not for all.
Was just thinking, since one pays a load of money , might as well receive the 4gb model.
Also wanted to ask, the temperature of my cards are 78-79 for the main and aprox 70 for the second when i play diablo 3, usage is about 50-60%. I know its hot in my room, but are not these temp a bit to hot ?
Also if one gets 680m sli with 4gb each totalling 8gb video memory one could keep up with the golden rule that always stands, 4 times more RAM then Video memory. meaning the 32gb ram in the m18x2
Now i have 460m sli with 1,5gb each (3gb video ram) and 12 gb ram in the x7200. Coincidence ?
I guess the 4gb model would be usefull with a quad FULL HD retina like display in a laptop. But untill then, even a quad HD will be an improvement. It seems we can't get away from the 1080p resolutions. Its seems we are stuck with them. -
-
Firstly, it doesn't 'total' 8GB video memory in actual usage. Because of the way SLi (and Crossfire on AMD cards) works, both cards have the same files on their VRAM at any one time, so the total usable VRAM is that of a single card - 4GB in this case - no matter how many GPUs you have linked up.
Secondly, suggesting you need that much RAM is ridiculous. 8GB, if anything, is still overkill for today's games. Very few games will utilise even half of that, most graphically intensive games I play use about 1.5GB-2GB of RAM, possibly up to 2.5GB.
For the vast majority of people, the only reason you'd ever want more than 8GB of RAM today (on non-server machines) is if you wanted to run a RAMdisk, or if you're running CAD applications or virtual machines. This applies no matter what GPU you're using. -
apparently the 4GB v2GB Vram on the desktop 680 doesn't make any differnce even with high res monitors as its chocked by the 256bit bus
-
-
It's like on the old M14x R1, the 555M came in 1.5GB and 3GB versions. The 3GB was only marginally more expensive...but what's the point of paying that little bit extra when there's no performance benefit? -
Bottom line we don't "need" the 680m or the 7970 to run most games at 1080p but we choose to upgrade! So since I am choosing to upgrade I am also choosing to upgrade to the best version out there. If I wanted the best bang for my buck it would be the 7970 but since I want to go nvidia this time round and pay the inflated price then I might as well get their top tier version of the card. -
I think you are misusing the term "more bang for your buck." Just because the 7970 has more vRAM does not mean it is more powerful then the 680m, in fact the 680m is the more powerful card even with less vRAM so it is worth the money over the 7970. I also understand what you mean about getting the best card that Nvidia offers. I too want the 4GB card because why not. I am already going to pay a ton of money, why not go all of the way and I am sure I will find a use for the extra vRAM at some point. It is also a bragging thing to an extent. I can say "I have dual 4GB GTX 680m SLI, what do you have?" Haha that sounds nice. Are they going to perform different from the 2GB cards? No, so I would recommend if you are getting a new system to stick with the 2GB from Dell when having one built. For someone like me with the R1, then either is a viable option really.
-
-
I thought I read somewhere that the 7970s were 3GB cards. Oh well, but I would rather the Nvidia due to driver support and PhysX.
-
-
Rep+ for you! -
Something to keep in mind is that any person's use of the term "best" presupposes a specific context or frame of reference. What context one person uses may not perfectly equate to the context of another.
So while a 680m with 4gb of DDR5 may be for some the "best" because of the specific context they are operating within [specific software/use requirements] it may not be the "best" for others. While the performance hit may be minuscule going from 2gb to 4gb of memory will require an ever so slight power increase and will create more heat when overclocked [more memory at a given over clock creates more heat]. Now this may not affect the person who requires 4gb for their specific application it may be a consideration for others.
All I'm trying to point out is that in order to determine which card, or variant of a given card, is the "best" we should be clear about the context we are operating in. Additionally, if someone else is operating in a different context we should be respectful and understanding of the fact that their context may dictate another card/variant as the "best" for that application.
We see this in things like the variation between cards in given games. For some games an ATI/AMD card may preform optimally, while in others a Nvidia card may give a higher frame rate. We can't say that either card is un-qualifiedly [is that a word?] the best. We can only say that in the context of a specific game one card gives better performance than the other. -
Unequivocally is the word you are looking for, and it is very much a matter of perspective.
-
Let me ask another interesting question.
I play diablo III, 1080p, all maxed out, with adaptive vsyinc ON. I get constant 60 fps.
What i learned is that if without vsync the game reaches say 150 fps and you limit the game to 60 fps, then the gpu will not pull so much current because it wont need to reach those higher frames. That makes power consumption lower and perhaps temperatures lower.
Anyways with vsync ON i get both gpu load at about 50-55%.
Question is now, since 680m is much more powerfull then 460m, it will probably reach much more FPS unimpended by vsync.
If vsync is ON, will the load then be lower than in the case of 460m ?
Will that result in even lower temperatures and even lower power consumption ?
If so, that alone is a reason to upgrade. -
VSYNC is a frames per second limit. 60 is the max, unless you have a 120Hz monitor - which in that case it will be 120 FPS.
Without VSYNC, the 680M will get more FPS with the exact game settings that the 460M has - if the settings are higher when using the 680M, that may not be the case.
The load would be lower, and temps may be lower depending on the fan profile. -
-
BTW we still need someone to volunteer for installing the 680M in the x7200 -
-
Hello,
I tried searching for this but I came up empty handed. Anyway, I have a M18X R1 with 6990 Crossfire X. I would like to retrofit the system with a single 680M. Has anyone successfully done this and does anyone have the list of parts that's required?
Also, do you have to modify the .inf file for driver install even if you order a 680M from Dell?
Thanks in advance! -
Yes, we are already starting to see this upgrade. It is compatible with the R1 and works very well. You may need to mod the driver for now, as the hardware device ID changes when installed in the R1 (much like it was initially with the 7970M video cards). It will also show up in Futuremark bench results a Generic VGA until they add the device ID variation to their database. They may have already done so... haven't checked.
5150Joker has a single 680M in his R1 and it's blistering fast (3DMark11 7563 with only one GPU, which is similar, if not slightly higher than a single 7970M).
The parts are not available from Dell at this time. You will need to buy the GPU from Clevo for now, and you can expect to pay nearly $800 for one GPU from them, but it is a more robust GPU with twice the amount of vRAM. The heat sinks for the 580M will work with the 680M. -
Thanks for your reply, Mr. Fox.
So basically, I would need to get, the 680m card from Clevo, 580/680 heat sink from Dell and some thermal paste to do this retrofit?
Do you suggest that I wait to order the part from Dell or just order it from Clevo? Think there will be any differences between the cards? -
-
If you are only planning to install one 680M, yes... a heat sink for the left side, and support plate (aka back plate or heat plate). The 580M heat sink comes with pads and thermal interface material applied already if purchased new from Dell, so buying paste would be optional. If you plan to install 680M SLI you will need an SLI bridge cable as well as a right heat sink and second support plate. As I mentioned, the Clevo cards appear to be more robust with twice the vRAM. There are some differences in the vBIOS, they are not compatible between the GPUs, and the information I have seen reported suggests the Clevo cards offer slightly more aggressive performance. I don't have a recommendation either way. There would be pluses to ordering them from Dell, but they are not available for purchase at this time.
-
Got my 680m installed in my M18X today and itworked likea charm HOWEVER:
the optimus feature which lets me switch between the discreet gpu and integrated gpu at the press of a button to save power on battery did not work as the computer neither in windows or BIOS would read an integrated gpu the intel 3000 hd on die with the cpu AT ALL. so there was NO way to switch ... so I had to put in an ssd anyway so I reinstalled windows and now not only does that not work but upppon reinstalling windows, drivers and alienware programs some of the quick keys do not work, like the quick key for mute does not work and volume up lowers the volume and volume down raises the volume now and a few more things like that!!!
Is optimus working for others with the 4gb 680m in the m18xr1? any ideas? -
Have You Test It On games? I Don't Think that The the optimus is avalible on alienware laptop!
-
-
In my case (SLI machines) Dell / AW does provide for a manual GPU switching which requires a keystroke + reboot which is 'technically' not =/= "Optimus" -
that is my issue here
I just have the discreet gpu to choose -
Based on what you have described, you haven't finished installing everything yet. You need to download Alienware Command Center and Alienware OSD before those keys will work in your new Windows installation. You can download both for Dell Support.
Alienware OSD - Drivers and Downloads | Dell [United States]
Alienware Command Center - Drivers and Downloads | Dell [United States]
NVIDIA GTX 680M Thread - M18x R2 and M18x R1 Upgrade Discussion
Discussion in 'Alienware 18 and M18x' started by Bytales, Jul 4, 2012.