To me part of what you are paying for is better drivers. I agree the premium seems a bit much at times, but that remains to be seen just what the pricing will be. But buying the cheaper 6970's and getting crummy drivers after YEARS of AMD claiming to get it right is basically "you get what you pay for". The very fact that you can't simply go and download and install drivers straight off of AMD's site with zero fuss is simply proof they just don't really put much effort into getting their drivers right. Remember the bios lock issues with the R2? Just not the way I expect a premium product to run.
If the only high end chip for M18x is ultimately 6970 CF, that's the way things go. I don't think that's how things will turn out but if that is the case, then I'll have no need of the machine.
-
You can and do get the latest drivers for the 6970M and other AMD products from their website. And so far the 6970M Xfire drivers are working really good and AMD is pushing out hotfixes for games that need them quite fast. Can't really say the same for nVidia but if a big corporation isn't adapting nVidia's high end, it has to be for a very good reason. nVidia's driver myth is best left in the past at this point. Scaling is better with AMD now and they have more regular driver updates. -
Without a 485M SLI or 580M SLI offer I won't be switching from my M17X-R2 because than there is imho not enough performance and feature increase to justify the M18X and the 4000 Euro cost.
-
-
cookinwitdiesel Retired Bencher
-
M18x eats the R2 for breakfast and then poops it out. There's a huge performance difference between the two systems. IMO ATi makes better video cards and drivers than nVidia this generation. Crossfire scales better than SLi, driver updates are quite regular (as are hotfixes), the cards are cheaper and you don't hear of AMD cards burning out. At this point with all the regular driver updates, it wouldn't surprise me to see 6970M Xfire as fast if not faster than 485M SLi overall. -
I was one of the most die hard Nvidia fans back in the day. 7800GTX 7900GTX 9800GTX, I had them all and loved them. Interesting changes began to happen when ATI released the new 4800 series of cards. Although I never bought one, I definitely started considering ATI as a contender finally. When I went to build a new rig in 2009 I couldn't stop doing research, I'm actually diagnosed with mild OCD about certain things, spending large amounts of money was going to take a lot to justify. After 3 months of straight research, emails, talking to people who work at the company, I concluded that there is no way that you couldn't buy ATI. You look at power per watt, per dollar, and NOTHING touches ATI.
Look forward to the 6000 series and ATI cemented themselves in the best of the best. How? THEY LISTEN TO THEIR CUSTOMERS! Biggest complaint from a lot of the 5000 users was scaling. What does ATI do with the 6000 series? Destroys SLI and does it for a FRACTION of the price of a Nvidia setup. ATI releases their drivers monthly on time AS PROMISED and has kept up with hotfixes when necessary, a la Witcher 2.
Bottom line is, how can anyone pick Nvidia anymore? They cost more, drivers may come out fast but they have literally destroyed peoples cards, and the scaling is not as good as ATI. Why would you want Nvidia unless you are die hard into 3D? I can't wait to see what ATI does with the 7000 series and there are NO signs of ATI being beaten anytime soon.
Dammit another TL;DR post by me. -
notebook-wise ati wins this year. i paid $150 just to add another 6970m card in my m18x. its almost too good to be true.
ever since cat 11.4 ati has been on a huge roll. crossfire 6970m has been a huge improvement over the gtx 280m in sli i had earlier. -
-
-
You don't have to go far to see the continuing glitches, indeed some of them you yourself Joker, found with the 5970's and 6970's. I don't buy the "they are just fine now" statement. There are issues all over these forums where graphic errors happen and that's not imaginary. First day out you couldn't install the drivers. You need a workaround and a complete driver wipe to get it to work. That is not how it should work in my book. If all you care about is speed, and you don't mind taking the time to tinker to get things to work right, then it's good enough I suppose. I don't and I don't expect to for the price of these things.
I can see many of you don't agree. Yet I watched as you struggled the first days out of the gate with this machine and the R3 as well. But we don't have to agree. And I don't think 6970's are out and out bad, just not what I'd prefer for the money spent.
Anyway, let's remember it's only a month old, so there are many changes than can come. Some may come soon some longer, we'll see. -
Looks like the M18x is indeed getting the 560m
NVIDIA refreshes mobile graphics with GeForce GTX 560M, attracts ASUS, MSI, Toshiba and Alienware -- Engadget -
That's who we are, we like to mess with the systems and push beyond limits, upgrading every possible component, doing custom paint jobs, etc.
If you think Nvidia cards have less driver issues, - think again. Look in the X7200 Benchmarking thread for example. There multiple cases of driver related issues with SLI (cards don't upclock/downclock normally), etc.
One of the famous Nvidia issues is bad soldering and high failure rate after 1-2 years (remember all the oven-baking of the GTX8800/9800?).
Major stuttering/latency problem that plagued all SLI systems (260M/280M/285M) and Nvidia chipset 2 years ago and it took Nvidia almost a year to fix? the problem.
Another point to consider is that Nvidia shows only the core/die temps for the cards and many gamers and benchers remain oblivious to what's happening to the memory and other components during heavy load. Yes, 85C is fine for the core in Furmark but do you know that memory is possibly hitting 100C+? AMD gives you more info - Memory/Shader and Disp I/O sensors
Two days ago I ran a neck-to-neck comparison testing between a decked out X7200 (990X+2x485M) vs M18x (2630QM+2x6970M). Both on stock drivers, no tweaking, no tinkering. See for yourself :Attached Files:
-
-
-
chewietobbacca Notebook Evangelist
No
Kepler isn't until 2012
560M is just a higher clocked (14-15% higher) GTX 460M with some core improvements. I'd estimate 10-15% faster overall (since clocks don't scale linearly unfortunately) -
-
http://forum.notebookreview.com/gam...28nm-mobilty-hd7000-series-coming-2012-a.html -
chewietobbacca Notebook Evangelist
-
SaosinEngaged Notebook Evangelist
This debate is purely subjective. There is no, "Nvidia is clearly better, or, "AMD is clearly better."
It comes down a lot to personal history with either product and which brand the individual is just more comfortable with.
For me, that's always been Nvidia. I do LOVE the 5870's in my R2, and I love the 6970 in my R3. BUT, at the end of the day, if the m18x ever gets released with a top end Nvidia option, I wouldn't think twice about the price premium.
Maybe it's because of nostalgia, maybe it's because I feel more comfortable with Nvidia cards in my system, maybe it's even because I subconsciously think paying more means you're getting more, IDK.
All I know is I would jump on a top end Nvidia configuration, even though the 6970's are monster cards; great performing, excellent PPP, and good driver support.
It's one of the reasons I was excited to pick up an m14x; I'm happy to own another Nvidia based system. -
-
I know the slide you're talking about, and if you think Southern Islands is only a revised HD6000 then you should consider Kepler even less of a change to Fermi. Kepler's die shrink of good Fermi arch to 28nm HKMG by itself will account for most of the increase in the DP GFLOPS/watt that slide markets....not that double precision floating point performance means much at all to gamers. -
chewietobbacca Notebook Evangelist
Nvidia and AMD both don't manufacture their own GPUs. They use TSMC to manufacture them, and if TSMC doesn't have 28nm ready, new cards won't come out. It's that simple -
fermi released in march, in which is not that deep into the year :/....
only info on kepler is better efficiency, but its supposed to be a new architecture according to "talk". i dunno where youre getting your info from but theres no way kepler is just a die shrink of fermi. nvidia has never done that in all their years.
the current info on the HD7000 that i read is that it IS a die shrink and not a new architecture, what sources are you reading your info from?
http://www.bit-tech.net/news/hardware/2011/04/15/radeon-hd-7000-due-june-or-july/
this is pretty recent, and by the sounds of it the hd7000 series is just the 69xx on steroids. -
-
yea but 8800s came out in 2006 november. gtx 2xx came out in mid 2008, which is about the right time frame to release a new architecture. nvidia sold their 9800gtx/9800gt/8800gts as the mid level since the gtx260/280 were the ultra-high end. thats excluding the mobile brands.
the article states its based on the HD6900 series, but you were saying kepler is less of a change than what HD7000 is to HD6000. well there no news that kepler isnt a completely new architecture. its not supposed to have anything to do with fermi. so why did you say its a die shrink of the GF114?
its almost confirmed that hd7000 is just hd6900 on steroids, which isnt a new architecture. its just a revised 6900 with double the shaders.
TL;DR what im saying is GTX 6xx = something brand new, HD7000 = 6970 on steroids. -
holy crap the fps on the 560 sample sheet is very impressive for a single card .... in SLI ... ? this should be very very nice
NVIDIA GeForce GTX 560M brag sheets - Engadget Galleries -
-
So at least one portion of the prediction came true, 560M in June for M18x.
580M = 2H of the year so 50% of the prediction failed because, 2H means they just don't know when in the year it will be coming which usually means "late".
560M sli is nice, but not enough for me to get a M18x yet. It will be interesting to see how it performs, but suspect a single 6970 or 485M will beat 560M in SLI. -
No they aren't as good as 6970 x-fire
-
-
0i dont think a single 6970 is going to beat it .... (sli 560 that is)
-
-
I didn't realize the 6970m's could get such high FPS. :O
-
As FYI I hit 69fps (xfire) vs 37 if you give Nvidia 100% sli scaling then you can see you get simply destroyed.
Sorry but the 560's are glorified 460's. Not an option for someone who is vendor gpu agnostic. I don't care what it is or who it's from as long as it performs. I've had good luck with ATI and Nvidia and bad luck with both as well so really performance is all that matters to me oh and bang for your buck...The Nvidia Next Gen solution is rumored to be $800 dollars where the xfire is $400 and no matter what Nvidia pulls out of it's hat no way the 580m is going to double the 6970m. -
katalin_2003 NBR Spectre Super Moderator
You probably meant Dell. -
-
katalin_2003 NBR Spectre Super Moderator
-
-
katalin_2003 NBR Spectre Super Moderator
-
-
an Nvidia 560m is hardly an upgrade, it is marginally better than a 460m and only because the clocks are faster in fact when it comes down to it the 560m in my opinion is just a slightly overclocked 460m and the 580m isn't much better, Dell need to look long and hard at what the competition is offering and get their heads out of their a$$es
-
chewietobbacca Notebook Evangelist
-
cookinwitdiesel Retired Bencher
-
The m17x just got the 580m (2GB) as an option and it costs $350 more than the 6970m (2GB). Does this mean that when the m18x gets this it will be $700 more for these in SLI? I thought the 6970 was supposed to be faster. If not, then what gives on the markup?
-
580M is supposed to be 15-20% faster than 6970M. Plus, Nvidia always charges more for their cards especially when it's a performance king.
Anyway, I hope AMD will answer with the 6990M soon for the same/similar price as the 6970M and minimize the performance gap. -
As I expected. AW goes 580m without any delay. This pretty much solves the whole '485m not available for AW laptops' mystery.
-
My bad, was confusing it with the 560m
-
NVIDIA announces GeForce GTX 580M and 570M, availability in the Alienware M18x and MSI GT780R -- Engadget
This is great news for us AW enthusiasts.
P.S. If you haven't realized already, the M17x has already gotten the 580M and may get the 570M. -
OMG !!!! I just fricken ordered last week my m18x ... grrrrrr
-
cookinwitdiesel Retired Bencher
The GTX 580m is nothing but a very slightly higher clocked GTX 485m people......
The memory clocks are identical, the core logic and size are identical, the shader is only 90 MHz higher.....meaning the core will be like 35-40 MHz higher.....this is hardly ground breaking.
It should beat the 6970m by about 5% if that. That is my prediction.....
The GTX 580m should overclock well though if the transistor design reflects the same changes made in the desktop 500 chips (of course....heat permitting)
m18x to get GTX 560m and 'other exciting GPU options' in June
Discussion in 'Alienware 18 and M18x' started by Shaden, May 14, 2011.