The 485m has been out for a while and Alienware isn't offering it in the M17xR3, so I don't see why it would magically be offered in the M18x.
As far as pricing, I had read an estimation of pricing starting at $2,000-2,100. I'm going to assume this is with a low-end Sandy Bridge CPU, sup-par RAM (4GB maybe 6GB) and a 460m or 5850m. Then you're going to pay the pre-requisite couple hundred bucks to upgrade to a 6970 and a more adequate CPU. So $2,500 non-SLI. In comparison, a 1080p M17xR3 with the 2630 (2.2Ghz) i7, 4GB RAM, whatever HD, and 6970 is $1,899.
Basically you're paying an extra half a grand (presumably) for the capability and infrastructure to allow SLI then or in the future, and I'm hoping 1080p standard (on an 18.4" screen anything less than 1080p standard is just weird).
Given what you could pay for an M17xR2 before it was taken down a couple days/weeks ago, that sounds about right. I think an SLI 5870 configuration of the R2 was about $2,500, maybe higher.
-
-
Jubei Kibagami Notebook Consultant
@ Matt, What is an SLI 5870? You mean crossefire 5870....lol
-
I've seen nothing that says you'll be ABLE to buy a M18x with a single GPU. I think that's a leap of faith that may fall flat. I believe that the answer to those desiring a single GPU will be the R3.
-
-
The only review available was at first with a single 460m because there was a problem with the SLI bridge but then was fixed and the review was updated with the 460m SLI benchmarks.
And second there will definitely be a single gpu option, the 460m for sure, and most probably the 6970 as well. -
but would be amazingly stupid to deny the possibility of single card... I mean... ?
but then I would just buy it and sell one of the cards by myself lmao. -
chewietobbacca Notebook Evangelist
-
The R3 is hailed as the only 3d capable Alienware laptop, that's its selling point, and the fact that the M18X offers single gpu options doesn't mean it's gonna compete with the R3, different target markets.
It would be stupid if Dell didn't offer single gpu's in the M18X, some people want the bigger screen, the better build quality but can't afford dual gpu configurations, so giving them the option of doing a single card configuration is obvious. -
SillyHoney Headphone Enthusiast
Ive been through tons of Nvidia cards for years before landing on the red camp for the first time with XFX 5870M. And now I'm not missing the green camp one bit
-
Because the R3 gives the single card option, all the way up to a 6970, do not be at all shocked if only dual GPU's are offered in the M18x. They may offer a "cost effective" level like dual 460's and maybe dual 6950's, but I bet at even the lowest configuration they run two cpu's.
This is purely guessing but it's the way things look to be shaping up, and the only thing fueling our concept of a single GPU option is that is how the R2 came, but with the advent of the R3/18x market split, it seems practically obvious that they would keep the 18x dual card only. And in fact that is exactly how the literature describes it. Now the literature could be wrong, and I could need some salt when I eat my hat, but that's how I see things panning out. -
Guys why are you debating whether a single gpu option is gonna bed offered or not? It DEFINITELY will be, accept it
I know you're thinking but wait how can they offer a single card option while the R3 also offers that, but remember that adding an extra gpu is as I said an EXTRA, it's not a basic solution, many people don't want nor need it, they just want a bigger screen and a better build quality with better speakers.
And at a starting price of $1999 don't even dream of getting dual 460m card. Dell aren't stupid trust me -
-
dell is promoting ATI brands in my mind, as ATI cards definitely has the better Price/performance ratio. thus, i'd like to choose 6970CF...u know the 485SLI is just tooooooooo expensive, and 460 is pretty ty.
-
-
Are there single GPU options on any of the current config sites?
-
cookinwitdiesel Retired Bencher
There is no config site for the M18x that I know of
But the M17x R1 and R2 both had single GPU options. It would just downright stupid to not offer it. -
I fail to see how any of this relates to 485m vs 6970m SLI/CF performance, by the way.
How about enough speculation, and we wait and see in a few weeks? -
Think we are all hoping the waiting doesn't take too long
-
Can you guys tell if the golden colored cable in the picture below is the SLI/Crosfire bridge?
http://www.hardwareheaven.com/reviewimages/alienware-m18x/alienware-m18x_inside2.jpg
EDIT: If it is, (it is labeled as a vga connector cable in the service manual) is anybody else besides me concerned about the short term/long term effects of it sitting directly on top of two heatsinks/heatpipes? -
cookinwitdiesel Retired Bencher
That is the SLI bridge and the heatsinks are not a problem. The M17 and M17x both had the cable going over heatsinks and never had a single problem due to it
-
To me it looks like in the M17x chassis the SLI/Crossfire cable barely (like barely barely maybe) touches the CPU heatpipe on the right side just before it connects to the right MXM module.
-
cookinwitdiesel Retired Bencher
It is resting on the heatpipes for the CPU heatsink
just left of the piece of tape
-
In the M18x it looks like it is sitting directly on top of the GPU chips on both sides....like directly on top of them.
I don't know, you might be right about not being a problem. I am an electrical engineer and I am not really sure about it, I am still concerned. Perhaps some fellow engineers in the forum might want to take a look at this too and see what they have to say if they want. -
Yea that might be worse than over the heatsinks as the heatpipes get much hotter, no?
-
-
cookinwitdiesel Retired Bencher
Higher heat raises thermal resistance and introduces instability at higher signaling frequencies since the electrons will collide with more nuclei while flying through the conductor which slows them down.
The amount of heat transmitted off the heatpipes however is VERY little due to lack of surface area - air is a HORRIBLE conductor of heat/energy. That is why the fins are needed at the end to transfer that heat with the air that passes over the fins. Not only that, but the signaling speed of the SLI link is already established to be something stable at normal operating temps and then some (safety factor when designing a solution - design it to work in worse than ideal conditions).
As I mentioned, the M17, and M9750 both had designs that draped the CF/SLI cables across much hotter scenarios than this, I would not worry about it for one second. They are pretty good at designing these laptops afterall. The clevo dual GPU solution would suffer much more than the M18x IF a flaw with such a design existed as their SLI cable is right across the top of the GTX 485m GPUs
-
I also lost you on the surface area part. If the surface area is large then the heat will be more distributed and not as high at any given point on the surface. If its a single point with a small surface area then it will be pretty hot... I mean the heat pipes are designed to transfer most of the heat off the GPU chip to the heatsink...so I will be astonished if the end of the heat pipe is hot, the start of the heat pipe is hot...and the middle "VERY" little.
I know that there are ribbon cables that can stand high temperatures (the Teflon kind) but this one looks a little bit too thin for that (I might be wrong). I'll have to see its part number.
Further, how do we know that the SLI link is stable sitting directly on top of the heatpipes for this particular system, how has it been established? If you know the specs then we can look at them.
http://media.bestofmicro.com/R/U/265386/original/malibal-nine_x7200-sli.jpg
I am not really sure "they" are pretty good at designing these laptops. Its not the first time "these" laptops shipped with issues. I am glad you are not worried and I look forward to your review and first impressions of the system once you get it -
cookinwitdiesel Retired Bencher
While it is hot to the touch, that is exactly it, to the touch. As in a semi-liquid/semi-solid body (a human finger) touching it. The air next to the heatpipe is absorbing very little of the heat as air it self is a poor medium for transferring that energy. Energy (in many different forms) travels MUCH more efficiently through liquid and solid bodies compared to air/gaseous bodies. Due to this, even though the heatpipe may get hot, that heat is not being transferred to the SLI cable. I cannot speak to the characteristics of the insulator they used but have confidence in its abilities. The fact that my M17x r1 is still alive and kicking is a testament to this.
And the picture attached shows how the SLI/CF cable will be situated, the blue one that crosses the top of the notebook.
And for the clevo pic you linked, that orange one is the SLI cable. Note that it passes right across the backside of the PCB which will be as hot as I would expect any heatpipe to be as it is the opposite of the GPUAttached Files:
-
-
For the clevo. It's right in between the two GPU cards, close to the PCB yes. I don't know how far below the GPU PCBs it extends. I guess if it extends all they way to below the GPU it will be relatively hot. I have no idea how hot the bottom part of a GPU is. Maybe I should find out, might be useful.
I do understand, however, about the info you provided that other manufacturers have this cable close to heat sources with no issues. It might not have any effects. Personally I would have avoided the M18x SLI cable placement. I have no opposition with the M17x, it doesn't look as bad. Thank God I am not designing Laptops.
Lets see what happens when then final product arrives. Hopefully all will be good. I really need to get a new PC and I had my eyes on this for a while. -
cookinwitdiesel Retired Bencher
I have never measured the actual heatpipe temps, just have the die temps as reported in software. Maybe I should look into an IR thermometer, quite useful for these sorts of discussions haha
The backside of a PCB with a 100W GPU cooking away on the front side will get PLENTY hot. This I can say with certainty from experience gained with my R1 (and desktop cards as well).
We also need to keep this all in perspective. The majority of the time a CPU will not exceed 80c if the cooler is applied well. That is the hottest point in the system, in the CPU die itself. The heatplate on the CPU will absorb most (but not all) of that heat and transfer it via heatpipe suffering yet another inefficiency in energy transfer. Keeping all this in mind and assuming a worst case (meaning 100% efficiencies), we are still only looking at somewhere around 80c temperatures being exposed to the SLI cable (and I am sure we can agree that is an easily not bad worst case for cable insulation). I mention the CPU because those are the heatpipes that the cable actually rests on while going between GPUs.
I do not want to come across as argumentative on this but let me say that I am 110% confident that there will be nothing to worry about. I do not know of your experience in laptops but so far in mine I have seen these or similar configurations and not had anything to worry about. I am confident that traces in PCBs get considerably hotter than anything this SLI cable will ever be exposed too while being smaller and they still survive just fine. -
-
cookinwitdiesel Retired Bencher
Of course, it is a purely academic exercise as neither of us are going to solve the worlds problems in regards to laptop design haha
-
Just put tape underneath the cable that crosses over the heatsink and problem solved.
-
almost 2 years and the cfx bridge is far from a consern i'd be more conserned about thier south bridge (now that the north one is in the cpu it's preatty much the only place they can screw up) or the fact that this will be thier first 330w system and might run into issues -
Ok guys i tried to do a little research on dual gpus and slis and crossfires and couldnt understand it all (since i hv never used dual gpus, no experience either).
One problem that is coming up a lot on google is people saying that this game is not supported or xyz driver is not there or create this or that profile and what not.
So i was wondering if anyone who has used dual gpus or knows about these things could tell me that suppose i have some games such as nfs most wanted or nfs shift coud i actually see the benefits of dual gpu or only in certain games is their some benefit?
Like for example take nfs mw. It is quite an old game and probably doesnt have drivers and profiles or blah blah blah or dual gpu support. Would i still get more smooth gaming and higher fps?
Basically what i am asking is , if i buy dual gpu in the m18x will i see benefits in all games without having to constantly worry about updating this or that driver, making so and so profiles?
If no, what are the problems faced with dual gpus? -
If for some reason you ever have a problem with a game and need to update drivers, it sounds much more daunting than it is, as long as you follow the steps, you'll be fine, and both nvidia and ati have programs that help you with that.
As for performance, yeah it's noticeable, but it's quickly getting to the point that folks are talking about 1080p screens at 60 frames per second and max resolution. It's become a game to rake every single pixel over the coals in pursuit of the max settings possible without bursting into flames. If you just play a game on high settings, you'll be fine and likely won't notice the difference. It's once you're trying to make out the color of the eyes of the racer 3 cars behind you in the reflection of the bumper ahead of you that it'll start to make a difference.
Slight exageration, but that's the general spirit.
As for straight compatibility, as I said, I don't know any games that straight up won't work, and there might be. Someone else would have to answer that, and potentially the why's of it.
Short version, you will certainly see the benefits of performance with graphics turned up higher, and the problems have been in my experience few and far between (non-existent), unless you go tweaking and tinkering and trying to milk every last drop out of them. -
if a game don't work on a multi gpu system ethier you are the issue or the game was coded wrong
and if the game don't run on multi gpu disable CFX or SLI the game should not require the power for 2 -
I sort of like the old M17 SLI cable arrangement a little bit better than the current M18x. At least the SLI cable in the M17 was in a sleeve and sits on top of metal plates which house the heatpipes on both GPU ends. It also looks like the SLI sleeve has some sort of standoffs that give it some distance from the heatpipe enclosure.
5150Joker :
Ducktaping (ok, proberly taping it) might indeed improve things(if there are things to improve, we shall see) but I don't think that's the point.
-
cookinwitdiesel Retired Bencher
I loved my M17 when I had one, just not the GPU or CPU cooling, too loud to little
-
If you look at my Desktop config at home is it worth it waiting for the m18x? I mostly use the Laptop for WoW, L4D2, and Portal 2. I feel like it might be a little overkill for what I need....Considering my desktop is a absolute beast of a computer, I'm not sure I need a beast CFX laptop.
-
cookinwitdiesel Retired Bencher
Having both your desktop and an M18x would not make sense, one or the other would be acceptable. Your desktop is already very strong though, much more so than the M18x for graphics
-
Get an M14x or M17x-R3. The M18x would be overkill for your needs. -
-
OT: How did you get dell to send you an R3 to try out? -
cookinwitdiesel Retired Bencher
I would like to do a trial on an M18x in that case.....lol
-
Have an amazing Dell rep that helped me out and considering the amount of business our company does with Dell they honored it.
Do you guys need a rep for a nice discount? I got someone that will help you out BIG time. -
Being as almost every pc game is capped based on gpu power at the moment, how long do you think dell will support the mx18 r1. Obviously knowing Alienware there will be a mx18 r2 that will most likely support z68 or ivy bridge but even then I won't expect the new chipset until summer 2012. The new ATI 7000m series is road mapped at being available Q1 2012. Do you think that Alienware will support the next gen of gpus with bios in the mx18 r1 or will they shaft us?
-
It's hard to tell, they usually support each new platform with at least one GPU upgrade during it's lifetime. The M17x-R2 had the 4870M and then later 5870M. If Dell keeps up that trend, we should hopefully see the 580 series and maybe 7000 series for the M18x-R1. Maybe even Ivy bridge since it uses the same socket. -
Yeah I was wondering if hm67 would be able to accommodate Ivy Bridge. I think that will be a deciding factor for a lot of people due to Sandy Bridge cpu's inability to be overclocked besides the 2920xm. This seems like the first time that Alienware has made a nice upgrade path that allows the mx18 r1 to last for quite a long time in the computer world(excluding Throttle Stop and the 920xm).
-
cookinwitdiesel Retired Bencher
For dual-card junkies: GeForce GTX 485M SLI vs. Radeon HD 6970M CF
Discussion in 'Alienware 18 and M18x' started by zAzq, Apr 21, 2011.