The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    For dual-card junkies: GeForce GTX 485M SLI vs. Radeon HD 6970M CF

    Discussion in 'Alienware 18 and M18x' started by zAzq, Apr 21, 2011.

  1. Matt Woller

    Matt Woller Notebook Evangelist

    Reputations:
    13
    Messages:
    359
    Likes Received:
    0
    Trophy Points:
    30
    The 485m has been out for a while and Alienware isn't offering it in the M17xR3, so I don't see why it would magically be offered in the M18x.

    As far as pricing, I had read an estimation of pricing starting at $2,000-2,100. I'm going to assume this is with a low-end Sandy Bridge CPU, sup-par RAM (4GB maybe 6GB) and a 460m or 5850m. Then you're going to pay the pre-requisite couple hundred bucks to upgrade to a 6970 and a more adequate CPU. So $2,500 non-SLI. In comparison, a 1080p M17xR3 with the 2630 (2.2Ghz) i7, 4GB RAM, whatever HD, and 6970 is $1,899.

    Basically you're paying an extra half a grand (presumably) for the capability and infrastructure to allow SLI then or in the future, and I'm hoping 1080p standard (on an 18.4" screen anything less than 1080p standard is just weird).

    Given what you could pay for an M17xR2 before it was taken down a couple days/weeks ago, that sounds about right. I think an SLI 5870 configuration of the R2 was about $2,500, maybe higher.
     
  2. Jubei Kibagami

    Jubei Kibagami Notebook Consultant

    Reputations:
    13
    Messages:
    282
    Likes Received:
    4
    Trophy Points:
    31
    @ Matt, What is an SLI 5870? You mean crossefire 5870....lol
     
  3. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    I've seen nothing that says you'll be ABLE to buy a M18x with a single GPU. I think that's a leap of faith that may fall flat. I believe that the answer to those desiring a single GPU will be the R3.
     
  4. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    lol the only review available is with a single gpu
     
  5. Cerberus

    Cerberus Notebook Evangelist

    Reputations:
    212
    Messages:
    364
    Likes Received:
    68
    Trophy Points:
    41
    Okay both of you are wrong.
    The only review available was at first with a single 460m because there was a problem with the SLI bridge but then was fixed and the review was updated with the 460m SLI benchmarks.
    And second there will definitely be a single gpu option, the 460m for sure, and most probably the 6970 as well.
     
  6. M18x

    M18x Notebook Guru

    Reputations:
    9
    Messages:
    59
    Likes Received:
    0
    Trophy Points:
    15
    but would be amazingly stupid to deny the possibility of single card... I mean... ?

    but then I would just buy it and sell one of the cards by myself lmao.
     
  7. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    Again, why offer a single GPU option for the high end notebook when the m17xR3 comes in single GPU format already?
     
  8. Cerberus

    Cerberus Notebook Evangelist

    Reputations:
    212
    Messages:
    364
    Likes Received:
    68
    Trophy Points:
    41
    The same way the M17X-R2 had single gpu options.
    The R3 is hailed as the only 3d capable Alienware laptop, that's its selling point, and the fact that the M18X offers single gpu options doesn't mean it's gonna compete with the R3, different target markets.
    It would be stupid if Dell didn't offer single gpu's in the M18X, some people want the bigger screen, the better build quality but can't afford dual gpu configurations, so giving them the option of doing a single card configuration is obvious.
     
  9. SillyHoney

    SillyHoney Headphone Enthusiast

    Reputations:
    543
    Messages:
    1,202
    Likes Received:
    1
    Trophy Points:
    55
    Ive been through tons of Nvidia cards for years before landing on the red camp for the first time with XFX 5870M. And now I'm not missing the green camp one bit :D
     
  10. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Because the R3 gives the single card option, all the way up to a 6970, do not be at all shocked if only dual GPU's are offered in the M18x. They may offer a "cost effective" level like dual 460's and maybe dual 6950's, but I bet at even the lowest configuration they run two cpu's.

    This is purely guessing but it's the way things look to be shaping up, and the only thing fueling our concept of a single GPU option is that is how the R2 came, but with the advent of the R3/18x market split, it seems practically obvious that they would keep the 18x dual card only. And in fact that is exactly how the literature describes it. Now the literature could be wrong, and I could need some salt when I eat my hat, but that's how I see things panning out.
     
  11. Cerberus

    Cerberus Notebook Evangelist

    Reputations:
    212
    Messages:
    364
    Likes Received:
    68
    Trophy Points:
    41
    Guys why are you debating whether a single gpu option is gonna bed offered or not? It DEFINITELY will be, accept it :)
    I know you're thinking but wait how can they offer a single card option while the R3 also offers that, but remember that adding an extra gpu is as I said an EXTRA, it's not a basic solution, many people don't want nor need it, they just want a bigger screen and a better build quality with better speakers.
    And at a starting price of $1999 don't even dream of getting dual 460m card. Dell aren't stupid trust me ;)
     
  12. Matt Woller

    Matt Woller Notebook Evangelist

    Reputations:
    13
    Messages:
    359
    Likes Received:
    0
    Trophy Points:
    30
    Oh darn, used the wrong term to mean two video cards running synchronously. D:
     
  13. qianlicc

    qianlicc Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    dell is promoting ATI brands in my mind, as ATI cards definitely has the better Price/performance ratio. thus, i'd like to choose 6970CF...u know the 485SLI is just tooooooooo expensive, and 460 is pretty ty.
     
  14. JCrichton

    JCrichton Notebook Evangelist

    Reputations:
    152
    Messages:
    530
    Likes Received:
    0
    Trophy Points:
    30
    Absolutely. There is no chance that a single GPU configuration won't be offered.
     
  15. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Are there single GPU options on any of the current config sites?
     
  16. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    There is no config site for the M18x that I know of

    But the M17x R1 and R2 both had single GPU options. It would just downright stupid to not offer it.
     
  17. Matt Woller

    Matt Woller Notebook Evangelist

    Reputations:
    13
    Messages:
    359
    Likes Received:
    0
    Trophy Points:
    30
    No, but the M17xR2 is also no longer offered.

    I fail to see how any of this relates to 485m vs 6970m SLI/CF performance, by the way.

    How about enough speculation, and we wait and see in a few weeks?
     
  18. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Think we are all hoping the waiting doesn't take too long :)
     
  19. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
  20. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    That is the SLI bridge and the heatsinks are not a problem. The M17 and M17x both had the cable going over heatsinks and never had a single problem due to it ;)
     
  21. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    Where? Is the image below the M17x SLI/corssfire?...am I going blind :)

    To me it looks like in the M17x chassis the SLI/Crossfire cable barely (like barely barely maybe) touches the CPU heatpipe on the right side just before it connects to the right MXM module.

    [​IMG]
     
  22. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    It is resting on the heatpipes for the CPU heatsink ;) just left of the piece of tape
     
  23. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    Yes, that's what I was saying right below (all the barelys), but its just barely and actually its not physically touching the heatpipes, the CPU cover is in between.

    In the M18x it looks like it is sitting directly on top of the GPU chips on both sides....like directly on top of them.

    I don't know, you might be right about not being a problem. I am an electrical engineer and I am not really sure about it, I am still concerned. Perhaps some fellow engineers in the forum might want to take a look at this too and see what they have to say if they want.
     
  24. littleone562

    littleone562 Notebook Deity

    Reputations:
    1,417
    Messages:
    993
    Likes Received:
    59
    Trophy Points:
    66
    Yea that might be worse than over the heatsinks as the heatpipes get much hotter, no?
     
  25. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    I think heatpipes are involved in both the M18x and M17x sli cable situations. Its just that the way I see it (as in my opinion) its not as bad in the M17x.
     
  26. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I too am an electrical engineer, here are some physics to chew on:

    Higher heat raises thermal resistance and introduces instability at higher signaling frequencies since the electrons will collide with more nuclei while flying through the conductor which slows them down.

    The amount of heat transmitted off the heatpipes however is VERY little due to lack of surface area - air is a HORRIBLE conductor of heat/energy. That is why the fins are needed at the end to transfer that heat with the air that passes over the fins. Not only that, but the signaling speed of the SLI link is already established to be something stable at normal operating temps and then some (safety factor when designing a solution - design it to work in worse than ideal conditions).

    As I mentioned, the M17, and M9750 both had designs that draped the CF/SLI cables across much hotter scenarios than this, I would not worry about it for one second. They are pretty good at designing these laptops afterall ;). The clevo dual GPU solution would suffer much more than the M18x IF a flaw with such a design existed as their SLI cable is right across the top of the GTX 485m GPUs
     
  27. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    This is one of my concerns, also the properties of the cable could permanently change over time due to long term exposure to high temps. For example, if its Teflon coated (used for high heat ribbon cables) the Teflon will slowly evaporate.

    What is "VERY little"? Is "VERY little" within the specs of the ribbon cable. The heat pipes have a specific temperature range and it is actually relatively high at the high end (not boiling hot, but high enough for a don't touch label to be there).

    I also lost you on the surface area part. If the surface area is large then the heat will be more distributed and not as high at any given point on the surface. If its a single point with a small surface area then it will be pretty hot... I mean the heat pipes are designed to transfer most of the heat off the GPU chip to the heatsink...so I will be astonished if the end of the heat pipe is hot, the start of the heat pipe is hot...and the middle "VERY" little.

    I know that there are ribbon cables that can stand high temperatures (the Teflon kind) but this one looks a little bit too thin for that (I might be wrong). I'll have to see its part number.

    Further, how do we know that the SLI link is stable sitting directly on top of the heatpipes for this particular system, how has it been established? If you know the specs then we can look at them.

    I don't know about all clevos. The x7200 has the SLI cable between the two cards below the heat pipes where it is not as hot. I am not certain though whether what I see in this picture below is the SLI cable (orange cable between the cards)

    http://media.bestofmicro.com/R/U/265386/original/malibal-nine_x7200-sli.jpg

    I am not really sure "they" are pretty good at designing these laptops. Its not the first time "these" laptops shipped with issues. I am glad you are not worried and I look forward to your review and first impressions of the system once you get it :)
     
  28. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    While it is hot to the touch, that is exactly it, to the touch. As in a semi-liquid/semi-solid body (a human finger) touching it. The air next to the heatpipe is absorbing very little of the heat as air it self is a poor medium for transferring that energy. Energy (in many different forms) travels MUCH more efficiently through liquid and solid bodies compared to air/gaseous bodies. Due to this, even though the heatpipe may get hot, that heat is not being transferred to the SLI cable. I cannot speak to the characteristics of the insulator they used but have confidence in its abilities. The fact that my M17x r1 is still alive and kicking is a testament to this.

    And the picture attached shows how the SLI/CF cable will be situated, the blue one that crosses the top of the notebook.

    And for the clevo pic you linked, that orange one is the SLI cable. Note that it passes right across the backside of the PCB which will be as hot as I would expect any heatpipe to be as it is the opposite of the GPU
     

    Attached Files:

  29. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    Poor medium, good medium doens't much mean anything here. A cable is touching a hot surface. What matters is how hot the surface is...which I don't think you or I know so we should prolly find out...and what the cable can take. I am not saying it is the end of the world, I am just worried. I will look into it more as I am not convinced we have the right info yet.

    Ok, so because heat transfer is not as efficient through gaseous bodies than solid or liquid bodies means that the sli cable is going to be OK? Maybe...maybe not. I don't know how to use this info. The cable is pretty much touching with the heat pipes. I guess it is definitely better than if the SLI cable was sitting in a water bath together with the heatpipe.


    Really? I wouldn't vouch on this. Do you have soldering plates? Test it out on the lowest temp you can get.

    Picture is good, thank you. I saw it in the M18 technical manual. Looks like they have a different cable in this pic compared to the one on the hardwareheaven review. I can't tell.

    For the clevo. It's right in between the two GPU cards, close to the PCB yes. I don't know how far below the GPU PCBs it extends. I guess if it extends all they way to below the GPU it will be relatively hot. I have no idea how hot the bottom part of a GPU is :) . Maybe I should find out, might be useful.

    I do understand, however, about the info you provided that other manufacturers have this cable close to heat sources with no issues. It might not have any effects. Personally I would have avoided the M18x SLI cable placement. I have no opposition with the M17x, it doesn't look as bad. Thank God I am not designing Laptops.

    Lets see what happens when then final product arrives. Hopefully all will be good. I really need to get a new PC and I had my eyes on this for a while.
     
  30. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I have never measured the actual heatpipe temps, just have the die temps as reported in software. Maybe I should look into an IR thermometer, quite useful for these sorts of discussions haha

    The backside of a PCB with a 100W GPU cooking away on the front side will get PLENTY hot. This I can say with certainty from experience gained with my R1 (and desktop cards as well).

    We also need to keep this all in perspective. The majority of the time a CPU will not exceed 80c if the cooler is applied well. That is the hottest point in the system, in the CPU die itself. The heatplate on the CPU will absorb most (but not all) of that heat and transfer it via heatpipe suffering yet another inefficiency in energy transfer. Keeping all this in mind and assuming a worst case (meaning 100% efficiencies), we are still only looking at somewhere around 80c temperatures being exposed to the SLI cable (and I am sure we can agree that is an easily not bad worst case for cable insulation). I mention the CPU because those are the heatpipes that the cable actually rests on while going between GPUs.

    I do not want to come across as argumentative on this but let me say that I am 110% confident that there will be nothing to worry about. I do not know of your experience in laptops but so far in mine I have seen these or similar configurations and not had anything to worry about. I am confident that traces in PCBs get considerably hotter than anything this SLI cable will ever be exposed too while being smaller and they still survive just fine.
     
  31. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    Not at all, we are just talking. I mean this is why the forum is for, right
     
  32. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Of course, it is a purely academic exercise as neither of us are going to solve the worlds problems in regards to laptop design haha
     
  33. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Just put tape underneath the cable that crosses over the heatsink and problem solved.
     
  34. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    [​IMG]

    almost 2 years and the cfx bridge is far from a consern i'd be more conserned about thier south bridge (now that the north one is in the cpu it's preatty much the only place they can screw up) or the fact that this will be thier first 330w system and might run into issues
     
  35. Chaos92

    Chaos92 Notebook Consultant

    Reputations:
    138
    Messages:
    272
    Likes Received:
    1
    Trophy Points:
    31
    Ok guys i tried to do a little research on dual gpus and slis and crossfires and couldnt understand it all (since i hv never used dual gpus, no experience either).

    One problem that is coming up a lot on google is people saying that this game is not supported or xyz driver is not there or create this or that profile and what not.

    So i was wondering if anyone who has used dual gpus or knows about these things could tell me that suppose i have some games such as nfs most wanted or nfs shift coud i actually see the benefits of dual gpu or only in certain games is their some benefit?

    Like for example take nfs mw. It is quite an old game and probably doesnt have drivers and profiles or blah blah blah or dual gpu support. Would i still get more smooth gaming and higher fps?

    Basically what i am asking is , if i buy dual gpu in the m18x will i see benefits in all games without having to constantly worry about updating this or that driver, making so and so profiles?

    If no, what are the problems faced with dual gpus?
     
  36. Pion2099

    Pion2099 Notebook Consultant

    Reputations:
    23
    Messages:
    185
    Likes Received:
    8
    Trophy Points:
    31
    From a slightly informed novice, it's been my experience that I've yet to run accross a game that doesn't work. I've been playing pc games on and off for about 8 years now, but didn't really delve into the information side of things until recently, so I was pretty much winging it.

    If for some reason you ever have a problem with a game and need to update drivers, it sounds much more daunting than it is, as long as you follow the steps, you'll be fine, and both nvidia and ati have programs that help you with that.

    As for performance, yeah it's noticeable, but it's quickly getting to the point that folks are talking about 1080p screens at 60 frames per second and max resolution. It's become a game to rake every single pixel over the coals in pursuit of the max settings possible without bursting into flames. If you just play a game on high settings, you'll be fine and likely won't notice the difference. It's once you're trying to make out the color of the eyes of the racer 3 cars behind you in the reflection of the bumper ahead of you that it'll start to make a difference.

    Slight exageration, but that's the general spirit.
    As for straight compatibility, as I said, I don't know any games that straight up won't work, and there might be. Someone else would have to answer that, and potentially the why's of it.

    Short version, you will certainly see the benefits of performance with graphics turned up higher, and the problems have been in my experience few and far between (non-existent), unless you go tweaking and tinkering and trying to milk every last drop out of them.
     
  37. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    if a game don't work on a multi gpu system ethier you are the issue or the game was coded wrong


    and if the game don't run on multi gpu disable CFX or SLI the game should not require the power for 2
     
  38. vzachari

    vzachari Notebook Evangelist

    Reputations:
    468
    Messages:
    334
    Likes Received:
    8
    Trophy Points:
    31
    The Good old days :)

    I sort of like the old M17 SLI cable arrangement a little bit better than the current M18x. At least the SLI cable in the M17 was in a sleeve and sits on top of metal plates which house the heatpipes on both GPU ends. It also looks like the SLI sleeve has some sort of standoffs that give it some distance from the heatpipe enclosure.

    5150Joker :

    Ducktaping (ok, proberly taping it) might indeed improve things :) (if there are things to improve, we shall see) but I don't think that's the point.
     
  39. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I loved my M17 when I had one, just not the GPU or CPU cooling, too loud to little
     
  40. MaynardLD50

    MaynardLD50 Notebook Consultant

    Reputations:
    182
    Messages:
    273
    Likes Received:
    1
    Trophy Points:
    31
    If you look at my Desktop config at home is it worth it waiting for the m18x? I mostly use the Laptop for WoW, L4D2, and Portal 2. I feel like it might be a little overkill for what I need....Considering my desktop is a absolute beast of a computer, I'm not sure I need a beast CFX laptop.
     
  41. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Having both your desktop and an M18x would not make sense, one or the other would be acceptable. Your desktop is already very strong though, much more so than the M18x for graphics
     
  42. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Get an M14x or M17x-R3. The M18x would be overkill for your needs.
     
  43. MaynardLD50

    MaynardLD50 Notebook Consultant

    Reputations:
    182
    Messages:
    273
    Likes Received:
    1
    Trophy Points:
    31
    That is sort of what I figured, thanks for the help guys! I'm typing on an R3 right now that Dell sent to me to try out and I have to say it's pretty amazing, especially coming from a Sager. Time to order one for real :)
     
  44. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    OT: How did you get dell to send you an R3 to try out?
     
  45. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I would like to do a trial on an M18x in that case.....lol
     
  46. MaynardLD50

    MaynardLD50 Notebook Consultant

    Reputations:
    182
    Messages:
    273
    Likes Received:
    1
    Trophy Points:
    31
    Have an amazing Dell rep that helped me out and considering the amount of business our company does with Dell they honored it.

    Do you guys need a rep for a nice discount? I got someone that will help you out BIG time.
     
  47. Räy

    Räy Guest

    Reputations:
    0
    Messages:
    0
    Likes Received:
    0
    Trophy Points:
    0
    Being as almost every pc game is capped based on gpu power at the moment, how long do you think dell will support the mx18 r1. Obviously knowing Alienware there will be a mx18 r2 that will most likely support z68 or ivy bridge but even then I won't expect the new chipset until summer 2012. The new ATI 7000m series is road mapped at being available Q1 2012. Do you think that Alienware will support the next gen of gpus with bios in the mx18 r1 or will they shaft us?
     
  48. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    It's hard to tell, they usually support each new platform with at least one GPU upgrade during it's lifetime. The M17x-R2 had the 4870M and then later 5870M. If Dell keeps up that trend, we should hopefully see the 580 series and maybe 7000 series for the M18x-R1. Maybe even Ivy bridge since it uses the same socket.
     
  49. Räy

    Räy Guest

    Reputations:
    0
    Messages:
    0
    Likes Received:
    0
    Trophy Points:
    0
    Yeah I was wondering if hm67 would be able to accommodate Ivy Bridge. I think that will be a deciding factor for a lot of people due to Sandy Bridge cpu's inability to be overclocked besides the 2920xm. This seems like the first time that Alienware has made a nice upgrade path that allows the mx18 r1 to last for quite a long time in the computer world(excluding Throttle Stop and the 920xm).
     
  50. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    You should get a good 8 months to a year without even having to worry about it. Then you may start seeing upgraded versions but do not expect them to be huge upgrades compared to the R1
     
← Previous pageNext page →