The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    a little disappointed with the m18x

    Discussion in 'Alienware 18 and M18x' started by (((STEREO))), Apr 25, 2011.

  1. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0
    Am i the only one who thinks at 18.4", the M18x should've gotten a desktop CPU?

    Other than that it is indeed a beast.
     
  2. AirJordan

    AirJordan Notebook Evangelist

    Reputations:
    35
    Messages:
    474
    Likes Received:
    27
    Trophy Points:
    41
    The only things I'm disappointed with are the screen and the color choices... a desktop processor really isn't needed and would just create issues. The Sandy Bridge processors are plenty powerful for what they do.
     
  3. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    No way, there's absolutely no need for a desktop CPU. It would result in thermal, power and further weight issues and be useless for the market AW caters to.
     
  4. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Desktop CPU's are absolutely not needed, and often cause motherboard issues down the road because while not "high" in heat when properly cooled, the slightly stronger heat levels slowly cook the internals.

    Best to use portable CPU's for portable applications. Besides, these days, they are plenty powerful enough for 90% of folks and their applications.
     
  5. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0
    useless to the market AW caters to? You mean the Super-Enthusiast market where even the slightest bit of extra performance is everything? Joker this is rare... actually the first time...I dont agree with you. And i was kinda hoping it would shut up x7200 owners :)

    I understand the difficulties involved with power/thermal contraints... just at 18.4"... i think they could've pulled it off with a little work...

    Let's just hope the 2920xm is TDP/TDC unlocked in TS ... anyone know if it is?
    If it is then you are right, and it'll be plenty fast... overtaking many desktop cpus anyways.


    other than that the screen should've been over 1080p ... and my complaints stop there :)

    Don't get me wrong... i love the M18x.

    SIDE QUESTION: is the M17x-r2 officially phased out/discontinued? If i still have about 2 years worth of warranty... if it dies in the near future... will they replace it with the M18x? Just wondering...
     
  6. Räy

    Räy Guest

    Reputations:
    0
    Messages:
    0
    Likes Received:
    0
    Trophy Points:
    0
    You aren't going to notice any difference between the 990x and the 2920xm in games. If you need 6 cores then obviously the 990x is your choice. But for most people that aren't going with 990x or 2920xm the toss up is between a 2720 and 960. But quite simply the potential for ivy bridge cpus is a factor as well for the mx18. Certainly desktop cpus like the new Sandy Bridge E will be huge in terms of performance and benchmarks but how much will it affect games. It seems that most games have reached a plateau in which it all comes down to the gpu.
     
  7. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0
    In games no. In raw cpu processing like encoding n such... yes. not saying i need that power... i'm saying i'm pretty sure AW could've pulled it off...
     
  8. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    they chose the 4ghz factory overclock instead i think a 4ghz SNB cpu will kill the 990x
     
  9. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    You won't need a desktop CPU with an OCed 2920XM. Plus, the SB E is still 2 quarters away.

    I'd rather prefer having a 17" screen (more high quality panels available, including IPS). Even though it's beyond hope, I'd love to see an RGB LED 18.4"...
     
  10. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231


    Look at the X7200, it's a disaster. It's thicker than the M18x, weighs more despite being plastic, needs TWO PSU's to power 485M SLi + desktop processor and is ugly as sin. Would you really want AW to go down that path? If they had, M18x would have been off my list immediately. What they did with M18x is nearly perfect, I say nearly because as you said, the downgrade from 1200p to 1080p is one factor as is the lack of IPS/RGB LED.

    However, performance wise, a 2920xm has been shown to match or beat a 980x in games. So the only people that would benefit from a desktop processor are the few that do heavy server work or use applications that require more than 8 threads--that is probably like 0.0001% of the AW consumer base. Remember, because M18x has IGP capabilities, you can utilize quick sync with the SB mobile and get some awesome results for encoding.

    As a trade off for a desktop processor you would see:

    1. increase in system weight due to thicker heat pipes needed
    2. increase in cost to develop a specialized system like this
    3. increase in width/depth because as it is, Dell had trouble squeezing in the CPU heatsink (notice they used a single pipe design).
    4. increase in thickness because bigger and louder fans would be needed
    5. The use of dual PSU's to power this disaster of a design

    Desktop CPU's have no place in a notebook IMO, especially with the release of Sandy Bridge and soon Ivy Bridge. Frankly, I think Clevo is insane for making a notebook with a desktop processor and I wonder how many X7200s they have really sold? Probably not a lot.
     
  11. tyranus7

    tyranus7 Notebook Evangelist

    Reputations:
    137
    Messages:
    511
    Likes Received:
    16
    Trophy Points:
    31
    I am a bit disapointed, I wanted the SLI GTX 485M, which is around 6% better than the CF 6970M... and that is a lot to me. Besides I want, physX, cuda, 3d vision, and all that stuff.
     
  12. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0

    ok fair argument my man... but the question remains... is AW going to leave the TDP unlocked on the 2920xm? and what about that factory OC to 4ghz? Im assuming thats on one core and not all 4 simultaneously?
     
  13. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    all we need is that 2.5ghz 4 core to hit 3ghz and u wont need a single mhz more at 1080 resolution. 4ghz 4 cores wouldnlt really give any diff at all except in maybe 5 games.

    this m18x will be the best gaming laptop everrr made.
     
  14. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    Never thought about that. Yeah it should have had a desktop cpu. Otherwise it's still weaker than the X7200.

    I'm have serious and have joking.
     
  15. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    The 2920xm should be able to match if not beat the 980x.....unlike the desktop monster shoved in a laptop, this much more efficient SB cpu will also be able to:

    -Overclock
    -Go into a MUCH lower idle power state
    -Utilize Quick Sync
    -Utilize the IGP for greater than 45 minute battery life
    -Lower system power making only one PSU necessary
    -Most of the other things already mentioned by joker in regards to weight and size

    Remember SB has a much better IPC than gulftown AND higher frequencies....that is total win. Just look at the desktop space to get an idea...so far mobile SB is about 95% as powerful as desktop SB which is phenomenal for laptop owners
     
  16. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Only Dell knows that right now. I'm hoping they did.
     
  17. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Regardless, it is a multiplier meaning TS can probably manipulate it (if not the TDP/TDC maybe still the multis :D)
     
  18. koag8087

    koag8087 Notebook Guru

    Reputations:
    4
    Messages:
    56
    Likes Received:
    0
    Trophy Points:
    15
    only thing im dissapointed with is that it will be released in mid may
    i thought and hoped to get m18x when m14x was released
     
  19. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Yeah this waiting is killing me. I got about $2800 to spend on an M18x (had $3200) but keep finding reasons to spend the money on other stuff. Just blew $100 on gym clothes today. Hurry up Dell!
     
  20. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    They got until I sell my desktop and M17x, anything longer than that is unacceptable!

    lol
     
  21. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    they got nothin i'm not upgrading until i find a game i cannot just put to max and don,t look again to upgrade so it will likely be arround hd 8XXX and haswell
     
  22. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    How does your M17 handle DX11?

    ;)
     
  23. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    My crossfire 5870m setup struggles with borderland with max details. FPS drops down to the 30s and below soemtimes during heavy action. M18x will definitely alleviate that.
     
  24. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    I wonder if 330W will be enough for the GTX485M SLI + OCed 2920xm...
     
  25. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Should be enough. The 330W probably scales to 360W at least. But we won't have to worry about that, I don't see them offering 485M SLi.
     
  26. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    It would be very tight on power. I would be more comfortable with the 6970m CF they are offering ;)
     
  27. sticky

    sticky Notebook Consultant

    Reputations:
    16
    Messages:
    200
    Likes Received:
    3
    Trophy Points:
    31
    The only thing I'm disappointed with is that I ordered an M17X and have to deal with Dell making returning it a hassle.

    Can't wait for the M18X... hope it won't be much longer.
     
  28. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    i hate you
     
  29. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Just saying....lol
     
  30. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    ya but i don't think i ever tryed to play a game that use dx11 or at least the dx11 feature my card does not suppor unless sc2 or assasins creed use dx11
     
  31. SillyHoney

    SillyHoney Headphone Enthusiast

    Reputations:
    543
    Messages:
    1,202
    Likes Received:
    1
    Trophy Points:
    55
    Darn! I really hate it when RGB could not make it to M18x. I'm doing a lot of Photoshop on both my R2 and MBP 13" and every single time I switch between the environment back and forth I get annoyed by the difference in display quality. And that is when MBP display is top notch being a WLED. I looked at M15x and R3 display and I was really disappointed :(
     
  32. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0
    As much as i love ATI... i think my next gpu(s) will be nvidia... just for the PhysX... i find myself needing it more n more lately... and yea RGB LED should've made it to the M18x... ohwell still is an awesome machine
     
  33. SOS4DELL

    SOS4DELL A Notebook Philosopher

    Reputations:
    865
    Messages:
    969
    Likes Received:
    20
    Trophy Points:
    31
  34. PsiPr0

    PsiPr0 Notebook Evangelist

    Reputations:
    97
    Messages:
    495
    Likes Received:
    3
    Trophy Points:
    31
    Waitin for the M18xR2!!!!!
     
  35. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    SOS, the screen on the 6600 is probably going to be incredible. Hoping anyway. For me though dual GPU's insures longer life, so I'll be going M18x.

    An oc'd 2920 doesn't hurt the mix either!
     
  36. SOS4DELL

    SOS4DELL A Notebook Philosopher

    Reputations:
    865
    Messages:
    969
    Likes Received:
    20
    Trophy Points:
    31
    Indeed, my friend FXi, you're right: dual GPU is amazing and a 'covet' by all means but, I must think that besides gaming with a multi thousand dollars beautiful machine I also need to do some media editing, complicated graphics and other works… The M17xr2 Allow for all of this but I have seen around me several machines (and even checked some) with this so called FHD white led… and it doesn’t answers to the needs. Besides, I must take care of my eyes too…
    Have you seen the possible GPU configurations offered in the M6600? They are quite advanced too and if I’m going to spend more than 5000 dollars in a machine I need a very good “front line”, and this is the display… with is VFHD (a ‘Very Full High Definition’ ;) ) RESOLUTION AND GAMUT!
    Let's hope for the best!!! (in the menatime in the 'Front Line" Dellienware is not full answering to the expectations they open with the M17xR2... and i'ts 1920x1280 RGB-LED...
     
  37. SillyHoney

    SillyHoney Headphone Enthusiast

    Reputations:
    543
    Messages:
    1,202
    Likes Received:
    1
    Trophy Points:
    55
    Actually I'm thinking about KEEPING my R2. That was not a plan at all before but the extensive work with Photoshop + the daily comparison between RGB and W + No RGB on M18x make it a very feasible option for me :eek:
     
  38. AirJordan

    AirJordan Notebook Evangelist

    Reputations:
    35
    Messages:
    474
    Likes Received:
    27
    Trophy Points:
    41
    If I was in your position I would definitely keep the R2. The M18x may be a relatively significant upgrade in terms of performance, but the screen is a big hit in terms of quality. The R2 still has at least 2 years left to thrive, and I'm sure by then the M18x will seem not so great, so I would just wait. In my position though, I'm still using a crappy old Dell Inspiron, so it's about time I upgrade. :D
     
  39. SOS4DELL

    SOS4DELL A Notebook Philosopher

    Reputations:
    865
    Messages:
    969
    Likes Received:
    20
    Trophy Points:
    31
    I totally agree. This is why I must forgot my apetite for the AW M18x and concentrate in the forthcoming M6600. At least it will have the display we need. (but who knows... maybe Dellienware will think again the display they are actually offering :rolleyes: It's not too late )
     
  40. Natadiem

    Natadiem Notebook Evangelist

    Reputations:
    216
    Messages:
    392
    Likes Received:
    6
    Trophy Points:
    31
    Well, while the M18x will be able to max a modern game at >100Fps, the M17xR2 will do >60 FPS...
    In the end it will looks prettier on M17xR2.

    I am waiting to see both together on display before to make my final decision though.
    Screen is always the most important part of a gaming device imho.

    PSP2 Oled screen anyone? :)
     
  41. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    I think the M18x display will be just fine.
     
  42. miahsoul

    miahsoul Notebook Deity

    Reputations:
    75
    Messages:
    1,372
    Likes Received:
    0
    Trophy Points:
    55
    LOL!

    Anyways, doesn't it say turbo boost up to 4GHZ? That means a 3GHZ base clock right, not 4.0GHZ constant, I believe. (Since 2.5GHZ is stock, and base Turbo is 3.5GHZ) So 4.0 GHZ isn't going to be as great as they advertise it to be, but I think it'll still be a screamer at 3.0 GHZ and with an unlocked tdp, I'm sure you could get perhaps 4.5+GHZ for short bursts. :p
     
  43. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
  44. harmattan

    harmattan Notebook Evangelist

    Reputations:
    432
    Messages:
    642
    Likes Received:
    55
    Trophy Points:
    41
    Ouch. $2000 for the absolute base; just under $3k for base system with dual 6970s. $900 upgrade for dual 6970s?

    Limited card options: single 460 (aka the Best Buy/-was-I-thinking model), dual 460s, single 6970, dual 6970s.

    To be expected, I guess. This is kinda a big m-e-h for me considering the x7200 has had more powerful options for the past 3 months.
     
  45. andyroo

    andyroo Notebook Guru

    Reputations:
    6
    Messages:
    74
    Likes Received:
    0
    Trophy Points:
    15
    Why would they decide to switch to 1080p when they used a 1200p on the R2? Is it the screen dimensions?
     
  46. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    It's the latest trend... moving to 16:9 screens instead of 16:10 screens. Cheaper for them too :eek:
     
  47. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    16:9 is more based on what the panel makers provide in a sufficient quantity and price for a notebook maker rather than a choice the maker does to give people wider vs taller screens. Fabs have been getting more and more "tooled" for 16:9 such that it is getting harder and harder to have production runs of 16:10 screens. Harder means a panel is more rare and may not come in sufficient quantity to supply a notebook line as well as spare parts for that line. And that also means higher cost. Which, at some point, a notebook maker has to surrender to when a 16:10 panel starts costing a multiple of the cost of a 16:9 panel.

    This is a changeover from the panel maker side, much more than the notebook maker side.

    That said, Dell still could have found a way to get us RGB LED because a backlight can be custom ordered, but we're past being able to see anything done about that on the M18x. Love it for what it is, or not. It is what it is now.
     
  48. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    wow, they ripped that config site down pretty fast lol
     
  49. (((STEREO)))

    (((STEREO))) Notebook Consultant

    Reputations:
    86
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    0
    maybe they're saving the RGB LED screen for the M18x r2 :D

    remember the M17x r1 didn't have an RGB LED... just sayin' ...you never know :)
     
  50. iPhantomhives

    iPhantomhives Click the image to change your avatar.

    Reputations:
    791
    Messages:
    1,665
    Likes Received:
    48
    Trophy Points:
    66
    Yea~ with Radeon HD 7000-series xFire? SLI 485? maybe? or 3rd i7 Processor? LoL :D
     
 Next page →