The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    M17X - How reliable it is?

    Discussion in 'Alienware' started by Visu2k7, Aug 30, 2009.

  1. Visu2k7

    Visu2k7 Notebook Guru

    Reputations:
    1
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    After having burned my fingers on a costly hp machine (dv9500t), I am thinking these days about whether to buy another power packed (this time a gaming one) notebook and that too an ultra exprensive M17X.

    My primary question is how good are the graphics cards (GTX 280M and GTX 260M) that are being offered? How old are these into the markets and what would be the age of oldest notebook machine using these cards? I know they must not be having the heating defects like 8400-8600M series, but still how hot do they get? Also having an air conditioner on, when running games, does it help to prolong the life of a GPU like GTX 280M?

    In "Dual 1GB GDDR3 NVIDIA® GeForce® GTX 280M" what does dual mean? Does it mean that the total graphics RAM is 2 GB? That would be insane :eek:

    Sorry for the newbie questions, as I am really afraid of buying another costly lemon!
     
  2. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,087
    Trophy Points:
    931
    Our official M17x review will be up next week, keep checking the homepage ( www.notebookreview.com).

    The GTX 260M and 280M are very powerful video cards, the fastest on the market as a matter of fact. The best value is currently the dual GTX 260Ms in SLI; the 280Ms are not a whole lot faster. The M17x has a great cooling system so they stay rather cool; I have not seen them go above 80*C, even with the CPU is overclocked.

    Having an air conditioner on while playing games could help, but the M17x doesn't need it. You won't necessarily prolong the lifespan by having one on.

    The M17x does not have 2GB of video memory; in SLI mode, the video memory between the cards is mirrored, so you actually have 1GB total. 1GB is more than enough for all modern games and should be enough for some time.
     
  3. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55


    That is not correct, the M17X has 2GB of video memory. 1GB for each card as per the description on alienware website and also the information on my M17X when looking into Nvidia Properties.

    For someone that writes review you should do a better job at informing people.

    Sorry to be so rude but bad information is already bad enough from newbies but from officials of notebookreview it is not acceptable.

    "Dual NVIDIA® GeForce® GTX 280M, 2GB – SLI® Enabled [Included in Price]"
     
  4. kilthro

    kilthro Floating in Space

    Reputations:
    222
    Messages:
    1,577
    Likes Received:
    0
    Trophy Points:
    55
    Then you dont know how SLI works.. It is not a true 2 GB of memory. Items are duplicated as the cards are working together to give fastest possible performance.. For someone to say it is 2GB, it is not accurate. Yes 1 gb per cards but not 2gb uniquely for use.

    Chaz was correct in his statement..

    I would suggest you go and read on how SLI works!
     
  5. Pman

    Pman Company Representative

    Reputations:
    327
    Messages:
    1,882
    Likes Received:
    0
    Trophy Points:
    55
    lol

    the cards only use one cards ram mate, so only 1gb is used

    PC's 101

    ...
     
  6. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Joebarchuck looks in the mirror every morning thinking that there is a clone of him mimicking his every movement.
     
  7. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55
    I think there is a problem here. First of all I never said there was 2GB per card. There is 2GB total and yes it is mirrored with SLI but nonetheless you still have 2GB of total video memory though only 1GB is really useful.

    The problem here is the statement of Chaz: he says the M17X does not have 2GB of video memory well technically that's not true; there is!

    SLI works this way: one card renders the top of the screen and the other renders the bottom. They each have 1GB of memory to utilize to render each of what they have to do therefore yes there is 2GB of video memory but of course neither card can use 2GB. We all got that. The point is there is 2GB of video memory on the M17X.
     
  8. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    It doesn't matter. Only 1GB is useable despite technically having 2 x 1GB cards. The only time 2GB is used to describe SLi is in marketing by companies.
     
  9. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55
    Well I guess you could view it both ways but again what I wanted to say is that the way Chaz wrote it implies each card has 512MB because remember that someone that knows nothing of SLI or crossfire like the OP would automatically assume that therefore it is necessary to state the truth each card has 1GB of memory totalling 2GB of video memory usable at the rate of 1GB per card.
     
  10. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    ok

    the facts instead of argueing

    there is 2 gigs of vram

    only 1 gig is used
     
  11. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    actually i lol'd @ that :p
     
  12. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Okay. The next time I post about someone having two 500GB HDD's in RAID 1, I'm going to say they have 1TB instead of 500GB available.
     
  13. nakedshorts

    nakedshorts Notebook Geek

    Reputations:
    0
    Messages:
    89
    Likes Received:
    0
    Trophy Points:
    15
    I am lost also So if you are not using ram from 2nd card what is the use of a dual cards? So a 1gig card should be plenty for light gaming like GTA 4 or Sacred 2 ? I need to know this before i purchase a dual. I was going to order one 3 days ago but decided to wait a few weeks.
     
  14. kilthro

    kilthro Floating in Space

    Reputations:
    222
    Messages:
    1,577
    Likes Received:
    0
    Trophy Points:
    55
    More processing power. So more can be done at the same time.
     
  15. Scytus

    Scytus Notebook Deity

    Reputations:
    127
    Messages:
    842
    Likes Received:
    0
    Trophy Points:
    30
    Perfect example
     
  16. azelexx

    azelexx Notebook Evangelist

    Reputations:
    53
    Messages:
    317
    Likes Received:
    4
    Trophy Points:
    31
    1GB for top half of screen + 1GB for bottom half of screen = 1GB for the entire screen

    1 x 0.5 + 1 x 0.5 = 1!!

    Works out mathematically.
     
  17. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    You use both cards but not both cards ram
     
  18. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Is it possible to dedicate one card to PhysX exclusively though rather than SLi? If that was the case, you would be able to use up to 2GB of RAM.
     
  19. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    No ...........
     
  20. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I'm sure you could hack PhysX to take over one card... just requires driver linking. I'm just trying to push the argument that there has to be a way to push the RAM to 2GB of usage. :p
     
  21. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    Go ahead if ur smarter then nvidia be my guest
     
  22. a_beast33

    a_beast33 Notebook Consultant

    Reputations:
    8
    Messages:
    160
    Likes Received:
    0
    Trophy Points:
    30
    "jerry jerry jerry"
     
  23. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    @joebarchuk:
    Can you provide a LINK that supports your description of how SLI works? I was under the impression that it worked somewhat like what was alluded to in Chaz's and Soviet's posts--that SLI was similar to RAID'ed hard drives except only applications that are coded to utilize... whatever nevermind
     
  24. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    It has nothing to do with being smarter than nVidia. Just takes a lot of driver hacking to have the PhysX software think one of the cards is a PhysX card. Probably not worth the effort but its definitely possible to do.
     
  25. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    Have you heard of the Laws of thermodynamics?
     
  26. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Just wiki the information.

    Dual cards rendering each half of a screen is known as 'split-screen-rendering', and then there's dual cards that take turns rendering each frame, which is known as 'alternate-frame-rendering'. The latter is actually much more efficient and better for a majority of games, especially those that utilise a shader-intensive engine.

    My source: Nvidia Control Panel.

    As for Physx, well you simply deactivate SLi, and the idle card will run your Physx. I thought this was all understood? But how PhysX will ever use 1GB of GDDR3 RAM is beyond me as of the moment.

    I think the real bone of contention here was that whole 'technically, you have a total of 2 gigs of ram,' rhetoric. However, we all know that effectively the rendering performance cuts down to 1 GB of RAM. With SLi, you simply yield the benefits of two cards taking the pressure off each other by either taking turns at rendering a running scene, or rendering one-half of the scene, per card. It never meant that everything is literally doubled. End of story. Chaz was spot-on, and he even made a point to mention that both cards will have 1 GB of RAM, each. How and why anyone would be motivated to pursue such a bloody semantic argument against him, and then grasp at every little straw, escapes logic altogether.
     
  27. KracsNZ

    KracsNZ Notebook Evangelist

    Reputations:
    91
    Messages:
    332
    Likes Received:
    0
    Trophy Points:
    30
    No, that's not quite how it works. Here is the the wiki...

    http://en.wikipedia.org/wiki/Scalable_Link_Interface

    Split-screen isn't 50% of the screen per cards, but messured 50% of the workload (or attempts to). You may have one cards actually rendering allot more of the screen due to other areas (water/transparency/complexity) requiring more processing power.

    Put on the SLI indicators on a game that properly supports split-screen and watch the lines go all over the place.
     
  28. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Of course you don't, but we're talking about dedicating a single card to PhysX. When you disable SLi, the idle GPU will most likely become the dedicated PPU for the relevant software using a PhysX engine.

    Ah well. I guess it ain't half and half in that kind of sense. But I did not state that it's efficient or a perfect 50-50 either, because I have yet to see this technique work well for a given game. I'm sure there are some out there, but I couldn't care less.
    -Fin-
     
  29. BatBoy

    BatBoy Notebook Nobel Laureate

    Reputations:
    7,395
    Messages:
    7,964
    Likes Received:
    25
    Trophy Points:
    206
    If you order an M17x you are buying from Dell. If you hit the M17x config page on alienware.com it sends you straight to dell.com. ;)
     
  30. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    I have a harsh bias against Dell these days. However, Alienware is now under Dell's control, so you'll get the same support that other Dell owners get, I believe. And in theory, this is what should be happening.

    I think another M17x owner would be better suited to answer this question. Here in UK, they've still got different teams for the XPS and the Alienware. How about state-side?
     
  31. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55


    It really works like the paragraph below but I used top of the screen and bottom of the screen to simplify the process:

    Split Frame Rendering (SFR), the first rendering method. This analyzes the rendered image in order to split the workload 50/50 between the two GPUs. To do this, the frame is split horizontally in varying ratios depending on geometry. For example, in a scene where the top half of the frame is mostly empty sky, the dividing line will lower, balancing geometry workload between the two GPUs. This method does not scale geometry or work as well as AFR, however.
     
  32. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    ok, were getting off topic now joebarchuk and need to get back to where you provide a LINK that supports your argument that dual 1gb cards SLI'd =2gb of Vram.
     
  33. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
  34. a_beast33

    a_beast33 Notebook Consultant

    Reputations:
    8
    Messages:
    160
    Likes Received:
    0
    Trophy Points:
    30
  35. Xeneize

    Xeneize Notebook Deity

    Reputations:
    824
    Messages:
    1,263
    Likes Received:
    9
    Trophy Points:
    56
  36. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55

    I pasted just above your post how wikipedia sees SLI working. This clearly shows that each graphic card processes different parts of the image rendering each using the 1GB of memory they have.

    Therefore it's not like RAID 1 where two 256GB drive only gives 256GB of available space.

    What is true though is that each graphic card only has the possibility of using 1GB but it's not shared memory. They each have their own 1GB to process from therefore you can argue it's like having 2GB of video memory.

    Chaz reply to the OP clearly states there is a shared 1Gb memory which is absolutely not the case.
     
  37. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    It is not shared per-se, but both memory modules are filled with the same data. It is pointless to argue the whole "I have 2GB's of VRAM" deal.

    So we could say that it is redundant in how it works.

    That ok, joe?
     
  38. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,087
    Trophy Points:
    931
    I stated it was mirrored, not shared:
    Joe, I made a simple statement to help the original poster out. Would it do them any good if I said "It has 2GB of video memory" to have them come back and say "Hey, only 1GB is usable!"? No. If you have that much of a problem with a member's post, then you should take it up with the member in private via PM instead of derailing a thread for four pages as you did with this one.

    That said, let's stay on-topic here. Here is the original post for those interested:
     
  39. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    This happens quite a bit, chaz.

    Someone posts something they think is right, but then gets proven wrong.

    Then they'll try and defend it for another 100 posts.....
     
  40. airplaneman

    airplaneman Notebook Deity

    Reputations:
    150
    Messages:
    716
    Likes Received:
    0
    Trophy Points:
    30
    Either graphics card set up will be fine for all modern games at max settings (except Crysis of course) and it should run them all very smoothly. My single 260 has never gotten over 60 degrees thanks to the M17x's amazing cooling system. I don't think having the AC on while gaming would have any noticeable impact on GPU life. A better thing to do would be to get a notebook cooler such as the NZXT Cryo LX.

    If you haven't figured it out, dual 1GB cards mean 2 cards with 1 GB each for a total of 2GB, but only 1 GB is usable (Not sure why, someone mentioned that they mirror each other). Regardless, 1GB is and will be sufficient for a long time to come.

    You will very much enjoy the M17x if you decide to order it, I can almost guarantee it.
     
  41. Xeneize

    Xeneize Notebook Deity

    Reputations:
    824
    Messages:
    1,263
    Likes Received:
    9
    Trophy Points:
    56
    HAHA QFT!!
     
  42. nakedshorts

    nakedshorts Notebook Geek

    Reputations:
    0
    Messages:
    89
    Likes Received:
    0
    Trophy Points:
    15
    so other words , for the casual gamer that playes Prototype , GTA 4, Scared 2 will be better off with a single 280, 260, or better with a dual 260 ? I was about to order dual 260 thinking i would would have 2 gigs of video ram , thinking i would not need to update my video for at least 5 years.

    I mostly deal with dvd's but do not want to be stuck like my last notebook , to where after i got it , 6 months later i could not play any games. (gforce go 5200 32meg video card wooohoooo).

    Used that laptop for like 4 years without playing any games with that awesome 32 megs of video ram.


    Long store short, casual gamer be better off with a single 280, 260, or better with a dual 260 ? Game maybe 8 hours a week will not put a lot of strain on a single 1 gig card. But video and surfing 70 hours a week.
     
  43. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    VRAM does not matter

    get sli and forget about it
     
  44. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    This thread receives the Soviet Seal of Approval. Joebarchuck, thanks for making us Californians look bad.
     
  45. Scytus

    Scytus Notebook Deity

    Reputations:
    127
    Messages:
    842
    Likes Received:
    0
    Trophy Points:
    30
    Don't make Californians look bad D:
     
  46. SoundOf1HandClapping

    SoundOf1HandClapping Was once a Forge

    Reputations:
    2,360
    Messages:
    5,594
    Likes Received:
    16
    Trophy Points:
    206
    For real.

    +1 char
     
  47. Joebarchuck

    Joebarchuck Notebook Virtuoso

    Reputations:
    881
    Messages:
    2,246
    Likes Received:
    0
    Trophy Points:
    55
    Hold on, first of all I've been on this forum for many months and I have never argued nor contracticted anyone. I was always to the best of my knowledge helpful or looking for help.

    But here some things that are not true have been said like "they both process the same data". NO, each card in SLI mode does not process the same data. They process different part of the graphics and they combine it to make one image or one frame should I say therefore each 1GB of video memory for each card is used independently. This is how SLI works.

    I totally understand that if SLI worked as shared memory meaning, both card use the same video RAM then yes there would only be 1GB of video memory no matter how much is advertised but that's not the case at all.
     
  48. Vitotherm

    Vitotherm Notebook Consultant

    Reputations:
    18
    Messages:
    179
    Likes Received:
    0
    Trophy Points:
    30
    Mmmm...

    I that i opened up a tread about how reliable a M17X is.. :confused:
    if you want to discus something like that fine.. but stay on topic here.
    start a new tread or send some PM's

    Don't start over again..
     
  49. Vitotherm

    Vitotherm Notebook Consultant

    Reputations:
    18
    Messages:
    179
    Likes Received:
    0
    Trophy Points:
    30
    So how reliable is it.. ?
     
  50. Doomy

    Doomy Notebook Geek

    Reputations:
    4
    Messages:
    78
    Likes Received:
    0
    Trophy Points:
    15
    Well the design of the chassis, temps ect are excellent.

    There does appear to be a fair few issues with the Hybrid GPU, Stealth getting stuck, black screens on wake-up ect all software driver related it appears.

    Some updates seem to have fixed some of these issues already ie BIOS A02.

    I say 8/10 in the reliability stakes.
     
 Next page →