The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    M11x NEXT GPU ATI OR NVIDIA

    Discussion in 'Alienware M11x' started by mightymax86, Oct 28, 2010.

  1. mightymax86

    mightymax86 Notebook Consultant

    Reputations:
    34
    Messages:
    234
    Likes Received:
    0
    Trophy Points:
    30
    Hey there guys i would love your opinion on this. Do you think the next gpu to find itself in the m11x will be ati in origins or nvidia. reason why i ask is that i thought dell would slap a 435m in by now
     
  2. first_leviathan

    first_leviathan Notebook Consultant

    Reputations:
    19
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    I'm guessing Nvidia still since Alienware was really enthusiastic about Optimus technology when the R2 came out.

    Alienware probably also is listening to customers who are complaining about the lower battery life of the R2 compared to the R1.

    AMD (I hate how the graphics cards of AMD aren't called ATI anymore) GPUs generally use more power, and they don't have the Optimus tech of Nvidia cards.

    Therefore, Alienware will likely stick with Nvidia. :| I'd prefer AMD though, since their graphics cards are stronger at lower prices. And if you know your computers, you don't really need Optimus tech anyway. :p
     
  3. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Yeah, unless AMD has Optimus, then I want an nVidia card, no questions asked.

    It's just too convinient, enough for me to overcome my general dislike of nVidia since the pricing of the 8800 series...
     
  4. roxxor

    roxxor Notebook Evangelist

    Reputations:
    172
    Messages:
    587
    Likes Received:
    0
    Trophy Points:
    30
    dell is already using the 4xx series in the new XPS machines
     
  5. first_leviathan

    first_leviathan Notebook Consultant

    Reputations:
    19
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    Uh... how do the size and power demands of the 4xx series compare to the 3xx? Will a 400 fit in the M11x chassis? How badly will battery life be affected when gaming unplugged?
     
  6. djjosherie

    djjosherie Notebook Consultant

    Reputations:
    42
    Messages:
    224
    Likes Received:
    1
    Trophy Points:
    31
    My Opinion:

    I'm very sure we won't see any CPU changes, since AMD can't compete with the speeds of the i7, especially a mobile low power model...

    As for GPU, nVidia due to optimus, which as nVidia has announced is a big target for the upcoming driver releases, between the punkbuster problems, as the few general optimus problems that haven't been fixed yet...
    the 4xx series from what I read on other forums and reviews didn't give clear performance changes compared to a 3xxm card of similar value, (ie 335m vs 435m) which is why Alienware isn't probably going to rush anything when they're still pushing the R2 right now, I just got an email yesterday with special deals for it.
     
  7. Cpt.Zero

    Cpt.Zero Notebook Consultant

    Reputations:
    58
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    for me, i really wanted to see a ati gpu for our m11x revision. ati gpu has better performance and less thermal and wattage consumption in every class.
     
  8. Spalding

    Spalding Notebook Consultant

    Reputations:
    14
    Messages:
    180
    Likes Received:
    0
    Trophy Points:
    30
    I feel like optimus is just the lazy mans tool, just be patient and change between cards yourself. You may as well have it on 100% of the time on battery, and gaming on battery, fn+f6 is my friend.
     
  9. Name User

    Name User Notebook Consultant

    Reputations:
    0
    Messages:
    114
    Likes Received:
    0
    Trophy Points:
    30
    Then cars are just a lazy man's tool as well... same for trains, boats and airplanes. Do you walk or swim everywhere you go?

    It's not called laziness, it's called efficiency.
     
  10. Spalding

    Spalding Notebook Consultant

    Reputations:
    14
    Messages:
    180
    Likes Received:
    0
    Trophy Points:
    30
    I ride a bike and definitely own an r1 =D.
     
  11. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Well, I've owned a R1 for a while, then switched out to a R2. I like the R2 better, since I can control which app uses the nVidia or Intel GPU. That way, I don't have to hit a manual switch - which isn't an annoying thing to do, but it did bug me endlessly whenever I forgot to, and only realized *after* starting a game :mad: <--my own fault :p

    IMO, the difference between the R1 and R2 is level of control (on many different things). The R1 is either "yes" or "no". The R2 is "how much?"
     
  12. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    As I have posted before.. it's all about control. I dont need my video card telling me when it wants to kick in, I engage the card when I NEED IT! not the other way around. I dont need the card to try and "think" for me, try to interpret when it "thinks" I am going to game.

    I know exactly when I am gaming. No thanks optimess.

    Dedicated +1 right here.

    ON TOPIC - The 425 optimus may be the next M11x option available. Not sure on power consumption and whether a board has been designed that will allow the UM cpu's to be partnered with the optimus GPU's. Any Dell insiders here?

    :)

    Stevenx
     
  13. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    lol... :p :D

    I found Optimus to be more of a "fine control." Only if you let the default nVidia settings stay, will the issues begin :D

    I just set the default to Integrated, and changed the profiles for all of my games to nVidia GPU :)
     
  14. first_leviathan

    first_leviathan Notebook Consultant

    Reputations:
    19
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    I agree with the last person on topic. :D The 425m doesn't really have much of a performance gain, so Alienware wouldn't refresh when it still has a lot of R1 and R2 models in stock.

    Anyway, going back to the topic...

    Let's speculate further. If there WERE to be a GPU refresh anytime soon, what Nvidia/AMD card would Alienware stick inside the chassis?

    Also, how would that affect performance/battery life/heat output/fan noise?

    EDIT: Actually... Alienware might want to hold that (disputed) claim of "Strongest Notebook under 14 Inches," so maybe a refresh is possible. Who knows. D:
     
  15. oriki77

    oriki77 Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    hey everyone!
    if you dont mind have got 2 questions -

    1. about when do u think "R3" will be released?
    2. is it worth waiting?, or probably the changes won't be that big?

    thx
     
  16. Grimgrak

    Grimgrak Notebook Geek

    Reputations:
    5
    Messages:
    85
    Likes Received:
    0
    Trophy Points:
    15
    It better have usb3 thats all i know.
     
  17. TalonH

    TalonH Notebook Evangelist

    Reputations:
    78
    Messages:
    402
    Likes Received:
    0
    Trophy Points:
    30
    Who cares? The R1 is great and Optimus is more a pain than anything. If anything, though, I'd love to see Alienware stick with nVidia as ATI is lame.
     
  18. kopicha

    kopicha Notebook Evangelist

    Reputations:
    194
    Messages:
    582
    Likes Received:
    0
    Trophy Points:
    30

    Every tech has it's own pros and cons. But using the word efficiency for Optimus enabled machines is a real good word to describe it.

    example.... R1 can choose to go dedicated with on board or Nvidia. R2 can as well but only via on board and not dedicated to Nvidia. simply set global default to Intel and the Nvidia will never kick in. But R2 cant set Nvidia as default since the final output is still done by Intel. R1 has a +1 advantage in such scenario being able to have more "control". But on top of that R2 has benefit assuming if you are doing more than 2 different things concurrently eg video encoding via GPU and some other stuffs that does not require GPU acceleration. Here in this scenario R2 is being more "efficient". So here's the +1 back to R2. Having total control does not mean efficient even knowing what you are doing. Switching back and forth every time when I am trying to do different thing is already very inefficient. 1 can debate that I could have plan my stuff and work all I need on 1 then switch over for the remaining task. But imo I buy a computer to work for me not for me to work for the computer. It's getting wrong if I need to plan such simple task that the computer should have sorted it. On top of that due to Optimus still on it's way of fine tuning it's still on its rocky road. PB issue is 1 example that will be non existence on an R1. So it scores 2 - 2 now. So both have their own good and bad.

    But personally I will give my last point to R2 giving it a 3 -2 win over R1 simply because Optimus issue can get fix over time. But you cant fix the efficiencies of R1 without us human giving in. No punks intended. This is just my personal thoughts and no mean to represent what others think. So you can have your own opinions to which ever is better. This is not meant to be comparison but just how I personally look at both.

    I feel that AW will still take Nvidia in their next M11x revision thou
     
  19. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    I dunno, the Acer 3820TG probably took that title at 13.3"...
     
  20. kopicha

    kopicha Notebook Evangelist

    Reputations:
    194
    Messages:
    582
    Likes Received:
    0
    Trophy Points:
    30

    Where's that claim from? I only see this "The most powerful 11” gaming laptop in the universe"
     
  21. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    It used to be on the Alienware m11x site...
     
  22. KSSR1211

    KSSR1211 Notebook Evangelist

    Reputations:
    127
    Messages:
    366
    Likes Received:
    5
    Trophy Points:
    31
    There is a reason to think that there will be a 445 invidia gpu being used. Whether there is a performance increase or not the higher number will be used to extract another $100.00 from the consumer and keep their minds off the hinge defect.
     
  23. kopicha

    kopicha Notebook Evangelist

    Reputations:
    194
    Messages:
    582
    Likes Received:
    0
    Trophy Points:
    30
    imo I feel that it's more important for AW to rethink their Processor selection in their next M11x rather than the GPU. I personally feels that the GT335M is still good for most stuffs out there right now (yeah i mean most not all) but current machine is mostly bottomneck by the CPU rather than the GPU. So I would be more concern with which Processor rather than which GPU they gonna use.
     
  24. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    In what school of math are 420 and 425 not also higher numbers than 335? ;)
     
  25. first_leviathan

    first_leviathan Notebook Consultant

    Reputations:
    19
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    I dunno, Nvidia graphics cards are more stable when overclocked, so the M11x can technically beat the Acer since it can overclock to higher rates without exploding. :D

    The Acer wins, but there's a loophole Alienware will insist on exploiting. :)

    The CPU, admittedly, is depressing. The Acer has similar battery life to the R2, and it uses a real full-fledged i5. :| Alienware should really stick in a better CPU and use their "superior" engineers and designers to make it more power-efficient, like what Lenovo does with their ultraportables. D:

    As for the last post, sadly, a higher number doesn't always mean better. :| The Nvidia 8800 GT was way stronger than the 9600 GT. :(
     
  26. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    lol, any OC I try on the GPU crashes SC2... :(
     
  27. luffytubby

    luffytubby Notebook Deity

    Reputations:
    354
    Messages:
    829
    Likes Received:
    10
    Trophy Points:
    31
    12'' screen in the same chassis. a good IPS display... it would be.. incredible. It would virtually be the best laptop in the universe.

    Look at the new Macbook Air 11. M12x could be the ultimate laptop in the galaxy... only buyers who care about slimness would be turned off.

    the m11 has amazing asthetics, and a small footprint, yet powerful. if they can retain the 5-6 hours of battery, and not compromise on the other parts, I dont see any laptop in the world coming close to its combination of power and mobility. the ultimate tradeoff.
     
  28. zeobr

    zeobr Notebook Enthusiast

    Reputations:
    0
    Messages:
    38
    Likes Received:
    0
    Trophy Points:
    15
    but it is heavy...
    Macbook air 11 has only 1kg, half of the weight...
     
  29. CubsWin

    CubsWin Notebook Consultant

    Reputations:
    0
    Messages:
    102
    Likes Received:
    1
    Trophy Points:
    31
    Have you tried overclocking without touching she shader speed? I don't own SC2, but from what I've read some games are very finicky about shader overclocks.
     
  30. mightymax86

    mightymax86 Notebook Consultant

    Reputations:
    34
    Messages:
    234
    Likes Received:
    0
    Trophy Points:
    30
    well ATI is going to roll out with the 6000 mobility series within a month so it could go either way. it would be nice two options
     
  31. first_leviathan

    first_leviathan Notebook Consultant

    Reputations:
    19
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    Hopefully Alienware will be cool and let us select our GPU. Imagine choosing between a GT 445M or Radeon 6650 or something like that. I'd be torn. :eek: