The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *Official Alienware M18x AMD CrossFire Discussion Thread (re: Dell/AW Conference Call)*

    Discussion in 'Alienware 18 and M18x' started by Mr. Fox, Oct 29, 2011.

  1. steviejones133

    steviejones133 Notebook Nobel Laureate

    Reputations:
    7,172
    Messages:
    10,077
    Likes Received:
    1,122
    Trophy Points:
    581
    I think we all were under the assumption it was on hold for the time being....

    NP Fox, my pleasure. I too am looking forward to more and more liason with Dell over future topics. The more discussion, the better!

    As above. Bill, as soon as it is available - drop a note on the linked thread and we will get going on it.
     
  2. nuroo

    nuroo Notebook Consultant

    Reputations:
    68
    Messages:
    156
    Likes Received:
    0
    Trophy Points:
    30
    Mr Fox will post major discussions from conference call but I wanted post a few quick small things Luis cleared up:

    ALL m18x's are made in china. So are 14x,15x,17x

    Numbers on motherboard near battery compartment are date stamps not motherboard revisions. All m18x have same motherboard.

    Fan profiles are something they are considering, it's something they can do. No eta, or even guarantee.

    Customizations for Dell laptops are rolled into Nvidia drivers.

    Customizations for Dell laptops are NOT rolled into AMD/Intel drivers. They are working with them to make this happen in the future. This is why Dells driver for 6990m's has brightness control on function keys and New drivers from AMD/ATI don't.

    Mr Fox will have a much more in depth follow-up post. This is just a teaser to answer speculation on where laptops are made and motherboard revisions. I remember pages of post on those two questions.
     
  3. steviejones133

    steviejones133 Notebook Nobel Laureate

    Reputations:
    7,172
    Messages:
    10,077
    Likes Received:
    1,122
    Trophy Points:
    581
    I thought one of the interesting answers was regarding the lack of "on the fly" gfx switching on the M18x - the explanation was informative and something I hadnt considered to be a reason why we dont have it. I particularly liked Louis' answer to why this wasnt available on a single gpu setup after explaining why dual gpu setups dont have this feature (I think it was Johnksss who asked that question but my line wasnt that clear)......but then again, you cant configure an M18x with a single gpu, can you? - moot point perhaps?
     
  4. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    base m18x is single 560m gtx
     
  5. steviejones133

    steviejones133 Notebook Nobel Laureate

    Reputations:
    7,172
    Messages:
    10,077
    Likes Received:
    1,122
    Trophy Points:
    581
    Damn you're good lol - I forgot about base... :eek:
     
  6. ACHlLLES

    ACHlLLES Notebook Virtuoso

    Reputations:
    303
    Messages:
    2,199
    Likes Received:
    0
    Trophy Points:
    55
    FYI Dell is sending me another M18x with same spec instead of 580 Sli.

    I was told the nvidia card is on back order, and we are just going to cross our fingers and hope it doesn't happen with the new system.

    I'm quite happy with the spec, so I'll be happy as long as it doesn't overheat like the one I have now.

    Good luck to you all as well.
     
  7. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    I brought up the lack of dedicated hardware for controlling frame buffers between the two cards and general on the fly switching.

    Louis says both AMD and Nvidia were not forthcoming on how their cards handle the resource allocation buffer and frame buffer allocations.

    They simply don't want to provide a white paper on how things work, this is clearly some sort of capital and investment issue these companies are mulling on and they think they don't want to get the hassle of providing support for graphics switching on such platforms.

    Can it be done? Of course it can, provide the white paper and the detailed layout of how their system works and we VLSI engineers will come up with any solution for the problems. But that means time and investment from AMD and Nvidia. They are simply saying a straight forward "NO", no technical or precise info why they won't. Dell cant do much if these companies don't play ball.

    Its all down to AMD and nVidia coming forward and being open about how their systems work. As a dedicated logic is needed for this purpose. Nvidia has "Copy express engine" for their single card solutions, and they even talk about multi GPU scaling in their own white papers, but funny they wont actually tell how its done to any vendor like Dell.

    There isn't a technical problem stopping anyone from making a solution, its all bureaucracy from AMD and nVidia.

    Once a solution is done in layouts and simulations, VLSI engineers can send the design specs and sim models to any contractor in Taiwan to fabricate the chips. And in bulk the prices are going to be in the order of 5-15 USD per chip. That is not cost increase for a 2 grand system.
     
  8. steviejones133

    steviejones133 Notebook Nobel Laureate

    Reputations:
    7,172
    Messages:
    10,077
    Likes Received:
    1,122
    Trophy Points:
    581
    With that explanation, it does seem rather remiss of Dell not to bother with it....the assumption I took from it was that as it was a heavy system aimed at a true DTR, Dell didnt bother with anything to enable on the fly - basically assuming that no one would want to carry the lump around and use it!

    Thanks for explaining the ins- and outs as I am quite the lamen in reality.
     
  9. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Dell cant do anything even if they wanted to, unless AMD and Nvidia are willing to share more details. I think you are right, it is AMD and Nvidia who are thinking why bother giving on-fly switching to a DTR. If that's the case, shame on them.

    Dell is completely innocent on this infact their hands are tied.
     
  10. SOS4DELL

    SOS4DELL A Notebook Philosopher

    Reputations:
    865
    Messages:
    969
    Likes Received:
    20
    Trophy Points:
    31
    My congratulations to Mr. Fox and each one of the other organizers and participants (from both sides). I feel that you all had represented us in the best possible way.
    I feel this as a promising begging of a periodical agenda of metings on the many topics we all are passionate about them.
    Thank you all pals!!!
     
  11. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    i concur. including the ah-6 overwatch in the background. :D
    the big part you forgot to mention...all motherboards are brand new.... :)
    it was i. :)
    found this out myself today. :mad:
    they are basically watching optimus have many issues... switching when no one wants it to... speculation of course.
    nvidia does not like intel and vice versa. for more than just pr reasons.. ;)


    and a big thanks to mr fox! well done on the front end of things.
    the back end would be the report you're posting in the next few days... :)
     
  12. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    If all the motherboards are the same revision, then why do earlier boards work and not later ones? Or was there just a big batch of defective boards made?
     
  13. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    not even sure the boards are defective since 580m gpus work in them.
    it has to do with the ctf circuit on the 6990m and some more technical stuff.
     
  14. nuroo

    nuroo Notebook Consultant

    Reputations:
    68
    Messages:
    156
    Likes Received:
    0
    Trophy Points:
    30
    Two Problems.
    1. Something is causing 6990m gpu circuit not to report correct temp to mombo.

    2. When no temperature signal is received mombo should max fans but its not.

    Luis gave us a match better explanation but I'm no engineer. "Basically" there's an interrupt that passes gpu temperature info to mombo. There is a signal loss or interruption thats causing a problem. When this signal is interrupted the fans are suppose to go max speed, but they are going zero speed instead. Some cards are not causing the problem, some are. They only have one test system that has the problem but when we send them the captures they'll have more to test. Thats it in a nut shell, plz wait for Mr Fox's explanation vetted thru Luis.
     
  15. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    Now that explains why at times my fans go full blast for no apparent reason with zero load and low idle temps. Interesting... My machine is from the very first batch, btw.
     
  16. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    Yea Aiki I have seen that as well ... but its not a problem since increased fans are no worries.

    I guess this means we have to put the end to all the AMD bashing thats been going on (as if, as Aiki has pointed out NV hasn't had their share of failure). There was a developing thread here of ... 'AMD screwed us, never buying AMD again, see ... NV is worth the extra cash' ... or something as such.

    Good to have some clarification ...
     
  17. gderreck

    gderreck Notebook Enthusiast

    Reputations:
    117
    Messages:
    43
    Likes Received:
    1
    Trophy Points:
    16
    I work out of town, Northern Alberta, and just caught up with the thread. Glad you had the opportunity to speak with Dell regarding the issue(s). I am looking forward to reading the synopsis. Read the tease...interesting about the MB "revisions". Perhaps that, in effect, is what they are. It seems that although we end up at the same place (overheating); we get there in different ways. Even my 2 machines (original and replacement) exhibit different symptoms.

    My compliments to Fox and everyone who contributed to this. I has been an education for me, just following the thread.
     
  18. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    nvidia is only 150 bucks more now. :D
     
  19. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    In that case I should just push for a new system at this point with 580Ms. No point in holding on to potentially defective hardware.
     
  20. skygunner27

    skygunner27 A Genuine Child of Zion

    Reputations:
    1,694
    Messages:
    2,679
    Likes Received:
    7
    Trophy Points:
    56
    I agree.
    10chars
     
  21. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    580's are on back order. and now the new setup is only 150 bucks more.

    so now the drama about price performance is here by "smacked in the face"... :D :D
     
  22. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    with this and amds drivers one would literally have to be an idiot to go amd :p
     
  23. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    That might be for those who had terrible issues with it. I have yet to have a bad experience with either product. I will choose AMD at this time because the over all power draw is lesser and I care about my monthly bills.
     
  24. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    now that is nonsense..rotflmao!
    they both draw 100 watts maxed out.
     
  25. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    You sure about that? That sounds like alot of assumptions even after you owning a crossfire system john. :D

    My Specs don't draw more than 225 watts on most games un-overclocked and on single card using F@H its only 163 watts. Most games over around 125-155 watts regions on a single card.

    Overclocked I touch 256watts, can you say the same for the Nvidia?

    Even on Furmark the draw is lesser than 289 watts un-overclocked. I have already seen many reporting over 300 watts easy on furmark for the 580Ms unless those people were reading it wrong.
     
  26. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    The 580Ms draw more power at max load than the 6990s, about 10-15W each. That's what we have seen in-lab. All stock clocks, we never overclocked.
     
  27. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30

    ^^ Exactly and since they have 160SP clusters inactive on each of my 6970Ms which are essentially the same chips that got binned as 6970M instead of the full 6990Ms, the power draw is lesser on max loads compared to the 6990Ms and further lower compared to the 580Ms

    There should be no debates over this one as its old news.
     
  28. Yeti575

    Yeti575 Notebook Consultant

    Reputations:
    204
    Messages:
    205
    Likes Received:
    19
    Trophy Points:
    31
    Not an option in the UK - 580's are they're still an extra £550 versus the 6990 or roughly $825 ..
     
  29. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    really....i didnt see you benching at 885/1200 clocks or did i miss something?
    Xen has some stock numbers as well for the 580's. ill have to re run them when they get around to sending my replacement.

    power draw was over 300 watts with a es 2920xm at max watts of 90W

    so, not sure what page you guys are on.
     
  30. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30

    At stock 2920XMs the power draw could be a max of 5-10 watts, which is also doubtful considering even most apps show my 2720QM pulling 55watts when its supped to be 45watts. Unless its measured directly on the motherboard those programs are just approximations.

    I get 289 watts max on furmark and without that I get much lower. Thats is considerable reduction compared to 580Ms.

    And I am more than happy with these GPUs, my next target are the HD 7000s.
     
  31. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    ill give it a retest. when i get my other 580m.

    yeah, i get far more that 289....but at the end of the day...you are right...the 580/sli/2960xm all over clocked and pull up to 380W...the amd was about 15 to 20 watts lower. (while running under dice)


    side note:
    if your worried about 20 watts....lol...better run the igpu...haha
     
  32. skygunner27

    skygunner27 A Genuine Child of Zion

    Reputations:
    1,694
    Messages:
    2,679
    Likes Received:
    7
    Trophy Points:
    56
    You guys are suppose to be in the "Big Leagues"....who cares about power draw or monthly E bills?

    My IGP draws less power than both of your 580M SLI & 6990M CF!! lol. Don't even make me change my Intel HD 3000 from performance to balanced!!
    Don't even......................
     
  33. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    ~20+ is under extreme loads, under regular loads the difference is larger. I don't see much beyond barely 210 watts on most DX 11 titles and most of the times its in the 175 watt region both cards active depending on the game. And I use my system for more than 11 hours a day at my apartment. Either gaming or doing folding@home.

    So I care about the 4-5 dollars I save every month, that's ~60USD wasted per year when I am just fine with this setup at the moment. This is in El Paso, pretty soon I will be in Singapore where electricity bills and room rents are the biggest killers in terms of cost of living. And the moment the HD 7000 launches I am getting rid of my HD 6970Ms because moles in the know are sporing the TDPs for the HD 7000 mobile series as looking very very good. It will be no where near 75 watts stock if all goes well at TSMC.

    As long as I get smooth frame rates on the vast majority of titles out there and its cheap, I go with that one. Regardless if its AMD or Nvidia. At this time that is AMD.

    Its useless to pay 150 USD more for Nvidia and then pay 60 USD more (in El paso in Singapore even more) over the year all for nothing. Its like am paying 150 more or whatever amount just so I can pay more in terms of electricity bills. Besides all the 3 top options 6970M/6990M/580Ms are very playable on most titles, so why the need for the absolute fastest at the expense of power bills? Not for me.
     
  34. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,819
    Trophy Points:
    931
    :D :D :D

    yeah, im just giving him a hard time. hes a good guy though and knows quite allot.

    side note:
    dude..i run:
    42 tv
    phase change unit
    400W
    portable ac
    1200W
    water chiller
    680W
    two gtx 580s and an over clocked 980X
    1680W


    so 20 watts is of no concern for me.
    :D
     
  35. lqm

    lqm Notebook Consultant

    Reputations:
    109
    Messages:
    205
    Likes Received:
    64
    Trophy Points:
    41
    No issues with thermal shutdown.

    MB Rev. F2 1136

    M18x
    i7-2960XM
    16GB 1600
    2x 6990M
    2x 500GB Momentus XT RAID 0
     
  36. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    You have the monetary resources for that bro, I don't. :D

    The max I tried was taking my Athlon thoroughbreds back in the day from 1.4Ghz to 3.15Ghz on custom R404-A, it was an expensive affair that I am not interested in anymore.
     
  37. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Outlet? Or brand new build?
     
  38. FredFlint_

    FredFlint_ Notebook Consultant

    Reputations:
    4
    Messages:
    259
    Likes Received:
    4
    Trophy Points:
    31
    I have been waiting to see if dell found a solution to the overheating issue and it looks like it may take some time so think I will call them but not sure what to expect? What is the best way to proceed?

    Thank.
     
  39. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    You could push for a 580M swap. Citing you are fed up with the failing fans causing the system to over heat which will reduce life of the product due to repeated similar scenarios.

    AW Lead Graphics engineer Mr. Louis confirmed they have put 6990M replacements on hold as they need to get to the root cause. No point in probably sending out more of the same when they haven't tracked down the exact causes just yet.
     
  40. FredFlint_

    FredFlint_ Notebook Consultant

    Reputations:
    4
    Messages:
    259
    Likes Received:
    4
    Trophy Points:
    31
    The price difference in the UK is big so don’t think they would do that and I would prefer to have AMD cards anyway. I just would rather not have to send them the laptop and wait for it to come back and have the same or more issus.
     
  41. lqm

    lqm Notebook Consultant

    Reputations:
    109
    Messages:
    205
    Likes Received:
    64
    Trophy Points:
    41
    Ordered brand new from Dell website on 9/8/2011 and received unit on 9/19/2011.
     
  42. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Ah you are in the UK, then forget what I said.

    Stick with HWiNFO64 to run them full blast, just make sure the settings are not more than 3800 rpms for the GPUs otherwise they seem to cut off at random.

    Thats great to know you got a good system without the problems. As per the conference we learned that Motherboards and GPUs are not physically different.
     
  43. lqm

    lqm Notebook Consultant

    Reputations:
    109
    Messages:
    205
    Likes Received:
    64
    Trophy Points:
    41
    Yeah. My GPU temps under heavy load like Skyrim max at about 75c. Fans kick in heavy during those times and usually bring the temps back down to the 60c range. This is all using HWMonitor.
     
  44. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30

    Very good, so you don't have to worry about a re-paste for a long while to come.
     
  45. killaz05

    killaz05 Notebook Evangelist

    Reputations:
    12
    Messages:
    373
    Likes Received:
    2
    Trophy Points:
    31
    I dont seem to be able to have HWinfo64 work to control my fans :-/
     
  46. lqm

    lqm Notebook Consultant

    Reputations:
    109
    Messages:
    205
    Likes Received:
    64
    Trophy Points:
    41
    Can't do it in MSI Afterburner either.
     
  47. nuroo

    nuroo Notebook Consultant

    Reputations:
    68
    Messages:
    156
    Likes Received:
    0
    Trophy Points:
    30
    HWinfo64 can indeed control fans, on all m18x's. Can even have a custom rpm setting based on cpu temp. U guys using it right?

    Matter on fact its one of the 2 known work arounds from the thermal shut down problem dual 6990m's are causing.
     
  48. killaz05

    killaz05 Notebook Evangelist

    Reputations:
    12
    Messages:
    373
    Likes Received:
    2
    Trophy Points:
    31
    Can you let me know what option I have check to allow me access or where in the program I can find it? thank you
     
  49. nuroo

    nuroo Notebook Consultant

    Reputations:
    68
    Messages:
    156
    Likes Received:
    0
    Trophy Points:
    30


    OK

    First go into config. uncheck both GPU I2C Support and GPU I2C via NVAPI. That will make it load sensors much faster. "press ok"

    then goto senors:
    @ bottom of pop-up window, press little button to left of "loggin start". Looks like a fan.

    use custom auto
    leave lower settings alone change the max's to only 3900 rpm.
     
  50. killaz05

    killaz05 Notebook Evangelist

    Reputations:
    12
    Messages:
    373
    Likes Received:
    2
    Trophy Points:
    31
    Thank you very much! Very helpful. Any reason to keep it at only 3900rpm?
     
← Previous pageNext page →