The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Enable SLI problems M17x - 260M main 280M secondary

    Discussion in 'Alienware' started by GLiDE-86, Sep 25, 2009.

  1. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Good luck and let us know how it goes :D
     
  2. GLiDE-86

    GLiDE-86 Notebook Enthusiast

    Reputations:
    21
    Messages:
    24
    Likes Received:
    0
    Trophy Points:
    5
    My head hurts... hehehehe

    Damn I forgot how fun gaming systems can be... LoL!!!

     
  3. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    keep us posted

    I can offer advice on NVflash if needed, I have used both it and Nibitor alot
     
  4. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, I thought SLI had to be 2 identical cards also, curious to see how this flash thing turns out.
     
  5. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Glide was able to get the card to flash successfully after a few tries.

    Just waiting to see how it went.....
     
  6. rubyboy79

    rubyboy79 Notebook Evangelist

    Reputations:
    24
    Messages:
    530
    Likes Received:
    0
    Trophy Points:
    30
    according to the nvidia sli faq here http://www.slizone.com/page/slizone_faq.html#c7

    it says you cant mix and match 2 different cards though i do know of some people in desktop configs with 2 different cards in sli mode.

    basically my view on it is that if it works, the more powerful card would downclock to match the weaker card.

    i feel glide should flash his 260 to a 280 instead of changing your 280 into a 260 cos that way youll lose some performance? cos your 280 will work at the speeds of a 260 and hence not maximizing your cores etc


    i guess mixing and matching different cards dont work on mobile cards then? maybe the drivers arent mature enough unlike the desktop ones
     
  7. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    Doesn't matter which he flashes since he can change the clock speeds anyway.
     
  8. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    I think it matters from a heat standpoint flashing the 280 to a 260 makes more sense since he can software oc. It is better for the gpu's to be running at the lower clocks and he can software oc when and if he needs it for gaming rather than flashing the 260 to be oc'd all the time.
     
  9. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    but when flashing it he doesn't need the software
     
  10. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    so you are saying he should flash the 260 to a 280?
     
  11. Marvie100

    Marvie100 On a Mission

    Reputations:
    394
    Messages:
    1,221
    Likes Received:
    1
    Trophy Points:
    55
    Didn't Moo already do an experiment like this or was he only trying to flash his single 260 to a 280?
     
  12. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    I'm saying it doesn't matter.
     
  13. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    If anything, he should put the 280 in the primary slot and the 260 in the secondary. Has he tried that yet?
     
  14. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Problem with that is the sli connector is located differently on each card (I think).

    It'd be a stretch to get the cable to fit, but I am not 100% sure on this.
     
  15. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I think they actually are in the same place. Why would dell want to make 2 different cards ;) cheaper to just make 1
     
  16. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Like I said I am not 100% on that.

    He didn't want to start tearing off heatsinks anyway and I'm not sure the placement of the cards makes any difference.
     
  17. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    just thinking it out in my head, the connectors should be in at least roughly the same place
     
  18. sgilmore62

    sgilmore62 uber doomer

    Reputations:
    356
    Messages:
    1,897
    Likes Received:
    0
    Trophy Points:
    55
    might have to?
     
  19. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    it is easy to take off the heatsinks, you should anyways to replace the crap thermal paste they used
     
  20. Uroboros

    Uroboros Notebook Evangelist

    Reputations:
    15
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    I was reading this a few days ago did he ever get this to work? If you buy a slave off somebody does it work alone as the main gpu?
     
  21. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    For those that wanted to know, even after a successful flash the cards still wouldn't work in sli.

    He stated that the SLI option in the nv control panel wouldn't show up, even though he could select the second card to run physx.

    So, looks like I'll be able to test the 280 in my M15x after all ;)
     
  22. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    I think more people want to know if this works anyway.
     
  23. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    I have doubts it will work unless I under power the card a little.

    The 65W requirement might hold it back. I understand why Dell did that because all the new DX11 / 40nm parts are max 65W (instead of 75W). At least the ATI cards are, not sure about the 3xx series.
     
  24. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    why limit the power spec on the slot :(

    they that worried about bad battery life?! Bad call AW.....
     
  25. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    I don't know.

    Only thing I can think of is power supply (150W) and the full 75W isn't needed when the new cards come out ;)
     
  26. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    but having the full 75 watts matters ALOT for overclocking, there aint no direct connection to power supply like there is for desktops
     
  27. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    The 40nm mobile 5870 is specced at 40 - 65W. So even at max there should be some headroom to oc.

    The 280M supposedly runs @ 75W (which is the limit of the M17x MXM slot) and you can oc that decently, right?
     
  28. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    they always spec the TDP well above what the chips actually draw

    SPCR (a pretty good site) did a review between the Q9550 and the Q9550s (65w vs 95w) and found that they put up the exact same power usages.
     
  29. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Processors fluctuate a lot more than GPU's.

    I am sure Dell has their reasons and we won't know until newer cards come out ;)
     
  30. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    From tweakers.net

    According to a post by The Source, there are nine separate mobile parts inbound, three GPUs with three variants each. The family is called Manhattan, the chips are Broadway, Madison, and Park. Each has high end XT, mid range Pro, and low end LP variant, but some of those names may change before you see them. Since they are Evergreen parts, they obviously are all 40nm, DX11 and should be notably faster than their mobile M9x predecessors.

    Power is said to be 45-60W for the GDDR5 Broadway XT, dropping to 30-40W for the Pro, and the GDDR3 based LP takes only 29W. Madison uses GDDR5 for the 20-30W XT, either GDDR3 or 5 for the 20-25W Pro, and 15-20W for the GDDR3 based LP. Park goes down from there, 12-15W for the GDDR5 XT, 10-12W and sub-8W for the Pro and LP respectively, both of which use GDDR3. As with the 7xx parts, the memory controller will allow vanilla DDR3 in place of GDDR3 as well.
     
  31. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    hopefully, I just cant get too excited over the 4870s, gddr5 or not, knowing that by the time I probably manage to sell my GTX 280m's the new 5k cards would be out (or gt300). And I am in no financial position to be swapping around top shelf GPUs right now :( (maybe later though)
     
  32. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    That is the exact same way I feel, scook.

    I was waiting for GDDR5 4870's for forever and now that they are here I think the evergreen parts aren't too far off (Q1 2010).

    So I am not quite as pumped now as I would have been a few months ago.
     
  33. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    The 4870s with GDDR5 just should have been an option on the M17x at launch. The cards have existed for a LONG time at flextronics - they were originally meant for the W840DI (an MXM 2.1b version of course)
     
  34. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Would DDR5 have any effect if used in a MXM2.1 slot?
     
  35. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    When the 840 went eol it was because Dell contacted them to make the m17x

    Simple as that .... If Dell hadn't called they would be out right now ;)
     
  36. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    GDDR5 would work fine in MXM 2.1, no reason for it not too. The memory subsystem got for the GPU is independent of the mxm slot and chipset/CPU

    The video card is pretty much a standalone computer system that interfaces with the CPU/Ram via the PCI-e lanes
     
  37. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    From what I have heard, the MXM 2.1 slot has a lower max for the memory bandwith.

    Isn't half the performance gain for GDDR5 dependant on the increased speed of the memory?
     
  38. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    The bandwidth would be by the PCI-e spec, not the MXM. And both were PCI-e 2.0 x8 so I would imagine there is no difference. I thought the difference between MXM 2 and MXM 3 was just the power and cooling spec (mainly mounts for cooling)
     
  39. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Yeah, I am not sure and haven't found too many resouces online to look up MXM specs.

    Whatever. I just want to see the M15x with a 5870 :D
     
  40. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    So are you keeping both?
     
  41. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    For now ;) lol

    I got a good deal on both so if I need to sell the M17x I shouldn't loose too much money (if any).
     
  42. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    And I am pretty stoked to get an Intel chipset with the M15x.

    I am NOT a fan at all of nvidia's chipsets.....
     
  43. Mandrake

    Mandrake Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,976
    Messages:
    12,675
    Likes Received:
    65
    Trophy Points:
    466
    going off topic but why? I love the integrated gpu.
     
  44. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    I just have had better luck and more stability with Intel chipsets.

    Driver support, etc.

    I won't miss the 9400 too much. lol
     
  45. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Intel IGPs may suck, but DAM do the chipsets OC better. At least this is the case in the desktop world, I am relatively new to the laptop world of performance computing

    Typically the P series offers awesome performance for low price as well

    Glad you are happy Sleey0!
     
  46. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    Thanks, scook!

    I never really use the integrated in the M17x so I think it causes more problems than anything (for me at least).
     
  47. GLiDE-86

    GLiDE-86 Notebook Enthusiast

    Reputations:
    21
    Messages:
    24
    Likes Received:
    0
    Trophy Points:
    5
    Thanks to everyone for the help with the attempted 260M/280M SLI config in the M17X. Flashed with every possible combination we could come up with to no avail. The SLI option does not ever come up in the NVidia Control Panel. The card was only available for PhysX.

    Let me just say, that anyone who didn't know so already, Sleey0 is one classy dude to deal with. I felt more like I had failed in returning the card because I really wanted it to work. :( Thanks for the refund, bro!

    Oh well, I DO now have a very intimate relationship with my M17X and could take the sucker apart in my sleep at this point... LoL!!!

    Once again, thanks to all involved. As I get around to it, you will get rep points from ol' GLiDE... aCk!!! Thfpppttt!!! :p
     
  48. sleey0

    sleey0 R.I.P. AW Side Topics

    Reputations:
    1,870
    Messages:
    7,976
    Likes Received:
    0
    Trophy Points:
    205
    For the record, glide is one smooth customer!

    no prob, glide and anytime brother!
     
← Previous page