Good luck and let us know how it goes![]()
-
-
My head hurts... hehehehe
Damn I forgot how fun gaming systems can be... LoL!!!
-
cookinwitdiesel Retired Bencher
keep us posted
I can offer advice on NVflash if needed, I have used both it and Nibitor alot -
Yeah, I thought SLI had to be 2 identical cards also, curious to see how this flash thing turns out.
-
Glide was able to get the card to flash successfully after a few tries.
Just waiting to see how it went..... -
according to the nvidia sli faq here http://www.slizone.com/page/slizone_faq.html#c7
it says you cant mix and match 2 different cards though i do know of some people in desktop configs with 2 different cards in sli mode.
basically my view on it is that if it works, the more powerful card would downclock to match the weaker card.
i feel glide should flash his 260 to a 280 instead of changing your 280 into a 260 cos that way youll lose some performance? cos your 280 will work at the speeds of a 260 and hence not maximizing your cores etc
i guess mixing and matching different cards dont work on mobile cards then? maybe the drivers arent mature enough unlike the desktop ones -
Doesn't matter which he flashes since he can change the clock speeds anyway.
-
I think it matters from a heat standpoint flashing the 280 to a 260 makes more sense since he can software oc. It is better for the gpu's to be running at the lower clocks and he can software oc when and if he needs it for gaming rather than flashing the 260 to be oc'd all the time.
-
but when flashing it he doesn't need the software
-
so you are saying he should flash the 260 to a 280?
-
Didn't Moo already do an experiment like this or was he only trying to flash his single 260 to a 280?
-
-
If anything, he should put the 280 in the primary slot and the 260 in the secondary. Has he tried that yet?
-
It'd be a stretch to get the cable to fit, but I am not 100% sure on this. -
cookinwitdiesel Retired Bencher
I think they actually are in the same place. Why would dell want to make 2 different cards
cheaper to just make 1
-
Like I said I am not 100% on that.
He didn't want to start tearing off heatsinks anyway and I'm not sure the placement of the cards makes any difference. -
cookinwitdiesel Retired Bencher
just thinking it out in my head, the connectors should be in at least roughly the same place
-
-
cookinwitdiesel Retired Bencher
it is easy to take off the heatsinks, you should anyways to replace the crap thermal paste they used
-
I was reading this a few days ago did he ever get this to work? If you buy a slave off somebody does it work alone as the main gpu?
-
For those that wanted to know, even after a successful flash the cards still wouldn't work in sli.
He stated that the SLI option in the nv control panel wouldn't show up, even though he could select the second card to run physx.
So, looks like I'll be able to test the 280 in my M15x after all -
-
I have doubts it will work unless I under power the card a little.
The 65W requirement might hold it back. I understand why Dell did that because all the new DX11 / 40nm parts are max 65W (instead of 75W). At least the ATI cards are, not sure about the 3xx series. -
cookinwitdiesel Retired Bencher
why limit the power spec on the slot
they that worried about bad battery life?! Bad call AW..... -
I don't know.
Only thing I can think of is power supply (150W) and the full 75W isn't needed when the new cards come out -
cookinwitdiesel Retired Bencher
but having the full 75 watts matters ALOT for overclocking, there aint no direct connection to power supply like there is for desktops
-
The 40nm mobile 5870 is specced at 40 - 65W. So even at max there should be some headroom to oc.
The 280M supposedly runs @ 75W (which is the limit of the M17x MXM slot) and you can oc that decently, right? -
cookinwitdiesel Retired Bencher
they always spec the TDP well above what the chips actually draw
SPCR (a pretty good site) did a review between the Q9550 and the Q9550s (65w vs 95w) and found that they put up the exact same power usages. -
Processors fluctuate a lot more than GPU's.
I am sure Dell has their reasons and we won't know until newer cards come out -
From tweakers.net
According to a post by The Source, there are nine separate mobile parts inbound, three GPUs with three variants each. The family is called Manhattan, the chips are Broadway, Madison, and Park. Each has high end XT, mid range Pro, and low end LP variant, but some of those names may change before you see them. Since they are Evergreen parts, they obviously are all 40nm, DX11 and should be notably faster than their mobile M9x predecessors.
Power is said to be 45-60W for the GDDR5 Broadway XT, dropping to 30-40W for the Pro, and the GDDR3 based LP takes only 29W. Madison uses GDDR5 for the 20-30W XT, either GDDR3 or 5 for the 20-25W Pro, and 15-20W for the GDDR3 based LP. Park goes down from there, 12-15W for the GDDR5 XT, 10-12W and sub-8W for the Pro and LP respectively, both of which use GDDR3. As with the 7xx parts, the memory controller will allow vanilla DDR3 in place of GDDR3 as well. -
cookinwitdiesel Retired Bencher
hopefully, I just cant get too excited over the 4870s, gddr5 or not, knowing that by the time I probably manage to sell my GTX 280m's the new 5k cards would be out (or gt300). And I am in no financial position to be swapping around top shelf GPUs right now
(maybe later though)
-
That is the exact same way I feel, scook.
I was waiting for GDDR5 4870's for forever and now that they are here I think the evergreen parts aren't too far off (Q1 2010).
So I am not quite as pumped now as I would have been a few months ago. -
cookinwitdiesel Retired Bencher
The 4870s with GDDR5 just should have been an option on the M17x at launch. The cards have existed for a LONG time at flextronics - they were originally meant for the W840DI (an MXM 2.1b version of course)
-
Would DDR5 have any effect if used in a MXM2.1 slot?
-
When the 840 went eol it was because Dell contacted them to make the m17x
Simple as that .... If Dell hadn't called they would be out right now -
cookinwitdiesel Retired Bencher
GDDR5 would work fine in MXM 2.1, no reason for it not too. The memory subsystem got for the GPU is independent of the mxm slot and chipset/CPU
The video card is pretty much a standalone computer system that interfaces with the CPU/Ram via the PCI-e lanes -
From what I have heard, the MXM 2.1 slot has a lower max for the memory bandwith.
Isn't half the performance gain for GDDR5 dependant on the increased speed of the memory? -
cookinwitdiesel Retired Bencher
The bandwidth would be by the PCI-e spec, not the MXM. And both were PCI-e 2.0 x8 so I would imagine there is no difference. I thought the difference between MXM 2 and MXM 3 was just the power and cooling spec (mainly mounts for cooling)
-
Yeah, I am not sure and haven't found too many resouces online to look up MXM specs.
Whatever. I just want to see the M15x with a 5870 -
-
For now
lol
I got a good deal on both so if I need to sell the M17x I shouldn't loose too much money (if any). -
And I am pretty stoked to get an Intel chipset with the M15x.
I am NOT a fan at all of nvidia's chipsets..... -
going off topic but why? I love the integrated gpu.
-
I just have had better luck and more stability with Intel chipsets.
Driver support, etc.
I won't miss the 9400 too much. lol -
cookinwitdiesel Retired Bencher
Intel IGPs may suck, but DAM do the chipsets OC better. At least this is the case in the desktop world, I am relatively new to the laptop world of performance computing
Typically the P series offers awesome performance for low price as well
Glad you are happy Sleey0! -
Thanks, scook!
I never really use the integrated in the M17x so I think it causes more problems than anything (for me at least). -
Thanks to everyone for the help with the attempted 260M/280M SLI config in the M17X. Flashed with every possible combination we could come up with to no avail. The SLI option does not ever come up in the NVidia Control Panel. The card was only available for PhysX.
Let me just say, that anyone who didn't know so already, Sleey0 is one classy dude to deal with. I felt more like I had failed in returning the card because I really wanted it to work.Thanks for the refund, bro!
Oh well, I DO now have a very intimate relationship with my M17X and could take the sucker apart in my sleep at this point... LoL!!!
Once again, thanks to all involved. As I get around to it, you will get rep points from ol' GLiDE... aCk!!! Thfpppttt!!! -
For the record, glide is one smooth customer!
no prob, glide and anytime brother!
Enable SLI problems M17x - 260M main 280M secondary
Discussion in 'Alienware' started by GLiDE-86, Sep 25, 2009.