If I was to purchase the M17x with just the basic 260m, could I later add a 280m? would it make a difference what mxm slot they were in?
Also I am wondering if there are any codes for us canadians?![]()
this is my first post but I have been lurking on here for a while, just never bothered to make an account because im on my dads 40 year old laptop.![]()
-
dondadah88 Notebook Nobel Laureate
lol. it's fine you should of made an account sooner.
i call to see if you can make the upgrade and one rep said yes for 600 dollars and two reps said no. (i believe the two reps over the one that said yes because he seemed unsure)
if i were you and you wanted the 280m sli over the 260m sli then i would wait and get them and get it over and done with. trust me you don't want to go down the path of upgrading the gpu later it is not nice.(not the labor.)
It's even worse if you can't just go to a website like newegg, rjtech, eurucom to find gpu's. you have to search forums and search ebay and then having the money on the spot and or losing it by a second and starting all over. trust me it's a pain.
so i would wait even if it takes you a month and the suspense is killing you. it's worth it in the end. -
no hes asking if you can install a 260m and a 280m together lol
-
dondadah88 Notebook Nobel Laureate
yeah you can, you would have to flash them to be both identical but it's not worth going threw all that and the performance would be unstable becuase one card would not fire right or as much data as the other one.
-
no they would work theoretically fine in SLI if you flashed one like i did
-
im not even going to attempt that
by the way moo are you getting that qx9300 from hp? -
lol no i do alot of things for this forum but i can't afford anything else right now
sadly college is expensive and i don't want a quad i want a x9100 -
hehe yea my university is expensive $40k a year damn.
-
They gonna work without flashing, the GTX280m will downlock to the same clocks of GTX260m and will only use 112 shaders. Thats how SLI works
-
@ big i thought it needed to have a 280 firmware to work correctly ....
I know it will work but i haven't seen SLI on a laptop with 2 different cards... -
Cards with the same generation GPU work in SLI even if the shader numbers/clock/mem is different.
Or at least should.
-
dondadah88 Notebook Nobel Laureate
with my belief i would think sli will see both cards and since there's flashed the same it will work the same but because one has 128sp and 112sp, it will make games unstable.
ex afr. 280m, 260m both flashed to 280m's becasue it sounds bad... and i will still use 260 and 280 in order to get the point across. and then you have frame odd(1,3,5) and frame even(2,4,6)
now a quick explaination of afr. it makes one card render the first(odd frames) card and the other card render the second (even frames) and just keeps firing. it does it very fast lol.
ok now crysis shaders on very high maxing out both cards. (frames will be in numbers)(280m odd. the 260m even)
frame
1st 280m took charge and can possible do 25fps by it self
2nd 260m took charge and can possible do 20fps by it self
3rd 280m
4th 260m
etc
now because it's happening so fast instead beacsue it's fps(frames per second).. lets do fph(frames per hour) much slower but you can get the point clearly.
now the 1st can do 25 running nice and smooth and then swap
now 2nd can do 20 fine but it drops the 25 because of the fact it can make all the shaders look sexy so it backs it from 25 to 20 for the next minute
now 3rd the the better one takes charge after following the lead as afr does, then knowing it can do 25 be saw it just hit from 25fps to 20fps, it hit 22,
4th now the 260m after just maxing out at 20fps it sees that it was just at 22 tries but runs unstable and does 18fps.
now since your getting te picture, switch it from fph to fps and run that threw your head. and now
5th 280m sees what's going on gives up and bluescreens or just crashes.
that's my belief. could i be wrong yes, but i haven't seen it tested out so that's my belief. -
dondadah88 Notebook Nobel Laureate
someone flash a 280m to a 9800m gtx in the sager forum(thanks emike09) and it still read as 128sp. so you will still have 128. i have done some research on this but it's 2:10am over here so i'm poofed -
The more powerful card will be limited at the same number of shaders by the driver.
And anyway theres no game that can max out the shaders so the stuttering isn't going to happen because it.
Ppl tends to think that just because you have more shaders they are all being used which isn't true.
-
yea my only issue is SLI is picky about being the same firmware i know it will downclock but the device ID's not matching is my only concern
-
Yes SLI works like that (and i already tested).
But if you want a more trustworthy source:
Mis-Matched Card Capabilities
One of the major roadblocks for SLI originally was the fact that you had to have matching video cards to use it. Worse still, early SLI adopters not only had to worry about the brand of the cards they were using, but also the BIOS versions for those cards. Today’s SLI is much different. While it is my opinion that NVIDIA still has a major hand in making sure SLI-able cards are up to spec, I picked two very different cards to put into my SLI rig. As mentioned above, I’m happy to say that an EVGA 7800 GTX and an MSI 7800 GTX are working well together. Keep in mind that these two cards do not even share the same default GPU and RAM speeds. In spite of the fact that the cards are from different manufacturers, and that they run at different GPU and RAM speeds, rest assured that they -- as NVIDIA promised -- work very well together.
Note: You must still use two graphics cards that have the same GPU model. For example, a GeForce 7800 GT must be matched with another GeForce 7800 GT (not with a GeForce 7800 GTX or a GeForce 6800 Ultra).
http://www.hardocp.com/article/2005/10/27/nvidia_sli_ready_for_mainstream/2
And it work with different shaders too if the GPU is the same (what is the case here).
EDIT: I'm gonna sleep but i forget to say one thing.
The card will still read as 128SP but the extra SP's are not going to be in use. -
dondadah88 Notebook Nobel Laureate
-
So the question still remains who wants to be the guinea pig to test this theory?
-
-
dondadah88 Notebook Nobel Laureate
msi 9800 gtx + evga 9800 gtx to see the performance difference
msi 9800 gtx + msi 9800 gtx
msi 9800 gtx + msi 9800 gt flash to gtx
and alot more examples. -
dondadah88 Notebook Nobel Laureate
you don't think so. do the 260 desktop card with 192 sp will produce the same results of the 260 with 216 stream cpus???? and then the 280 and 295 is going to be close cause it's using x2 in there cards. ??????????????
in ati's section the 4870 using 160(800) nvidia Sp's, will be the same as the 4870x2 with 320SP. (1600sp)?????????????????
here's a review
http://www.hardwarecanucks.com/foru...e-216-216-sp-superclocked-edition-review.html -
cookinwitdiesel Retired Bencher
The games don't use the shaders.....the drivers do. The driver creates as many simultaneous threads as the GPU can use, it knows this based off the device ID (that tells the driver what it has to work with)
This is pretty much the same as a single vs dual vs quad core CPU comparison. The same OS will run on all of them, but can do more with more cores.
But back on topic, I believe this concept would work with CrossfireX as mentioned but NOT SLI. If it does work, then both cards will run at the spec's of the lower card. For better parallelism. -
already covered -
cookinwitdiesel Retired Bencher
just reiterating for anyone who is skipping to the last page haha.....
-
..... yea sadly people don't even read any more....
one thread about recovering your lost alienrespawn i answered 2 post then 10 post later some one asked how to do it .... i just gave up and told him i wouldn't tell him -
cookinwitdiesel Retired Bencher
now you know why they say it takes patience to be a teacher haha
-
... im no teacher i just answer questions
-
So, what is the final answer to this question? lol j/k, but yea personally I enjoy reading the whole thread, if it is not too long.
-
cookinwitdiesel Retired Bencher
the FINAL answer, is we don't know, and wont until someone tries it (which is very unlikely)
But it would make MUCH more sense to just run GTX 260m SLI and this will cost less and we know it will work -
i say it will work
... -
dondadah88 Notebook Nobel Laureate
i would say if you flash them it would work but it will be unstable.
-
how is a card unstable when downclocking
?
overclocking maybe (not really) but no it would work fine
SOMEONE send me a card -
cookinwitdiesel Retired Bencher
you say, but you don't know - until it has been done, unless you very thoroughly perused the driver to find out
Still makes much more sense to just get 260m sli....would be same performance, guaranteed to work, and cheaper -
i agree with your second part not the first
-
cookinwitdiesel Retired Bencher
good enough lol
-
dondadah88 Notebook Nobel Laureate
-
no because the 280 woudl drop to exactly how the 260 is
-
dondadah88 Notebook Nobel Laureate
so you think it will just use 112sp instead of 128?
-
Sli = Raid 0
-
cookinwitdiesel Retired Bencher
if you flash the gtx280m down, the sp will still be there, but the driver will only be sending threads for 112 sp (because that is what the device ID will tell it), not the full 128, idk how the GPU will decide which SP's to use or not though
-
So basically you can put a 280 with a 260, but the 280 will run on 260 settings and use the same ammount of shaders? so no point in doing this instead of just getting dual 260m's?
If i understand correctly, and thanks for the quick help. -
To be honest I can't be bothered to read the whole thread but thought this information (if not already posted) might help? Even if it just helps people who don't understand the complicated stuff you guys talk about
http://uk.slizone.com/page/slizone_faq.html#c3 -
will they have/sell the 280m as a single card configuration on the website?
-
I don't think they plan on offering that, but if they did it would probably be cheaper to sell the 260 on ebay and buy the 280m for 500ish.
-
cookinwitdiesel Retired Bencher
running a single 280m is kind of pointless....if you need higher performance just get dual 260m's - the cost is nearly the same and you will have higher performance
There is NO market for buying MXM 3.0 cards right now so you cannot plan on selling what you are not using at this point.... -
What about someone with an M17x with just one 260m that wants a second one? even I check ebay for a second one...
-
If you can eliminate all the other factors (clock, mem, etc) yes (unless you find a game which uses 3 pass and more geometry than is human possible).
.
And no, drivers aren't magic and can't magically divide 1 thread in two.
If you use single pass and decide to do something with GS you will create a thread which gonna run in a shader and that's it.
-
cookinwitdiesel Retired Bencher
the driver acts like a compiler creating the instruction threads from the 3d game data, or at least that has been my understanding.....the driver will make X threads based off how many SP it think there are based off the device ID?
Keep in mind I have never written or looked at a driver but have taken computer design and programming, so that is how I arrived at what I just said -
. The driver create the threads based on which instructions you used in the game but what you run in shaders is generally simply tasks which need to run in a sequence what make asymmetric processing impossible.
If i want to create 3 geometries using GS and pass 3 I'm gonna use 9 shader processors and nothing can make it use more.
-
cookinwitdiesel Retired Bencher
well i didnt do bad for completely talking out my a$$ lol
Couple of Questions, Can you have a 260m and a 280m?
Discussion in 'Alienware' started by Alekx, Aug 4, 2009.