I currently own a 2009 Sager NP9280 and was wondering just how beneficial an upgrade would be from a 280M to a 675M. Id have to purchase a whole new laptop due to the 6xx series not being compatible with the laptop.
-
-
I know about the specs Id like someones opinion on how the performance actually pans out.
-
Should be a nice, sizable nearly 100% performance increase. If not more, depending on games.
280m is quite old right now, and the 580m is quite much more powerful. Go for it. Huge increase in performance. -
conscriptvirus Notebook Evangelist
It's hard to find gaming benchmarks for the 675m as its a pretty new card but it'd be quite a significant improvement in my opinion. I would guess based on 3Dmark scores that you would have like a 30-40% increase in performance in general games. You would probably see a higher increase in performance in higher end games.
First, you get DX11 which could provide optimizations over DX10.
Also, you would get a newer/better cpu,
so im guessing there would probably be a large performance increase.
IF you want to look more into FPS differences i ngames, try comparing the 580m NVIDIA GeForce GTX 580M - Notebookcheck.net Tech with the 280m and check out their gaming fps (at the bottom of the site) -
So if I were to then choose between the 580M and the 675M wouldn't the 580m perform better? Whats the percentage there?
-
-
So there wouldn't really be a point to order the 675m when I could just order the 580m and save money as well as still get the better performance?
-
Unless you absolutely want the newer tech, the 580m is the way to go. Now the GTX 680m on the other hand will be a good bit better than the 580m but it probably won't be released until summer at the earliest.
-
So PowerNoteBooks.com is out of the 580's under the Sager brand. Are there any other reputable dealers that still have the 580 in stock?
-
FORCE 1761 / MSI 1761 - XOTIC PC
15/17" respectively. -
wait for the gtx660m if you want new technology based on kepler on 28m. No point in getting 2010 type hardware in 2012. A gtx660m holds its own against a gtx675m nearly wihile taking 40-45w tdp vs the 100w for the gtx675m.
NVIDIA Announces the GeForce 600M series Enabling High-end gaming on Smaller Laptops
A gtx660m will be on thin ivy bridge laptops and will be 28nm. Kepler is 2x better performance per watt which shows in virtually most benchmarks.
Put it this way to get the same performance of a gtx580m 100w card will now take 50w to do and in a much smaller form fator way better idle consumption.
Its like a pentium 4 vs a core 2 duo in terms of performance per watt -
-
-
Meaker@Sager Company Representative
A bit, though the higher memory clocks help alleviate that.
I am very interested what the 660M will clock to. -
Have you not seen the benchmarks on the link. There is not much difference in fps 10 fps difference between a gtx670m and gtx660m. This shows the gtx670m is basically 10% more faster then the 570m so basically the gtx660m should be nearly as powerful as a 570m. Anyway the 570m takes over 2x as much power to run to achieve as much performance. Its a no brainer to me to get a gtx660m that takes 40-45w vs a card that takes 95W ish just to perform slightly better like 10 fps max.
NVIDIA GeForce GTX 670M - Notebookcheck.net Tech -
Isnt a 280M about the same or even slightly worse performing than a GT 555M now? Of course this is to be expected but just realized how far we're dwindling down in notebook size to performance ratio and such
-
You are basically saying just because it is 128 bit it reduces the power consumption. Its better due to the 28nm and kepler archietcture. Have you not seen the benchmarks on games at 1080p, it more then does enough to show its capable. Put it this way you could overclock it to perform better the the gtx 670m on games and it should take half the power. Kepler is amazing.
I can't wait till we find exact benchmarks like gddr5 version if that is the gddr3 version. Who knows that could be the gddr3 version and the gddr5 version could outperform the gtx 670m while taking less power. -
I don't see your argument. Where are you getting 40-45w tdp? The 670m is max 75w tdp, the 660m isn't even out so there is no actual tdp, 675m and 580m are 100w. Yes the 660m is based on new technology so it should max 45w but that doesn't mean anything.
Have fun waiting for the gtx 660m to come out and paying a premium for it. -
Articles : GeForce 600M Notebooks: Powerful and Efficient - GeForce
Look at this. Have fun having a laptop act like a heater and is like a brick to play the same game at same fps. With ivy bridge 50% less power consumption at same performance as sandy bridge and 2x performance per watt kepler over fermi while having a slim form factor, I am sure I know which is better for a laptop.
Lenovo IdeaPad Y480 and Y580 Specs Announced
This shows with a gtx660m the computer starts at $899 which is basically £563 in uk terms although uk price should be £700 for one of these machines as it costs more normally in the uk.
This image tells you everything you need to know why everyone is waiting on kepler graphics. Imagine with ivy bridge as well how thin you could get it. Also not only does it weigh half as much its battery life is nearly 3 times better.
http://www.geforce.com/Active/en_US...-efficient-and-powerful/600M-2xEfficiency.png
This shows how thin a gtx660m computer will be so 40-45w card seems about right and soon as 28nm gpu and 22nm cpu should allow for slimmer cases this design is about right.
Gigabyte’s New P2542G Gaming Notebook Packs Quad-Core Power, GeForce GTX660M GPU -
Nvidia’s 28nm mobile lineup leaked | SemiAccurate
AMD’s 28nm mobile lineup leaked too | SemiAccurate
N13GE shows as 40-45w card and is known as the gtx660m. -
Meaker@Sager Company Representative
Thats an 8 month old charlie article :/
-
On that graph, 675m is roughly around 50% faster on average than the 660m.
570m takes 66% (assuming TDP to be power consumption in your example) and gets you ardound 25% better performance. It is a faster card by all means, but sure you would need a stronger cooling solution.
You have no idea the magnitud of difference 10 fps made when you are barely maintaining 30fps in a game. 30fps vs 40fps is quite a lot. If performance difference was more around 60fps vs 70fps, it would be understandable.
But the 660m is not that powerful at all... at least compared to what's available. It is, however, much more power efficient.
And yeah the 580m is the way to go performancewise right now. -
Well they're not. So there goes that picture.
And I don't use my laptop for gaming so I could care less. A lot of members come here looking for gaming notebooks and they don't use them on battery and don't expect to. They want the best performance possible, and the gtx 660m is not going to offer that, plain and simple.
-
Wow, that's some ridiculous sources quoted. Notebookcheck included.
To OP: Would you upgrade your GTX 280M to GTX 580M?
You can compare numbers between 280M and 580M, they are publicly available.
If yes, get the GTX 675M... or the GTX 580M.
If no, then wait further for GTX 680M.
Reason being GTX 675M is a rebranded GTX 580M. -
i owned a sager 8690 with a 280m. i promptly sold it to get my 8150 with a 485m, which is at least 150% the speed of the 280m.
the 580m and 675m are still faster than my card, so coming from a 280m, you definitely would be feel the difference. plus you are also getting dx11 and optimus to boot.
would i recommend you buy the 675m? maybe not. the foremost reason being that the 680m which is the true kepler high end gpu successor for nvidia should be out by june. it wont be cheap, but if youre looking for the best, thats going to be it.
the second reason is that the 280m is still a fairly capable dx9 card, especially if your willing to go down to 900p for more demanding games. in my experience my sager downscaled wonderfully to 900p for games such as metro 2033. and also there arent many games out there that make use of dx11.
so to answer your question, yes there is a significant difference between the 280m and the 675m. but if you were to upgrade and you had the cash and dont mind waiting a few months, just wait for the real high end kepler. -
s2odin
You do realise the gtx660m will be far better for gaming in the summer and will be portable instead of the big gaming laptops you have now.
gt640m with a better cpu probably would come close to matching the gtx 460m. Notebookcheck theres like 0-15 fps difference between them in most.
The 660m can be overclocked to make the difference. At the end of the day we don't know if the gtx660m is using gddr5 memory yet on those benchmarks. This technology is incredible upgrade and the first real upgrade since amd got the 5000 series graphics cards out for laptops and 32nm dual cores by intel.
Fact of the matter is 2x performance per watt I believe is based on dx11 games and around 70% better performance per watt overall in most games is good enough. This makes gaming on laptops and having low idles even with discrete graphics. -
Meaker@Sager Company Representative
The GTX660M must be paired with GDDR5.
-
I think I will attempt to get a 580M as I know I wont be able to afford the premium pricing of the 680M when it drops in Summer 2012. It seems its still a very significant upgrade. Who knows maybe if I can find a supplier that will sell them in SLI I might get away with that as well.
Thank you all for your valuable input and feedback, I didn't expect this thread to go this far (2+ Pages) and thanks again!
-
You might want to wait for the 7970M. AMD have always been cheaper than Nvidia so perhaps the 7970M won`t be that much more expensive than 580M while offer a great performance boost over it.
Alienware is showcasing the 7970M already in next month I think with the GPU ready for sale in May -
TheBluePill Notebook Nobel Laureate
I would still buy a 6990M and save the green.
-
On another topic, I would not under estimate AMD's GCN, it's quite a nice design, on a clock for clock basis HD 7970 and GTX 680 are almost identical. GPU designs are getting more similar as compute performance is a new'ish business potential for both companies. Its going to be drivers that determine winner and second place...
There should be a large boost in performance this cycle, and almost certainly AMD's solution will be priced more attractively. -
The gtx 680 is easily better performance per watt which is the main thing. It doesn't matter what clock rate it is as long as it runs cool enough and takes less watt I am happy. AMD GCN performance per watt is a fair bit behind nvidia kepler.
-
But yeah, in the end the 680 is better in performance/watt and overall performance. Running cool is entirely up to the cooling method though. Both cards run at very similar temperatures because they both have decent cooling solutions. -
Meaker@Sager Company Representative
As you slow a chip down it will loose a lot of performance much quicker than a chip targeted at a certain frequency instead.
The 680 has a LONG way to get down to 100W. -
GTX 680, 670Ti and 670 should all be GK104. GTX 660 is rumored GK106
-
-
-
Meaker@Sager Company Representative
-
true so maybe they wont disable cores
But my thought is that they initially were going to make the gtx 670ti with the full 1536 cores, but at more normal clocks say around 700 or 800, which would have a lower tdp and most likely a decrease in voltage. but when things fell through with the gk100 or gk110 they decided to put all of their effort into getting the gk104 to have really high quality yields so that they could clock it very high and market it as the gtx 680. so the higher quality gk104 cards that would have become the gtx 670ti have become the higher clocked gtx 680. this is probably also the reason that we have not seen any gtx 670ti's released as of yet, they didnt plan to use the gk104 cards as the gtx680 so therefore there are now not enough to launch the gtx 670ti.
just my guess as to what happened and a reason why i think the gtx 680m will be based on the gk104.
http://webcache.googleusercontent.c...as-GTX-670-Ti-.html+&cd=6&hl=en&ct=clnk&gl=us -
Meaker@Sager Company Representative
We know it's going to be based on the 104, it certainly is not the 107. It's going to have to shed a fair few Mhz and millivolts. That's all I am saying.
-
GK107 is already max out at 384 cores IIRC.
NVIDIA GTX280M vs. NVIDIA GTX675M
Discussion in 'Gaming (Software and Graphics Cards)' started by spartan948265, Mar 27, 2012.