The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    NVIDIA GTX280M vs. NVIDIA GTX675M

    Discussion in 'Gaming (Software and Graphics Cards)' started by spartan948265, Mar 27, 2012.

  1. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    I currently own a 2009 Sager NP9280 and was wondering just how beneficial an upgrade would be from a 280M to a 675M. Id have to purchase a whole new laptop due to the 6xx series not being compatible with the laptop.
     
  2. omnivor

    omnivor Notebook Consultant

    Reputations:
    0
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
  3. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    I know about the specs Id like someones opinion on how the performance actually pans out.
     
  4. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Should be a nice, sizable nearly 100% performance increase. If not more, depending on games.

    280m is quite old right now, and the 580m is quite much more powerful. Go for it. Huge increase in performance.
     
  5. conscriptvirus

    conscriptvirus Notebook Evangelist

    Reputations:
    89
    Messages:
    334
    Likes Received:
    9
    Trophy Points:
    31
    It's hard to find gaming benchmarks for the 675m as its a pretty new card but it'd be quite a significant improvement in my opinion. I would guess based on 3Dmark scores that you would have like a 30-40% increase in performance in general games. You would probably see a higher increase in performance in higher end games.

    First, you get DX11 which could provide optimizations over DX10.
    Also, you would get a newer/better cpu,

    so im guessing there would probably be a large performance increase.

    IF you want to look more into FPS differences i ngames, try comparing the 580m NVIDIA GeForce GTX 580M - Notebookcheck.net Tech with the 280m and check out their gaming fps (at the bottom of the site)
     
  6. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    So if I were to then choose between the 580M and the 675M wouldn't the 580m perform better? Whats the percentage there?
     
  7. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
    0-1% at most. The 675m is just a rebranded 580m.
     
  8. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    So there wouldn't really be a point to order the 675m when I could just order the 580m and save money as well as still get the better performance?
     
  9. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
    Unless you absolutely want the newer tech, the 580m is the way to go. Now the GTX 680m on the other hand will be a good bit better than the 580m but it probably won't be released until summer at the earliest.
     
  10. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    So PowerNoteBooks.com is out of the 580's under the Sager brand. Are there any other reputable dealers that still have the 580 in stock?
     
  11. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
  12. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    wait for the gtx660m if you want new technology based on kepler on 28m. No point in getting 2010 type hardware in 2012. A gtx660m holds its own against a gtx675m nearly wihile taking 40-45w tdp vs the 100w for the gtx675m.

    NVIDIA Announces the GeForce 600M series Enabling High-end gaming on Smaller Laptops

    A gtx660m will be on thin ivy bridge laptops and will be 28nm. Kepler is 2x better performance per watt which shows in virtually most benchmarks.

    Put it this way to get the same performance of a gtx580m 100w card will now take 50w to do and in a much smaller form fator way better idle consumption.

    Its like a pentium 4 vs a core 2 duo in terms of performance per watt
     
  13. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
  14. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
    The 660m is also only 128-bit which automatically gimps the card. And it is slightly slower than the 570m, and 580m > 570m therefore 675m > 660m. The 660m looks to be a 560m basically.


    Yes it will.
     
  15. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    A bit, though the higher memory clocks help alleviate that.

    I am very interested what the 660M will clock to.
     
  16. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    Have you not seen the benchmarks on the link. There is not much difference in fps 10 fps difference between a gtx670m and gtx660m. This shows the gtx670m is basically 10% more faster then the 570m so basically the gtx660m should be nearly as powerful as a 570m. Anyway the 570m takes over 2x as much power to run to achieve as much performance. Its a no brainer to me to get a gtx660m that takes 40-45w vs a card that takes 95W ish just to perform slightly better like 10 fps max.

    NVIDIA GeForce GTX 670M - Notebookcheck.net Tech
     
  17. Gearsguy

    Gearsguy Notebook Deity

    Reputations:
    570
    Messages:
    1,592
    Likes Received:
    0
    Trophy Points:
    55
    Isnt a 280M about the same or even slightly worse performing than a GT 555M now? Of course this is to be expected but just realized how far we're dwindling down in notebook size to performance ratio and such
     
  18. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    You are basically saying just because it is 128 bit it reduces the power consumption. Its better due to the 28nm and kepler archietcture. Have you not seen the benchmarks on games at 1080p, it more then does enough to show its capable. Put it this way you could overclock it to perform better the the gtx 670m on games and it should take half the power. Kepler is amazing.

    I can't wait till we find exact benchmarks like gddr5 version if that is the gddr3 version. Who knows that could be the gddr3 version and the gddr5 version could outperform the gtx 670m while taking less power.
     
  19. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
    And the 660m has a 20fps difference in some game from those benchmarks.

    I don't see your argument. Where are you getting 40-45w tdp? The 670m is max 75w tdp, the 660m isn't even out so there is no actual tdp, 675m and 580m are 100w. Yes the 660m is based on new technology so it should max 45w but that doesn't mean anything.

    Have fun waiting for the gtx 660m to come out and paying a premium for it.
     
  20. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    Articles : GeForce 600M Notebooks: Powerful and Efficient - GeForce

    Look at this. Have fun having a laptop act like a heater and is like a brick to play the same game at same fps. With ivy bridge 50% less power consumption at same performance as sandy bridge and 2x performance per watt kepler over fermi while having a slim form factor, I am sure I know which is better for a laptop.

    Lenovo IdeaPad Y480 and Y580 Specs Announced

    This shows with a gtx660m the computer starts at $899 which is basically £563 in uk terms although uk price should be £700 for one of these machines as it costs more normally in the uk.

    This image tells you everything you need to know why everyone is waiting on kepler graphics. Imagine with ivy bridge as well how thin you could get it. Also not only does it weigh half as much its battery life is nearly 3 times better.

    http://www.geforce.com/Active/en_US...-efficient-and-powerful/600M-2xEfficiency.png

    This shows how thin a gtx660m computer will be so 40-45w card seems about right and soon as 28nm gpu and 22nm cpu should allow for slimmer cases this design is about right.
    Gigabyte’s New P2542G Gaming Notebook Packs Quad-Core Power, GeForce GTX660M GPU
     
  21. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Thats an 8 month old charlie article :/
     
  23. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Dudeeeeeeeeeeee.............. You don't seem to understand how % work. In general there is like 20fps or more difference btween 660m and 675m.

    On that graph, 675m is roughly around 50% faster on average than the 660m.

    570m takes 66% (assuming TDP to be power consumption in your example) and gets you ardound 25% better performance. It is a faster card by all means, but sure you would need a stronger cooling solution.

    You have no idea the magnitud of difference 10 fps made when you are barely maintaining 30fps in a game. 30fps vs 40fps is quite a lot. If performance difference was more around 60fps vs 70fps, it would be understandable.

    But the 660m is not that powerful at all... at least compared to what's available. It is, however, much more power efficient.

    And yeah the 580m is the way to go performancewise right now.
     
  24. s2odin

    s2odin Merrica!

    Reputations:
    1,085
    Messages:
    859
    Likes Received:
    2
    Trophy Points:
    31
    That image shows a 285 being equal to a 460.
    Well they're not. So there goes that picture.

    And I don't use my laptop for gaming so I could care less. A lot of members come here looking for gaming notebooks and they don't use them on battery and don't expect to. They want the best performance possible, and the gtx 660m is not going to offer that, plain and simple.
    Aug 24th 2011 :rolleyes:
    Lol +1
     
  25. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Wow, that's some ridiculous sources quoted. Notebookcheck included.

    To OP: Would you upgrade your GTX 280M to GTX 580M?
    You can compare numbers between 280M and 580M, they are publicly available.


    If yes, get the GTX 675M... or the GTX 580M.
    If no, then wait further for GTX 680M.

    Reason being GTX 675M is a rebranded GTX 580M.
     
  26. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    i owned a sager 8690 with a 280m. i promptly sold it to get my 8150 with a 485m, which is at least 150% the speed of the 280m.
    the 580m and 675m are still faster than my card, so coming from a 280m, you definitely would be feel the difference. plus you are also getting dx11 and optimus to boot.

    would i recommend you buy the 675m? maybe not. the foremost reason being that the 680m which is the true kepler high end gpu successor for nvidia should be out by june. it wont be cheap, but if youre looking for the best, thats going to be it.
    the second reason is that the 280m is still a fairly capable dx9 card, especially if your willing to go down to 900p for more demanding games. in my experience my sager downscaled wonderfully to 900p for games such as metro 2033. and also there arent many games out there that make use of dx11.

    so to answer your question, yes there is a significant difference between the 280m and the 675m. but if you were to upgrade and you had the cash and dont mind waiting a few months, just wait for the real high end kepler.
     
  27. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    s2odin

    You do realise the gtx660m will be far better for gaming in the summer and will be portable instead of the big gaming laptops you have now.

    gt640m with a better cpu probably would come close to matching the gtx 460m. Notebookcheck theres like 0-15 fps difference between them in most.

    The 660m can be overclocked to make the difference. At the end of the day we don't know if the gtx660m is using gddr5 memory yet on those benchmarks. This technology is incredible upgrade and the first real upgrade since amd got the 5000 series graphics cards out for laptops and 32nm dual cores by intel.

    Fact of the matter is 2x performance per watt I believe is based on dx11 games and around 70% better performance per watt overall in most games is good enough. This makes gaming on laptops and having low idles even with discrete graphics.
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    The GTX660M must be paired with GDDR5.
     
  29. spartan948265

    spartan948265 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
  30. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You might want to wait for the 7970M. AMD have always been cheaper than Nvidia so perhaps the 7970M won`t be that much more expensive than 580M while offer a great performance boost over it.

    Alienware is showcasing the 7970M already in next month I think with the GPU ready for sale in May
     
  31. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    I would still buy a 6990M and save the green. :)
     
  32. Drummerboy2212

    Drummerboy2212 Notebook Enthusiast

    Reputations:
    0
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    15
    Not knowing what the performance or price would be, that's hard to say as of now...

    On another topic, I would not under estimate AMD's GCN, it's quite a nice design, on a clock for clock basis HD 7970 and GTX 680 are almost identical. GPU designs are getting more similar as compute performance is a new'ish business potential for both companies. Its going to be drivers that determine winner and second place...

    There should be a large boost in performance this cycle, and almost certainly AMD's solution will be priced more attractively.
     
  33. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    The gtx 680 is easily better performance per watt which is the main thing. It doesn't matter what clock rate it is as long as it runs cool enough and takes less watt I am happy. AMD GCN performance per watt is a fair bit behind nvidia kepler.
     
  34. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Just a tiny bit actually. Specially on stock where the difference in power consumption is between 5 and 20w on average.

    But yeah, in the end the 680 is better in performance/watt and overall performance. Running cool is entirely up to the cooling method though. Both cards run at very similar temperatures because they both have decent cooling solutions.
     
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    As you slow a chip down it will loose a lot of performance much quicker than a chip targeted at a certain frequency instead.

    The 680 has a LONG way to get down to 100W.
     
  36. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GTX 680, 670Ti and 670 should all be GK104. GTX 660 is rumored GK106
     
  37. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    well not really, i bet originally the gk104 was meant to run at lower clocks, because it was designed to be the gtx 670ti. basically the 680m will be a downclocked and undervolted gtx 670ti, not a gtx 680. therefore it will have fewer cores and probably lower clocks, which will make it easier to lower the overall tdp.
     
  38. harmattan

    harmattan Notebook Evangelist

    Reputations:
    432
    Messages:
    642
    Likes Received:
    55
    Trophy Points:
    41
    Yep, its like the 8800m gtx rebrand to the 9800m gt all over again, although this will not stop Dell from charging $100 more for the newer card for the interval when both are listed /;
     
  39. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    As you disable cores your efficiency goes down.
     
  40. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    true so maybe they wont disable cores

    But my thought is that they initially were going to make the gtx 670ti with the full 1536 cores, but at more normal clocks say around 700 or 800, which would have a lower tdp and most likely a decrease in voltage. but when things fell through with the gk100 or gk110 they decided to put all of their effort into getting the gk104 to have really high quality yields so that they could clock it very high and market it as the gtx 680. so the higher quality gk104 cards that would have become the gtx 670ti have become the higher clocked gtx 680. this is probably also the reason that we have not seen any gtx 670ti's released as of yet, they didnt plan to use the gk104 cards as the gtx680 so therefore there are now not enough to launch the gtx 670ti.

    just my guess as to what happened and a reason why i think the gtx 680m will be based on the gk104.

    http://webcache.googleusercontent.c...as-GTX-670-Ti-.html+&cd=6&hl=en&ct=clnk&gl=us
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    We know it's going to be based on the 104, it certainly is not the 107. It's going to have to shed a fair few Mhz and millivolts. That's all I am saying.
     
  42. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    GK107 is already max out at 384 cores IIRC.