The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    HURRAY: Nvidia 600 series not just Fermi!! (Kepler)

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 2, 2012.

  1. XXVII

    XXVII Notebook Geek

    Reputations:
    52
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    You just summarized my entire thought process, verbatim, before I pulled the trigger and opted for the 570m and the machine in my signature. I also really needed a new notebook, so that made the decision even easier. Glad I went with a 570m.
     
  2. OCNbluedevil

    OCNbluedevil Notebook Consultant

    Reputations:
    0
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    Did anyone but me notice that the new GTX 660m will have "up to 384 cuda cores"? While the 570 and 670 still are 336? The 580 and 675 are at 384 still too? I am suspecting the higher end models will show up in Sagers and MSIs and the gimped GTX 660m (sub 384 cuda cores) will go to Asus.

    GTX 660m GDDR5 128-bit
    http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-660m/specifications

    GTX 670m GDDR5 192-bit
    http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-670m/specifications

    GTX 675m GDDR5 256-bit
    http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-675m/specifications

    The only thing that has changed is the memory bus/type from model to model in the GTX lineup.
     
  3. awakeN

    awakeN Notebook Deity

    Reputations:
    616
    Messages:
    1,067
    Likes Received:
    4
    Trophy Points:
    56
    Poor Asus :( Looks like the new G55&G75 series is going to be a horrible fail.


    Haha, but I can't afford a 680M probably, and if the 660M won't come out with a 192-bit (it better), then 570M I shall buy.

    Time to head to the MSI forum to figure out more :p
     
  4. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    You are right about the performens but look at the link i posted eirlier the 680 on full oc was sucking just under 400 watts wile 7970 was just belo500
     
  5. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I see, you are right, I think we are thinking pretty much the same, I thought you are saying it is not a good card, it is really good, but it actually shines in performance/power :) I was not expecting that it was going to crush 7970, but it is clearly the winner (mostly in benches though, as you are saying, in games they trade blows), we'll see, what I am hoping right now is a really nice GK104 based gtx 680m (maybe wishful thinking :))
     
  6. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    basically the new kepler cuda cores are not as fast as the old fermi cores, but since the kepler cores are significantly more efficient and smaller they can fit much more of them on a chip. my guess is that the 680m will be based on the gk104 and it will have somewhere between 768 and 1536 cuda cores.

    then after nvidia gets their gk100 all straightened out, we will see a gtx685 as their new flagship.
     
  7. Molite

    Molite Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    The GTX 660m, is better price/performance right? I mean I only play games like Diablo, WoW, and SC2, maybe GW2 if it's good. So it would fit me right with the new Ivy Bridge stuffs.

    I'm hoping to get Clevo/Sager, NP8131, or any 15" model. Which would you recommend I pre-order?
     
  8. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Can't go wrong with a Clevo.. or an Alienware.. But as for pre-orders.. i would just wait until everything is available and then decide.


    Tom's Review puts it best;

     
  9. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
  10. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Looks very impressive in my humble opinion but very little 60FPS gaming with those settings (no wonder) :)
     
  11. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    The thing is with that particular test, you are comparing over 30% overclock of the radeon with an overvoltage applied, compared to 20%ish of the GTX680.

    Thats why the power consumption of the radeon skyrocketted. On average the difference is between them on stock is around 8watts and 25watts... which is also in the article you posted.

    Hell even in SLI/Crossfire the difference was 53 watts. With the Radeons having lower power consumption on idle.
     
  12. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Monsternotebooks with the P150EM:
    GTX 680M "July-September 2012 Delivery"
    Monster

    :(
     
  13. awakeN

    awakeN Notebook Deity

    Reputations:
    616
    Messages:
    1,067
    Likes Received:
    4
    Trophy Points:
    56
    This could just be because it's in Turkey. The ETA in the US should be shipped out on April 29th/30th
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Really? Where have you heard that?
     
  15. awakeN

    awakeN Notebook Deity

    Reputations:
    616
    Messages:
    1,067
    Likes Received:
    4
    Trophy Points:
    56
  16. omnivor

    omnivor Notebook Consultant

    Reputations:
    0
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    That link has the 675M as a $399 upgrade from the 670M, that seems really high, even after conversion.

    EDIT: Actually, that just sounds like the upgrade price we've had for going from 570M to 580M.
     
  17. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
  18. kevmanw4301

    kevmanw4301 Notebook Deity

    Reputations:
    146
    Messages:
    1,476
    Likes Received:
    1
    Trophy Points:
    56
    They are actually rather close in performance. Even though the 460M has more shaders, 285M's shaders are stronger, and bandwidth is close.
     
  19. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    i was not saying that the 7970 are not great card
    And if you read through the overclock part you will se thatthe 680 are at its top staible oc as are the 7970
    And to be honest the 680 is a bit of a disapointment in my eyes hope the gk 100/110 comes soon i think thats the one that will do the job proper :)
     
  20. Bullit

    Bullit Notebook Deity

    Reputations:
    122
    Messages:
    864
    Likes Received:
    9
    Trophy Points:
    31
  21. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Lol...
    If this is the case, it's a major fail.
    But it's also possible that Nvidia is actively working on separating the gaming and 'professional' cards intentionally (even though the hardware between the two is exactly the same) so people would be forced to get either one or the other (A LOT more expensive for 'pro cards') depending on their needs.
     
  22. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    guys, 680m is the best gaming product for centuries, and I think it is enough to be that way, you cannot do everything.. yeah I think they will separate professional cards further (they already started with Tesla, who cares about quadro)
     
  23. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    Yep the GTX 680 sucks in that respect. Even the Radeon 7870 fares better iirc. But I personally don't give a damn and I'm willing to bet that 98% of its potential buyers don't either. GPGPU focus results in big, complex, expensive, inefficient chips (Fermi) and can be detrimental to gaming performance (Cayman).

    With Fermi Nvidia basically said screw that, let's build the fastest gaming & compute chip, even if it has to be >500mm² and inefficient. It proved quite troublesome at first but in the end they achieved their goal. AMD stayed reasonable for Tahiti but the fact that Nvidia improved efficiency with Kepler and went for a ~300mm² chip specifically designed for gamers looking for the best bang for the buck is a bummer for them, as they used to rely on that as their competitive advantage, at least since the 4000 series.

    If you factor in the fact that the green team is scheduled to later release their own big GPGPU oriented chip, I think AMD have taken the risk of looking plain silly in the upper-mid end/high-end segments. Since they no longer have the efficiency crown, and they're not in a position to set the trend (since their chips are bigger and more complex), and Nvidia will have 2 better performing chips that will cover both gaming and GPGPU markets.
     
  24. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    except that the pro and regular cards are identical.
    They mostly separate them either via VBIOS or software... or through a different measure.
    Besides, CUDA was a fairly recent product introduced... why didn't they simply remove that from gaming gpu's in the first place?

    It's idiotic and unfair.
     
  25. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    dude, not for nvidia, quadro / tesla / geforce are totally different (well, spec-wise, manufacturing point of view I have no idea)
     
  26. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  28. lordbaldric

    lordbaldric Notebook Consultant

    Reputations:
    2
    Messages:
    203
    Likes Received:
    6
    Trophy Points:
    31
    This is totally untrue. They are the same cards.
     
  29. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    so you are thinking number of CUDA cores equals to same cards... I said I don't know the manufacturing process, but all specs except CUDA cores is different, show me quadro 6000's equivalent in gtx family?
     
  30. technerd

    technerd Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    graz`zt is right, Quadro and Tesla cards have more vram and are gpgpu orientated, whereas the GTX series is gaming orientated
     
  31. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    Our Chinese friend 慕容蛛蛛 just said that the Clevo GTX680M 4GB is now expected end of June... ;)
     
  32. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    4GB will not matter for most gamers. Even most notebooks it is destined to be in support max 2 displays at any one time. What I am interested to see is the 2GB variant. ;)
     
  33. awakeN

    awakeN Notebook Deity

    Reputations:
    616
    Messages:
    1,067
    Likes Received:
    4
    Trophy Points:
    56
    That sounds like overkill o_O about as much overkill as having 32 GB of RAM.
     
  34. technerd

    technerd Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    not to mention the 64gb ram capabilities of those new x79 boards, but back on topic, why would nvidia put 4gb on a mobile chip and only have 2gb on their desktop version?
     
  35. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    probably they are considering surround display for mobile platform too (otherwise no idea)
     
  36. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    4TB option for GTX 680 in June!
     
  37. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    well actually the new p150em has dvi,hdmi, and display port and nvidia says the gk104 can output up to 4 displays so it makes perfect sense to have that much vram if it can support that many displays, and anyways they always stick more vram in on notebooks anyways.
     
  38. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Thanks man. Nice of you to have us updated. I wonder if I can wait all the way to July. But its tempting :)

    Yup, agree with this. For people using a single monitor 4GB is a total overkill though.

    It makes you wonder if you have to pay a premium for the card, since they stuffed so much VRAM inside to cater for everyone.

    ---------------------

    7970M is due to release anytime now btw. The new Alienwares will offer 675M, 660M and 7970M, plus Clevo/Sager have now discontinued 6970M and 6990M. Something is coming pretty soon :)

     
  39. abhi1986

    abhi1986 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    when can we expect to see one in a dell xps 17?i am think of buying one in may.i'll wait another month or 2 if it releases.
     
  40. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Unless you are working across several monitors and using the full power of your gpu, more VRAM will hardly matter.
    As for being more gpgpu oriented... that's only because the manufacturer of the cards force them to use OpenGL (via VBIOS or another type of modification) and gaming gpu's use direct3d (both though have same hardware support... primary difference is in the software).

    Hardware-wise they ARE identical.
    You also have gaming gpu's that have different VRAM sizes/types - that doesn't mean it does them any good though in gaming.
    :D
     
  41. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    man seriously? He wasn't meaning any performance, VRAM being different just plain simple makes it different, get it? it is not a BIOS modification or driver difference which makes the ram differ... *facepalm*
     
  42. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    The difference between most consumer and professional GPU are double and single precision performance... Nothing more.
     
  43. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    Google Translate

    This shows the gt650m should play games like dirt 3 without aa I would say at over 30fps on high. Looks like a gtx660m is needed for proper gaming.
     
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Oh nice a review. Thanks :)

    They are testing the DDR3 version of GT 650M. It is the Samsung Q470 which also have a weak dual core without turbo. Which is why you see the DDR3 version crush 555M in benchmarks and shockingly is not that far away from GTX 560M lol.
    I see that the notebook does very good in Battlefield with high settings and in Resident evil with high settings too. Throw in a better CPU and it would do a lot better in Dirt 3. Or use the GDDR5 version :)

    I tried rep you but have given away too much rep today. You have to wait
     
  45. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    I saw the gpuz and it is ddr3. Maybe gt650m gddr5 should be as powerful as a gtx560m then. My acer 5930g though is good enough except cpu is poor so I may wait till dell outlet sell for half the retail price l7..x and l521x.
     
  46. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    The only exception i can see is for people using SLI's 680Ms on a high-rez (4k) external. That would be pretty handy. But those folks are few and far between.

    a 4G Desktop card would be much more useful.


    Hardware Wise, yes. But the drivers make the biggest difference. The Gamer card drivers are tweaked often for maximize game performance and patch bugs. Thats not the case with the professional / CAD video card drivers.
     
  47. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    That's my point.
    The only main difference between pro and gaming cards are in the drivers (software) which specializes the gpu's to a singular direction.
    Hardware-wise, they are effectively the same - therefore a gaming gpu if successful in being moded to be recognized as it's pro equivalent, would behave in 3d programs like a pro card.

    And Nvidia/AMD both overcharge for the difference at hand - which is a plain and simple robbery.
     
  48. sancez

    sancez Notebook Guru

    Reputations:
    0
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    How can you say it is robbery. NV/AMD are getting payed to support their products for a market where money isn't as important as stability and good support for the programs needed.

    It's simply another market. If the products are the same who cares. They serve a different purpose thus the different price.
     
  49. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    for nvidia this is just wrong, please look at the specs of quadro 6000 (which indeed was a technological BREAKTHROUGH), gtx 560Ti, gtx 570, gtx 580, ALL are different... (that is why they are charging 2k for a Tesla GPU and world at large is buying it, otherwise trust me, companies are NOT stupid...) number of CUDA cores is NOT equal to the card being the same...

    for amd, it is right as m8900 is equal to 6970m spec-wise (in mobile world, I didn't check desktop specs yet).
     
  50. AlphaMagnum

    AlphaMagnum Notebook Consultant

    Reputations:
    19
    Messages:
    117
    Likes Received:
    1
    Trophy Points:
    31
    My big concern at this point is what comes next haha.

    I mean, 600M has largely been a disappointment imo, due to the long string of rebadges. If the 680M isn't absolutely phenomenal, I wouldn't be surprised if nVidia releases something ridiculous in say....6 months after the release of 680M. Like what the GTX 485M was to the GTX 480M, if you're looking for a comparison.
     
← Previous pageNext page →