The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    NVIDIA Geforce GTX 660M Release Information + CUDA Core Count

    Discussion in 'Gaming (Software and Graphics Cards)' started by yknyong1, Mar 22, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I don`t think the Kepler chips themselves run much hotter than Fermi to be honest. It is Acer that have screwed the pooch. All the reviews of the Acer timeline says there is a hotspot under the laptop that get extremely hot. After all, the Acer is a ultrabook. You won`t find many GT555M inside ultrabooks btw...

    I think Acer used some light and very thin materials around the GPU or whatever is causing the heat, thus creating a risk for burn.

    Here is a guy who have been OVERCLOCKING his GT 650M inside the Lenovo Y480.

    His GPU is 44 degree celsius while idling, and max 71 degree celsius under HEAVY overclock under full load. So yeah...
    ÏÂÒ»´ú¼Ü¹¹ÆØ¹â£¡GT650MÕæÊµ²ÎÊý£¡ÄÚÓоªÏ²£¡_±Ê¼Ç±¾×ÛºÏÂÛ̳_̫ƽÑóµçÄÔÍø²úÆ·ÂÛ̳
     
  2. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Isnt that the Samsung Q470? Atleast the GPU Z says the Card manufacturer is samsung.. But I really havent checked

    Like I said we will just have to wait and see, from this point on everything is just guessing and speculating.. The few hard facts we have dont give us alot of info about throttling or the new kepler running hot...

    Also the problem isnt the GPU getting too hot, but the combination of the GPU and CPU.. I can only repeat.. IVB still has the same throttling limits but offers more perfomance, in combination with kepler it could become very hot and throttle, we wont know until the first IVB laptops with kepler gpus are being tested

    A respected member from a german alienware forum (supposedly an engineer) even claims that IVB is more likely to throttle than sandy bridge.. So lets wait and see :)

    Also turbo boost is an indication of the cooling solution not being able to handle the full power of a CPU, the turbo boost is only supposed to be enabled for a short time when the CPU is in intensive situations, and not for a long time like when playing Battlefield 3..

    Also that the upcoming MSI GE60 and GE70 fat gamer notebooks feature an 650m doesnt really talk for the 650m being cool

    http://www.notebookjournal.de/news/kommende-msi-gaming-boliden-setzen-auf-neue-nvidia-gpus-4388
     
  3. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    With GPU Turbo pushing the clocks until TDP is reached... it is little wonder cards are hot.
     
  4. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Not really TDP arent exact measures, only theoretical values..
     
  5. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    In fact the cards are now making the most use of the thermal and power capacity, so the result you know.
     
  6. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    you should read this then..

    Source
     
  7. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    TDP differs by manufacturer though...
     
  8. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    ooops my bad. It is a Samsung Q470 and not a Lenovo :)
    Guess we will have to wait and see some reviews before coming to a conclusion about the heat with Kepler yes. But I sincerely hope not that the turbo isn`t limited by heat like Sandy Bridge is, but is able to turbo fully throughout the gaming session. I just imagining this turbo to be a power saving feature that downclock while idling or doing older easier games, while is able to do full turbo when gaming in demanding games.

    But you`re right, turbo means more heat. But laptop OEMs is responsible to make a cooling system and use materials that can handle this turbo heat without destroying the laptop or hurting the owners though

    Oh well, still a few questions that need to be answered :)
     
  9. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    sorry for double posting.
     
  10. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    Yep, I expect the mobile GK 104 to be launched as GTX 680M and then the 700 series a bit later, or GTX 780M outright. The first case would be just like what they did with the GTX 485M.

    That would make no sense. Nvidia doesn't have a track record of playing with the last digits to squeeze their new products in the line of existing ones. And they'd be right for once as it would be plain stupid. The GTX 670/675M are not fated to last, so they have no interest in those - soon to be - outdated chips overshadowing their new mobile Kepler.

    Releasing some new mobile parts featuring the GK 106 (768CC) and using names like "GTX 665M" would only "work" if it's performing significantly worse than the GTX 670M, which is very highly unlikely as in fact it's probable it will trounce it, and it would at all events make the nomenclature incomprehensible.

    Nvidia won't show concern for the implied "rule" that says you can't release new series too soon after the previous & for no apparent good reason, especially not with their recent track-record, and aren't likely to let themselves be bothered by a crowded nomenclature of GTX 660M, 665M, 670M, 675M, 680M, etc. They'll launch the 700 series sooner than anyone thinks and be done with it, and few people will find reasons to object.
     
  11. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Turbo boost isnt about downclocking a CPU/GPU to save power, but rather to overclock it in intensive tasks .. GPUs already have had lower clocks on idle for example for a long time.. so this isnt anything new..

    Turbo Boost is ideal for situations where the CPU needs a short extra burst of speed to get a task done a bit quicker. If it's a prolonged task, then Turbo Boost is going to kick on and off depending on how well the CPU is being cooled and generally throw off a lot of heat in the process.
     
  12. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    What I meant was that in a super weak game, lets say Pacman 3D, you got full throttle with a 580M and lots of heat and lots of energy consumption. You also had the same full throttle with Battlefield 3. But with this Kepler turbo feature the GPU doesn`t have to go full throttle with Pacman 3D to make 60FPS (or whatever), it downclock itself to save power while still giving you a smooth gameplay (60FPS). Battlefield however is full clocks all the time, like the previous generations of CPU had. They didn`t downclock the turbo but could go forever on max. That is what I`m hoping Kepler have also. And hopefully not the throttling problem you speak of which Sandy Bridge had due to bad cooling :)
     
  13. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Do you think GTX 660M may have a castrated DDR3 version?
     
  14. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The 'hot clocks' scenario is idiotic for prolonged use given the heat/energy it consumes.
    Either keep the clocks at a certain level where the heat production and energy consumption remains relatively minor or force the manufacturers to implement superior cooling technology so they can keep up with the hardware.

    It's idiotic to keep things 'cheap' because they can already spend minute amounts of cash on superior materials.

    Someone needs to step on EVERYONE'S toes to get their rear ends out of the gutter.
    This has been going on too long.

    As for 660M possibly having a 'castrated' DD3 version... oh it's entirely possible.
    I think Asus managed to gimp the 555m and 560m because the 'regularly clocked' versions were too hot (for them).
     
  15. plancy

    plancy Notebook Evangelist

    Reputations:
    56
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    I don't think so cause isn't the 650m like that? GTX would mean it's the higher end side of mobile GPU's, so I don't think they would do that, the 560m didnt have a DDR3 version.
     
  16. SlickDragon

    SlickDragon Notebook Consultant

    Reputations:
    80
    Messages:
    143
    Likes Received:
    88
    Trophy Points:
    41
    Great thread with interesting speculation and hordes of useful information, so thanks guys!

    I usually agree with Cloudfire's guesses/prophecies, but i highly doubt there will be a GTX 665m kepler variant that many of us would desire. Instead we will have to have wait till the gtx 7xx series (or GCN mobile) to find true next gen replacements for our current gtx 5xx cards.

    While the (kepler versions) of GT 640M, GT 650M, and GT 660m are decent mid/low tier cards, they ultimately are just the same card with different memory/bit/clocks, in the same vein of the 555m and its hordes of variations (it even propagated them all to its crappy rebadged self the 635m..). Nonetheless its extremely likely that their will be gimped version of this card called the "GTX" 660m even while being castrated (think Asus G53SX).
     
  17. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    If 665M isn`t made (which I also think is likely but a slight possibility), the 700 series have to arrive very soon because if GK104 is GTX 680M, then Nvidia would have to make room for the GK106 which is also arriving pretty soon on the desktop side and consist of the GTX 650, 650 Ti and perhaps GTX 660.

    Which means we have a GK104 GTX 680M in July and 700 series with GK106 out almost at the same time?
     
  18. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    It happened with the 500m series and 485m, so I wouldn't be surprised.
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I don`t understand why you guys keep mentioning 485M and 500 series. 485M came out early January 2011, GTX 560M came early June 2011 and GTX 570M early July 2011. That is 5 months after 400 series. Which means that if Nvidia follows this, 700 series should arrive December 2012 at earliest.

    It however makes sense to put the GK106 GPUs in the 700 series because Nvidia just had to use some of the names in the 600 series for their rebadging (curse you Nvidia).

    :confused:
     
  20. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Lol... if people are buying new systems with these gpu's... I think it's best they focus on the 640m and 650m (or just OC the 640m to 650m because I think they might be the same card with only difference in the clocks) and those who want the higher-end gpu's hold off until Nvidia releases the 7 series, or just get the 5xx series gpu's for a lower price tag.
     
  21. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    What do you have to base this off? Nvidia's next chip, the GK110 is going to be a monster chip which will never find its way into a notebook.

    Nvidia are releasing their small kepler first on the notebook front and middle kepler on desktop.

    Expect to see the desktop version of small kepler fairly soon and middle kepler eventually release as the 680M.

    Remember Nvidia only has two chips this round since big kepler (GK100) was scrapped.
     
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Kepler annhilates fermi for power efficiency, totally and utterly destroys it, there is no competition. It's also now doing better than AMD's chips.

    [​IMG]

    Source:

    http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/19

    Slightly more power than an HD7950, for more performance than the 7970.
     
  23. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Kepler is like 4 times more power efficient than Fermi (originally what they claimed it would be back in 2009), which makes me wonder how good Maxwell will be :)
     
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    What is very funny is that the GTX 680 isn`t that far away from GTX 590 and yet is 200W+ below in power consumption

    And GTX 680 is 30-40% better than GTX 580 in games while using 60W less. And 680 is 295mm^2 while 580 is 520mm^2.

    LOL

    I can`t wait to see how the real high end of Kepler will perform. The GK110 is the real high end and is about the same size as 580 (550mm^2), increasing the die size by 255mm^2 from 680, and stuffing in 2.5billion more transistors than 680 and increasing the memory bus from 256bit to 512bit. It aint gonna be pretty :p

    LOL :D
     
  25. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Question is, will it even make it into notebooks?
     
  26. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    being optimistic I'd say GTX 780M :D (which will probably be due in mid 2013)
     
  27. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    And I noticed GT 630M and 620M are shrunk Fermi dies.
     
  28. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    It wasn't just the 400m -> 500m.

    When 40nm wasn't ready for them back in March '09, Nvidia used re-branded 9000M cards to actually launch the 100M(low/mid range) and 200m (GTX cards) series at the same time. Then a few months later in June '09 the 65nm GT 100M cards were replaced by 40nm GT 200m parts.

    In Jan. '10, so as not to appear a generation behind AMD's HD5000 series launch, those 200m cards got rebranded as the GT 300m series. A few of Nvidia's 40nm chip designs got scrubbed and a top end GTX 380m was never launched. GTX 285M...an overclocked GTX 280m...lasted until crap-Fermi based GTX 480M was "ready".
     
  29. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Those were kind of the "Dark Days" for mobile GPUs. A lot of it was "me-too" and most attempts at a mobile GPU were to appeal to a small niche or to meet the experimental/trial machines of the OEMs. None of the manufacturers were really serious about till a year or two ago.

    I think companies now realize the importance, and growth, for the platform. I am not shocked by nVidia's 1/2 rebrand, 1/2 new approach, though i do find it odd they are producing the 675M (i think it will only be on the market for a short timeframe). Most companies relegate the rebadges to the lower ends of the line. I think that the buying public that does any research will only buy the new parts and that, in turn, will mean the OEMs only stock and sell the 640/660/680M parts.

    After this generation, i think the product lines will tighten up a bit more.
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    No, because then we would have another 480M on our hands. They might do it for workstations for the compute, but it wont be faster than the 680M, just because of where the chips are optimised for power consumption.
     
  31. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    We will likely get a derivative of the desktop 660 as the 680M, just as we did the last generation.

    Now, i would love to see a workstation "Quadro Mobile" part based on the full desktop 680. It would probably have to fit in an 18" chassis and be a single card solution.. but it would be pretty killer. Never happen.. but still..
     
  32. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    I really want a gtx 660m.

    NVIDIA Announces the GeForce 600M series Enabling High-end gaming on Smaller Laptops

    This is valuable information as it shows the benchmark details.

    Looks like gt 640m for 720p gaming and gtx 660m for 1080p gaming. I so want the gtx660m even if it is 40-45w tdp it is worth getting for the extra grunt. I am sure I can play most newest games at 60fps at 1080p with one or 2 settings medium highish.
     
  33. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    looks that way buddy, I think gtx 660m will be a big success in mainstream/enthusiasts gaming :)
     
  34. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Man... that chart looks wrong. the GT650m looks faster from that same chart.

    And several generations later... still with mere 60sGB/s bandwidth? That's barely enough for high resolution gaming. They should have gone at least 70s. They basically have the same memory bandwidth from cards dating to the 9800m gtx.

    Mid range had a nice bump in performance. The rest remained more or less the same.
     
  35. Speedy Gonzalez

    Speedy Gonzalez Xtreme Notebook Speeder!

    Reputations:
    5,447
    Messages:
    3,143
    Likes Received:
    27
    Trophy Points:
    116
    if they can just put that 660m on a macbook pro and price it under 2k :) just dreaming here :D
     
  36. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    I say you should aim for a 570/670m or higher. If the 660m is aimed at less than the 570m, you will be out of luck playing at high settings 60fps newest games. I certainly can't maintain such fps unless I dial more than 1 setting haha.
     
  37. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    ryzeki

    That chart is not wrong. The gt640m shows games run at 1366x768 vs the gtx660m running at 1920x1080. The gtx660m is the card to get and its supposed to be a 40-45w card. My current 9600m gt is a 23w card but I am sure a 40-45w card won't matter that much if you are future proofing yourself as its so much more powerful.

    My dream is to get a 3612qm 35w and a gtx660m 40-45w so around 30w more then my laptop which takes 70w when stressed at gaming so 100w which is less then a 2630qm and gt555m.
     
  38. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Sounds like what i figure Razer will do with the next Blade.
     
  39. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    You do realise the gtx660-m is the highest end 28nm card nvidia will release first and will fit in thin size laptops and take less then half your 570m in power. What I mean is from those charts it shows 1080p on ultra detail or highest possible at 35-50fps. The 670m is pointless card as all it is a power hungry big card that will require massive cooling. I rather 2x performance per watt then the 670m and although I won't overclock, the performance should be similar at half the power.

    Besides my current 5930g runs games at 35-50fps on high like dirt 3 at 1280x800 or fifa 11 demo at 60fps or batman arkham asylum on high at 35fps or test drive unlimited 2 at 35 fps on medium high ish. My cpu probably bottlenecks tdu2 a bit.

    I reckon gt650m is bare minimum with gddr5 for gaming. GTX660m is perfect for most people as it is the 28nm kepler which is a 40-45w card.

    22nm ivy bridge with 22nm graphics integrated and 28nm kepler gtx660m is great for virtually everything. I won't mind if I can play every game at 1080p with shadows down to medium if it gets to 60fps and I have great battery life and my laptop only take 100w to game. Thats perfect really as 1080p gaming on a portable laptop that weighs 2.5kg 15.6 will be great. I prefer a laptop not to be thin if the cooling is effected. Besides with new driver updates games will run better.
     
  40. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Put it this way, if GTX 660M come with two memory bus configurations like GTX 460M did when Fermi was introduced, you get these bandwidths:

    1. 128 bit GDDR5: 128bit/8 = 16, 16 x 2000 = 32 000, 32 000 x 2 = 64 000MB/s = 64 GB/s

    2. 192 bit GDDR5: 192bit/8 = 24, 24 x 2000 = 48 000, 48 000 x 2 = 96 000MB/s = 96 GB/s

    Comparisons:

    GTX 675M/GTX 580M 256bit GDDR5: 96GB/s
    GTX 660M 192bit GDDR5: 96GB/s
    GTX 670M/570M 192bit GDDR5: 72GB/s
    GTX 660M 128bit GDDR5: 64GB/s
    GTX 560M 192bit GDDR5: 60GB/s

    So yeah, I bet it is easy for Nvidia to bump up the performance of GTX 660M in wait for 700 series. And I bet that this 192bit configuration will beat the 570M easily ;)

    Now the question remains: Would they make this? Is their digit system really this important to keep the 660M in chains? I really couldn`t give a rats *ss about it
     
  41. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    I would say you are Spot on here. That is what we will wind up with.
     
  42. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Weird.. Double Post. Please delete.
     
  43. PaKii94

    PaKii94 Notebook Virtuoso

    Reputations:
    211
    Messages:
    2,376
    Likes Received:
    4
    Trophy Points:
    56
    can we expect the gtx 660m in 15.6 in <6 lbs and not very bulky? in other words not bulky gaming behemoth laptops ?
     
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I know the 15.6inch Lenovo Y580 will have GTX660M. It weighs 2.7kgs and is 1.4inches high.
    And Asus G55 which is also have GTX660M and is also a 15incher.

    Probably more but these are the ones I know of.
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I doubt there is a 192bit bus on the chip.

    Replacement to the 550ti, 128bit memory controller capable of faster speeds giving the chip the bandwidth it needs.
     
  46. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GREAT news everyone. GTX 660M with 192bit bus confirmed!! :cool: :cool:

    It`s impossible to build a 128bit GPU with 1.5GB memory. Eurocom will have the 192bit bus with their Neptune 2.0.. What we will be seeing is the exact same as with GTX 460M/560M where OEMs sell 2GB 128bit and 1.5GB 192bit versions. Don`t let you be fooled

    [​IMG]
    [​IMG]

    Cheers :) :)
    Eurocom Adds NVIDIA GeForce GTX 675M, GTX 670M, GTX 660M and GT 650M To Its Lineup | techPowerUp
     
  47. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah I know the specs at Nvidia say 128bit only. Perhaps it is temporary? I don`t know. But Eurocom press release say 1.5GB GDDR5 with 660M. Not just a typo from Techpowerup. 675M get 2.0GB which match 256bit. 670M get 1.5GB which match 192bit.
    Press release

    If it is not a typo we will be watching the same memory bandwidth with 660M as with GTX 675M/580M. Crazy :)
     
  49. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    man, sorry to break your spirit but, I also think it is a typo :( Don't worry, our 680m will be 256bit 2GDDR5 at least 768 CUDA beast, that will be enough :)
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    We will see. I just hope its not since I want to see what it can do. I too am awaiting 680M btw. I just want to keep myself occupied with Kepler reviews until June (its gonna be 2 looooong months :p)
     
← Previous pageNext page →