I don`t think the Kepler chips themselves run much hotter than Fermi to be honest. It is Acer that have screwed the pooch. All the reviews of the Acer timeline says there is a hotspot under the laptop that get extremely hot. After all, the Acer is a ultrabook. You won`t find many GT555M inside ultrabooks btw...
I think Acer used some light and very thin materials around the GPU or whatever is causing the heat, thus creating a risk for burn.
Here is a guy who have been OVERCLOCKING his GT 650M inside the Lenovo Y480.
His GPU is 44 degree celsius while idling, and max 71 degree celsius under HEAVY overclock under full load. So yeah...
ÏÂÒ»´ú¼Ü¹¹ÆØ¹â£¡GT650MÕæÊµ²ÎÊý£¡ÄÚÓоªÏ²£¡_±Ê¼Ç±¾×ÛºÏÂÛ̳_̫ƽÑóµçÄÔÍø²úÆ·ÂÛ̳
-
-
Like I said we will just have to wait and see, from this point on everything is just guessing and speculating.. The few hard facts we have dont give us alot of info about throttling or the new kepler running hot...
Also the problem isnt the GPU getting too hot, but the combination of the GPU and CPU.. I can only repeat.. IVB still has the same throttling limits but offers more perfomance, in combination with kepler it could become very hot and throttle, we wont know until the first IVB laptops with kepler gpus are being tested
A respected member from a german alienware forum (supposedly an engineer) even claims that IVB is more likely to throttle than sandy bridge.. So lets wait and see
Also turbo boost is an indication of the cooling solution not being able to handle the full power of a CPU, the turbo boost is only supposed to be enabled for a short time when the CPU is in intensive situations, and not for a long time like when playing Battlefield 3..
Also that the upcoming MSI GE60 and GE70 fat gamer notebooks feature an 650m doesnt really talk for the 650m being cool
http://www.notebookjournal.de/news/kommende-msi-gaming-boliden-setzen-auf-neue-nvidia-gpus-4388 -
With GPU Turbo pushing the clocks until TDP is reached... it is little wonder cards are hot.
-
-
In fact the cards are now making the most use of the thermal and power capacity, so the result you know.
-
-
TDP differs by manufacturer though...
-
Guess we will have to wait and see some reviews before coming to a conclusion about the heat with Kepler yes. But I sincerely hope not that the turbo isn`t limited by heat like Sandy Bridge is, but is able to turbo fully throughout the gaming session. I just imagining this turbo to be a power saving feature that downclock while idling or doing older easier games, while is able to do full turbo when gaming in demanding games.
But you`re right, turbo means more heat. But laptop OEMs is responsible to make a cooling system and use materials that can handle this turbo heat without destroying the laptop or hurting the owners though
Oh well, still a few questions that need to be answered -
sorry for double posting.
-
Releasing some new mobile parts featuring the GK 106 (768CC) and using names like "GTX 665M" would only "work" if it's performing significantly worse than the GTX 670M, which is very highly unlikely as in fact it's probable it will trounce it, and it would at all events make the nomenclature incomprehensible.
Nvidia won't show concern for the implied "rule" that says you can't release new series too soon after the previous & for no apparent good reason, especially not with their recent track-record, and aren't likely to let themselves be bothered by a crowded nomenclature of GTX 660M, 665M, 670M, 675M, 680M, etc. They'll launch the 700 series sooner than anyone thinks and be done with it, and few people will find reasons to object. -
Turbo Boost is ideal for situations where the CPU needs a short extra burst of speed to get a task done a bit quicker. If it's a prolonged task, then Turbo Boost is going to kick on and off depending on how well the CPU is being cooled and generally throw off a lot of heat in the process. -
-
Do you think GTX 660M may have a castrated DDR3 version?
-
The 'hot clocks' scenario is idiotic for prolonged use given the heat/energy it consumes.
Either keep the clocks at a certain level where the heat production and energy consumption remains relatively minor or force the manufacturers to implement superior cooling technology so they can keep up with the hardware.
It's idiotic to keep things 'cheap' because they can already spend minute amounts of cash on superior materials.
Someone needs to step on EVERYONE'S toes to get their rear ends out of the gutter.
This has been going on too long.
As for 660M possibly having a 'castrated' DD3 version... oh it's entirely possible.
I think Asus managed to gimp the 555m and 560m because the 'regularly clocked' versions were too hot (for them). -
-
Great thread with interesting speculation and hordes of useful information, so thanks guys!
I usually agree with Cloudfire's guesses/prophecies, but i highly doubt there will be a GTX 665m kepler variant that many of us would desire. Instead we will have to have wait till the gtx 7xx series (or GCN mobile) to find true next gen replacements for our current gtx 5xx cards.
While the (kepler versions) of GT 640M, GT 650M, and GT 660m are decent mid/low tier cards, they ultimately are just the same card with different memory/bit/clocks, in the same vein of the 555m and its hordes of variations (it even propagated them all to its crappy rebadged self the 635m..). Nonetheless its extremely likely that their will be gimped version of this card called the "GTX" 660m even while being castrated (think Asus G53SX). -
If 665M isn`t made (which I also think is likely but a slight possibility), the 700 series have to arrive very soon because if GK104 is GTX 680M, then Nvidia would have to make room for the GK106 which is also arriving pretty soon on the desktop side and consist of the GTX 650, 650 Ti and perhaps GTX 660.
Which means we have a GK104 GTX 680M in July and 700 series with GK106 out almost at the same time? -
It happened with the 500m series and 485m, so I wouldn't be surprised.
-
It however makes sense to put the GK106 GPUs in the 700 series because Nvidia just had to use some of the names in the 600 series for their rebadging (curse you Nvidia).
-
Lol... if people are buying new systems with these gpu's... I think it's best they focus on the 640m and 650m (or just OC the 640m to 650m because I think they might be the same card with only difference in the clocks) and those who want the higher-end gpu's hold off until Nvidia releases the 7 series, or just get the 5xx series gpu's for a lower price tag.
-
Meaker@Sager Company Representative
Nvidia are releasing their small kepler first on the notebook front and middle kepler on desktop.
Expect to see the desktop version of small kepler fairly soon and middle kepler eventually release as the 680M.
Remember Nvidia only has two chips this round since big kepler (GK100) was scrapped. -
Meaker@Sager Company Representative
Source:
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/19
Slightly more power than an HD7950, for more performance than the 7970. -
Kepler is like 4 times more power efficient than Fermi (originally what they claimed it would be back in 2009), which makes me wonder how good Maxwell will be
-
What is very funny is that the GTX 680 isn`t that far away from GTX 590 and yet is 200W+ below in power consumption
And GTX 680 is 30-40% better than GTX 580 in games while using 60W less. And 680 is 295mm^2 while 580 is 520mm^2.
LOL
I can`t wait to see how the real high end of Kepler will perform. The GK110 is the real high end and is about the same size as 580 (550mm^2), increasing the die size by 255mm^2 from 680, and stuffing in 2.5billion more transistors than 680 and increasing the memory bus from 256bit to 512bit. It aint gonna be pretty
LOL -
-
being optimistic I'd say GTX 780M
(which will probably be due in mid 2013)
-
And I noticed GT 630M and 620M are shrunk Fermi dies.
-
When 40nm wasn't ready for them back in March '09, Nvidia used re-branded 9000M cards to actually launch the 100M(low/mid range) and 200m (GTX cards) series at the same time. Then a few months later in June '09 the 65nm GT 100M cards were replaced by 40nm GT 200m parts.
In Jan. '10, so as not to appear a generation behind AMD's HD5000 series launch, those 200m cards got rebranded as the GT 300m series. A few of Nvidia's 40nm chip designs got scrubbed and a top end GTX 380m was never launched. GTX 285M...an overclocked GTX 280m...lasted until crap-Fermi based GTX 480M was "ready". -
TheBluePill Notebook Nobel Laureate
I think companies now realize the importance, and growth, for the platform. I am not shocked by nVidia's 1/2 rebrand, 1/2 new approach, though i do find it odd they are producing the 675M (i think it will only be on the market for a short timeframe). Most companies relegate the rebadges to the lower ends of the line. I think that the buying public that does any research will only buy the new parts and that, in turn, will mean the OEMs only stock and sell the 640/660/680M parts.
After this generation, i think the product lines will tighten up a bit more. -
Meaker@Sager Company Representative
-
TheBluePill Notebook Nobel Laureate
Now, i would love to see a workstation "Quadro Mobile" part based on the full desktop 680. It would probably have to fit in an 18" chassis and be a single card solution.. but it would be pretty killer. Never happen.. but still.. -
I really want a gtx 660m.
NVIDIA Announces the GeForce 600M series Enabling High-end gaming on Smaller Laptops
This is valuable information as it shows the benchmark details.
Looks like gt 640m for 720p gaming and gtx 660m for 1080p gaming. I so want the gtx660m even if it is 40-45w tdp it is worth getting for the extra grunt. I am sure I can play most newest games at 60fps at 1080p with one or 2 settings medium highish. -
looks that way buddy, I think gtx 660m will be a big success in mainstream/enthusiasts gaming
-
Man... that chart looks wrong. the GT650m looks faster from that same chart.
And several generations later... still with mere 60sGB/s bandwidth? That's barely enough for high resolution gaming. They should have gone at least 70s. They basically have the same memory bandwidth from cards dating to the 9800m gtx.
Mid range had a nice bump in performance. The rest remained more or less the same. -
Speedy Gonzalez Xtreme Notebook Speeder!
if they can just put that 660m on a macbook pro and price it under 2k
just dreaming here
-
-
ryzeki
That chart is not wrong. The gt640m shows games run at 1366x768 vs the gtx660m running at 1920x1080. The gtx660m is the card to get and its supposed to be a 40-45w card. My current 9600m gt is a 23w card but I am sure a 40-45w card won't matter that much if you are future proofing yourself as its so much more powerful.
My dream is to get a 3612qm 35w and a gtx660m 40-45w so around 30w more then my laptop which takes 70w when stressed at gaming so 100w which is less then a 2630qm and gt555m. -
TheBluePill Notebook Nobel Laureate
-
Besides my current 5930g runs games at 35-50fps on high like dirt 3 at 1280x800 or fifa 11 demo at 60fps or batman arkham asylum on high at 35fps or test drive unlimited 2 at 35 fps on medium high ish. My cpu probably bottlenecks tdu2 a bit.
I reckon gt650m is bare minimum with gddr5 for gaming. GTX660m is perfect for most people as it is the 28nm kepler which is a 40-45w card.
22nm ivy bridge with 22nm graphics integrated and 28nm kepler gtx660m is great for virtually everything. I won't mind if I can play every game at 1080p with shadows down to medium if it gets to 60fps and I have great battery life and my laptop only take 100w to game. Thats perfect really as 1080p gaming on a portable laptop that weighs 2.5kg 15.6 will be great. I prefer a laptop not to be thin if the cooling is effected. Besides with new driver updates games will run better. -
Put it this way, if GTX 660M come with two memory bus configurations like GTX 460M did when Fermi was introduced, you get these bandwidths:
1. 128 bit GDDR5: 128bit/8 = 16, 16 x 2000 = 32 000, 32 000 x 2 = 64 000MB/s = 64 GB/s
2. 192 bit GDDR5: 192bit/8 = 24, 24 x 2000 = 48 000, 48 000 x 2 = 96 000MB/s = 96 GB/s
Comparisons:
GTX 675M/GTX 580M 256bit GDDR5: 96GB/s
GTX 660M 192bit GDDR5: 96GB/s
GTX 670M/570M 192bit GDDR5: 72GB/s
GTX 660M 128bit GDDR5: 64GB/s
GTX 560M 192bit GDDR5: 60GB/s
So yeah, I bet it is easy for Nvidia to bump up the performance of GTX 660M in wait for 700 series. And I bet that this 192bit configuration will beat the 570M easily
Now the question remains: Would they make this? Is their digit system really this important to keep the 660M in chains? I really couldn`t give a rats *ss about it -
TheBluePill Notebook Nobel Laureate
-
TheBluePill Notebook Nobel Laureate
Weird.. Double Post. Please delete.
-
can we expect the gtx 660m in 15.6 in <6 lbs and not very bulky? in other words not bulky gaming behemoth laptops ?
-
And Asus G55 which is also have GTX660M and is also a 15incher.
Probably more but these are the ones I know of. -
Meaker@Sager Company Representative
I doubt there is a 192bit bus on the chip.
Replacement to the 550ti, 128bit memory controller capable of faster speeds giving the chip the bandwidth it needs. -
GREAT news everyone. GTX 660M with 192bit bus confirmed!!
Cheers
Eurocom Adds NVIDIA GeForce GTX 675M, GTX 670M, GTX 660M and GT 650M To Its Lineup | techPowerUp -
Meaker@Sager Company Representative
Are you sure? It may be a typo:
http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-660m/specifications -
Press release
If it is not a typo we will be watching the same memory bandwidth with 660M as with GTX 675M/580M. Crazy -
Don't worry, our 680m will be 256bit 2GDDR5 at least 768 CUDA beast, that will be enough
-
We will see. I just hope its not since I want to see what it can do. I too am awaiting 680M btw. I just want to keep myself occupied with Kepler reviews until June (its gonna be 2 looooong months
)
NVIDIA Geforce GTX 660M Release Information + CUDA Core Count
Discussion in 'Gaming (Software and Graphics Cards)' started by yknyong1, Mar 22, 2012.