Nice and tidy chart, via NotebookCheck:
![]()
Explain to me why the 650M boosts higher than the 660M.
-
-
I think even Nvidia guys can't answer that.
-
So, what do you all think about the 660m's performance?
-
Officially Lean and Mean. NVIDIA...
-
aaaaaargh, just got home to my computer. Took a quick look through the pages. So much to read. Where to begin lol
-
No need to read LOL. Just use Anandtech, nicely summarised.
-
TheBluePill Notebook Nobel Laureate
-
A few questions/remarks:
1. I see the GT 650M is clocked 15MHz higher than the GTX 660M. Why? Since GT 650M and GTX 660M have the same core count, what could be separating them? Shader clocks perhaps?
2.
Is that disappointing?
But then we have this which put the 660M 15% ahead of GTX 560M but 570M is 11% ahead of 660M. BAAAAH brain hurts
-
-
Aha, thaaaanks. So perhaps the GT 650M is 336 @ 850MHz while GTX 660M is able to have all 384 @ 835MHz?!
Still god damn confusing. We need reviews
Are they able to put out reviews of the mobile GPUs now since GTX 680 reviews are already out?
EDIT: Nvidia homepage says GT 650M:
"CUDA cores 384"
"Graphics Clock (MHz): 850 MHz with DDR3/735MHz with GDDR5"
GREAT not confusing at all
GTX 660M:
"CUDA cores 384"
"Graphics Clock (MHz): 835 MHz"
EDIT #2: Not in the "zone" today. Cloudfire out -
It's the GT 555M situation again, except this time it can be any of three names: 640M, 650M, or 660M.
Save us from this madness, AMD. -
Well GT 650M can`t be a GT 555M since 650M seems to crush the 555M. Maybe the GT 640M? Yes where the hell is AMD, Kevin?! I want to see what they can offer too.
Samsung Q470 guy again testing the laptop with GT 650M DDR3, i3 2350M (2.3 GHz without turbo) and 4GB RAM
This time it is with "High" settings instead of "Ultra" and the 4xMSAA disabled @ 1366x768
Compare these results with this.
These pictures are taken by wsshei110.
Source: Benyouhui -
What I mean by that, is that with this 600M we have a whole bunch of versions of the same card, with small differences, just like the GT 555M.
-
Ah, I see. Yes, Agreed. They could atleast let the 660M have more cores than the other two.
-
Looks like Nvidia for me this gen.
-
2 Kepler cores ~= 1 Fermi core (from notebookcheck though...)
-
-
I am scared right now that there are no news from nvidia (whatsoever) that there will be a 680m, I thought they were going to talk about a little.. what are the chances the yields at TSMC were so bad (FOR MOBILE), they didn't beat Fermi with new architecture (twice as many shaders clocked at core speed)?
-
So how much longer before I can actually order a laptop with a GTX 660M or GTX 670/675m? Looking towards xoticpc and clevo models. Maybe looking at Gentech as well. Anyone know the release dates?
-
TheBluePill Notebook Nobel Laureate
-
I bet that 670M is running hot as hell. But yeah, the 670M/675M are here to keep the suckers occupied while they work on the 680M.
It sucks because I would buy the 680M in a heartbeat. I just hope June is a confirmed month for it, so that we don`t have to wait not knowing exactly when it is coming. I`ve been there done that so many times, from games to hardware, and it sucks bigtime -
Not buying till 680m comes out
.
-
Anyone have a 675M vbios? -
-
Now all we have to do is wait for Ivy Bridge Announcement, any speculation on when that's released? And when can I at least buy a 660m in a clevo model?
-
-
-
Yeah I'd definitely choose GCN over Fermi, but not GCN over Kepler, and yes GCN will be faster than Fermi with lower power consumption....
Purchase, not counting build+ship time.
http://www.geforce.com/whats-new/articles/nvidia-geforce-gtx-680-R300-drivers-released?sf3581820=1/ -
So now that it is official, what is the speculation in the ultrabook world? I am waiting to see what AMD is doing with the Trinity platform and their version of the Ultrabook, but these Kepler gpus sure are tempting.
Why haven't we seen anything from AMD on the ultrabook front since Jan? -
Well to be honest we all used to expect this level of performance from the get go. Only the highest end would see a % increase since they will introduce a new high end card but the rest should be relative performance increases to current products.
the 680m should be THE card, just as the HD7990m or whatever equivalent should bring the big performance.
Other than that, I think we will see most product ranges around the same level.
Now I wonder if I should wait for HD7000 series or go ahead an get an HD6990m...
Both Kepler and GCN are quite awesome right now but I wonder when they will be available to obtain (high end cards). I am interested to see if the high end GPU from AMD will be a 7870 equivalent with lower clocks, since it seems their current best bang for buck card. -
Something fishy is happening right now with AMD. So weird to not hear a single word about anything from them. Not even some small details about the low specced GCNs. No leaks on the internet. Nothing. Nada. Zero
Although most of the rumors about Kepler was wrong, there was atleast something to discuss. -
AMD I think hit the wall too hard atm, I can bet anything they don't have squat to compete against 680 (desktop), and they are trying their best to come up with better something, anything
(that is why they are working their a$$ off to get 8000 out asap...)
-
You give way too much credit to the GTX680 when the actual performance difference is nowhere near as great haha.
AMD can compete with lower price instead of rushing another revision.
-
-
Don't forget it's 256bit memory and not 384bit, but I believe you've exaggerated a bit on the power consumption and performance,for the 7907 the TDP is actually way higher than the real world result, and the 680's performance over the 7970 is also definitely not 20%.... synthetic benchmarks like 3dmark vantage are in Nvidia favor.
-
I'm just hoping the 660M does beat the 570M in both efficiency and performance, as well as price.
Or am I just wishing too much? -
Guess we will have to see some review to find out. They can`t be that far away now that 680 have been reviewed
-
Hehehehe I'm waiting for either the 660M or a 675M/580M price decrease
-
-
AlwaysSearching Notebook Evangelist
efficiency.
660 3dmarkVan about 11.8k and 570 about 13k from nvidia. -
and memory does not dictate performance.... they could both be 1.5GB and only in 2560 res with 16x AA/AF would the memory be bottlenecked.
It does not consume 55w less power. see the charts. it varies per game. the GTX680 consumes up to 225w since its the maximum allowed by the dual 6 pin connectors. 195w is on average, 225 when turbo boost and the game allows it. The HD7970 is rated at 250w but doesn't always consume that either so while the GTX680 does indeed perform better in average and consumes less, the difference is nowhere near as massive as you make it out to be.
Don't get me wrong, it is an amazing card, specially since you consider how much kepler evolved from fermi to higher performance and much less power. But it is not really that beyond the HD7970 series at all. Both are great and brought improvements. I am excited to see what they release next, specially with driver updates from both fronts.
In the end when buying these super high end card, you end up OCing, and thats when the performance difference between the two simply vanishes. -
-
You can wait for the newer cards, but if you are in a hurry to get a new machine, you can't go wrong with a 570m. -
AlwaysSearching Notebook Evangelist
of the time mine is on my lap.
They could have easily made the 660 perform better than the 570
but for some reason gimped it.
As it is IMO they had to bump the clock on the 670(570) a little so there
is a little more separation between the models or there is no reason to
have the 670 really.
Nvidia must have alot of older components sitting around they need
to clear out. -
Except Intel decides to be stupid to us and wait for the Sandy Bridge stock to sell out first by delaying Ivy Bridge for 3 weeks when they have IB ready by tomorrow. >.>
I'm still waiting for some of the new cards or when new reviews come out before I decide on a laptop I want. My macbook is breaking down, frequent kernel panics and BSODs on bootcamp >:OOOO -
GTX 660M is not entirely "gimped"
Memory Bandwith:
GTX 570M/670M 192-bit: 72.0 GB/s
GTX 660M 128-bit: 64GB/s
GTX 560M 192-bit: 60GB/s
Due to the faster GDDR5 memory in GTX 660M (2000MHz) compared to 560M (1250MHz) it makes up for the smaller memory bus.
Now if GTX 660M also come with a 192 bit bus like 460M did, it will surpass 570M (atleast bandwith wise) because 570M memory is 1500MHz, while 660M is 2000MHz.
Here`s for hoping -
-
Not impossible I guess
-
Now I'm going to sit back and watch this thread die a painful death
Time to move on to IB and the new laptops which will come out with IB/Kepler
HURRAY: Nvidia 600 series not just Fermi!! (Kepler)
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 2, 2012.