Nice and tidy chart, via NotebookCheck:
![]()
Explain to me why the 650M boosts higher than the 660M.
-
Maybe only DDR3 GT 650M boosts to that high clock.
-
I think even Nvidia guys can't answer that.
-
So, what do you all think about the 660m's performance?
-
Officially Lean and Mean. NVIDIA...
-
aaaaaargh, just got home to my computer. Took a quick look through the pages. So much to read. Where to begin lol
-
No need to read LOL. Just use Anandtech, nicely summarised.
-
TheBluePill Notebook Nobel Laureate
I agree. There are probably several versions of each. -
A few questions/remarks:
1. I see the GT 650M is clocked 15MHz higher than the GTX 660M. Why? Since GT 650M and GTX 660M have the same core count, what could be separating them? Shader clocks perhaps?
2.I compared GTX 660M score with the scores @ Notebookcheck ( Click to see GTX 560M) and from what I can gather the GTX 560M score an average of P9200, and GTX 660M will score P11500, which means 25% better than GTX 560M, or GTX 570M score P10800 in average,( Click to see GTX 570M), so the GTX 660M is 6% faster.
Is that disappointing?
But then we have this which put the 660M 15% ahead of GTX 560M but 570M is 11% ahead of 660M. BAAAAH brain hurts
-
-
Aha, thaaaanks. So perhaps the GT 650M is 336 @ 850MHz while GTX 660M is able to have all 384 @ 835MHz?!
Still god damn confusing. We need reviews
Are they able to put out reviews of the mobile GPUs now since GTX 680 reviews are already out?
EDIT: Nvidia homepage says GT 650M:
"CUDA cores 384"
"Graphics Clock (MHz): 850 MHz with DDR3/735MHz with GDDR5"
GREAT not confusing at all
GTX 660M:
"CUDA cores 384"
"Graphics Clock (MHz): 835 MHz"
EDIT #2: Not in the "zone" today. Cloudfire out
-
It's the GT 555M situation again, except this time it can be any of three names: 640M, 650M, or 660M.
Save us from this madness, AMD. -
Well GT 650M can`t be a GT 555M since 650M seems to crush the 555M. Maybe the GT 640M? Yes where the hell is AMD, Kevin?! I want to see what they can offer too.
Samsung Q470 guy again testing the laptop with GT 650M DDR3, i3 2350M (2.3 GHz without turbo) and 4GB RAM
This time it is with "High" settings instead of "Ultra" and the 4xMSAA disabled @ 1366x768
Compare these results with this.
These pictures are taken by wsshei110.
Source: Benyouhui -
What I mean by that, is that with this 600M we have a whole bunch of versions of the same card, with small differences, just like the GT 555M.
-
Ah, I see. Yes, Agreed. They could atleast let the 660M have more cores than the other two.
-
Looks like Nvidia for me this gen.
-
2 Kepler cores ~= 1 Fermi core (from notebookcheck though...)
-
Confirmed? Also only around 65-70W?
-
I am scared right now that there are no news from nvidia (whatsoever) that there will be a 680m, I thought they were going to talk about a little.. what are the chances the yields at TSMC were so bad (FOR MOBILE), they didn't beat Fermi with new architecture (twice as many shaders clocked at core speed)?
-
So how much longer before I can actually order a laptop with a GTX 660M or GTX 670/675m? Looking towards xoticpc and clevo models. Maybe looking at Gentech as well. Anyone know the release dates?
-
TheBluePill Notebook Nobel Laureate
They don't need a 680M yet, as the 670M will overtake the 6990M in Performance. -
I bet that 670M is running hot as hell. But yeah, the 670M/675M are here to keep the suckers occupied while they work on the 680M.
It sucks because I would buy the 680M in a heartbeat. I just hope June is a confirmed month for it, so that we don`t have to wait not knowing exactly when it is coming. I`ve been there done that so many times, from games to hardware, and it sucks bigtime -
Not buying till 680m comes out
.
-
No it won't. The 670M is only 192 Bit :/ You mean the 675M.. which is almost the same.
Anyone have a 675M vbios? -
Not only the 670M, even the 675M is gonna have have a hard time of crushing the 6990M, lol. Here are some benches of the 675M, barely 3-5% faster than the 6990M in Vantage GPU and '11. Not to mention that it will probably cost 800$. The 7970M should be 15-20% faster, IMHO.
-
Now all we have to do is wait for Ivy Bridge Announcement, any speculation on when that's released? And when can I at least buy a 660m in a clevo model?
-
In 2-3 weeks
-
Pre-order or recieve xD
-
Yeah I'd definitely choose GCN over Fermi, but not GCN over Kepler, and yes GCN will be faster than Fermi with lower power consumption....
Purchase, not counting build+ship time.
http://www.geforce.com/whats-new/articles/nvidia-geforce-gtx-680-R300-drivers-released?sf3581820=1/ -
So now that it is official, what is the speculation in the ultrabook world? I am waiting to see what AMD is doing with the Trinity platform and their version of the Ultrabook, but these Kepler gpus sure are tempting.
Why haven't we seen anything from AMD on the ultrabook front since Jan? -
Well to be honest we all used to expect this level of performance from the get go. Only the highest end would see a % increase since they will introduce a new high end card but the rest should be relative performance increases to current products.
the 680m should be THE card, just as the HD7990m or whatever equivalent should bring the big performance.
Other than that, I think we will see most product ranges around the same level.
Now I wonder if I should wait for HD7000 series or go ahead an get an HD6990m...
Both Kepler and GCN are quite awesome right now but I wonder when they will be available to obtain (high end cards). I am interested to see if the high end GPU from AMD will be a 7870 equivalent with lower clocks, since it seems their current best bang for buck card. -
Something fishy is happening right now with AMD. So weird to not hear a single word about anything from them. Not even some small details about the low specced GCNs. No leaks on the internet. Nothing. Nada. Zero
Although most of the rumors about Kepler was wrong, there was atleast something to discuss. -
AMD I think hit the wall too hard atm, I can bet anything they don't have squat to compete against 680 (desktop), and they are trying their best to come up with better something, anything
(that is why they are working their a$$ off to get 8000 out asap...)
-
Haha, the HD7970 doesn't trail that far behind haha, and when both OC, they are on the same level of performance.
You give way too much credit to the GTX680 when the actual performance difference is nowhere near as great haha.
AMD can compete with lower price instead of rushing another revision.
yeah its weird.... no info of ANYTHING. It sucks. -
bro! are you kidding? 20% better in 3dmark11 and 55W less power consuming, 2GB VRAM as opposed to 3GB, and you calling this barely better??? Look to some BF3 benchmarks, 680 bashes 7970 squarely on the face...
-
Don't forget it's 256bit memory and not 384bit, but I believe you've exaggerated a bit on the power consumption and performance,for the 7907 the TDP is actually way higher than the real world result, and the 680's performance over the 7970 is also definitely not 20%.... synthetic benchmarks like 3dmark vantage are in Nvidia favor.
-
This
I'm just hoping the 660M does beat the 570M in both efficiency and performance, as well as price.
Or am I just wishing too much? -
Guess we will have to see some review to find out. They can`t be that far away now that 680 have been reviewed
-
Hehehehe I'm waiting for either the 660M or a 675M/580M price decrease
-
I'm pretty sure a bit of OC will do just that easily don't worry. 570m is actually quite powerful but since the clocks are low on stock it doesn't show its strenght on the get go.
-
AlwaysSearching Notebook Evangelist
660 will beat a 560 but not a 570 in performance. It will beat a 570 in
efficiency.
660 3dmarkVan about 11.8k and 570 about 13k from nvidia. -
Haha synthetics are amazing but sadly gamewise its not the same story. Sure it runs battlefield 3 exceptionally well, but then again you can also point to metro 2033 where the HD7970 beats it.
and memory does not dictate performance.... they could both be 1.5GB and only in 2560 res with 16x AA/AF would the memory be bottlenecked.
It does not consume 55w less power. see the charts. it varies per game. the GTX680 consumes up to 225w since its the maximum allowed by the dual 6 pin connectors. 195w is on average, 225 when turbo boost and the game allows it. The HD7970 is rated at 250w but doesn't always consume that either so while the GTX680 does indeed perform better in average and consumes less, the difference is nowhere near as massive as you make it out to be.
Don't get me wrong, it is an amazing card, specially since you consider how much kepler evolved from fermi to higher performance and much less power. But it is not really that beyond the HD7970 series at all. Both are great and brought improvements. I am excited to see what they release next, specially with driver updates from both fronts.
In the end when buying these super high end card, you end up OCing, and thats when the performance difference between the two simply vanishes. -
Yey that makes me wonder if I should get a 570M or a 660M. I'm usually a stay-at-a-desk-plugged-in person.
-
570m is a great card. It is also very overclockable so it should easily outperform 660m no problems specially when overclocked. sure it's nice to have better efficiency but in the end if you will be plugged all the time, it won't matter.
You can wait for the newer cards, but if you are in a hurry to get a new machine, you can't go wrong with a 570m. -
AlwaysSearching Notebook Evangelist
I am plugged in too but wouldn't mind a more efficient gpu especially since 50%
of the time mine is on my lap.
They could have easily made the 660 perform better than the 570
but for some reason gimped it.
As it is IMO they had to bump the clock on the 670(570) a little so there
is a little more separation between the models or there is no reason to
have the 670 really.
Nvidia must have alot of older components sitting around they need
to clear out. -
Sounds like Intel, with their Sandy Bridge stock.
Except Intel decides to be stupid to us and wait for the Sandy Bridge stock to sell out first by delaying Ivy Bridge for 3 weeks when they have IB ready by tomorrow. >.>
I'm still waiting for some of the new cards or when new reviews come out before I decide on a laptop I want. My macbook is breaking down, frequent kernel panics and BSODs on bootcamp >:OOOO -
GTX 660M is not entirely "gimped"
Memory Bandwith:
GTX 570M/670M 192-bit: 72.0 GB/s
GTX 660M 128-bit: 64GB/s
GTX 560M 192-bit: 60GB/s
Due to the faster GDDR5 memory in GTX 660M (2000MHz) compared to 560M (1250MHz) it makes up for the smaller memory bus.
Now if GTX 660M also come with a 192 bit bus like 460M did, it will surpass 570M (atleast bandwith wise) because 570M memory is 1500MHz, while 660M is 2000MHz.
Here`s for hoping
-
I wish 660M was 192 bit. Perhaps Nvidia did this on purpose to make sure the numbers on the lineup made sense, so the 660M wouldn't surpass the 570M and the 670M.
-
Not impossible I guess
-
Now I'm going to sit back and watch this thread die a painful death
Time to move on to IB and the new laptops which will come out with IB/Kepler
HURRAY: Nvidia 600 series not just Fermi!! (Kepler)
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 2, 2012.