tell me about it!![]()
-
-
Meaker@Sager Company Representative
-
(but I have no idea definitely, as they managed to trim 70w from 560Ti at the expense of no CUDA cores, but memory interface / core clock / memory clock, definitely it is involving a ton of electrical engineering which I don't know squat
) also half a gtx 680 is somewhat equal to 580, can you imagine gtx 580 performance in a lappy
-
Meaker@Sager Company Representative
No, the 680 is not twice as fast as the 580.
It will most likely have clock/voltage reductions since its TDP is not too bad already. -
the gt660m is a 40-45w 128bit card. I have seen nowhere where it says its 192bit.
-
Meaker the gtx680m should be 2x performance per watt if its a kepler.
-
Meaker@Sager Company Representative
I think the post was edited, but since kepler no longer has a shader clock and 2 shaders are needed to do the same work as before it does not really work that way.
Also at no point was performance per watt mentioned nissangtr. -
Articles : GeForce 600M Notebooks: Powerful and Efficient - GeForce
This shows it is 2x performance per watt over 500m series.
This gt525m score 912 in 3dmark11 vs 1830. Both virtually take identical power. Maybe 2x performance per watt in dx11 games.
Review Fujitsu Lifebook AH531 Notebook - Notebookcheck.net Reviews
Review Acer Aspire Timeline Ultra M3-581TG Ultrabook - Notebookcheck.net Reviews -
But don't be delusional with it's performance. Half of those games are not run at their maximum details at all. Only Deus EX and Call of duty are confirmed max all. Batman has two more presets, Battlefield one more etc etc... And check the graph correctly. You are talking about a 30-40fps experience at those settings already at best. Only Dirt 3 managed higher performance. From the graph alone by nvidia, the 570m is around 20% faster on stock.
I imagine this is because 384 Cuda cores running around half the shader speed than current fermi is basicaly near the 192 cores of the GTX460/560m which is why it has similar performance in the end. so the 680m will be at least running 768 cores at less power, and should have a nice speed bump.
In the end, it's good to have a low power card that you can take on a smaller package. I am not against such progress. I am just pointing out performance isn't enough for a 60fps, 1080p, highsettings at all. I am talking only regarding performance, but in my case, I am interested in higher performing products, not just the same performance on lower power envelope (for my machine).
thats why I am interested in the higher end versions of Kepler (680m) and AMD's equivalent in GNC. I want to see what 70w-100w tdp can get me now. -
Oh Cloudfire, sorry to burst your bubble, but
NVIDIA has been known to use interleaved memory configuration on memory bus.
Something like 'flex' interleaving.
So it doesn't simply mean one can judge the bus width from the memory size nowadays.
Hope the source below is informative enough
AnandTech - NVIDIA's GeForce GTX 550 Ti: Coming Up Short At $150
And the chart quoted is for the GTX 560M, not 660M.
-
Meaker@Sager Company Representative
-
And I posted the chart to give people an idea what to expect going from 128bit to 192bit, which I thought was pretty clear. Since Kepler have turbo boost, perhaps it will greatly benefit by pushing those precious turbo Gigabytes through a wide memory bus. Or perhaps the 1.5GB was a typo and that 128bit is really enough for GK107. I don`t know -
-
-
Put some Clevo W110ER benches with GT650M DDR3 2GB up:
http://forum.notebookreview.com/sag...ries-w110er-w150er-w170er-21.html#post8406849 -
-
. I would like to see if it stacks up to 560m in real world performance, I'm probably going to get a 660m and not a $800 GPU and game at 720p, put the extra in a desktop maybe.
-
TheBluePill Notebook Nobel Laureate
-
10 char -
-
I am pretty shure we will see benches with 660M very soon. Pictures with internals of Lenovo Y580 have been posted in various forums recently
-
Meaker@Sager Company Representative
I wonder what the real frequencies were.
It will have to clock very well to match the 570. -
I wish there was a 192-bit variant, that would be insane performance...
-
Ouch, NVIDIA is that what you are capable of?
What's New - GeForce -
Meaker@Sager Company Representative
I don't get it, what are you saying about that link?
-
"If you ask us, the GeForce GTX 680 is our best graphics card ever"
I felt a bit sour that NVIDIA did not release the best chip for desktop ie GK110. -
TheBluePill Notebook Nobel Laureate
Articles : Introducing The GeForce GTX 680 GPU - GeForce -
-
Meaker@Sager Company Representative
There are a couple of realistic reasons why the top end chip was not launching IMO:
1) The chip is too defective and yields are not where they need to be.
2) The chip is not clocking where it should.
3) The GK104 was so close in performance it made no sense to release it.
I think its 1 or 2.
Also Long, because they are not really that far ahead, maybe a few % but if you had a higher end chip you could slap it on a £600 card and claim victory and rake in the cash. -
-
GK110 was not done because
A) A 256bit mid range GPU could beat a 384bit high end AMD GPU and therefor
B) Didn`t have to rush and build a monstreous GPU and
C) They wanted to show what a small Kepler could do in terms of efficiency and therefor didn`t go overboard with heat, consumption etc. Meanwhile
D) AMD is working on the 8000 series and the Dual GPU 7990 which is what the GK110 should be aiming for which is also the reason why Nvidia wait.
But yeah, your points could also be valid Meaker. After all Nvidia seems desperate now that the CEO have reached out to Intel and asking if they could use Intel`s factories due to low yield.
And I agree with lordbaldric. The 680 is crappy in other things than gaming. 7970 is miles ahead -
True Big Kepler GK110 is NVIDIA's 7 billion transistor monster. Will never make it into laptops. Not this generation at 28nm by any means.
Kepler Architecture to Support 7-Billion Transistor Chip? - HardwareZone.com -
Yeah I don`t see that hot thing going inside a laptop
-
Watercooled laptop, anyone?
-
Meaker@Sager Company Representative
-
-
-
TheBluePill Notebook Nobel Laureate
The first photo of the inside the Y580 and the new processor was cool;
-
hahaha that made my day. :laugh:
Too bad I can`t give you rep (will later)
"You must spread some Reputation around before giving it to TheBluePill again." -
Meaker@Sager Company Representative
If you like that, then the eve cluster runs 10,000 of those in parallel to generate the computational power needed.
-
TheBluePill Notebook Nobel Laureate
-
Cute thing dude
-
I'm wondering if given great cooling, would a GT 650M outperform a GTX 660M?
-
-
Now I wonder if overvolting is possible on Kepler
Would overclocking still have the boost???? -
Meaker@Sager Company Representative
You can disable boost, but if you dont then yes it will still try and boost itself. Since the 650M (7xxmhz) and 660M (8xxmhz) will have different core voltage there may be some wiggle room for 650M users at least.
-
Yeah, repeat of GF108, GT 525M to GT 550M all same chip... One could easily OC 525M in the Dell XPS to 540M clocks.
-
Any chinese overclocking result for 650M/660M since Meaker has been sharing sources with us?
-
Here is another 660M with what looks like a 192bit configuration. Weird
-
You can tag 3GB onto 128-bit bus though...
If it is 192-bit, that card will be sitting between the 670M and 675M, which then causes me to question why the 670M would be in existence at all. So I can only imagine that Nvidia wouldn't do something THAT confusing to the market.
NVIDIA Geforce GTX 660M Release Information + CUDA Core Count
Discussion in 'Gaming (Software and Graphics Cards)' started by yknyong1, Mar 22, 2012.