Looks like 670M will be 10-20% faster than 570M.
NVIDIA GeForce GTX 670M - Notebookcheck.net Tech
NVIDIA GeForce GTX 570M - Notebookcheck.net Tech
-
No way that is going to be faster by 10-20%. The core clock increase is a mere 23MHz.
-
Shader/Memory clocks are also higher
670M
Core Speed * 598 MHz
Shader Speed * 1196 MHz
Memory Speed * 1500 MHz
570M
Core Speed * 575 MHz
Shader Speed * 1070 MHz
Memory Speed * 1150 MHz
Check benchmarks and games, but it's still ~5-10% below 6970M. -
Why anyone would get a 40nm 670m over a 28nm 660m kepler 2x performance per watt ultra thin is anyones guess.
Now this will be amazing only weighing 6.5kg.
Eurocom squeezes four GPUs into Panther 5.0 - Tans Computer News -
Thats because shader speed is 2x core speed. They follow each other
Edit: no 570M does not hum weird..
Edit #2: Yes they do. The specs you quote is wrong
-
Cough...April Fools joke...caugh...sorry.
-
dude, you have been april fooled..
ahaha
-
Quad 675Ms would have been hardcore lol
-
Don't blame copy/paste, blame NVIDIA GeForce GTX 570M - Notebookcheck.net Tech.
-
All these stories were on the 2nd of april though.
Eurocom Corporation - Number 1 in Desktop Replacement Notebook Technology
lookks like it is a trick although I thought the article was fantasy in terms of this part.
thus ensuring Eurocom notebooks will support the next generation of 130 Watt high performance notebook GPUs coming out in the near future.
Aanyway no way will a laptop have the cooling to do it.
It would still be amazing if that happens in 2020 with stacked cpu's gpu's etc. -
ITS FINALLY HERE GUYS. THE DAY A LOT OF US WAS WAITING FOR.
INTRODUCING THE GTX 680M
680M stock
680M overclocked (assuming I read the translation correct)
Source: http://dell.benyouhui.it168.com/thread-2046420-1-1.html -
Engineering samples of Alienware and technically illegally taken out from factory as said by the Chinese poster.
-
Here is a comparison with the GTX 580M and 6990M
GTX 680M scores a little bit over 30% more than 580M in 3DMark11. If its a 75W TDP (like this early chart suggest) then its a significant win for Kepler
GTX 680M
GTX 580M
6990M
Source: http://www.pcper.com/reviews/Graphics-Cards/Mobile-GPU-Comparison-GeForce-GTX-580M-and-Radeon-HD-6990M/3DMark11-Quick-Bat -
-
I WANT ONE!!!
so nice, +rep Cloudfire, man you are like our newspaper/CNN
-
TheBluePill Notebook Nobel Laureate
Now that is the improvement i was hoping to see!
Now for the 7970Ms.. -
pretty cool, any one know why Desktop GPU's score so much higher than the mobile versions. 560 ti scores almost double over 680m..........
-
You might wanna recheck your numbers. GTX 680M is faster than stock 560Ti which is awesome
-
TheBluePill Notebook Nobel Laureate
Completely different parts. Mobile parts have very strict thermal and power requirements. -
Yeah looks very impressive indeed. I think Nvidia is aiming for SLI power in a single chip hence why they plan on releasing GTX 680M with 4GB GDDR5 RAM
Thanks man. Trying my best to keep this forum updated.
680M looking good ey? I am SO buying this GPU when summer comes. Not a chance in hell I`m buying the 670M/675M. Poor people who do that without knowing what hits the market in 2-3 months
-
Lol, I know......... but more like these scores are completely skewed, and 3dmark is measuring something that doesn't even matter in modern gaming.... the 580m is definitely not less than half the performance of the 560ti in games.
Edit: oh damn, just saw thee score chart cloudfire posted........ apprently http://www.3dmark.com/search scores are way off. -
Oh man thats a nice pump, better than I expected
. it cant come out soon enough! Im weighing between getting a whole new R4 or just upgrading the GPU
.
Anyways thanks for the news and rep'd
.
-
That is like, "Ouch!" ...
-
Maybe already asked (couldn't find any posts on this...), but will the 680M use the same MXM standard as 485m, 580m, 6990m etc. i.e. could I plop two 680ms in my m18x?
-
that's what I read from clevo forums, that new cards are totally compatible with our mobo/bios
-
How about the MSI?
Sent from my Galaxy Nexus using Tapatalk -
MSI too as they also use MXM 3.0b
-
It will have the price tag to justify an "ouch" you can bet on that!
-
Good to hear i can upgrade it later. Just bought a set up with a 580m.
Sent from my Galaxy Nexus using Tapatalk -
Looks at 480, 485 and 580...
-
That 4 GB 680M is going to cost at least $3k :O
-
When I mean expensive, it's not to that extent...
MXM3.1 is ok with 3.0 cards? -
You have to be kind of insane to pay that much, plus I'm sure that AMD will come up with something similar for less.
-
wasn't some reseller in the Clevo forum already confirmed that?
-
-
They talk of a rumored GTX685M which they couldn´t confirm as the GTX680M with GK106 core is already reaching 100W.
-
Man that is going to be almost as powerful as a GTX 560ti [DESKTOP version].
Thats pretty darn impressive.
Now I hope my GTX 670m will be good enough for my games
I feel that it will though; if not maybe I'll spring for the GTX 675m....
Depending on how much I work this next month haha. -
well, as soon as 680m is here, I am on board
685m will be followed by 780m with less than a year..
-
TheBluePill Notebook Nobel Laureate
Advancement never bothers me. Im all for faster quicker. -
totally, I still wish I bought a np8150 with a 6970m back in april, I think it was a really good buy (even though 6990m followed a mere 4 months later), that is because I bought a darned g53sw by the end of march to play crysis 2 (what a fool..
)
-
I`m all for faster, smaller, quicker, cooler and cheaper.
On a second note: The source says the power consumption of the GTX 680M touched the 100W sometimes. I wonder what the TDP of this card is going to turn out to be. That is what is most interesting for me. Heat. Not Battery life since Optimus will take care of that for me.
Now what I`m curious about is wether power consumption and TDP go hand in hand. Are they linear? Meaning, if power consumption increase 5x, does the TDP have to do that too? Especially between two architectures like Kepler and Fermi. Could the Kepler be more efficient than Fermi with using all the watt and doesn`t output as much heat...? -
let's hope so, if it touches 100W, it is a bit too much, considering desktops touches 150W's not their full TDP..
-
TheBluePill Notebook Nobel Laureate
You bought a Machine to play Crysis 2.. i mean.. i could see for Mass Effect 3 (what i did).. but Crysis 2????
(I'm playin'...)
-
What's with these Chinese dudes never showing the GPU scores? No one cares about the P score, because it's meaningless.
With notebook GPUs, the AMD or Nvidia provided TDP is a measurement of the max power consumption. -
So TDP = power consumption? Thats disappointing. I thought they improved TDP with newer architectures, and made it more power efficient than the last one and cooler. 100W draw meant 100W TDP in generation 1. 100W draw meant 90W TDP in generation 2 due to more efficiency of the power. It could mean we have a 100W TDP 680M then if the information from the source is correct. And I who thought we could have a 75W powerful GPU ugh. I hope the source is wrong.
Why is it meaningless though? Its a pretty good indicator how good the GPU perform in general but you dont see exact details about GPU performance with the different types. But yeah GPU score would have been better like you say
The GTX680M was tested with a 3720QM. Find another test with a 580M and a CPU that is equally powerful and you know how much better the GPU is @ 720p with DirectX11 since the only thing that separates the systems are the GPU.
3720QM should be equal to 2760QM.
Here is the 580M coupled with the the 2860QM, a faster CPU. 680M score 31% more.
Mobile GPU Comparison: GeForce GTX 580M and Radeon HD 6990M | PC Perspective -
I always thought TDP is the maximum amount of heat that the notebook is required to dissipate.
-
It is which is why I think its weird that newer architectures doesn`t improve the heat output without cutting down the power consumption. Maybe I`m just wishing too much
-
Interesting... Might not be fake because the Chinese manufacturers are the ones designing the 28nm process. All Nvida and AMD are doing are designing there GPU cores to fit a 28nm standard.
-
They realise people will still buy a notebook if it has a 100w GPU, so it doesn't make sense not to make one.
HURRAY: Nvidia 600 series not just Fermi!! (Kepler)
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 2, 2012.