Wow.. high TDP.. but, all of their desktop cards have been power pigs.
-
TheBluePill Notebook Nobel Laureate
-
Let's not be too pessimistic all of a sudden, I don't think nvidia faked while commenting on 7970. I am sure there must be something to Kepler..
-
So basically, Nvidia's plan all along was to move the existing mobile cards over to the 28nm process, and that's what they've done.
That explains why the chips aren't 100W anymore, and the reason companies like Asus and Samsung have jumped on board.
But Kepler and the desktop chips are a completely different matter. -
IF that is all they do, then screw Nvidia. I mean, who the hell cares about reducing the 580M down to lets say 70W, when they could have increased frequency or put more cores in and stayed at 100W. Atleast do *something* other than rebadging
This better not be true, because if it is, and Nvidia first does a die shrink, wait a few months to work on Kepler mobile GPUs, and first come out with a working Kepler in August whatever, I have bought an AMD GPU loooong time ago -
It's the GTX 200M series all over again. The desktop users 600 series is next-gen architecture while our 600M is a die shrink of the previous generation's scraps.
We probably won't get Kepler until the 700M series rolls around.
Look at us poor mobile users, all back of the bus and ****. -
wow i hope this isnt true. If it is true, the 635m is already a 555m. What will the 650,660,670.. and so forth be? they have to have actual upgrades and not rebadges on 28 nm as they go up right? if not they will run out of old cards to redo onto the 28 nm before reaching too high.
-
guys come on, it is too pessimistic all of a sudden, maybe this is some amd fans attacking nvidia's fanbase? I am sure there will be a 15-20% increase in performance, all the news until now was showing this way (ok this 3dmark6 of 670m shows %5-10 increase over 570m, but I am not sure if it is entirely correct)
-
Damn. I was hoping the 660m would be better than a 570m.
;( -
remember the 650m?
It really just seems like rehash of the 550m now!
Nearly the same specs, same situation as with the 660m/670m/675m
650m Specs
550m Specs
Now I do believe that the 635m (555m rebrand) is stronger than the 650mCode:GeForce GT 550M 96@740MHz Codename N12P-GT Pipelines 96 - unified Core Speed * 740 MHz Shader Speed * 1480 MHz Memory Speed * 900 MHz Memory Bus Width 128 Bit Memory Type DDR3/GDDR5 Max. Amount of Memory 1536 MB Shared Memory no DirectX DirectX 11, Shader 5.0 technology 40 nm Notebook Size large Date of Announcement 06.01.2011
But who knows.. Maybe we are all wrong all along.. hopefully!!
That was very interesting that you checked the aida release notes!
There is some more info! The Geforce 640 is also a gf116 rebadge such as the 620m..
Now where is our 650m ? I really want to know if its a rebadge ! -
One thing is clear: GT 650M comes with Optimus.
But if the Samsung clocks and it being Fermi are true, GT 650M performance is worse with GT 525M. 405MHz core can't do wonders. -
well maybe its really 28nm afterall, but a midclass GPU as 28nm and the high end ones as 40nm parts? Doesnt make sense especially since the 650m is alot more mainstream than the high end cards, which would mean a higher demand for kepler chips
Or maybe they are all manufactured in 28nm but only minor perfomance gains?
Very confusing ^^
Another interesting fact: the samsung series 7 gaming notebook, which was avaible for preorder accidently which had the 675m, had a 200watt power brick
-
GTX 675M is GF114 IIRC as stated in the changelog of a monitoring s/w.
-
Lol...
Remember the GTX 480M?
The GTX 580M was the re-badge with only 5-10% more power to it on a smaller manuf. process (and a lot more expensive).
Nvidia's history or re-badge and giving incremental revisions (while charging hectic amounts of money for them) is nothing new.
I agree that they could have reduced the manuf. process but retain the 100W and put on a lot more juice onto the gpu... or instead of 100W, reduce it to 90W (or 85W) on a smaller manuf. process and still pump out most of the juice onto it (without having to sacrifice a lot by going down to 70W). -
We can only hope that the 680M is late because it's either
1. A beefed up 28nm Fermi chip with more shader cores, ROPS, and TMUs
2. A true Kepler card -
Sounds like you're talking about the 485M.
The 480M was early Fermi GF100; which was later bested by even the 470M (GF104). The 485M (GF104)--which, despite the name, was a significant improvement over the 480M--on the other hand is basically a less efficient version of the 580M (G114).
Hopefully the 680M, will be a jump in performance from the 675M; just like the 485M was a jump in performance from the 480M. -
if this is true then...
pros: (at least) half a year more until our high end cards become obsolete.
cons: well, no true new cards.
what about AMD? -
AMD is using probably the 7800 series for the 7900M line, which means they are going to smash Nvidia's die shrunken refinements to bits.
-
That's depressing, I was hoping the 650m was as powerful as the 560m in a 45w casing, because that clevo 11" has a full power dual core + 650m, if it's only equal to 525m...why even bother? HD4000 would be virtually the same performance.
-
That is if they are even shrinking the dies. The funny thing is if they would even dedicate 28nm wafers to produce rehashed dies.
I bet NVIDIA was not rebranding at first, but doing so now due to poor 28nm yields courtesy of NV engineering. Huang himself already mentioned poor yields. -
This probably means the death of low end graphics.
Intel Haswell iGPU: there's something for everyone by VR-Zone.com -
nono looks like the 650m is more like the followuo for the 550m in the y570, which is still pretty good
-
This news probably means Kepler is not up to standard.
AMD said to be providing GPU for the PlayStation 4 by VR-Zone.com -
That is exactly what's happening. Kepler yields have been a disaster.
But I don't see how the 675M could be between 70-75W if it's not a die shrink. There's no other way to trim that much voltage off of the 580M. -
If the benchmarks/slides are to be believed, it performs better than a 560m.
-
you guys are just blown away by little winds (forget about storms
), dudes, DUDES, when they are saying we will have 70% increase, you jump on board without thinking! and when they are saying we will sacrifice performance, the same thing... and less than 2 weeks in between..
Kepler will save power and have 10-20% increase, anyway when it is released we will see
-
No. The true mobile Kepler will bring at least 40% increases at every level. A mere 10-20% is literally impossible.
-
we'll see Kevin
anyway, really don't be this pessimistic, probably these are fake amd attacks to nvidia's fanbase, I am still thinking we will see something good this Fall (not real Kepler maybe, like real Fermi is 500m, 400m is the entry to Fermi), but we will see some initial Kepler release with some nice performance boost and lower power consumption. Real Kepler will happen with 700m I think
-
I wish more companies would implement AMD's switching that uses two MUX's, then we could have the benefits of AMD graphics... I mean look at the 6990m vs the 580m, 90% performance for 50% the price. We'd get all the benefits of switchable graphics, and the full power of the AMD card, since it doesn't route through the intel graphics.
-
Im not trying to prove anyone wrong
I hope for a worthy followup for a Midclass GPU with optimus that can be seen in sub 14, or even better, 13 inch notebooks !
I hope there will be a followup to the lg p330 with ivy/kepler -
Disaster? Way to blow things out of proportions
NVIDIA Kepler Yields Lower Than Expected. | techPowerUp -
Meaker@Sager Company Representative
No what it means is that Nvidia has angered both Sony and Microsoft because of business dealings gone bad. -
As far as meeting their timeline? Yes, I consider having your flagships pushed back to Q3-Q4 a disaster.
AMD could be set to release the 8000 series by the time Nvidia almost catches up. -
I think it depends. Since the 600M line are 28nm Fermi chips, a GTX 680M being a shrunken GTX 560Ti 448 might not be a bad thing. However, if the 7870 ends up being as fast as a GTX 580 and the 7970M ends up being based on that, then it could be gg for nVidia till a true kepler mobile arrives.
However, I honestly think AMD might disappoint us by going with the 75w route, and severely neutering these mobile chips. I'm sure they can leave the core clocks from the desktop variants untouched with these new Pitcairn chips and still keep it at 100w TDP in the mobile version. Tahiti chips will probably be out of the question since there isn't enough room to line up 12 memory banks around the die on a MXM board. -
I think the problem is this is all speculation based on probably fake screenshots. I'm optimistic about performance of Keplar, and with the amount of uptalking from Nvidia about it, I can't see them shipping out rebranded chips, I seriously feel that that 670m leak was fake, the main problem is, there's nothing foolproof to prove that a chip is real until someone more knowledgeable has one in their hands to test it. Which will probably be a German source somewhere between March 6th and 12th, hopefully they speak proper English because google translate on Chinese forums is laughable at best.
-
TheBluePill Notebook Nobel Laureate
You are absolutely correct. 15 Minutes with Photoshop can seriously change the market conditions for enthusiasts in favor of / or away from a certain company. There are millions of dollars on the table and a "leaked" image or forgery can sway people in droves.
I am ALWAYS suspicious until i see hard proof.. 20 years in the tech industry does that to ya. -
Do you believe all of this is fake, even the BIOS screen?
Seriously, when's the last time we saw such subterfuge in the mobile GPU game? -
TheBluePill Notebook Nobel Laureate
Could be an early engineering sample or any number of things that could have influenced the results on the 3D Mark stuff. It may be real, it may not be indicative of what the final 670M is capable of.. Or, it could be exactly what it says it is and a real leak. I just always take things with a grain of salt.
I really do prefer to wait until we get finished products into the hands of known reviewers..
Its fun to speculate, but i have seen so many screen shots and benchmarks for things, even weeks before launch, that turned out to be either wrong, fake or just plain not representative of the final product.. Ya know? -
Meaker@Sager Company Representative
Well if that's true it certainly is performing like a 570M.
-
Color me disappointed with this disturbing rebadging.. Guess its time to jump on the Amd bandwagon with the introduction of Ivy Bridge, since atleast their chips are purported to be the real deal..
-
600m won't be a rebadge, can bet almost anything with anyone
Also I am a long time ati user, and I just fell in love with nvidia as soon as I got my 580m, marvelous GPU
-
LOL. I still have a GTX260M laptop hah.
-
Allthough the 600 series are rebadges, there are possibilities that 700 series will arrive very soon with Kepler. Nvidia may be pushing out Fermi`s with 28nm because they know this architecture inside out and better than Kepler and therefor have the possibility to push out cooler 570M/580Ms. God knows have many core models they have released with Fermi and made each better than the last one. And at the same time they push out 28nm Fermi, they also start releasing Keplers.
So who knows, Nvidia may come out with Kepler at the same time as AMD comes out with GCN mobile GPUs. It is not just the mobile Kepler we have heard nothing about, but AMD have also been dead silent about their next mobile GPUs. -
The problem here is the assumption that NVIDIA will shrink Fermi to 28nm. What suggests NVIDIA won't continue to use 40nm Fermi for re-badging?
-
Lower TDP? I am guessing of course. 670m could still be 40 nm like you say
Is it possible to manufacture both 40nm and 28nm with the same equipment btw? I thought that once the they started producing smaller die, there is no turning back. -
This is probably realistically what is going to happen which means, its not worth waiting for. Wait another year before worrying about upgrading. THe new Nvidia and Ati Lineup is going to be a modest/marginal increase in performance with a premium pricetag attached. For those waiting for "Ivy Bridge" and the new GPU's, your basically wasting your time. Grab a good deal on whats available now and worry about the next best, latest greatest hardware in another year's time when the improvements MIGHT be worth it.
-
Kepler and GCN will blow everything from the previous gen out of the water. Not marginally better. Not 10-20%. Better performance increase than that
-
Ive heard this song and dance soooooooooooo many times............theoretical performance potential and real world performance are always 2 different animals.......
Sorry, there is hype and marketing, and then there is what you really get. Kepler and GCN whenever they do get released will show improvements, but it will be marginal or modest on first gen. release and drivers.
-
yeah, but this time it will really be a bigger jump since it goes from 40nm down to 28nm.
5000M/6000M
GTX400M/500M
were all 40nm parts, so last two years were rather boring, I know what you mean. If we now don't get atleast a 30% gain with 28nm, than it will never be the case in the upcoming years/decade.
Also desktop 7770 (70W) has the performance of a desktop 6850 (120W).
I wouldn't call that a 'small' boost in performance/watt -
Theoretical?
I can show you real numbers of VLIW (6970) vs GCN (7970). GCN is on average 46% faster than VLIW. And that was from the very first review of 7970. New product with new drivers vs old product with improved drivers. 7970 will only get better with new drivers
-
man, you mean real Kepler and GCN, we will just see the entry product to that, and I don't think the transition from 500-600 will be better than 200-400 (definitely not like 285m to 485m, that was revolutionary), these newer chips are more power efficient, not more powerful (that's what the producers are saying right?)
Gtx 660m
Discussion in 'Gaming (Software and Graphics Cards)' started by Oats04, Jan 11, 2012.