Finally released by Nvidia! wonder how long would it take to see it in Alienware and god knows how much they would cost? get your livers prepared and packaged for sale, lol.
Source GeForce GTX 675M - GeForce
-
chewbakaats58 Notebook Evangelist
Nice! This must be released today along with the announcement of the Geforce GTX 680. I have an m18x coming next week with 580m sli..do I need to return it within the 30 days to get sli 675m..much a performance boost you think? Could I upgrade eventually to sli 675m with m18x? -
I would honestly wait for the 675 if I were you, its worth the wait if you are not in rush
-
chewbakaats58 Notebook Evangelist
ruh roh..I really can't wait anymore as I ordered this thing on 3/3..way too impatient. As long as they release a 675m version within 30 days of 3/27..i'm good to go, i'll just return it for the new spec right? -
Regretted that I bought my notebook
-
Speaking of experience, I was in the same situation before. would you rather have a temporarily system and go through the hassle of returning and re ordering or just wait a bit longer and get what you want?
-
chewbakaats58 Notebook Evangelist
Maybe so if Alienware will release new m18x models with sli 675m within 30 days. If it takes longer than that, then I know the new cards weren't for me and I made the right decision. I still need to see benchmarks to see if there is a notable increase from 580m sli to 675m sli. -
Same hardware, same specs, different vbios
-
chewbakaats58 Notebook Evangelist
Yeah rumors were that there would not be a huge performance increase from 580m to 675m...any benchmarks out there yet? -
Let's see the vbios first
otherwise it seems to be the same die but perhaps the voltage will be lower like 485M>580M but I doubt it
-
It's almost exactly the same card as the 580m.
-
chewbakaats58 Notebook Evangelist
If it is mainly a difference in power consumption, then that is something I really could care less about as I will always keep my m18x plugged in. Though if we are talking about increases in performance of up to 10+ fps..I think it would be worth my time to reconsider...but need to be convinced first by some results/experience.
-
I wouldn't get worked up until we see a proper 680m. 675 seems to be an odball number to placate us from wondering when "next gen" laptop chips will arrive. In retrospect its a sensible naming scheme (well, as sensible as this crazy industry allows for).
-
Benchmark according to Nvidia
GeForce GTX 675M - GeForce -
chewbakaats58 Notebook Evangelist
Yeah, your right Shadow. I was wondering why if this is really the next big guy in town..why they are naming it 675m rather than 680m. And yes, don't get me started on the naming scheme of the mobile video cards. You would think the desktop Geforce GTX 680 = mobile Geforce GTX 680m/675m...haha....... -
Did I read correctly that the 675 is a re branded 580 with the 40nm core, 670 is a re branded 570 with 40nm core and the 660 is the new keplar 28nm. So....guess I'm lost. 660 is still slower if I read correctly by quite a bit compared to 675 and 670. So, why get the 675 or 670 when the 580 and 570 are basically the same, no new package?....without the new core....
-
chewbakaats58 Notebook Evangelist
That performance chart tells me nothing. It would help if there was a GTX 580m on that chart along with the others.... -
hope this helps from Notebook Check:
NVIDIA GeForce GTX 675M - Notebookcheck.net Tech -
The 675M is roughly 5% faster than the 580M. Not a big deal, IMHO.
-
chewbakaats58 Notebook Evangelist
Yes, this is much better..thanks for the information. So practically just a re-branded 580m...meh. -
" Due to the same clock speed, the GeForce GTX 675M is exactly as fast as the old GTX 580M"
But then it contradicts itself;
"It is based on the GF114 Fermi core and just a renamed version of the GeForce GTX 580M with higher core clock rates."
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-675M.70785.0.html
The difference lies in different drivers used for the benchmarks, etc, that's all. -
yea until there is a legit 680m im not gonna worry about it and even then the 580m is still a beast.
-
TheBluePill Notebook Nobel Laureate
Yup.. Anyone that has a 580M shouldnt fret one bit about the 675M.. And probably not too much about the 680M. -
Unless the 680m uses a real Kepler core instead of being a rebrand.
-
I wonder if we will be able to re-flash 580Ms with 675M's bios. So far the two cards appear exactly the same. Those 5% of difference in the published benchmarks could just as well be a measurement error/inaccuracy.
-
tatally agree!
-
steviejones133 Notebook Nobel Laureate
Seems there is nothing much to get worked up about when comparing 580m to 675m......and I bet Nvidia will want a huge chunk of cash for a small performance gain....
-
Since its the same core and same manufacturing process with the same features...
The only difference between a 675m and a 580m will be simply how much more perfect the process becomes over time. (they usually get better at it as time goes on)
Thus the 675m might OC slightly better due to slightly less imperfections.
No one should have buyer's regret over a 580m. (or a 570m) as the new 675m and 670m are not really a big deal and really just BIOS flashes. (which most 570m users have already done) -
It's like a less than 5% increase in performance. Nothing else has changed, it's still Fermi, and the power usage is the same. Basically, Nvidia took the 580M, and changed the name to 675M.
-
AlwaysSearching Notebook Evangelist
675 is the 580. Check nvidia site they are identical just different number.
Any difference in benchmarks will just be the luck of the draw and
getting one that slightly performs better.
gtx675m -
Just got myself a 580M for $350
. If it is possible to flash 675M vbios into 580M then I couldnt be happier \
/
-
Yes, exactly. No buyer's remorse whatsoever given the recent news, plus I really needed a new notebook for work purposes as well.
-
Xoticpc and Powernotebooks both have the NP9270 configurable with the 675M now. The upgrade to 675M SLI + extra ac adapter + power converter costs $450, so I imagine that puts a 675M around $300? Not sure if you you can just do the math like that, though.
-
I know NV is waiting that AMD release the 7990M, to strikes back haha. lol
-
Well good for us 580M owners. This means 675M just prolongs the usability of our cards since 580M = 675M. Let's see what 680M brings. This one can actually make some difference, yet don't expect Nvidia to release a true monster so early. It should be some 20% faster than 580M/675M though, I'd say.
I bet the true capability of Kepler will come with 7xx or maybe even 8xx line. Till that moment, we will be milked out of cash by their "just a bit faster" designs. -
Time to hit the snooze button on the panic alarm.
Wake me up when high-end kepler hits haha.
Btw I think kepler will be just branded 680m after they've sold off all their sneakily renamed 580s. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Probably nvidia will use the same approach used with 485m/580m.
Once a middle-end Kepler desktop card will surface, the same technology will be used to create a top end mobile card.
485m/580m were both based on the great gtx 560ti with smaller differences (clocks and memory speed).
So my guess is that when nvidia will release the gtx660ti, the same core/specs will be the starting point of the future 680m.
That would be compatible with their usual release schedule on both mobile and desktop sides: usually the top end card comes to light in summer so I bet that the desktop gtx 670/660ti will be announced in may/june.
Gtx 680m probably will come around by the end of June/first week of July.
The only question is, since there's alot of months ahead,if clevo will support the newer cards in our existing p150hm/170hm (not ivy). At this point Im starting to doubt it, but only time will tell. -
Meaker@Sager Company Representative
I'm very interested in the GTX660M, it might be worth going to it from the 570M just for the power saving/clocking potential along with the newer features.
-
The 128-bit bus really restricts the 660M IMO D:
-
Meaker@Sager Company Representative
The memory controller is much faster (stock mem clocks are 50% higher on the 680 compared to the 580) on kepler and it looks like it will be able to make up a lot of ground based on clocks. -
I currently have the 570m and there is no way I'd waste money upgrading to the 660m which probably won't have any real world performance increase over our current 570m. I am going to wait and see what the 680m looks like. Then I may upgrade my 570m to that but upgrading to the 660m is just a waste of money.
-
actually the 485m is based on a fully implemented gf104 which was used in the gtx 460, although not all of the cores were used. the 580m and the 570m are the only mobile cards based on the gf114 that is used in the gtx 560 ti.
-
Afaik the 580m doesn't have Optimus and this has it. I think that's gonna be the main draw of this card.
Sent from my samsung galaxy s2 using tapatalk -
Nvidia can easily make the 680M from the desktop 680 if they wanted. The 680 has a smaller die size than the 560 Ti and only uses 20-30 more watts. If they lowered the clock from 1006Mhz to the 560 Ti's 822Mhz, I imagine the power draw would be equal or less. Then, they'd just have to downclock it like they did the 580M.
The 680M, if derived from the desktop 680 would have around 50-60% of the 680's performance or better. That puts it around the performance of the desktop 570. Some overclocking would get it to 580 levels
You'll be disappointed if you're expecting the 660M to be comparable to the 570M. The 660M has 384 cores compared to the 570M's 334, but Kepler cores are about 1/3 as powerful as Fermi cores (desktop 680 has 3x the number of cores as the 580 plus the much higher clock to give the performance we see). The 660M would need to have about double the cores to compete. -
I agree with GTRagnarok, I have a feeling the 680m will be based off the 680. This is because the 680 isn't the top of the line card like it's name suggests, but really more of the successor to the 560Ti. So if 560Ti = 580m and 680 replaces 560Ti, then 680 = 680m (with lower clocks, of course).
I'm really hoping that's true, because it means that the 680m could be desktop powerful, in a nice portable form factor. -
actually it does, and even the 485m does, its just that it was never implemented by the notebook manufacturer.
-
Beats me why they don't bother to.
-
because its very complicated and some gamers believe it causes more trouble than its worth. in my opinion they should just scrap it, and have it so the gpu can disable parts of it self in order to use less power. it could lower the number of cores turned on, which would greatly decrease power consumption, and then there wouldnt be a need for a second graphics card.
-
I believe you mean power gating most parts of the GPU chip... Right?
-
yes, if that means turning them off or nearly off.
GeForce GTX 675M
Discussion in 'Gaming (Software and Graphics Cards)' started by Qumars, Mar 22, 2012.