i dont think the 485m had optimus built-in, whether implemented or not. in its generation i think it was the 460 and 560m which were the highest performing nvidia gpu's with optimus.
http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-485m
edit:
however the 580m had it, as per nvidia website
-
GeForce GTX 485M - GeForce
second blurb
i think all of the fermi cards had it built in, although it was only implemented on stuff below and including the 460. I dont think the 485m was one of them, seeing as the 485m was only offered in the higher end clevos, none of which actually had optimus.
edit: also i like how nvidia didnt put, the worlds fastest notebook card under the 675m, like they usually do
.
-
Meaker@Sager Company Representative
Ha, we'll see.
It's not the performance quite so much I'm thinking of.
-
they only people who would think a gtx 680m = gtx 680 are noobs and bestbuy employees. However i do agree that nvidia shouldnt have released the 675m as a rebadge.
-
Mobile gpu's should be named with a number that corresponds to the desktop counterpart in terms of power (with an M to signify we are talking about a mobile part).
For example, the GTX 580m should actually be called GTX 560M (because it's an equivalent of a desktop 560Ti if I'm not mistaken).
Aside from that, I really don't get what the big deal about new high end mobile gpu's is.
Performance wise they end up essentially exactly the same or only a few % better with increased efficiency, while at the same time, they will undoubtedly overcharge for them.
What a fraud.
The only segment that experienced a tangible performance increase is the mid-range one. -
The only segment that experienced a tangible performance increase is "AMD"
-
People continue to get hung up on this... as long as the card has enough memory bandwidth to not be hamstrung, its fine.
As mentioned before, the memory and controller will be fast stuff. While I will not say that "it won't matter at all" I will say that "It probably won't matter anywhere near what you think."
The 660M might be a very interesting card due to just how much you can squeeze out of it for how efficient it is.
Imagine a card that performs between the 560m and 570m and doesn't need optimus drivers and still delivers on battery life AND low-enthusiast gaming at a reasonable price. Now imagine that laptop is a relatively unbulky and cool-running 15".
I'd be interested... -
^^ and < 1" tick, that is a BUY for me
-
I would prefer slightly thicker and cool and comfy to use.
Every "thin and ultra-light" laptop I have ever used either has major problems with heat or feels flimsy at best. Most have major problems with durability as well.
I don't need a tank, and reasonably thin and light is good, but I have never understood why people need anything <1" thick or X lbs.
Laptop developers... please make a laptop that ISN'T primarily a fashion statement. It can be easy on the eyes and be portable... however...
First and foremost, make sure it can handle its heat and daily use and focus on peformance!
Thank you
(a person who uses and fixes laptops) -
580M to 675M - not worth the upgrade
580M to 680M - worth it -
Meaker@Sager Company Representative
580M to 675M is buying the same card.
-
steviejones133 Notebook Nobel Laureate
Yup. Nvidia doing the usual "rebadging" tactic - might be slightly faster but likely thats only down to vbios.....and I am sure someone will get a 580m flashed to a 675m fairly soon after release.... -
Am I reading it wrong? Why does Nvidia have the GTX 675M lower than the GTX 580M in the 3DMark Vantage Chart? (15000 vs. 16000+)
GeForce GTX 675M - GeForce
GeForce GTX 580M - GeForce -
TheBluePill Notebook Nobel Laureate
I figure the newer drivers will shore up the gap between the 580 and 675 even more, as time goes on. -
Meaker@Sager Company Representative
PhysX most likely.
Gap? What gap? AFAIK the clocks are not even getting a small change. It's an absolute re-badge, only the 670M is get a token clock bump. -
one thing makes everything obvious, companies already started selling 670m and 675m (but observe we don't have nothing of 640m 650m or 660m) this is because 670m and 675m are exactly equal to 570m and 580m, they are just flashing the bios new and shipping it..
-
Do you want better power management or something? I still think you will have to use the power adapter in order to take full advantage of the 660m. I wouldn't mind having a Ultrabook with a 660m but I would want something with a little bit more power on a full blown gaming laptop. Back to the Ultrabook. I will probably buy one if they put a 660m in it. I think Acer is making something with the 555m or the 660m but I don't want a Acer. I'd rather have a Samsung, HP or Dell. Maybe Asus.
-
if 11k vantage (660m looks that way right now) is getting inside an ultrabook, I am buying that no matter what...
-
Tatally agree on all that!
-
Meaker@Sager Company Representative
I'm more thinking noise, I have to use my notebook with my wife around and when it's really going for it she finds it irritating. If I can do an ivy bridge/660m combo I could get the power right down and tune the fan to be a little easier on the ears.
Also when clocked up I would not be surprised if the 660M took a slight lead just from sheer clock advantage. -
I see you have a MSI barebones laptop. I do as well (1761). When playing BF3, my fans used to scream pretty much non stop until I purchased a laptop cooler. Now my fans will only spool up to 100% every 15 minutes or so. Seriously, you barely ever hear them anymore. When I'm not gaming, my fans never even seem like they kick in. You literally cannot hear them unless your gaming. Ever since adding the Thermaltake cooler, my GPU never gets above 75 Celsius when playing BF3 for a couple hours.
In the end, if the 660m is just a rabranded 560m, you would be taking a downgrade in performance but yes, it would require less power and would run much quieter. Kinda of like the Asus G74's that utilize the 560m. They are very quite. -
I'm not sure if you meant 560m or 570m. The 660m is not a simple rebrand of the 560m since it's Kepler and all. The 670m, however, seems to be rebranded from the 570m.
-
Oh, ok. Thanks for the clarification. Why would Nvidia make their budget GTX card Keplar but rebrand the better card? The 670m will have to outperform the 660m. It just doesn't make sense at all.
-
GK104 production will be at shortage in the coming months. NVIDIA will have problems getting enough GTX 680 cards out for sale. For high-end mobile cards, most are binned. Given the current 28nm production situation, you need to give Huang some time.
-
TheBluePill Notebook Nobel Laureate
The 660M will be THE card to beat in the mid range. The 675 though.. It just doesnt seem appealing at all.. might as well wait on the 680M
-
Why even start a thread about the 675m.............
-
Kingpinzero ROUND ONE,FIGHT! You Win!
I'm just curious to see how well it overclocks. Granted its a renamed 580m without even changing the hardware, but maybe something differs.
If it can hit or surpass stock gtx560ti there's a good boost compared to 485m/580m.
Iirc even at 860mhz the difference was the ram speed compared to desktop 560ti, which runs at 2000-2100mhz. -
Yep exactly I do hope the 675m will have some optimizations/enhancements over the 580m like the 580m had over the 485m: lower voltage, better temps, better overclocks.
If the 675M can go past 900-925mhz core clocks and +2000mhz memory clocks, it might have found a buyer
-
Meaker@Sager Company Representative
Because capacity is critically low and they are missing a level of chip in the desktop market (big kepler, so they only have little and mid kepler) so supplies are short.
Also their mobile market is very coarse, there will be a large difference between mid kepler and little kepler and they dont want to sell dissabled kepler units atm (likely saving them for a desktop model down where there is a similar issue).
We will see a 680M based off mid kepler, it will be fast, hotish and expensive. -
It even runs at identical valtage levels..posted a Furmark earlier that shows the 0,87v in 3D mode at 96c...
-
shiitt
they are then exactly 100% the same
-
I know most on here aren't too excited about the 675m, but if Nvidia can keep the price down, it might be a good option for those that don't want to wait for/spend the money on the new 680m, right? The 580m is still a great card as of now.
-
Nvidia to keep the price DOWN?
It's not unheard of, but given what's happened before, there's a good chance that Nvidia will actually raise the price even if there's 0 difference between the hardware (except in name alone).
Look at the fiasco between 485m and 580m... 5-10% difference in performance, same hardware otherwise with bumped up clocks at a smaller manuf. process (if I'm not mistaken) ... with over $200 price difference.
Honestly, I don't get it.
If Nvidia wanted to do a rebadge of the higher-end gpu's, then they could have at least lower the manuf. process, the TDP and bump up performance somewhat... but they didn't do even that.
They merely renamed an existing gpu with everything else being the same.
I wouldn't bother with those gpus... I'd rather get something up to 650M and for the high-range models wait until they release the 28nm Keplers. -
To me, it all comes down to pricing. I know Nvidia likes to go the BMW route with pricing, but given that they made no changes between the 580m/675m, they might (emphasis on might) be realistic with pricing. If that happens, then we could have reason to keep an eye on the 675m as a viable alternative for those that a) need a card before July and b) don't want to shell out what will surely be a high price for the 680m.
-
TheBluePill Notebook Nobel Laureate
Seems to be it would be better to just offer 4 Models, 620M, 640M, 660M and 680M.. Keep the product lines simplified, easier to support and still meet the needs of all of the OEMs and their price points.
-
Probably.
But when was the last time Nvidia or the manufacturers did things in a reasonable way?
Example:
They LOVE to put 2GB's (or 3GB) of DDR VRAM onto a GPU with 128bit bus and charge it more than the same gpu that has 1GB of VRAM.
It's a well known marketing gimmick so they can get rid of extra memory modules (which is stupid because it would be better to recycle them into raw matter and use it to make GDDR5 VRAM), next to increasing the bus bandwidth (provided the rest of the hardware on the gpu would benefit from it).
Instead, they will produce 2 or 3 versions of 1 gpu, price it differently and voila.
They complicate things for what exactly?
Money and profit?
Seriously, it's enough to have several gpu's in a specific category... just use them as is without creating variations on those gpu's and confusing people even more (well, such tactics won't work on people like us - mostly). -
That would be reasonable, but Nvidia likes money. And lately it seems they like money a lot.
IMO they should put a 6990m price tag under their "new" GTX 675M and sell the 580m for cheap. After all, a OC 580m is a 675M anyway... and a OC 580 will still be close to a 7870M, or even better.
But that won't happen because Nvidia loves money. -
TheBluePill Notebook Nobel Laureate
There are really several factors involved in these product decisions.
When there is so much fragmentation in the market, like with PCs, that each manufacturer will take a reference design and then split it out into different versions and mem configs to compete with their competitor's products.
The other issue is nvidia tries to meet every price point possible with a solution by having a wide range of offerings. Too many if you ask me. When combined with the OEM's sub-versions, you might have as many as 40 different variants on the market at the same time.
Sometimes i think they want this confusion to keep the consumers from being informed enough to make a rational decision so they can up-sell on ignorance. -
So the 7870M has been established as a superior card to the 580M? I'm just now getting in "late" on the new cards. Any links for the 7xxx series information (mainly the high end).
Also, it's established that July is when the 680M will drop? And still no real "news" on what to expect on it, right? -
Theres nothing on the AMD side. Not a single word, anywhere.
-
TheBluePill Notebook Nobel Laureate
That is concerning.. No leaks.. no talk.. nothing.. -
Maybe 28nm fabrication wasn`t so easy as one would imagine.
Both camps don`t want to admit it first, so here we are, the customers, waiting for new GPUs only to not realize that the "few" 28nm wafers is used on desktop GPUs instead
-
TheBluePill Notebook Nobel Laureate
I would say you are correct. Hopefully they can get the bugs worked out (They always do). Next thing you know we are on 14nm.. and then 6... Ah.. technology. -
this thread should have been named 580m
they ARE the same..
-
Whew, so I did good buying my 17" with a GTX 580m 2GB DDR5 instead of waiting?
-
The question: Will you be able to flash the GTX 580M into GTX 675M?
-
only one way to find out!
-
There is no official information, yet. It was just my guess based on speculation.
-
Is there any point seeing how both are one and the same?
Besides, one can undervolt the 580M and provide a small OC of say 15% to 20% afterwards to get higher performance and lower temperature/power draw.
So... unless people actually GAIN something by flashing the vBIOS, I don't think it's worth the hassle. -
Undervolt and add a "small" overclock of 15-20%??
Now that must be a new kind of magic... if my card runs too hot I just lower the voltage to get cooler temps and in addition to this I can simply overclock it as well to get a better performance.
Seriously, that's just not how it works.
GeForce GTX 675M
Discussion in 'Gaming (Software and Graphics Cards)' started by Qumars, Mar 22, 2012.