i dont think the 485m had optimus built-in, whether implemented or not. in its generation i think it was the 460 and 560m which were the highest performing nvidia gpu's with optimus.
http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-485m
edit:
however the 580m had it, as per nvidia website
-
second blurb
i think all of the fermi cards had it built in, although it was only implemented on stuff below and including the 460. I dont think the 485m was one of them, seeing as the 485m was only offered in the higher end clevos, none of which actually had optimus.
edit: also i like how nvidia didnt put, the worlds fastest notebook card under the 675m, like they usually do.
-
Meaker@Sager Company Representative
It's not the performance quite so much I'm thinking of.
-
-
Mobile gpu's should be named with a number that corresponds to the desktop counterpart in terms of power (with an M to signify we are talking about a mobile part).
For example, the GTX 580m should actually be called GTX 560M (because it's an equivalent of a desktop 560Ti if I'm not mistaken).
Aside from that, I really don't get what the big deal about new high end mobile gpu's is.
Performance wise they end up essentially exactly the same or only a few % better with increased efficiency, while at the same time, they will undoubtedly overcharge for them.
What a fraud.
The only segment that experienced a tangible performance increase is the mid-range one. -
-
As mentioned before, the memory and controller will be fast stuff. While I will not say that "it won't matter at all" I will say that "It probably won't matter anywhere near what you think."
The 660M might be a very interesting card due to just how much you can squeeze out of it for how efficient it is.
Imagine a card that performs between the 560m and 570m and doesn't need optimus drivers and still delivers on battery life AND low-enthusiast gaming at a reasonable price. Now imagine that laptop is a relatively unbulky and cool-running 15".
I'd be interested... -
^^ and < 1" tick, that is a BUY for me
-
Every "thin and ultra-light" laptop I have ever used either has major problems with heat or feels flimsy at best. Most have major problems with durability as well.
I don't need a tank, and reasonably thin and light is good, but I have never understood why people need anything <1" thick or X lbs.
Laptop developers... please make a laptop that ISN'T primarily a fashion statement. It can be easy on the eyes and be portable... however...
First and foremost, make sure it can handle its heat and daily use and focus on peformance!
Thank you
(a person who uses and fixes laptops) -
580M to 675M - not worth the upgrade
580M to 680M - worth it -
Meaker@Sager Company Representative
580M to 675M is buying the same card.
-
steviejones133 Notebook Nobel Laureate
-
Am I reading it wrong? Why does Nvidia have the GTX 675M lower than the GTX 580M in the 3DMark Vantage Chart? (15000 vs. 16000+)
GeForce GTX 675M - GeForce
GeForce GTX 580M - GeForce -
TheBluePill Notebook Nobel Laureate
-
Meaker@Sager Company Representative
PhysX most likely.
-
one thing makes everything obvious, companies already started selling 670m and 675m (but observe we don't have nothing of 640m 650m or 660m) this is because 670m and 675m are exactly equal to 570m and 580m, they are just flashing the bios new and shipping it..
-
-
if 11k vantage (660m looks that way right now) is getting inside an ultrabook, I am buying that no matter what...
-
Tatally agree on all that!
-
Meaker@Sager Company Representative
Also when clocked up I would not be surprised if the 660M took a slight lead just from sheer clock advantage. -
In the end, if the 660m is just a rabranded 560m, you would be taking a downgrade in performance but yes, it would require less power and would run much quieter. Kinda of like the Asus G74's that utilize the 560m. They are very quite. -
-
-
GK104 production will be at shortage in the coming months. NVIDIA will have problems getting enough GTX 680 cards out for sale. For high-end mobile cards, most are binned. Given the current 28nm production situation, you need to give Huang some time.
-
TheBluePill Notebook Nobel Laureate
The 660M will be THE card to beat in the mid range. The 675 though.. It just doesnt seem appealing at all.. might as well wait on the 680M
-
Why even start a thread about the 675m.............
-
Kingpinzero ROUND ONE,FIGHT! You Win!
I'm just curious to see how well it overclocks. Granted its a renamed 580m without even changing the hardware, but maybe something differs.
If it can hit or surpass stock gtx560ti there's a good boost compared to 485m/580m.
Iirc even at 860mhz the difference was the ram speed compared to desktop 560ti, which runs at 2000-2100mhz. -
Yep exactly I do hope the 675m will have some optimizations/enhancements over the 580m like the 580m had over the 485m: lower voltage, better temps, better overclocks.
If the 675M can go past 900-925mhz core clocks and +2000mhz memory clocks, it might have found a buyer -
Meaker@Sager Company Representative
Also their mobile market is very coarse, there will be a large difference between mid kepler and little kepler and they dont want to sell dissabled kepler units atm (likely saving them for a desktop model down where there is a similar issue).
We will see a 680M based off mid kepler, it will be fast, hotish and expensive. -
-
shiitt
they are then exactly 100% the same
-
I know most on here aren't too excited about the 675m, but if Nvidia can keep the price down, it might be a good option for those that don't want to wait for/spend the money on the new 680m, right? The 580m is still a great card as of now.
-
Nvidia to keep the price DOWN?
It's not unheard of, but given what's happened before, there's a good chance that Nvidia will actually raise the price even if there's 0 difference between the hardware (except in name alone).
Look at the fiasco between 485m and 580m... 5-10% difference in performance, same hardware otherwise with bumped up clocks at a smaller manuf. process (if I'm not mistaken) ... with over $200 price difference.
Honestly, I don't get it.
If Nvidia wanted to do a rebadge of the higher-end gpu's, then they could have at least lower the manuf. process, the TDP and bump up performance somewhat... but they didn't do even that.
They merely renamed an existing gpu with everything else being the same.
I wouldn't bother with those gpus... I'd rather get something up to 650M and for the high-range models wait until they release the 28nm Keplers. -
To me, it all comes down to pricing. I know Nvidia likes to go the BMW route with pricing, but given that they made no changes between the 580m/675m, they might (emphasis on might) be realistic with pricing. If that happens, then we could have reason to keep an eye on the 675m as a viable alternative for those that a) need a card before July and b) don't want to shell out what will surely be a high price for the 680m.
-
TheBluePill Notebook Nobel Laureate
Seems to be it would be better to just offer 4 Models, 620M, 640M, 660M and 680M.. Keep the product lines simplified, easier to support and still meet the needs of all of the OEMs and their price points.
-
But when was the last time Nvidia or the manufacturers did things in a reasonable way?
Example:
They LOVE to put 2GB's (or 3GB) of DDR VRAM onto a GPU with 128bit bus and charge it more than the same gpu that has 1GB of VRAM.
It's a well known marketing gimmick so they can get rid of extra memory modules (which is stupid because it would be better to recycle them into raw matter and use it to make GDDR5 VRAM), next to increasing the bus bandwidth (provided the rest of the hardware on the gpu would benefit from it).
Instead, they will produce 2 or 3 versions of 1 gpu, price it differently and voila.
They complicate things for what exactly?
Money and profit?
Seriously, it's enough to have several gpu's in a specific category... just use them as is without creating variations on those gpu's and confusing people even more (well, such tactics won't work on people like us - mostly). -
That would be reasonable, but Nvidia likes money. And lately it seems they like money a lot.
IMO they should put a 6990m price tag under their "new" GTX 675M and sell the 580m for cheap. After all, a OC 580m is a 675M anyway... and a OC 580 will still be close to a 7870M, or even better.
But that won't happen because Nvidia loves money. -
TheBluePill Notebook Nobel Laureate
When there is so much fragmentation in the market, like with PCs, that each manufacturer will take a reference design and then split it out into different versions and mem configs to compete with their competitor's products.
The other issue is nvidia tries to meet every price point possible with a solution by having a wide range of offerings. Too many if you ask me. When combined with the OEM's sub-versions, you might have as many as 40 different variants on the market at the same time.
Sometimes i think they want this confusion to keep the consumers from being informed enough to make a rational decision so they can up-sell on ignorance. -
Also, it's established that July is when the 680M will drop? And still no real "news" on what to expect on it, right? -
Theres nothing on the AMD side. Not a single word, anywhere.
-
TheBluePill Notebook Nobel Laureate
-
Maybe 28nm fabrication wasn`t so easy as one would imagine.
Both camps don`t want to admit it first, so here we are, the customers, waiting for new GPUs only to not realize that the "few" 28nm wafers is used on desktop GPUs instead -
TheBluePill Notebook Nobel Laureate
-
they ARE the same..
-
-
The question: Will you be able to flash the GTX 580M into GTX 675M?
-
only one way to find out!
-
-
Besides, one can undervolt the 580M and provide a small OC of say 15% to 20% afterwards to get higher performance and lower temperature/power draw.
So... unless people actually GAIN something by flashing the vBIOS, I don't think it's worth the hassle. -
Undervolt and add a "small" overclock of 15-20%??
Now that must be a new kind of magic... if my card runs too hot I just lower the voltage to get cooler temps and in addition to this I can simply overclock it as well to get a better performance.
Seriously, that's just not how it works.
GeForce GTX 675M
Discussion in 'Gaming (Software and Graphics Cards)' started by Qumars, Mar 22, 2012.