ZOMG, we get NVIDIA GPU Boost 2.0 this time around with the 700M series. It means thermal control the turbo boost, but it also means that we will get higher turbo boost than before (Turbo boost 1.0).
GTX Titan was the first GPU to feature it. On paper it says max turbo boost is 876MHz but in reality with Turbo boost 2.0 it easily clocks itself up to 1000MHz+ if temperatures are ok. Really good news guys :hi2:
The midrange are announced today. High end will have to wait a little longer. This looks very promising for the upcoming GPUs.
As a comparison, the GT 650M have turbo boost up to 900MHz. GT 750M have turbo boost up to 967MHz PLUS the extra thermal boost from Turbo boost 2.0. So it could clock itself up to 1000MHz+ if the cooling is sufficient in the notebook. The memory get a whopping jump from 900MHz top on the GT 650M to a 1250MHz top on the GT 750M.
Anandtech says that some of these chips below have the new GK208 core, but does not know which yet.
![]()
![]()
![]()
![]()
![]()
![]()
![]()
Sources:
NVIDIA Releases New GeForce GT 700M Mobile GPUs | PC Perspective
AnandTech | NVIDIA
-
-
Ah god damn. I have been outdated. Nooooooooo!
By the way GT 650M goes up to 950MHz by GPU Boost not just 900MHz. At least, it does in my Samsung - but maybe it was Samsung factory overclock? -
GeForce GT 650M | Specifications | GeForce
Asus: 950MHz/900Mhz
Samsung: 950MHz/900MHz
Dell: 835MHz/1000MHz
GT 750M: 967MHz+*/1250MHz
GT 740M: 980MHz+*/1250MHz
* (turbo boost 2.0 will clock it higher if thermal allow)
I have more news. Not from an official newssite this time but from my own digging.
GTX 670MX
Here is the upcoming GTX 770M ladies and gentlemen. A brand new GPU, GK114. A huge jump both in GPU clock and memory clock
-
Still bandwidth bottlenecked... waiting for the 780m release
thanks Cloud!
-
Well holy crap. The GT 750M is a lot faster than the GT 650M.
Look at the graphic score. GT 650M = 2145. GT 750M = 2690
That means the GT 750M is a whopping 25% faster than GT 650M.
GT 750M scores about the same as a GTX 670M...
I`m not sure if the dude that tested the GT 750M have the drivers that enables GPU boost 2.0 but the results speak for itself.
GT 650M when it launched
GT 750M.
-
Wahhhh I hate how technology progresses yet my wallet doesn't :'(
-
I like the look of the 750 vs the 650, seems like a nice boost. Is it looking like the 745 is a bit faster than the old 650? I would love to find something lightweight (a la Sony VAIO S 15) with a card like the 745 if it delivers 650~+ performance.
Edit: Also considering that the major draw of Haswell was the iGPU and it isn't coming out till June (and in limited numbers even then), think it's worth pulling the trigger on a 7xxM laptop just a few months before Haswell/Windows Blue etc... -
bigtonyman Desktop Powa!!!
Looks pretty sweet. I'll probably wait till the next M14x revision and pull the trigger then when it gets haswell, a 700 series gpu, and hopefully some more goodies as well.
-
GT 745M = 837MHz+ GPU boost 2.0/1250MHz
GT 650M = 950MHz/900MHz
I don`t know how these will compare to be honest. The 745M might be on pair with it I think.
-
Karamazovmm Overthinking? Always!
I still dont find it feasible that those gpus gained so much in terms of performance by a simple clock increase. That also means that the tdp of those things have been raised
-
With earlier GPUs, we had a base clock and a max turbo boost.
With GPU boost 2.0, we have a base clock and a max turbo boost. But we also have a additional boost that goes beyond the max turbo boost. It can overclock additionally IF the temperatures are within a limit, probably chosen by the notebook OEMs. So in average it will actually stay within the TDP envelope since its controlled by temperature. The max TDP might increase.
You can see from the specs in the OP that GT 750M is up to 967MHz (turbo boost) PLUS boost (GPU boost 2.0).
GPU boost 2.0 is explained in this picture.
-
Karamazovmm Overthinking? Always!
-
Correct although some of the GPUs in OP are GK208. Nobody knows which yet since Nvidia wont say.
GPU-Z won`t reckognize GT 750M either so I`m not sure. GPU: 0FE4?!
-
740m is GK208...
-
You`re right Prema. As always.
GT 740M come with both GK107 and GK208. The GK208 version have a 64bit bus and DDR3. LOL
-
Speedy Gonzalez Xtreme Notebook Speeder!
so is safe to assume that the rMBP late 2013 will include a gt 750m ? if it does this may affect temps even more due to the increased tdp
or if they pair this 750m with a haswell CPU that runs cooler that maybe the trick to keep the temps under control since the heat-sink is combined and the CPU affect GPU temps
not a lot of people seems to be interested in mid range GPU's but is cool you see a 1/4 of an inch machine running games decently -
^^ Speedy, I am dying to get my hands on a 13" lappy that is < 1" thick and does 4k 3dmark11, oh man, I would love that machine
-
Karamazovmm Overthinking? Always!
-
Do people have any idea when the enthusiast level cards are coming?
-
#1: How many notebooks do you find with 8870M? I tried searching Newegg and Amazon. Nobody offer any. I know of 1 notebook that does. Samsung Chronos. So you are kinda stuck with 1 model. And here lies AMDs problem which is discussed in the other thread. Too few models with their GPUs.
#2: Your TDP information is wrong. TDP of 7750 is 55W. 8870M thermals should be around 12% more. 55W x 1.12 = 62W = 8870M. GT 750M TDP is around 45W.
#3: You get the almost the same 3DMark11 score out of a GT 750M as you get out of 8870M. 8870M score 2986 GPU score. GT 750M score 2690. 10% more on the 8870M.
So to conclude: You get 10% better 3DMark11 score with a 8870M but you also get a GPU that have a 35% higher TDP. -
Karamazovmm Overthinking? Always!
1) it doesnt matter how many models you have with X gpu, it matters if it fits the thermal and power budget, the drivers are a concern here though, not as much as when at launch
2) AMD FirePro M6000 | techPowerUp GPU Database, same gpu actually, with just lower clocks, so I imagined around 45w or a little higher, no where near the 62w. And the 7800m and the 8800m series which are the same are based on the 7770, not on the 7750. I had some benchies of a guy with a 8570w pushing 1ghz on the clock, and yes the performance is of a 7770, little higher actually, which is strange
3) I really dont believe that 750m benchies sorry for that. and thats still 200 pts -
Speedy Gonzalez Xtreme Notebook Speeder!
I am not an NVIDIA fanboy but if I can get green GPU's for the same price I take it
also the 750m should be pretty close to 670m performance no? -
You know that 7770 and 7750 are the same GPU right? Both are Cape Verde.
HD 7750: 512 @ 800MHz. GDDR5 @ 1125MHz. TDP = 55W
7870M: 640 @ 800MHz. GDDR5 @ 1125MHz. TDP = less than 55W? How do you come up with that?
8870M: 640 @ 725MHz. GDDR5 @ 1125MHz. TDP = ?
GT 750M 3DMark11 score is pretty much like 670M it seems. Going to be interesting too see some Vantage scores though. -
Karamazovmm Overthinking? Always!
I really dont believe that overclock can bring the gpu performance to the 670m lvls, lets face it, the 750m clocks are the same as your 650m in the rmbp, are you in the lvl of the 670m? Nope.
2) The 7750 and the 7770 are actually different gpus from the same family
AnandTech | AMD Radeon HD 7750 & Radeon HD 7770 GHz Edition Review: Evading The Price/Performance Curve
the 7750 was in the 7700m series, the 7770 was in the 7800m series.
And cloud in the same manner that the 7770 has a tdp around 80w, guess whats the tdp of the 650, which is the whole range of 640m/650m/660m?
3) I still find hard to believe such a boost in performance comes from such low clocks, i think its going to be around the 660m and if it holds a little better than it.
4) we still have the heat problem, those clocks dont come at no cost. -
Speedy Gonzalez Xtreme Notebook Speeder!
lets say both GPU's run as hot, both got the same performance or very close according to Cloud screenshoot why you prefer AMD over NVIDIA? if Apple is going to charge the same for the laptop with either of those GPU's again I am not a fanboy and don't want to start an argument I just want to know your point of view
if you want to know mine I would prefer AMD if is 15% faster or more. -
The desktop 7850 is a 1024 core part with a 860mhz core clock at 130w. The 7970m is a 1280 core part at 850mhz and the same memory clock as the hd7850. Is it a 145w card? Nope, its 100w. -
What logic is this really? Its like saying I`m a millionaire. Why am I a millionaire? Because I live in a country close to gold rich soil. But that doesn`t make me a millionaire. I still have to FIND the gold and wash the gold out from tons and tons of soil.
Similar can be said about 7870M and 8870M. Its there but you can`t really hold it in your hand since there is almost none products that feature it and you can`t buy just the card. I understand your point though. 8870M is a fine piece of hardware. Gold is fine piece of metal too
GT 650M: 384 @ 958MHz. GDDR5 @ 900MHz. TDP = 10% less for the cores. We are down to 58W. God knows what the VRAM contribute to the TDP. 45W doesn`t seem like far fetched to me.
But there is one factor I forgot all about, which is really important. Voltage. Core clocks in mobile GPUs are lowered from the desktop models. I couldn`t find any 7870M/8870M examples but looking at 6870M and HD 7770, 6870M runs @ 0.950V while HD 7770 runs @ 1.2V. I guess we can expect the same from the 7870M/8870M too so Techpowerup and you is probably right about the TDP.
Similar can be seen about the mobile Nvidia GPUs. GTX 680M have a max voltage of 1.037v, GTX 670 1.175V.
-----------------------------------------------------------------------------------------------
@ R3d: All good points. Its down to getting the mobile GPUs to run on lower voltage I think. -
TechPowerUp says the TDP of GT 750M is 33W. lol
NVIDIA GeForce GT 750M | techPowerUp GPU Database -
But will it be the same on GDDR5 versions?
-
Karamazovmm Overthinking? Always!
- Tim Cook: well we have 2 gpus in there that we can shove in the rmbp 15
- Jobs spirit: But you know, amd doesnt exist aside that only one notebook from that hideous company! and you know my green body matches nvidia so lets go with that
and the 650m is 45w
TDP was always about temps, I dont see any change aside that they are doing what intel did with P4. Which is basically we determine an average load and use that as a TDP, in the end the TDP of the said chip can be much much higher when under load. For oems this is good, they can use that golden charts from nvidia, while designing the cooling for that average load, not the real high load, so in the end you pay for more and receive way less.
but cloud I think you are right in the performance blurb
http://forum.notebookreview.com/app...p-overclocking-results-gaming-benchmarks.html
1035core1729mem.jpg Photo by jakubp12 | Photobucket
I guess a really forgot those, sorry
PS: that performance is going to be around what that thing that shall not be spoken is when fully activated
PPS: I forgot to post these as well
http://www.3dmark.com/3dm11/4891670
http://www.3dmark.com/3dm11/5069841
the latter is the lowest i could find for the 7850m
Now speedy, you can guess which performance you like -
Speedy Gonzalez Xtreme Notebook Speeder!
did you even read my question? LOL.
at the end of the day we have to settle with whatever the laptop manufacturers use is not like we can replace soldered cores or replace proprietary GPU's on mid range laptops -
HOLY CRAP. Here you can see how GPU Boost 2.0 works.
When its within a temperature limit, it overclocks the GPU additionally.
As for GT70M inside the Lenovo Y400. GPU boost overclocks the GT 750M from 967MHz to 1099MHz. Automatically without the user doing anything!!
Can you people imagine how powerful the GTX 780M will be with this GPU boost? 1000MHz clocks without touching anything. Everything is automatic.
GT 750M GPU-Z. You can see the max speed there (967MHz) when the GPU boost 2.0 is not active
GT 750M when GPU Boost 2.0 is active. 1099MHz. Holy crap.
-
failwheeldrive Notebook Deity
-
It will have 1536 cores
GPU boost 2.0 is a far better feature than an overclocked GPU in my opinion. With an overclocked GPU you get full speed all the time. High temperatures. With GPU boost 2.0 it will only exceed the max turbo boost if the temperature is within a limit. So it is very important to have a notebook with a decent cooling system, because if the GPU fan can hold the GPU temperature down, you get full clocks. If you have a notebook with a crappy cooling system, the GPU boost 2.0 won`t overclock it additionally and you will have stock clocks. It is basically a system that push out max clocks all the time based on temperatures. So you don`t have to monitor the GPU temperature yourself and apply more clocks manually if the temperatures are low. It checks temperatures and apply more clocks automatically with 700M series -
if it has 1536 then it would be awesome (coupled with TB2.0 and an m17x for nice heat management), fingers crossed
-
failwheeldrive Notebook Deity
Assuming it works well, it'll be a nice feature for future Nvidia cards. I still don't see it offering better performance than a manual overclock performed by an experienced user.
Oh, and if it's 1536 cores, I may end up upgrading the 680ms -
If the 780M has only 1536 cores it will be a dissapointment for me, I expected a progress.
-
failwheeldrive Notebook Deity
-
Karamazovmm Overthinking? Always!
This GPU boost 2.0 is even a worse sham than turbo boost. At least there you have the excuse of get it done and quickly. And turbo boost is still in the TDP computational margin.
This is P4 all over again, ridiculous thing, we have not only to pay attention if the cooling is alright, but to see if it can give the power of the gpu to us, or just a little bit. This is a sham. -
All GPUs from the 700M series will get extra clocks from the TB2.0, but how much depends on the cooling. If you have a notebook with crappy cooling system, you might cap out on 900Mhz for a GPU since that cause the GPU to reach 80C. On another system, with a dedicated GPU fan, it will reach 80C on 1050MHz. So if this is within the GPUs power cap, lets say 55W for the GT 750M, the notebook with a dedicated GPU fan will perform better. Thats my understand of it
Its not a sham, it genious. -
We'll have to see how manual overclocking will work, because this boost will probably be overly conservative and will limit the overclocking potential, not even talking about overvolting possibilities.
-
So with a custom VBIOS for 700M series there might be another variable to change for the poeple who make these VBIOS.
-
failwheeldrive Notebook Deity
That's really interesting, I didn't realize that's how overclocking worked with the Titan. I hope this won't cause unforeseen issues with overclocking mobile GPUs though.
-
Karamazovmm Overthinking? Always!
Its still P4 all around, and simply this is borderline absurd. I wonder if you can sue them for that -
Again the same
suing someone for their legitimate & quality insured & well-polished product... that is functioning within specifications & advertisement. If you don't like it, don't buy it.
-
-
Source: Anandtech
Well, that wasn't really the message Anandtech was portraying
It seemed 740-750M is using existing old Kepler parts from last year but of course, with the feature tweaks and enhancements including higher factory clocks. In essence, it seemed a lot more like minor tweaks.
For the GK208, the way Anandtech portrays it appeals more like its going into the ultrabook & ULV segment, GK208 does not seem like it'll go into 740-750M. But GK208 seems promising for what it can do for the thin and light.
Then again if GK106 becomes a 750M I don't know, but there could be increased power consumption. -
Karamazovmm Overthinking? Always!
-
That is why they say "but only if you use these tires".. They will perform as well as they can, given proper circumstances. If the laptop manufacturer is not putting up an adequate cooling system, is this Nvidia's problem?
-
Karamazovmm Overthinking? Always!
Its kind like, you are an engineer, you can build to this good standard or this one that will sometimes rain inside
NVIDIA Releases New GeForce GT 700M Mobile GPUs
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Apr 1, 2013.