Can't see there being much difference. Both Optimus and enduro just put the gpu in a low power state, and that should be a fraction of what the other components are using.
-
No, it completely turns off. AMD has a seperate low power thing that has nothing to do with enduro.
-
Optimus and Enduro are supposed to do the same thing, but their implementations are different. In the past, Optimus would absolutely murderize Enduro, but apparently with the 7970M they've made it much better.
-
Karamazovmm Overthinking? Always!
murder bacon? not it wouldnt, actually switching is a good idea, but the implementation is terrible, sometimes even forcing the game, it wont start at all. Both were terrible, now if you want to pinch pennies... -
I wonder if Enduro now works with OpenGL. AMD's old solution never did.
-
Everything im reading about Enduro on the new cards seems to indicate they have made significant improvements. Not just for switching to the HD4000 but for shutting down parts of the card as they aren't needed and then back on when they are.
On of the other threads mentioned that when running 3dmark vantage/11 on the 7970 it was running at 68 degrees. Which makes me wonder if its shutting parts down even during a benchmark or if the cooling is just that good. -
The cooling is that good.
It wouldn't be shutting parts down during benchmarks. I don't think it shuts parts down at all actually. I think the whole dGPU is either on or off.
-
That is Zero-Core, but not relevant..
Zero Core wil not work on these laptops... Zere Core is only used when there is crossfire and more than one card... It completely shuts down all but the first GPU. To the point that in a desktop PC the fans on the auxiliary GPU's even stop spinning.
ATI's enduro works by shutting down the entire GPU except the PCIe bus portion, allowing it to recieve commands to wake up quickly. Enduro and Zero-Core are somewhat similar but for totally different purposes. -
u sure? zero-core has been shown to work in single gpu desktop systems as soon as the screen is switched off,enabling the gpu to idle at like a couple of watts. so should be the same principle in laptops. powergating on the other hand would describe the partial deactivation of the gpu in low load/2D scenarios.
Sent from my GT-I9001 using Tapatalk 2 -
7970 inside a clevo(german website)
Power consumption is always below the 675M. Looks like a very potent card for a laptop - might be too much for my needs
-
google translator
Google Übersetzer -
Outstanding!
-
unpossible!
-
-
that is some serious hurt man wow
I'm getting more and more interrested now
hopefully 2013 will be an awesome year
-
Wow...I am actually kind of disappointed in the performance versus what was quoted in the past, i.e. "50% faster" - I'm seeing GTX 580 -> Radeon 7970 (desktop) style-gains. Which I guess makes sense. Nvidias top of the line "last gen" card versus AMD's current top of the line "next gen" card. My guess is the 680M will change the game again...with another 10-15% speed boost very similar to the desktop GTX 680 versus the Radeon 7970.
I'm still very excited to receive my 7970M-equipped laptop.
-
Kepler GTX 680M specs and benchmark leaked
°ë¸öGK104£ºGTX 680MʵÎï¡¢³É¼¨ÊׯØ-NVIDIA,¿ªÆÕÀÕ,GK104,GTX 680M-Çý¶¯Ö®¼Ò (in Chinese)
256bit 4GB VRAM, 768 stream processors, 100W TDP, 3DMark 11 P4905 (with 3720QM) -
that looks familiar somehow....
-
The average fps lead looks like 40% or more in nearly every test. I see nothing wrong there.
-
Don´t know if it was posted before:
NVIDIA GeForce GTX 680M Detailed and Pictured | VideoCardz.com -
Hmmm, looks like Nvidia is not going to do very well this year. I just might have to go AMD.
-
Was expecting more out of NVidia. Might have to actually compete on price for a change.
-
Would that article happen to include a price?
-
It doesn't, but it's safe to assume that it'll be much more expensive than the 7970M.
-
Probably it will go back the same price the 580m had, then nobody will buy it once 7990m comes because it will both outperform it and be several hundred dollars cheaper.
-
No, it's not safe to assume. Nvidia has not always been more expensive than AMD, they can always return to a more decent price policy.
The only way the 680M will survive on the market against the 7970M is if it's 50 USD cheaper than the 7970M.
I wanted Nvidia to at least match the 7970M in terms of performance so I can buy a 3D laptop. Now this will make matters a little bit more complicated since I will have to buy a 3D laptop but ask for GPU swap and few resellers will agree to that without killing me with the price. -
Probably won't be a 7990M because there's basically no need for a 7990M. The 7970M is already 100W, anyway
-
Karamazovmm Overthinking? Always!
kevin makes a good point the 6970m was launched as a 100w gpu, thus... if there is a 7990m its going to be based on another chip probably, what, I dont know, but there is a possibility -
I suppose. But when it turns into a 5-15 FPS difference it's not really that impressive. Not sure what I expected. I guess it's the whole thing of synthetic benchmarks versus actual games.
-
There's always a possibility. We've have to see what nVIDIA cooks up first... if it beats AMD by a margin or so AMD is surely making a 7990M (I would presume they would)
....... and nVIDIA would be forced to make another update just as we've made it towards the next tock XD -
There are no chips better suited for the high-end mobile candidate than the Pitcairn Xt currently used in the 7970M. Can't unlock cores (unlike 6970M -> 6990M) so the hypothetical 7990M would get a mild clock speed boost at best.
-
Karamazovmm Overthinking? Always!
or a gutted 7950 die, not likely though or the unicorn 7890 -
Yeah, 7890 would be most likely. If there is a 7990m (which I'm hoping for), there will probably be a 685m and it might perform more like we're wanting.
-
Karamazovmm Overthinking? Always!
problem is 7890 would enter the market of the 7950
-
Yeah the original 7890 leak is junk.....
......because none of the 7800 cards ended up being correct. It's best to assume that it never did and never will exist, until AMD pulls it out of a hat.
Now, AMD could frankenstein a 1408 shader, 256-bit card, and call it something like 7990M.
Only looking at the 1080p averages paints a clearer picture.
DIRT 3:
- 0xAA = 67.2 vs 49.3
- 2xAA = 63.1 vs 45.5
Crysis 2:
- 0xAA = 48.6 vs 30.7
- 4xAA = 48.4 vs 28.4
Metro 2033:
- DOF off = 36 vs 19
- DOF on = 25 vs 15
Those are massive leads, and in the last two games, the difference is that the 7970M gives you playable framerates, while the 675M ranges from barely hanging on to failing miserably. Even in the worst case scenario, rates of 25 vs 15 may not look like much, but if you were forced to switch between those to values while actually playing, you'd notice a tremendous difference. A year from now, that 40 to 50 percent will present many more scenarios such as the latter, meaning the 7970M allowing you to play at setting with which the 675M would not be able to maintain 30fps. -
Hmm. Given how delayed the 675m's are, I wonder if I would lose my spot in the GPU line if I switched to 7970m...
-
It's hardly a thinking matter anyway. Upgrade your GPU. Speak to your reseller.
-
Yeah, but my school year ends two weeks before most normal ones and I don't want to wait all summer to show it off. But I am thinking I'll call them in the morning and switch, it would just be one more week if I lose my spot. The main issue is money, I'm already $50 overbudget.
-
Actually neither did the 6970M nor the 6990M have 100W TDP, they were more like 90W or even less. There were some threads here debating the TDP of these cards and the conclusion was that they were not 100W TDP, but we could never really pin-point their exact TDP.
-
Karamazovmm Overthinking? Always!
indeed, but it was launched as a 100w nonetheless. -
yes, their power comsumption about 90w
but their dedicated heatsink is 100w "TDP" thermal design power -
here we go again with TDP vs. power consumption
oh btw, DGDXGDG, ure the devil! (666 posts, rep power 6 and 6 green boxes!)
-
I swear to god since you said that he hasn't posted once!!
-
Yikes, it Damien Thorne!
That one's too good to not to save for later. Nice catch, jaybee.Regardless, it's probably best that they were not. They always ran a lot hotter when heavily OC'd for me than the GTX 580M has. I've never been able to trip the PSU on the M18x with 6970M/6990M CF, but pushed hard enough the PSU has tripped several times with the 580's. The 330W AC adapter doesn't offer a whole lot of wiggle room when an XM proc and two GPU beasts are pushed super hard.Attached Files:
-
-
LOL Poor guy's gonna remain stuck there till his rep power gets incremented with time...
-
Come on, I dare someone to rep him and upset the devil.
-
I'm not sure if that'll increase his rep power, only the rep received indicated by the green squares.
And as a matter of fact, there are SIX of them now!
-
now that u mention it....he even has SIX GREEN BOXES!!!
corrected my post above accordingly
-
Oh, he must be SO proud!
-
I'll have a go then.
+1 for the dark lord.
Upcoming GPU Info Thread - 6xxM / 7xxxM Discussions!
Discussion in 'Sager and Clevo' started by Blacky, Feb 16, 2012.