And I'll take that as confirmation that AMD dropped the ball and that Barts is not fit for laptops.
-
I was about to conclude that the GTX 485M would be about 15% faster than the 6970M based on those tests, but then I realized that the GTX 485M has the backing of the most powerful notebook CPU (i7 2920XM) while the 6970M system is using a i7 740QM. So it's 1.7 ghz of old tech vs 2.5ghz of new tech. That'll definitely make a difference in games, even the seemingly CPU independent games.
-
I must say, if that is all AMD could get out of 100W, I'm seriously disappointed.
It probably is a CPU bottleneck. The 740QM isn't exactly adequate, now that we're getting in to mobile GPUs which are faster than a 5770. -
I think its a fine gpu, i think people are expecting too much out of 40nm, its faster than 5870, almost as fast as GTX485 and cost less, i think its a good gpu.
-
20% slower is not "almost." The difference has to be less than 10% to be acceptable, imo.
-
We don't need to know how the GTX 485M does with Sandy Bridge though...we've already got those benchmarks...we need to know how the HD6970M does with Sandy Bridge.
The HD6970M benchmarks are run with an i7-740QM, which in multi-threaded games like Far Cry2 and World in Conflict only sees clocks of 1.83GHz. In Far Cry 2 even your 2.4GHz Q9200 should gain the HD6970M some FPS over the i7-740QM.
Gains from a CPU do plateau in a lot of games, but a CPU doesn't have to be pegged at 100% before it'll see gains from a faster CPU.
Sounds more like total system draw. I don't see the default 120W power brick that comes with the W860cu breaking 150W.
-
I took the first sentence as pertaining to the GPU under load alone, and the second to be CPU + GPU. 100W GPU + 45W CPU might explain the 150W load number.
Could be wrong.
Obviously I also assume they were using a larger PSU. Or a conspiracy theorist might say that PSU tripping was causing the low test scores. -
Yeah and for the g53 also, i'd be physced.
-
It's not 20% when they use same CPU. You are comparing 1.7ghz vs 2.5ghz, nice...
It's more like 10% with half the price...
6970M rocks! -
Even if AMD measured TDP as power consumption, GPU aren't rated under a maximum stress load...and there's still other components adding to total system draw. (chipset, screen, etc.). -
Well, I'm getting the 485m with i7-2630QM and will do some benchmarking which will alleviate any questions one way or another.
-
I think that we're just going to have to wait until one of us has this in his/her hands (or at at least a review that we don't need to use an error-prone translator for).
-
I wonder if the 6970m will have switchable graphics. I think that alone might push me to get it over the 485M.
-
Every discrete GPU from (at least) the last four generations supports switchable graphics.
-
Hello adbot. How you doing?
-
Test AMD Radeon HD 6970M Grafikkarte - Notebookcheck.com Tests
I think this counts as a "half-way decent review"
.
-
"740QM"
"Unfortunately, our test unit had to struggle with a serious problem: In many benchmarks, the same screen directly after the start of black, which each made a reboot necessary. Reason: Eurocom accidentally had the wrong radiator mounted in, causing temperatures in critical regions grew quickly. With the right cooler improved the situation, although significant, still occurred even at high resolutions and graphics-intensive settings often still " Black Screens "on."
Could be better, but okay. -
Eurocom sent them an engineering sample 6970M.
Don't take anything said in that review without a heavy dose of skepticism. -
Or the W860CU doesn't have the cooling capacity for that card... which is beyond 75W TDP.
-
Yeah, I think we're at the point where it can only be a 100W card. Really disappointing performance in that case, though...
-
The system power draw looks to be that of a 100w card, but that could be because they're engineering samples...
-
Perhaps, but I highly doubt it.
-
There are many different "Ors" that could be causing skewed results. That's the problem.
Again I have to comment on where it's written that wattages are measured in increments of 25. If it's not 75W it must be 100W?
Eurocom saying the HD6970M has a TDP of 75-100W is the same as saying they're not sure either. -
Well, the total system draw was significantly higher than a system with a GTX 480M, so I'd have to say that makes 85W the absolute minimum, with 90+ being the most likely. Whatever it is, it's too high for the level of performance.
-
A bad engineering sample, and/or an unoptimized BIOS can jack up power draw.
Justin@XoticPC posted the official word from Sager on the HD6970M here.
According to Sager the HD6970M is still being debugged by Clevo and no official release is ready yet. -
Either way, the 100w TDP for the 6970M makes sense considering that the desktop 6850 has a max board TDP of 127W. The 6970M, though has a lower core voltage, also has 2gb memory modules vs the 1gb memory modules in the 6850, which could push power draw high enough to make a 100W card. My guess would be between 85-100w after everything is finalized.
-
I still think that hoping for much more is being a bit optimistic, and the fact that they're taking so long with the card makes me worried that they're running into major issues.
-
Hynix 40nm 2GB modules can draw up to 20% less juice than the 50nm 1GB modules depending on what voltage they're set at. They'll draw the same juice at the same voltages.
-
I also presume 6970M are better binned parts than their desktop patrons given that they can sell for more.
-
that depend on the reseller they generally have differently binned part
and so far it seem the dell one are the higher binned among the mobility cards -
6970M benchmarks.
AMD's Mobility Radeon HD 6970 In CrossFire On Eurocom's Panther : AMD Attacks On the Mobile Front
Horrible crossfire performance (driver issues?) but 10 to 17% better than 470M single card. Great alternative to the 485M if priced right. -
x7200 is not yet ready for the 6970m. We need an updated BIOS, vbios, drivers. And most of all we need finalized cards and not the preview samples.
-
Tom's looks like they take Eurocom's guesstimate of 100W for the HD6970M's TDP but the actual power draw tests show a different story.
If you haven't realized the obvious yet, advertised TDP numbers have become about as obsolete for judging a GPU as 3DMark06. TDP is more marketing label than a real GPU spec. -
post removed. wrong card.
-
So what's the TDP ? We know the 485M is 100W...
-
Even if we knew it would be meaningless without a standard.
For Nvidia TDP is power draw. For Intel, and AMD, TDP is thermals...though they each measure under different conditions. -
Between us, let's ignore AMD, and use the term TDP to mean load power draw. This is the meaning that the majority goes by, when using the term, and it works for me.
Once a retail 6970M is paired and tested with Sandy Bridge, I'll come up with my own estimation. Unfortunately, after today's events, we know that won't be happening for quite a while.
So the "Intel just ruined my future" guess of the day, is 87W. -
Define load...cause if you want to use max GPU load then we don't know the TDP of any card because no one measures TDP under that condition.
Generally TDP is rated at "normal" stress loads but normal is arbitrarily defined by each company.
e.g. - Nvidia rates the GTX 480M's TDP at 100W (whole card) but according to Tom's test stressing a second GPU under max load jumps total system power up 140W over the single GTX 480M. -
Either way, TDP can't exceed the max amount of power drawn by the card. So it's possible to have a card consumes 120ish Watts of power, but generates 100W of conductive heat.
-
Apparently the 6490M and 6750M are in the new Macbook Pro, with dynamic switchable graphics... Anyone care to comment?
-
That's good news for the 6750M, on paper it looks sweet. 480SP is 20% more than a 5650M, then being Northern Islands arch I expect at least another 20% more performance. And finally it has GDDR5 so twice the bandwidth, so it should perform in a category of its own compared to previous mid-range mobile GPUs.
The only trouble being the MBP 15 packing it is, well, not cheap. -
Leaked drivers for HP's upcoming dv6 and dv7 suggest that they'll use the 6470M and 6770M.
-
I'll be shocked if the Mackbook uses a GDDR5 card. That memory can run pretty hot, which doesn't bode well for their ultra-thin design philosophy.
-
Consider me genuinely shocked. Every generation of Mackbook Pro has people complaining about heat, and something tells me that this will be no different.
-
Karamazovmm Overthinking? Always!
well AMD hardware do run cooler than Nvidias
Im still shocked that they actually packed the 6750 in there. -
i won't be able to say apple computer's gpu aint worth crap anymore
-
It definitely appears stronger than typically expected, but I'm going to be cautious as it's possible Apple have under clocked parts of the 6750 to manage thermals.
Hopefully it isn't too much as this is a nice jump spec wise and I hope it delivers -
I just hope more laptops will pack these, then maybe it'll put pressure on nvidia as the 6750/6770M will literally beat the crap out of their standard 96 cores + low DDR3 parts.
ATi Mobility HD 6000 series Roadmap
Discussion in 'Gaming (Software and Graphics Cards)' started by Arioch, Jun 10, 2010.