The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    ATi Mobility HD 6000 series Roadmap

    Discussion in 'Gaming (Software and Graphics Cards)' started by Arioch, Jun 10, 2010.

  1. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    That report means absolutely nothing to the mobile sector.

    Barts has been in mass production for a long time now.
     
  2. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    That's actually quite funny because here they say the exact opposite (that the outlook for 6950 supply is grim).
     
  3. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  4. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Over in the AW forums, yyduinihao0 posted a pic of what looks like Crossfire HD6900M in the new M17xR3.

     
  5. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    ^^ I was hoping those were single GPU results ;)
     
  6. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I love how he gets 16000 points on Intel HD Graphics. It's probably due to Optimus... but I am still puzzled over how exactly it works.

    As for the 6970M scores... those are actually better than those of the 480M SLI in the X7200. The X7200 gets 20000 points but with a desktop i7-960 CPU, the R3 gets that with a i7-2720QM. That puts the 6970M where...? 5% faster than the 480M ? If Nvidia releases it's 580M at CES, I bet it's going to be very tight battle.
     
  7. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    960 in X7200 scores about 5300pts in 06. 2720QM hits 5681. In this review of x7200 the system with 960 and 2x480M scores 20144 in 06. Based on that you can assume that 480M is actually more powerful than 6970M, especially if you keep in mind that Fermi cards aren't that good in synthetics. I'm afraid we got no new king here. Again, I hope the R3's 20k was achieved with a single 6970M.
     
  8. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    A single desktop HD6870 scores 21987 in 3DMark06 with an i7-920.

    EDIT:

    You were serious? What does the Mob. HD5870 in Crossfire with an i7-940XM score in 3DMark06? I thought it's like 15Kish.
     
  9. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Then there is something very wrong with that benchmark. I quad core at 2.2 GHz cannot possibly beat one at 3.2 GHz. Or be twice as fast as the previous generation CPUs.
     
  10. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
    So you think it's single?
    But isn't 3dmark06 not a bad test after you passed 20 000 points?
    Sure it's quite reliable if <15K

    @blacky
    Considering it's sandy brigde and the turbo boost (let's say 2.5ghz for quad)
    CPU also start to throttle when they become to hot. i7 960 is 130W? the new sandy is 45W. So it wouldn't be that strange
    I'm not saying it's the truth, but it's a possibility
     
  11. miahsoul

    miahsoul Notebook Deity

    Reputations:
    75
    Messages:
    1,372
    Likes Received:
    0
    Trophy Points:
    55
    That makes 20100 on a single mobile barts with a "lowly" 2720 pretty feasible if you ask me. :0
    Just can't wait to see how the new 69xxMs OC. If it was anything like the 58XXMs then they're gonna be IMPRESSIVE.
     
  12. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    If AMD's roadmap is accurate, this 6900M series would be based off a Barts chip, which would suggest that it's a single GPU score, though SLi and CFX scale very poorly in 3dmark06 to make a fully confident call on that.
     
  13. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    there is somethign else to take in consideration. Alienware cheat since intel turbo boost is on. in the R2 a 920XM could have all its 4 cores using the turbo at the same time. That game it a really unfair advantage and i guess the r3 cheat in the same way this could explain why the alienware quad kill the desktop cpu in the clevo
     
  14. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    +Rep

    Yes, that must be it. According to specifications, then the i7-2720QM must have been working with all 4 cores at 3.3 GHz, slightly more than the stock 3.2 GHz of the i7-960. But at those speeds, the power drawn and heat ... will make the laptop last just enough to do the benchmark.
     
  15. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    I don't get this. Intel Turbo Boost is on for any i7 regardless if it's mobile or desktop. And Turbo Boost shuts down cores and clocks up higher on just 2 or 1 core depending on the application.

    http://www.intel.com/technology/turboboost/
     
  16. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    It usually has to shut down cores, but not always, especially if using all of the cores is more efficient.
     
  17. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    actualy not turbo boost in the m17x-r2 was adjusting to the heat and power drain so the system was staying within its own limit instead of the artificial limit forced by intel
    also the cpu prolly didnot stayed at 3.3 all the run it likely varied depending on the power drain of other component and the heat produced by the turbo

    yes but some part needed to be implemented in the bios for turbo to work correctly. Dell "failed" to implement them correctly giving a system that could boost all 4 cores if the cpu was unlocked like the 920/940xm where the stated desktop cpu is likely using the reference desing
     
  18. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    If it's a single HD6900 card then maybe what they're looking at in the AW forums is an M15x....and the new design adds a numpad.
     
  19. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Well, how does the 460M score look in that thread? Does it look like a single card or SLI?

    I wonder about the chances of it being two 6950Ms in Crossfire. That would explain the score very well and look good for AMD.
     
  20. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    12-13K I believe is stock in the ASUS G73JW with a 740QM.
     
  21. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Hm. Then there may be a chance that both are single, but it seems unlikely. Though Vantage is a better benchmark than 06 anyway. With 11 out, 06 should be on it's way out.
     
  22. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    It actually does doesn't it.

    Desktop HD5770 scores 16.5K in 3DMark06 with an i7-920 and Mobility HD5870 scores 12.5K with an i7-720QM.

    If desktop Barts is scoring 22K with the same i7-920 then it is possible that a single HD6970M can score 20K with a CPU that scores ~2K higher than the i7-720QM.
     
  23. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Hm. Depending on the resolution it was run at, that does sound right for the desktop HD 6850, doesn't it? In fact, it's actually a bit too low to be Crossfire with those results. Well, unless Crossfire can only add around 2000 to the score and the CPU wasn't overclocked.
     
  24. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
  25. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    is it me or they just reused the 5870m chart .....
     
  26. Hendrick4life

    Hendrick4life Notebook Guru

    Reputations:
    3
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    That chart makes me lol. That to me doesnt even look like 10 to 20%
     
  27. sotoa

    sotoa Notebook Consultant

    Reputations:
    0
    Messages:
    193
    Likes Received:
    2
    Trophy Points:
    31
    I'm itching to hear some info on the 6900m cards. Anything yet?
     
  28. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Fudzilla is terrible, just fud.
     
  29. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    or a well calculated move releasing a chart that looks fake so people don't pay to much atention to it
     
  30. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Or Fudzilla is terrible and just fud.
     
  31. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
  32. GapItLykAMaori

    GapItLykAMaori Notebook Evangelist

    Reputations:
    28
    Messages:
    385
    Likes Received:
    0
    Trophy Points:
    30
    i knew this would happen. Prob is that ati's architecture has little potential because all they have been doing is doubling streams and making it more efficient. This was possible due to the decreasing size of transistors every generation and that this method was actually effective and obviously worked extremely well in the 5xxx generation. However since they are still on 40nm even after tweaking architecture performance gain is pretty minimal. However fermi on release was still an unfinished and unoptimised architecture and had lots of potential left. Even thought there are still many flaws with it the full gf100 delivers performance and is easily the top dog at the moment. However the tables may turn again in the next generation will begin to implement 28nm transistors. Rumours are that AMD have moved on to a different approach however i thnk that they are slightly scared to move onto a different architecture. Nvidia have lost large amounts of revenue and profit and by now have probably got a big slap to the face and woken up, I thnk the next generation will be a very intresting war for both companies and I hope a price war will errupt as it is beneficial for all consumers/fanboys.
     
  33. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    I guess those "fake" slides were more accurate than we thought. :/ Something tells me that it's going to be even worse in the mobile space, where a 580M might obliterate the 6970M, which might struggle against the 470M/480M. AMD has hit a wall.

    Those look like low TDPs, though. I wonder how far OCing will take them...

    Scratch that. These cards are a rip-off.
     
  34. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    LOL WUTTT wait is it me or the 6950 has a lower power consumsion then the 6870
     
  35. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Doesn't surprise me. AMD continues the same model as the HD5xxx, providing the best price/performance with excellent power/performance. Cooling could be better, but that can be easily rectified by the user and 3rd party AIBs.

    I would go with HardOCP on this one. That review simply stated, never before has 2 cards offered so much performance at a low price.

    And this is good news for laptop gamers. Appears to me, another round of awesome performance for laptops priced to demolish Nvidia equipped notebooks.

    FYI, Nvidia lost contract to Apple laptops because they REFUSED to price competitively. This sounds better and better for laptops to be equipped by performance AMD.
    - Good news for those who passed on HD5xxx and waiting for HD6xxx. You will be able to upgrade and get awesome performance at prices that won't make your wallet cry.
     
  36. djhuydx

    djhuydx Notebook Consultant

    Reputations:
    119
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    30
    I've seen some review for 6970 crossfire and that better 580SLI performance so I'll hope about same performance with 6970m xfire lol
     
  37. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Not a failed GPU but definitely a failed launch....partly because AMD didn't push to launch before GTX 580 but mostly because they didn't seem to provide reviewers with very good test drivers.

    Optimization for the new VLIW4 arch will correct some performance incosistancies and better show where the HD6970 stands but ultimately you only get one shot at a first impression and AMD blew it for HD6900.
     
  38. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Right. It's almost as if they didn't see the GTX 580 coming and just focused on beating the GTX 480 (and did a pretty bad job of that as well). This lack of foresight worries me for the HD 6900M series. I can't believe this happened after Barts was was so impressive. Stuff like this is why AMD's CPUs haven't been able the scratch Intel's for years. NVidia thought two steps ahead, while AMD was stuck one step behind. I really don't like this. Prices will go up if AMD can't remain competitive.
     
  39. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
    Its not as bad as you are thinking, the gpus are good, just not the GTX580 lvl, they are more in line with how AMD/ATI is, the 5870 cought Nvdia with its pants down, but things are back how it used to be, with Nvidia better performer, ATI better value, although right now its tough with GTX570 so close to the 6970, still seems we are back to how it was on the past, like the 4870/4890 never competing directly to the GTX285.

    But it does make me wonder, if the desktops didnt get that much of an upgrade, how will the mobile refresh end up, i guess in a month or two will see.
     
  40. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Oh well. I guess fat wallets rule in this business. :/

    And yeah, I'd say not to expect much. Both of them have hit the wall here, thanks to being stuck on 40nm. 28nm will be big, however.
     
  41. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    The HD 5000 series were like OMG!! They destroyed Nvidia with such powerful cards. But now... In my point of view these new HD 6000 series are a joke... Im very dissapointed...
     
  42. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Will all the 6000m series be rebrands and only the 69xxm have the new architecture?
     
  43. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Contrary to what some may think here.

    I think the performance upgrade on the mobile will be as expected, not as some who are throwing their arms in the air.

    It matters zilch about the HD6970 or HD6950, those were unlikely to ever be used for mobile. Why how HD6970 or HD6950 matters to them for laptops, baffles me.

    The HD5870M were based on the HD5770. Performance wise, an overclocked HD5870 800/1100 performs nearly identical to a HD5770.

    The replacement for the HD5770 we know of is the HD6850/HD6870. And those have significant improvements over the HD5770. So if the mobile HD69xx is based on the HD6850/HD6870, I think you can expect 25-30% improvement with O/C headroom. The HD6870 are not rebranded HD5xxx. Though the architecture appears to be based on them with significant improvements in tessellation (probably new tessellation units), from 16 to 32 ROP, faster GDDR5 and from 128 to 256 bit, and a new AF/AA algorithm for higher quality image. These are also more energy efficient than the HD5xxx.

    I would expect the lower end HD5xxx mobiles to be rebrands though.
     
  44. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    we likely won't get the hd6970 into the mobile but we could get a 4way arch card comming
     
  45. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    That's what I meant. Will any of the new 6000m series have the new architecture?
     
  46. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    How about we all wait until CES ?
     
  47. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    You can't really expect anything substantive until we go to 28 nm next year.
     
  48. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    That's not what I meant. If AMD misjudged anything about GTX 580 it was likely only when Nvidia would be launching their card.

    The reason I believe AMD should have gotten Cayman out before GTX 580 is because the HD6970's performance would have been better received by consumers against a GTX 480. AMD could have then worked on getting better performance out of drivers in time for GTX 580 launch, when reviewers would have to rerun all their HD6900 benchmarks.

    In a scenario where HD6970 comes out first the GTX 580 gets perceived as a small performance bump at an inflated price. However, with how the launch actually happened Cayman has gotten a black mark on their permanent record that, even if drivers improve things, does not get corrected in the eyes of most consumers.

    AMD fans are now upset (some more than others) because they wanted a bigger Cayman to flaunt the single-GPU card crown over Nvidia fans. In hindsight we can now see that we over-speculated and then remember that AMD doesn't work that way.

    I know it's cliche to say these things but, Cayman matches AMD's sweetspot strategy, dual-GPU Antilles will be their fastest card, and drivers will improve performance.


    An entirely different horse of another color. Nvidia wishes they had the means(money) to do AMD the way Intel did AMD.
     
  49. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    I'm getting pissed off with AMD lately. The 6850/6870 weren't revolutionary but brought good performance at a decent price, but now (in EU at least) their price is actually considerably higher, is this becoming an habit at AMD's or what ?

    Now the 6950/6970 bring marginal improvements with a bigger chip and more power consumption. I don't expect any significant performance increase with drivers since I have yet to witness anything like that (except for specific games). To begin with I say nvidia releases the GTX 560 quick and forces AMD to stop ripping off the consumer with overpriced Barts (here GTX 460 : 180€ / Radeon 6870 : almost 250€, Oo srsly).
     
  50. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    True, but even then they only match the GTX 480. I guess we'll wait and see, but hopefully AMD learns a lesson from this; specifically, I hope they learn to always have a back-up plan ready. Any way you look at it, NVidia played it very smart here. Who knows how long the GTX 580 was sitting in their labs, and who's to say that they don't have a GTX 580M ready to release at just about any moment?
     
← Previous pageNext page →