The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD's 28nm Mobilty HD7000 Series - Coming 2012

    Discussion in 'Gaming (Software and Graphics Cards)' started by Phinagle, Dec 30, 2010.

  1. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    1.5GB might be more than enough in most cases but the alternative 768MB could put a strain on memory.


    40nm GDDR5 comes in 2GB modules, that don't really cost more to produce and doesn't create more heat than the 1GB of 55nm GDDR5 that cards were using. A memory die shrink works in the same way a GPU die shrink does, in that it can double the shaders from one generation to the next without doubling the card's cost or heat.

    Bus width determines how many memory chips a card will use, and the amount of VRAM on each chip is determined by what the memory manufacturers are making at the time. You may end up with more GPU memory than you need, but just like how, a year or so ago, 4GB was more system RAM than most people would need, once the price dropped on 8GB more and more people started justifying their need for more RAM.


    Eyefinity, tessellation, GPGPU, and hopefully Hybrid Crossfire will all help you justify some of that extra VRAM. ;)
     
  2. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I bet the GTX 460 with 768MB of memory will run games much better than a mid-rage mobile card with 1.5 GB of vRAM.
     
  3. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    I haven't gone through the whole thread just this last page, but I do notice there seems to be an argument about how much memory is enough.

    Which reminds me, wasn't Bill Gates quoted at one time saying that 640k of memory should be enough for everyone?

    Think about this...
     
  4. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Did Gates Really Say 640K is Enough For Anyone?

    though as to your point, it once was reasonable for the balls out best card (8800GTX) to only have 768MB of memory, and before that, the 9700Pro to have 256MB!! Now the top end cards have 2GB and 1.5GB :eek:

    EDIT: and despite consolization, memory useage is still growing...
     
  5. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    The desktop GTX 460 768MB? Step over 1920x1080 and that card shows it's memory limits. Anyone considering running multiple displays, or a larger resolution (externally for notebooks) will need more memory....and a slower GPU doesn't automatically erase the need for more memory to run at higher resolutions.

    Depending on how "high-end/gamer enthusiast" the possible 192bit Heathrow is, it could potentially choke on 768MB of memory before the GPU caps out on processing more pixels.
     
  6. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I I guess you are right from this perspective, but few ppl. actually do this. In fact most laptops today have resolutions of 1366x768 or 1600x900 at best. It's only the high-end that gets higher resolutions and consequently also better cards.
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I just opted for a GTX 460 1GB over 768GB because I run a 1920x1200 monitor. Didn't want the bottleneck.
     
  8. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
  9. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
  10. Nemix77

    Nemix77 Notebook Deity

    Reputations:
    287
    Messages:
    1,086
    Likes Received:
    26
    Trophy Points:
    66
    Sounds like I'll be wanting a 7500 - 7700 series card. I just ordered a 5820TG with a AMD 6500 series card but 28nm = drool.

    Are CPU's going to be 28nm by then?
     
  11. Prydeless

    Prydeless Stupid is

    Reputations:
    592
    Messages:
    1,091
    Likes Received:
    3
    Trophy Points:
    56
    Probably not, unless Ivy Bridge which is 22nm is released before 2012.
     
  12. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    The only 2 8nm CPU will be AMD's next-gen Bobcat based netbook APU: Krishna and Witchita.

    22nm Ivy Bridge is due for launch around the same time we should expect to see the 7000M series.....but personally I'm more eager to see these GPU paired with mobile Bulldozer based Trinity APU (32nm).


    2012 CPU options can be discussed in these threads:

    http://forum.notebookreview.com/har...3-forget-huron-river-22nm-ivy-bridge-way.html

    http://forum.notebookreview.com/har...t-upgrades/505702-amd-fusion-info-thread.html
     
  13. itcomic

    itcomic Notebook Consultant

    Reputations:
    37
    Messages:
    154
    Likes Received:
    0
    Trophy Points:
    30
    More waiting begin ..... Oh well :) M17x R4
     
  14. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  15. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Standard fanboy reporting procedure: belittle the competition's die shrink as nothing special then later harp on how your favored manufacturer used a die shrink to increase transistor count, lower die size, and improve power efficiency with the new & better HKMG silicon. ;)
     
  16. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Heh, frankly I don't mind if it's just a die-shrink. Even with a die shrink I would expect a 30% performance boost just from that.

    Now, if AMD could get that Dynamic Switchable Graphics tech. to work with any MXM port... I'm sold. Though I will miss PhysX :(.
     
  17. Hungry Man

    Hungry Man Notebook Virtuoso

    Reputations:
    661
    Messages:
    2,348
    Likes Received:
    0
    Trophy Points:
    55
    "just a die shrink" I love die shrinks! I only ever buy tocks because of die shrinks. We're using laptops... die shrinks are important.
     
  18. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    With Intel backing Lucid's version of Optimus (aka Virtu) for desktops both AMD and Nvidia have got a new light lit under their bums to press forward with switchable graphics.

    I was also thinking the other day how I wouldn't be surprised if LucidLogix took up Aegia's old role and started offering it's chip as a 3rd party physics processor....though I believe Nvidia would fight it and AMD wouldn't need it because they'll eventually be able to offload physics to the integrated stream processors on their APU.
     
  19. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Mmm, didn't know that about Intel. Well hopefully all this will get implemented soon cause I am thinking about getting a new laptop.

    Personally I think PhysX is rather good and better than what AMD or LucidLogix has to offer, but Nvidia just had to make it propriety software.
     
  20. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    i think that i'll be waiting for the r5 at the pace thing are going my laptop can www handre 99% of the games til then and by then we will have some crazy 22nm 1600 sp per mobile cards with cfx and switchable graphic




    off course no new architecture they adrealy got thier 4way arch out they are just gonna bring it to a die shrink as SP amount increase
     
  21. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    This is all great and good, but what game will use this fully? I dont think there will be a new console generation until 2015, which effectively stalls anything breakthrough on pcs until then. There may be a game or two that might give it a little push, but realistically even in 2012 we will be in the same situation we are now, games designed for consoles and slightly altered for pc gamers.
     
  22. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    I admit you have a point. I'm more eager to see the progress made on mid-range mobile graphics and iGP.
     
  23. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    If this is going to be the "next gen card that does more then dx11, then this may be what the next gen consoles are based off of. But AMD sounds like they have other plans for dual CPU/GPU hybrid chips, if thats the case then we will see one of those land in consoles rather than a single gpu and a single cpu. The saving would benifit both the manufacturer and consumer alike. The other thing to think about aswell is, DX11 is all that amazing, not yet anyways. When ps3 was coming out it was a fresh coat of paint on dx9 and the new dx10. Dx11 isn't boasting anything radically new and different, it doesn't look much different then what we have already. People will want a new system when they can see the difference.
     
  24. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
  25. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    I'm not expecting any mobile 28nm AMD GPUs in a 1 year timeframe, best scenario included. Roadmaps showed one or two models supposedly for Q4 iirc, which in realistic terms mean you can add at least 3 or 4 months before you're able to select from a decent choice of notebooks packing the aforementioned chip. Then Ivy Bridge is expected for H1 2012 (could even mean later than Q1) and if mobile 7000 is really ahead Intel's schedule I wouldn't expect OEMs to put these 28nm GPUs in many Sandy Bridge laptops. I sure would wait for Ivy Bridge if I'm already waiting for the new 28nm GPUs and I suspect a majority of potential buyers would as well.
     
  26. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    How much better will be the 7000 series of cards? will games look 10X better then there console alternatives? probably not, and no developer will waste money developing unique code for some special effects that only 1 platform can benifit from, and only a select crowd who go out and drop the money so a fraction of the whole platform. Is this worth the buy......
     
  27. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I think the developers would be better off investing in OPTIMIZING console ports for PC's rather than some special effects you won't even notice.

    Seriously though, how much of a difference can be between games for PC's and those for consoles?
    Except for the PC being able to push slightly better visual settings at higher resolutions, not so much.
    There will be no discernible change in 3d meshes, only higher res/detailed textures and extra effects.
    Although, are you REALLY up for having one measly special effect you won't even notice on a PC that will for example eat up 20FPS without it?
    I know I'm not.

    Right now numerous console ports are pathetically coded for the PC.
    LucasArts Force Unleashed II for example runs horribly slow on my laptop and the hardware I have is better than the ones on consoles (not that FU 2 was good... the story was way too short, and graphics were not that much different compared to the first game).
    GTA IV is another example of a horribly done port.
    The PC version needs a quad core to run fine.
    Seriously?
     
  28. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    There really aren't many "bad" console -> PC ports.

    Just being able to run at 1080p res with AA and framerates above 30 is enough of an advantage for me.

    The consoles are a technical mess. Their most AAA budget games are sub-HD and borderline 30fps average.

    I guess my point is, if all the new chips offer me is more AA at higher resolutions, I will be content with that.

    Anti-aliasing is overlooked far too often, imo.
     
  29. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    What's the point? What's the point!?

    Are you F***ING kidding me?

    Maybe we can actually see some higher res displays in the smaller machines becuase the POS graphics cards they have to put up with will finally be able to cope.

    The gulf between the desktop mainstream and notebooks has been widening again.
     
  30. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    There's no link between the available mobile GPUs and their current performances and low res displays. 1080p are already quite common and the same case is often designed to accept systems ranging from a basic CPU and no discrete GPU to a quad-core CPU and decent GPU. It's a matter of price not a matter of available performance on mobile GPUs offerings. Aside from gaming (would you remind me how many people game on their laptop, again ?) there's no inconvenient to 1080p display.
     
  31. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    it is also a matter of portability. I dont want to have to lug around +3kgs of equipment. That is a thing of the past, that is what was acceptable on 2006.
     
  32. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Resolution??? Really??? You done with 1080p already , because I know 1080p is damn enough for my 52 inch, Id rather have this standard for a while instead of buying a 4k screen already, and new formats and new bandwidth needs. How about better polygon meshes, how about several layers of texture layering with ray tracing to really emulate the was light is absorbed through muscle and skin layers. How about infinite detail, I can wait until there is something that drastically sets a new graphics card apart from the one I have now. Their was a huge leap in graphic technology and technique from the beginning of 2000 to 2005 that required a new generation of graphics card, but in the last 6 years since there has only been small changes and not enough to amount to a game from 2011 looking worlds better then what we saw in 2005.
     
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    You and this "infinite detail" stuff...
     
  34. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    1366x768 displays are horrid. If we had 1600x900 at least as average we would need faster GPUs to actually play games on them.

    You can stick a 1366x768 display on there and sure the current crop of gfx chips is ok on there.

    However if the GPUs are more powerful they are going to be tempted to put higher res displays that can actually use the power.
     
  35. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Haha whats that supposed to mean :p. Regardless, its something that will happen sometime, when you walk up to textures they won't fall apart into a glob of colours and bump map effects. Thats something Id like to see, or more polys per model, stuff like that. 1080p is fine for me, and makes things nice a crisp.
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Yeah but creating the art for said content would be hard work lol.
     
  37. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Possibly, this is were I want to base the bulk of my work when I get to industry is in modelling and texturing, but that aside most games start with high poly models now, then destroy them into a low poly with a normal map generally, which for the most part get the job done, textures is a resolution thing I think for the most part, RAGE from ID is using Megatextures in its new engine. Its more work to build a game for today then it most likely will in the future, all the work limiting the model and making sure it still looks as good.
     
  38. Nemix77

    Nemix77 Notebook Deity

    Reputations:
    287
    Messages:
    1,086
    Likes Received:
    26
    Trophy Points:
    66
    Dell's Vostro 3350 has a Radeon HD 7450M with 512MB GDDR5, I'm unsure if this is the 28nm GPU architecture. Dell Vostro Laptops | Dell Canada

    If it indeed is a HD 7450M and 28nm then this is a early sample geared towards business purchases as cheap BETA testing for those who are willing to pay the premium for a early HD 7000M series GPU.

    So looks like AMD is going to try and release the HD 7000M series before 2012 to regain some laptop GPU market, currently NVIDIA 500M series GPU are on almost every Sandy Bridge laptop.
     
  39. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Or it could be a typo.

    Same core count as the HD6450M but GDDR5.....though Dell has a history of listing the wrong type of memory that comes on their cards.
     
  40. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Sounds like a rebranded 6490M.
     
  41. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Wouldn't make a whole lot of sense to start rebranding right before Llano comes out with HD6000 series IGP.

    Dell's free to put GDDR5 on an HD6450M just as they're free to put 3GB of VRAM on their GT 555M so the specs might be right, but I'm going to hold to the notion that someone hit the 7 key instead of the 6 when they were typing out the model number.
     
  42. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    There's no way AMD would let the first announced 7000 series chip be a rebranded mobile GPU.
     
  43. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    indeed, however the first 6000 series was a rebranded 5600/5700, so there is a precedent
     
  44. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I doubt it's an actual 7000 series GPU. TYPO!
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Dell is not free if ATI dissable the GDDR5 parts of the mem controller and I don't think the 6450 would support it.

    Just like dell cant pair the 144 shader part 555m with GDDR5.
     
  46. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    AMD doesn't officially specify what kind of VRAM a HD6450M has to use...they just generalize that GDDR5/DDR3 is available on HD6400M cards.

    AMD Radeon? HD 6400M Series Graphics


    That same kind of loose definition of what VRAM can be used on AMD's cards is why we saw HD4870M and HD5850M using GDDR3.
     
  47. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    I think there needs to be something more ground breaking then DX11 frankly, and until the next real big bump in Graphics technology and coding comes along, I think we as the ultimate consumers for their products should be laying the pressure on these companies and push them to make bigger break throughs in graphic techonology. I won't down play the significance of tesselation but I feel the last 5 years really hasn't brought much new to the table. Some might blame consoles for that but if games were based off of the lets say R770 chip , would games be leaps and bounds different ?
     
  48. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    Hardware is fine, I think we need software improvements.
     
  49. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Check out the unreal demo showing what they could do if everyone owned a dual GTX580 config.

    Also for notebooks a shrink to 28nm means a default leap in performance without any arch improvements due to the power envelope.
     
  50. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Thats general specs for the entire series, I really doubt they would let you pair a 6450 with GDDR5.

    Much like they would not let you pair a 6600 series chip with GDDR5.
     
← Previous pageNext page →