The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Next-gen AMD CPU's

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by octiceps, Oct 9, 2014.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  2. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Hoping its a decent, efficient quad issue, short pipeline design with independent Floating point units per core. It was an absolute nightmare for Bulldozer to share those resources with a weak front-end decoder. Either SMT properly or not at all since the beauty of Intel's implementation is that there is virtually no power penalty when not utilized while Bulldozer's modules suffered a lot when under-utilized.
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    The name sounds cool, so it must be good.
     
  4. StormJumper

    StormJumper Notebook Virtuoso

    Reputations:
    579
    Messages:
    3,537
    Likes Received:
    488
    Trophy Points:
    151
    That's all it is a feel good sound good.....nothing else :mad:
     
  5. un4tural

    un4tural Notebook Evangelist

    Reputations:
    53
    Messages:
    666
    Likes Received:
    14
    Trophy Points:
    31
    I'm really hoping amd will step up their game, intel needs some competition, not just amd trailing behind, selling chips cheaper... like in the good old days. not more of FX, a little higher clocked re-re-rereleases... good for the money but talk about beating a dead horse...

    on the second thought, the APUs might do well in laptop market, still need to sort out cpu side of things.
     
  6. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    There is a pretty good indication that 'Zen' could be a pretty potent thing.
    If it will be an APU (which it probably will be), the likelihood is that the CPU part could be massively upgraded to get very close if not match Intel... while the GPU part will be even better compared to what we see today (of course).

    Then there's also HBM stacked memory (which might be featured on Carrizo) with its huge bandwidth which could make a huge impact on APU's as a whole... both on the pure CPU side as well as the IGP.

    On another side, AMD's problem is that while it DOES lack in regular x86 instruction set compared to Intel... there's an issue that it's APU's were oriented towards things such as OpenCL and especially HSA (the latter being very lacking in terms of support from the mainstream).
     
  7. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I hope so. Intel have given AMD plenty of years to step up their game. Since Nehalem days I believe considering my 920xm can still perform adequately
     
  8. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Yes, Ars had a nice article on some AMD blunders, regardless of CPU architecture. AMD also has the problem of not having the same amount of resources to pour into R&D, but even then, it doesn't mean that they can't come up with something good.
     
  9. davidricardo86

    davidricardo86 Notebook Deity

    Reputations:
    2,376
    Messages:
    1,774
    Likes Received:
    109
    Trophy Points:
    81
    How is AMD suppose to make money for R&D when Intel gives their products to OEMs for free plus money on top? Why pay AMD anything? Just part of the bigger picture in my opinion.

    There's speculation of Zen being made with the help of Samsung foundry, 16nm finfet. Samsung node plus new design could put them closer to Intel CPUs. Probably not beat them but hopefully get closer than what they have in the past decade.

    AMD is betting on ARM+GCN+HSA over x86. Only time will tell.

    Sent from my XT1049 using Tapatalk
     
  10. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    One could say that HSA and GPGPU are the future, but the transition to it is painfully slow.
     
  11. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    One could say that, but I wouldn't bet any money on it. The reason the transition is painfully slow is because the majority of problems people care about are not at all suited to that degree of parallelization. It's quite a challenge to get the full performance boost out of even a multi-core CPU (i.e. parallelism where the number of threads is of order 4) and even at this level there are problem for which the parallelism is inherently useless (i.e. 4 cores are no faster than 1 regardless of the implementation). For GPGPU to be optimal, the program needs to effectively use of order 100 (or perhaps even 1000) threads. The class of problems which can be solved in this fashion is much smaller than even the multi-threaded ones. It's not negligible -- there are quite a few of them -- but I don't think it is enough to claim that this is the future of computing.
     
  12. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I'm hardly 'betting' on anything.
    I'm simply stating where the industry seems to be going.
    The problem here is not so much in writing programs along those lines... its just that it would be for the moment more cost prohibitive to transition fully to GPGPU.

    Anyway... I would like to post some details regarding AMD's upcoming Carrizo:
    AMD Launching Carrizo APU Featuring Excavator in December

    Seems that entry level is launching in December, with the rest in early 2015.

    Could be worth the wait at the very least.
    Carrizo does seem more like a step up compared to Kaveri... but I don't know how much of a change can we actually expect from it.
    On the CPU side, it will likely be relatively minor... though, we shall see.

    It would be nice if they had HBM for Carrizo... that would probably be a huge change (or as much of a change as one can hope for when it comes to Bulldozer type architecture)
     
  13. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Do you have a link to that article?

    I think part of the problem is that it's not just not having as much money for R&D, but that it's an order of magnitude difference.

    The details still seem rather lacking to me, so I'll reserve judgement. And, 2016 is still a fair ways off. At this point I'd probably be more likely to get Excavator, just because it's arriving faster. But I do agree that all said, the Bulldozer architecture and its derivatives have been letdowns. The integrated graphics have progressed well, but the CPU side just isn't there except in a few edge cases, such as highly parallel workloads on Vishera. Perhaps the best news out of all this is that AMD is going to give the performance x86 market another try after all, and hopefully with more success following the lessons learned from Bulldozer.
     
  14. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Here are the links, I found it a pretty interesting read.
    The rise and fall of AMD: How an underdog stuck it to Intel | Ars Technica
    The rise and fall of AMD: A company on the ropes | Ars Technica

    As for R&D budget, yeah, it is smaller and it doesn't help for sure. That said, if one of their guys has a genius idea, they may not need the budget. However, as far as incremental research and taking risks goes, Intel definitely has the financial backing to throw money at that and to also try more "frivolous" things. I don't know if Intel is willing to throw cash out the window on the off chance that their researchers find something good with crazy ideas though.
     
  15. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    This is probably pure hearsay but I remember reading a while ago about the rather expensive and manpower intensive RnD method Intel uses to keep the CPU progress marching forwards. It's something to do with parallel teams. Basically, there will be Teams A, B, C and D. Team A brainstorms all ideas that can possibly improve performance, Team B brainstorms all ideas that improve power consumption, Team C collates data from A and B to find solutions that improve performance (currently 2% for every 1% of power increase) with good ratios to power efficiency while Team D takes Team C's data and liases with the Manufacturing guys for viability.
    Basically they will have multiples of these A, B, C and D groups going at the same time (i.e. there was one for Nehalem and another for Sandy Bridge staggered but in parallel). My impression is that they effectively brute force the solutions to increasing core efficiency.
    Also something about how if your idea was implemented in the final product, you get a gold balloon to hang above your desk.
     
  16. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Here's another interesting read regarding Carrizo-L:
    AMD Carrizo tips up in bench databases

    http://wccftech.com/amd-carrizo-apu-leak-40w-512-sp-benchmarks/

    If these claims turn out to be accurate, it would indicate an IPC increase of roughly 23% on the CPU side - and there's still time until release date (which might mean further optimizations/improvements perhaps).

    Now this would certainly be an excellent APU to get for my nephew (without a discrete GPU).
    If AMD also decides to add HBM to this APU, then the GPU part alone should increase in performance dramatically and eliminate bandwidth bottlenecks, and if the CPU part could tap into HBM as well...

    If these preliminary claims turn out accurate, then Carrizo certainly looks promising, and the next architectural change coming after Carrizo/Excavator might result in really great gains on the CPU front.

    Bring on more Carrizo news please.
     
  17. Jarhead

    Jarhead 恋の♡アカサタナ

    Reputations:
    5,036
    Messages:
    12,168
    Likes Received:
    3,133
    Trophy Points:
    681
    I'm just hoping that the desktop versions of these processors will work on AM3+. Would be a nice upgrade from my FX-6300.
     
  18. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I think AMD changed the socket with Kaveri/Steamroller.
    Don't think desktop Carrizo will be compatible with AM3+ sockets, because it would seem less than prudent to suddenly go back to an older socket.
     
  19. Jarhead

    Jarhead 恋の♡アカサタナ

    Reputations:
    5,036
    Messages:
    12,168
    Likes Received:
    3,133
    Trophy Points:
    681
    Seems you're right. From Bright Side of News:

    That's pretty disappointing for users who invested in the AM3+ socket.
     
  20. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    4 years out of AM3+ is not bad at all and much more than one can say for Intel. Hopefully new sockets means AMD is serious about getting back into the CPU performance race.
     
    triturbo likes this.
  21. Jarhead

    Jarhead 恋の♡アカサタナ

    Reputations:
    5,036
    Messages:
    12,168
    Likes Received:
    3,133
    Trophy Points:
    681
    Well, I've only had mine for about a year. I don't regret getting one, but it would have been nice to look forward to upgrades on the same.motherboard. but yeah, still a better track record than Intel.
     
  22. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Realistically, Intel could have kept an older socket (say dating back from Sandy Bridge) probably including even Broadwell into the mix and then changing the socket with Skylake - but as we have seen, they decided to make slight alterations each year so as to make it impossible to upgrade from SB to IB, or from SB to HW (or IB to HW).
     
  23. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Sandy Bridge to Ivy Bridge is possible. It's Ivy Bridge to Haswell that breaks compatibility (as well as Westmere to Sandy Bridge). At least on the midrange socket (1155). On the high-end (2011 and such), it may be different.

    But yeah, it would've been nice if Intel had kept sockets compatible.
     
  24. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Since Intel like NVIDIA only likes to milk people's money, this was never going to happen...
     
  25. davidricardo86

    davidricardo86 Notebook Deity

    Reputations:
    2,376
    Messages:
    1,774
    Likes Received:
    109
    Trophy Points:
    81