The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Intel Broadwell GPU+CPU % power increase Revealed

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by HSN21, Aug 14, 2014.

  1. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    AnandTech | Intel Broadwell Architecture Preview: A Glimpse into Core M

    CPU Broadwell’s roughly 5% performance improvement comes at a cost of just a 2.5% increase in immediate power consumption.


    GPU Broadwell’s GPU has been upgraded to support the latest and greatest graphics APIs, an important milestone for Intel as this means their iGPU is now at feature parity with iGPUs and dGPUs from AMD and NVIDIA. With support for Direct3D feature level 11_2 and Intel’s previous commitment to Direct3D 12, Intel no longer trails AMD and NVIDIA in base features

    Slight Increase in GPU power (Still weaker than desktop GPU GT3) and iGPU consume less power now

    Basically the benefits from Broadwell itself is so tiny (+5%CPU, 30~?% GPU Still weaker than desktop igpu) but less power consumption/more battery is expected duo the move to 14NM
     
  2. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Another epic fail from Intel. NEXT!
     
  3. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Not surprising, and I am actually glad to see they've addressed a lot of the issues with Haswell. I see Broadwell as a fixed version of Haswell, that consumes approximately 30% less power, and has substantial potential for overclocking. So, in a sense, this measly 5% doesn't bother me. Haswell is a great line of CPU's with "crippled wiring," so to speak.
     
  4. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I doubt we are going to see much potential for overclocking out of these chips. Not with FIVR. This is just a die shrunk Haswell and is highly unlikely to bring any real improvements to high performance scenarios and instead cater to the low end.

    I would love to see Intel surprise us but I seriously doubt it.
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Well, all we can do is wait and see. The quad-core CPU's are expected to hit next year. So, it will be a while.
     
  6. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    There is no 30% less power consumes with Broadwell, it actually consumes 2.5% more than haswell, the benefits coming from the move to 14nm not the Architecture itself, a trade that intel made to buff up their GPU
     
  7. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Why do they bother? Intel makes CPU's not GPU's lol.

    At this rate our GPU's will take over the CPU's work as Intel have failed to bring out a groundbreaking cpu since the 920xm. That cpu is now 5 years old and can still keep up (no gpu on it :thumbsup :)

    At least this development gives me reason to keep the M15x living a couple more years.....and reason enough to keep pushing the 920xm up a couple more notches....
     
  8. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Isn't the 920 clock-for-clock around 10-15% slower than Haswell though? Its still an improvement. Its just not earth shattering.
     
  9. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    More like 20%-25%. 920xm is based on an architecture that came out in 2008 and is only 45nm. Still when pushed hard enough the thing can move! At 3.8ghz I get around 7600 in passmark. Puts me level with a 2920xm stock or a 3620qm. Even the 4700qm only gets 8000 points. At 4ghz I could have haswell. Pretty pathetic considering the age of my chip :(

    Considering every two years the next gpu generation usually trumps 50%+ the previous generation the 920xm should be dead and buried by now but it is instead still alive and kicking!
     
  10. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I have given up getting excited for new CPUs from Intel.
    They are trying to steal market share from the discrete graphic market, which you can see from OP.
    7x upgrade in graphics, +5% in CPU performance...What a tragedy.

    Haswell CPUs was hot, Im very interested to see what an even smaller die will do to CPU temps.. :rolleyes:
     
  11. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Does AMD have anything to counter it? heck been many many years since i saw anyone buy a laptop with AMD CPU unless it's super cheap laptop, if not then intel simply enjoying beating everyone at CPU and trying to get rid of nvidia, to a degree they were successful for example most of apple macbooks now use intel gpu and got rid of nvidia ;/
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Happy for you, but it's not exactly an equal comparison between an older XM and a newer QM considering you're pushing your 920XM to the limit of what a notebook can handle. Sure, you're getting around the same performance as an entry-level Ivy Bridge quad, but you're kicking out a ton more heat and drawing more than twice the power. Now if you took, say, a 3940XM, you could push it at least 50% higher with the same power/thermal characteristics.
     
  13. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Look up Kaveri APU. Their 35W APUs' IGPs are competitive against their mid-range Radeon 8000Ms (such as 8750M) based on the amount of GPU cores and their clock rates, if memory bandwidth wasn't an issue. Not sure how much those APUs would throttle when both CPU and GPU are running at full load, but their 19W Kaveri APUs can be undervolted to compensate.

    AMD wanted to play the CPU+IGP game with Llano, and Intel decided to do the same.

    What we will see out of Broadwell and AMD's upcoming Carrizo APU is that Nividia is going to have to stop selling Fermi GPUs branded as low-end 800Ms and 900Ms. The fact that the Geforce 820M is simply a higher clocked version of a 435M is a bit annoying.

    Fermi 40nm, Keplar 28nm, Maxwell 28nm and Maxwell 20nm all under one main model number would cause some major confusion.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The Fermi rebadges are 28nm.
     
  15. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Pretty much. The 5% CPU increases each generation really kill any inclination to upgrade. I have a 4 GHz 2500K; there's really no point to upgrade to Haswell or Broadwell, particularly considering I'd have to buy a new motherboard. The power consumption would be slightly lower, but not enough to justify the upfront expense and trouble. And the graphics are still no comparison to a dedicated card.

    The increased graphics performance is slightly more important on a laptop if you want to avoid the extra power consumption of a GPU, but if integrated graphics are your priority, you ought to be buying AMD (at least if they ever get their high-end Kaveri APUs into an OEM model).

    It's kinda nice that it means existing hardware doesn't become obsolete for a lot longer. But it's way less exciting than when each generation was 25 - 100% CPU improvement.
     
  16. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Yeah but at what financial cost! Entry level ivy quad does me fine for zero investment :)

    I would be hard pushed to sell machine and find a ivy or has well Xm chip machine for 50% more money... Besides the xm is capable of over 4ghz. Just don't need the power yet nor the actual power yet haha
     
  17. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Guess I'm waiting for Cannonlake then.
     
  18. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Considering the fact that Intel has to delay 14nm by 6 months for tablets/smartphones, about 9 months for laptops, and over 12 months for desktops, I would not be surprised if Cannonlake is even more delayed.

    Nividia and AMD are twiddling their thumbs with TSMC and GF being unable to move to 20nm on time.
     
  19. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Nvidia is not releasing a 20nm GPU even if the tech was available, they wants to milk Maxwell first (Powerful maxwell gpus still not out on laptops or desktops) then a year after they will move to 20nm lol
     
  20. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I know. My point is I'm not paying a couple hundred bucks for an insulting 5% or even 10% improvement, so I'll wait. Plus the day my 4900MQ becomes a bottleneck, I probably need a new laptop anyway.
     
  21. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,075
    Trophy Points:
    931
    Wow. I don't think Intel has released a second generation or refresh with a performance difference this small before. In our reviews, we attribute a 5% difference to benchmark error/variation ...
     
  22. TSE

    TSE Notebook Deity

    Reputations:
    235
    Messages:
    889
    Likes Received:
    21
    Trophy Points:
    31
    I really haven't had a reason to upgrade from my 2011 MacBook Pro.

    Just threw in an SSD and upgraded the RAM to 16gbs. Probably going to keep it for another 3-5 years. The only thing I really want is a higher resolution display than my 1680x1050 one.
     
  23. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    but but it can compete with a low end nvida/amd dGPU !! :laugh:
     
  24. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    A less talked about upgrade.
    The FIVR supposedly got a substantial upgrade as well. Broadwell should be using a switching design as opposed to a linear design. This means higher efficiency under heavy load to potentially reduce temperatures and also a significant increase in idle/low power efficiency.
     
    Ethrem likes this.
  25. Dufus

    Dufus .

    Reputations:
    1,194
    Messages:
    1,336
    Likes Received:
    548
    Trophy Points:
    131
    Easy to control, just lower the TDP so it does not produce so much heat and market it to make it sound good. :rolleyes:

    Seriously though, looks like we'll have to wait a bit longer before Broadwell's bigger brethren come to see how well they perform.
     
  26. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Its obviously still not ideal if Intel is dropping them entirely in Skylake but it would be nice if they got temperatures down.

    Sent from my HTC One_M8 on Tapatalk
     
  27. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Games are always bottlenecked by the GPU in notebooks anyways. CPUs are very near desktop grade power while GPUs trail far behind their desktop equivalents. So it's not the end of the world that we aren't going to see major performance boost (though that would always be welcome of course).

    I've been under the impression that high end Haswell's have been sufficiently powerful enough to not bottleneck even 780M/880M SLI in the majority of games?
     
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah even a "base" 4700MQ quad core wouldn't bottleneck 880M SLI.
     
  29. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    You don't think a 4700MQ would bottleneck an overclocked 780 Ti? I do and that's basically what 880M SLI is.

    Sent from my HTC One_M8 on Tapatalk
     
  30. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    The exception would be strategy games such as Civ 5, Total War and Wargame Red Dragon (if you're playing 10v10).
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I think it would in CPU-bound games, esp. WoT and PlanetSide 2. Single-threaded performance is lacking on the 4700MQ.
     
  32. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    That's when you disable two of the cores and OC the other two.

    Or get an i7-4610M.
     
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    3.6 GHz is nothing, PlanetSide 2 will eat it alive. You need at least 4.6 GHz.
     
  34. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Makes me Wonder then if the upcoming 8 Cores Intel CPU (END OF THIS MONTH CONFIRMED) worth it for Gaming? it's 8 cores at 3K speed only, the 6 clocks might reach 5k with good coolers but noway for the 8cores to reach anywhere near that
    Seems like we are stuck with 4cores than max out at 4k for laptops for at least another 2 years, 6 cores for laptop would be epic since all of the upcoming games are optimizes for 6 cores duo the PS4/XBOX ONE
     
  35. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The mobile quad cores are surprisingly adequate at what they do. At stock, the 4700MQ is roughly equivalent to a desktop 4670K when it comes to multi-threaded applications (thanks to HT being absent on the 4670K). For pure single thread speed the 4670K still wins though.

    The gap between mobile and desktop CPUs is much smaller than the GPUs. Your 4940MX is essentially a mobile 4770K. AnandTech did a piece on this last year.
     
  36. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Imagine trying to stuff a desktop GTX 780 Ti or Radeon R9 290x in a laptop...

    Doable if you have a PCI-E port on the laptop mobo, a PCI-E cable connector, and the desktop GPU along with its heatsink and fans connected horizontally to the mobo.

    Though you would need like two 250W PSUs. And that laptop is going to be one gigantic brick.

    EDIT: I recall AMD mentioning that when they decreased the TDP of some of their desktop APUs from 65W to 45W, it only lost relatively little performance.

    I would not be surprised if Intel's silicon process also favored power optimization over clock speed, thus allowing their high-end mobile CPUs to be competitive against desktop CPUs.
     
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Meaker told me you could theoretically engineer a 250W mobile card to use both MXM slots (one solely for power delivery). Though the amount of engineering it'll require and the final pricing essentially means it'll be a niche product in an already niche market, so unlikely to ever happen.
     
  38. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I think the only thing desktop CPU's really have going for them ATM is better overclocking. Once Haswell-E octa-cores come out though, the gap between mobile and desktop CPU's will definitely widen.

    But give credit where credit is due for Intel's amazing architectural efficiency. The fact that you can have a laptop i7 with 90% of the performance of a desktop i7 while sipping 50% of the power is no small feat indeed. AMD can only dream of such efficiency.
     
  39. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    There's no competing with the hex/octa cores that's for sure. I do worry that 5960X is going to become a runaway nuclear reactor once OC'd...
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    So maybe Intel should pull an FX-9590 and make them available only through OEM's and boutiques. Or bundle liquid cooling and require special mobos.
     
  41. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Mmm Rome II is depressingly CPU constrained.
     
  42. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    AMD had a more extreme trade off with their 28nm silicon proces. Desktop Kaveri chips had their clock speed reduced, allowing Richland chips to be competitive.

    The laptop Kaveri chips gained clock speed, which meant a Kaveri FX-7500 19W can compete against a two years old Trinity A10-4600 35W and just barely away from the Richland 35W APUs.


    Does it at least properly support quad-cores?
     
  43. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Broadwell looks like its a fail already.. 5% really? Well hopefully the battery usage decrease makes up for it..
     
  44. TSE

    TSE Notebook Deity

    Reputations:
    235
    Messages:
    889
    Likes Received:
    21
    Trophy Points:
    31
    I feel like it will be about the same with Ivy Bridge. Yeah the die shrink will allow Intel to either increase the performance by 5-10% without improving the battery life, or not improve the performance with increasing the battery life by 5-10%.

    The GPU being about 30% stronger isn't too bad. I think die shrinks and the optimizations of CPUs are never as strong as the new architecture that Intel comes out with every other time.
     
  45. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Ivy did greatly improve performance (in addition to the small IPC boost) in the sense that the lower current draw allowed the chips to turbo boost almost indefinitely as much as the cooling allows compared to Sandy Bridge which was burstier. I am hoping Broad well would do the same.
     
  46. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Just realized Haswell-E takes DDR4 only. I pity the early adopters of X99, they're going to pay a premium for 1st gen DDR4 that will probably just match or even perform worse than DDR3 at the same price point.

    X79 may still be the way to go until 2016.