AnandTech | Intel Broadwell Architecture Preview: A Glimpse into Core M
CPU Broadwell’s roughly 5% performance improvement comes at a cost of just a 2.5% increase in immediate power consumption.
GPU Broadwell’s GPU has been upgraded to support the latest and greatest graphics APIs, an important milestone for Intel as this means their iGPU is now at feature parity with iGPUs and dGPUs from AMD and NVIDIA. With support for Direct3D feature level 11_2 and Intel’s previous commitment to Direct3D 12, Intel no longer trails AMD and NVIDIA in base features
Slight Increase in GPU power (Still weaker than desktop GPU GT3) and iGPU consume less power now
Basically the benefits from Broadwell itself is so tiny (+5%CPU, 30~?% GPU Still weaker than desktop igpu) but less power consumption/more battery is expected duo the move to 14NM
-
Another epic fail from Intel. NEXT!
Apollo13, Cloudfire, davidricardo86 and 2 others like this. -
Not surprising, and I am actually glad to see they've addressed a lot of the issues with Haswell. I see Broadwell as a fixed version of Haswell, that consumes approximately 30% less power, and has substantial potential for overclocking. So, in a sense, this measly 5% doesn't bother me. Haswell is a great line of CPU's with "crippled wiring," so to speak.
-
I doubt we are going to see much potential for overclocking out of these chips. Not with FIVR. This is just a die shrunk Haswell and is highly unlikely to bring any real improvements to high performance scenarios and instead cater to the low end.
I would love to see Intel surprise us but I seriously doubt it. -
Well, all we can do is wait and see. The quad-core CPU's are expected to hit next year. So, it will be a while.
-
-
King of Interns Simply a laptop enthusiast
Why do they bother? Intel makes CPU's not GPU's lol.
At this rate our GPU's will take over the CPU's work as Intel have failed to bring out a groundbreaking cpu since the 920xm. That cpu is now 5 years old and can still keep up (no gpu on it :thumbsup
At least this development gives me reason to keep the M15x living a couple more years.....and reason enough to keep pushing the 920xm up a couple more notches.... -
-
King of Interns Simply a laptop enthusiast
Considering every two years the next gpu generation usually trumps 50%+ the previous generation the 920xm should be dead and buried by now but it is instead still alive and kicking! -
I have given up getting excited for new CPUs from Intel.
They are trying to steal market share from the discrete graphic market, which you can see from OP.
7x upgrade in graphics, +5% in CPU performance...What a tragedy.
Haswell CPUs was hot, Im very interested to see what an even smaller die will do to CPU temps.. -
-
-
AMD wanted to play the CPU+IGP game with Llano, and Intel decided to do the same.
What we will see out of Broadwell and AMD's upcoming Carrizo APU is that Nividia is going to have to stop selling Fermi GPUs branded as low-end 800Ms and 900Ms. The fact that the Geforce 820M is simply a higher clocked version of a 435M is a bit annoying.
Fermi 40nm, Keplar 28nm, Maxwell 28nm and Maxwell 20nm all under one main model number would cause some major confusion. -
-
The increased graphics performance is slightly more important on a laptop if you want to avoid the extra power consumption of a GPU, but if integrated graphics are your priority, you ought to be buying AMD (at least if they ever get their high-end Kaveri APUs into an OEM model).
It's kinda nice that it means existing hardware doesn't become obsolete for a lot longer. But it's way less exciting than when each generation was 25 - 100% CPU improvement. -
King of Interns Simply a laptop enthusiast
I would be hard pushed to sell machine and find a ivy or has well Xm chip machine for 50% more money... Besides the xm is capable of over 4ghz. Just don't need the power yet nor the actual power yet haha -
Guess I'm waiting for Cannonlake then.
-
Nividia and AMD are twiddling their thumbs with TSMC and GF being unable to move to 20nm on time. -
Nvidia is not releasing a 20nm GPU even if the tech was available, they wants to milk Maxwell first (Powerful maxwell gpus still not out on laptops or desktops) then a year after they will move to 20nm lol
-
-
Charles P. Jefferies Lead Moderator Super Moderator
Wow. I don't think Intel has released a second generation or refresh with a performance difference this small before. In our reviews, we attribute a 5% difference to benchmark error/variation ...
-
I really haven't had a reason to upgrade from my 2011 MacBook Pro.
Just threw in an SSD and upgraded the RAM to 16gbs. Probably going to keep it for another 3-5 years. The only thing I really want is a higher resolution display than my 1680x1050 one. -
King of Interns Simply a laptop enthusiast
-
The FIVR supposedly got a substantial upgrade as well. Broadwell should be using a switching design as opposed to a linear design. This means higher efficiency under heavy load to potentially reduce temperatures and also a significant increase in idle/low power efficiency.Ethrem likes this. -
Seriously though, looks like we'll have to wait a bit longer before Broadwell's bigger brethren come to see how well they perform. -
Sent from my HTC One_M8 on Tapatalk -
Games are always bottlenecked by the GPU in notebooks anyways. CPUs are very near desktop grade power while GPUs trail far behind their desktop equivalents. So it's not the end of the world that we aren't going to see major performance boost (though that would always be welcome of course).
I've been under the impression that high end Haswell's have been sufficiently powerful enough to not bottleneck even 780M/880M SLI in the majority of games? -
Yeah even a "base" 4700MQ quad core wouldn't bottleneck 880M SLI.
-
Sent from my HTC One_M8 on Tapatalk -
The exception would be strategy games such as Civ 5, Total War and Wargame Red Dragon (if you're playing 10v10).
-
-
That's when you disable two of the cores and OC the other two.
Or get an i7-4610M. -
-
Makes me Wonder then if the upcoming 8 Cores Intel CPU (END OF THIS MONTH CONFIRMED) worth it for Gaming? it's 8 cores at 3K speed only, the 6 clocks might reach 5k with good coolers but noway for the 8cores to reach anywhere near that
Seems like we are stuck with 4cores than max out at 4k for laptops for at least another 2 years, 6 cores for laptop would be epic since all of the upcoming games are optimizes for 6 cores duo the PS4/XBOX ONE -
The gap between mobile and desktop CPUs is much smaller than the GPUs. Your 4940MX is essentially a mobile 4770K. AnandTech did a piece on this last year. -
Imagine trying to stuff a desktop GTX 780 Ti or Radeon R9 290x in a laptop...
Doable if you have a PCI-E port on the laptop mobo, a PCI-E cable connector, and the desktop GPU along with its heatsink and fans connected horizontally to the mobo.
Though you would need like two 250W PSUs. And that laptop is going to be one gigantic brick.
EDIT: I recall AMD mentioning that when they decreased the TDP of some of their desktop APUs from 65W to 45W, it only lost relatively little performance.
I would not be surprised if Intel's silicon process also favored power optimization over clock speed, thus allowing their high-end mobile CPUs to be competitive against desktop CPUs. -
Meaker told me you could theoretically engineer a 250W mobile card to use both MXM slots (one solely for power delivery). Though the amount of engineering it'll require and the final pricing essentially means it'll be a niche product in an already niche market, so unlikely to ever happen.
-
But give credit where credit is due for Intel's amazing architectural efficiency. The fact that you can have a laptop i7 with 90% of the performance of a desktop i7 while sipping 50% of the power is no small feat indeed. AMD can only dream of such efficiency. -
There's no competing with the hex/octa cores that's for sure. I do worry that 5960X is going to become a runaway nuclear reactor once OC'd...
-
-
-
The laptop Kaveri chips gained clock speed, which meant a Kaveri FX-7500 19W can compete against a two years old Trinity A10-4600 35W and just barely away from the Richland 35W APUs.
-
Broadwell looks like its a fail already.. 5% really? Well hopefully the battery usage decrease makes up for it..
-
The GPU being about 30% stronger isn't too bad. I think die shrinks and the optimizations of CPUs are never as strong as the new architecture that Intel comes out with every other time. -
-
Just realized Haswell-E takes DDR4 only. I pity the early adopters of X99, they're going to pay a premium for 1st gen DDR4 that will probably just match or even perform worse than DDR3 at the same price point.
X79 may still be the way to go until 2016.
Intel Broadwell GPU+CPU % power increase Revealed
Discussion in 'Hardware Components and Aftermarket Upgrades' started by HSN21, Aug 14, 2014.