everything i have read in the past 12 months says maxwell is 20nm. Where have you seen anything stating it is 28nm? There was even a little hope granted 6 months ago that the x200 series was in 20n but that turned out to be obviously false. I find it highly unlikely they release another architecture on the same 28nm
-
HopelesslyFaithful Notebook Virtuoso
-
It is because 20nm manufacturing process not ready yet, need another half year or more and Maxwell is coming earlier than that. Obviously than they keepin 28nm, plus here is a reading about that.
tilleroftheearth and HopelesslyFaithful like this. -
But really, we do need an updated roadmap, because with the extension of Kepler and the delay of 20nm, everything has been turned on it's head. I'm just expecting bigger and hotter 28nm at this point, however. -
Don't expect 20nm chips until second half 2014, and as usual AMD might be the first one out with a 20nm GPU. Now as for which company will come out with 20nm chips first, its probably Samsung.
-
HopelesslyFaithful Notebook Virtuoso
....seriously it is bad...intel will have had 22nm for 2-3 years before everyone else -_- It used to only be 1 year. No wonder intel isn't keeping their old time frame...no reason :/ I do think intel is smart to use their extra space up though for resale
I am surprised they haven't in the past. Imagine if intel used the extra space for AMD's GPUs
14nm GPU next year OMG!!! (geeking out as you can tell)
I am curious on how intel is staying so far a head. Are they just smarter or is it their premium price allows them better R&D budget to just throw money at it. -
As to why they're renting out fab space, traditional processor markets are slowing or declining, so Intel (presumably) can't fully fill their fabs with their own chips anymore. As a result, they're letting others use them, provided the others don't directly compete with Intel. -
Karamazovmm Overthinking? Always!
they rent fab space because otherwise those machines would become scrap.
not everything that goes inside is made at the same process node, for example crystalwell and various parts like the PCH
but that still doesnt cover what those factories can dish out, so they try to get contracts on those.
a similar example, would be how the US car makers dumped low quality cars in south america to utilize that machinery that otherwise would go to the trash, so they brought that machinery and made cars that those machines could make, i.e. garbage. -
-
Karamazovmm Overthinking? Always!
that begs the question how much depressed demand is intel waiting for cpus? -
Jayayess1190 Waiting on Intel Cannonlake
-
^So according to that, unless Apple switches to the H series in the 13" Retina MBPs, we're going to be stuck with 2 cores and GT3 (non-e) graphics again.
-
-
Karamazovmm Overthinking? Always!
you mean skylake
my hopes of broadwell coming with ddr4 are getting smaller by the minute, albeit industry slowing down the production of ddr3 and the prices going up considerably -
-
Lets revisit that article you guys got that from shall we?
http://www.computerworld.com/s/article/print/9244215/DDR4_memory_may_not_find_way_into_PCs_tablets_until_2015
Broadwell will support DDR4.
So yeah, like I said, odds are improving that mobile broadwell in June 2014 will also support DDR4.
Micron and Samsung are already massproducing DDR4 as we speak. Since they will be available for consumers by June 2014 and desktop people already using DDR4, why shouldn`t we also get it?HopelesslyFaithful likes this. -
Karamazovmm Overthinking? Always!
we already knew that haswell E is going to have DDR4, and I didn't get that from that article I haven't read it.
its a simple matter of given that haswell E is going to have it, there is a much smaller chance that broadwell has it. while its not impossible, its certainly doesn't play in its favour.
I don't know I don't have much in terms of expectations. I do hope that next year there is DDR4 and sata express, because I really want a new pc, its going to be 3 years already with this one. -
-
You are just making up assumptions based on nothing.HopelesslyFaithful likes this. -
HopelesslyFaithful Notebook Virtuoso
Cloudfire likes this. -
DDR4 @ 2133MHz (which shown at that slide above) consume 25% less power than DDR3 @ 1866MHz ( source), so if anything that makes the move toward notebooks with batteries more needed than desktops if you ask me.HopelesslyFaithful likes this. -
Karamazovmm Overthinking? Always!
and I'm not even saying it aint going to happen, just saying that I don't believe it will. -
Meaker@Sager Company Representative
Not all chipsets supported it. Now the imc is on the cpu it becomes a different game. With the igp getting stronger and stronger intel have a large incentive to push ddr4 to keep an edge over amds apus which is about their only threat.
-
Jayayess1190 Waiting on Intel Cannonlake
-
Jayayess1190 Waiting on Intel Cannonlake
-
18 cores? What?
-
-
-
Karamazovmm Overthinking? Always!
-
-
-
Also, for processes that are not highly parallel, there is a sharp drop in performance gained per core added. Functionally, this means it is not worth the cost of producing a processor of more than eight cores for client workloads. Higher core counts are only useful in graphics and other programs that are specially designed to be nearly 100% parallel (as in supercomputers).
-
Karamazovmm Overthinking? Always!
when you move to the enthusiast you are left with 6 tops -
I remember hearing about Intel's 18 core Haswell Xeon processors a few weeks back on a podcast; I think it was on Twich.
Could any workstation-type application/processes even make use of that many cores or is it more of a proof of concept?
From the business side of things; the pay off (meaning productivity) has to be worth it for the user to justify the price, no (I am sure it will carry a hefty price tag.)?
Sent from my Nexus 5 using Tapatalk -
Karamazovmm Overthinking? Always!
here are some examples
AnandTech | The Mac Pro Review (Late 2013) -
Interesting article but how does a review of a Mac workstation relate to my question, maybe I missed something?
Sent from my Nexus 5 using Tapatalk -
Meaker@Sager Company Representative
Because it looks at threaded applications due to the 2011 socket used?
-
Confused here as well.
-
Karamazovmm Overthinking? Always!
imagine that other apps like MASSIVE also make fair use of multiple cpu cores as well. -
Aside from that i don't see much changes from Haswell. -
I don't even think massive prime could make fair use of 18 cores. In their product page they state..."a workstation with four cores may run sims up to 3.9x faster." By that logic 18 cores would increase sims by 17.55x, I am willing to bet that isn't even remotely near the actual performance gains.
http://www.rfx.com/products/49
My conjecture is that I don't think that there is currently software available to make use of 18 cores...that's okay though, the apps will catch-up.
But I digress, I don't want to get the thread off track.
Sent from my Nexus 5 using Tapatalk -
-
Jayayess1190 Waiting on Intel Cannonlake
-
HopelesslyFaithful Notebook Virtuoso
This is F@H on a 3720qm (Full speed is 3.5GHz and ~45w TDP)
2000 MHz
17 TDP
117.6470588 Perf/W
57% of Max Perf
38% of Max TDP
151% Efficiency
2GHz was better than 1.8GHz and 2.6GHz but maybe 2.2 might have been better but never tested that freq to find the exact value.
With those numbers you could get a way with a 16 core chip at 22nm at 2GHz it would run ~60 TDP. I assume it would be around 60 and not 68 because some things would not have to be duplicated like extra GPU cores idling and other parts of the CPU.
I have not gotten my hands on a haswell chip to test scaling but i bet it would be better down the road with the new things being done. A 16 core 2GHz chip is very do able in a good laptop like an m18x or clevo.
2.6Ghz was 23.5 watts BTW so if you had a beast of a cooling system you could get an 16 core 2.6 GHz 22nm IB in a laptop. I was able to jerry rig my G51j to support 68 watts at 80C IIRC. (920XM)
You could get away with a 16 core CPU in an m18x if it was programmed correctly. You could run 4 cores at 4-5GHz and 6-8 cores at ~4GHz and and 16 cores at 2-2.6GHz. It would work just they haven't programmed windows to manage the CPU well enough to know what parameters to follow. Plus no one really thinks it is worth while to do.
I want to get my hands on a haswell but preferably a broadwell to see how the IVR works and see how it does compared to an IB CPU. If IVR does work than a 16 core laptop is easily do able with IVR
For whatever reason no one wants to get creative with technology :/ The tech is out but everyone is ok with status quo. -
When you figure in economics, there comes a point where it is no longer cost effective for companies to invest in adding more CPU cores when they will have less benefit. For programs that are 50% or more parallel, a dual-core processor is between 1.5-2x as powerful as a single-core processor, which is definitely worth it compared to the ideal of 2x. A quad-core processor is between 1.75-3.5x as powerful as a single-core, so not quite as good a deal compared to the ideal (4x), but still okay for more parallel programs. 8-core CPUs are between 1.9-6x more powerful, and 16-cores are between 2-9x more powerful than single-core CPUs.
So while it is definitely possible for companies to put out 16-core processors now, customers are not going to see many real usage performance improvements compared to quad-core processors using the same design unless the programs they use are all highly parallel, and not enough programs are highly parallel for the vast majority of customers to pay the substantial (4x) price increase that would be required for a processor with 16 cores for only getting 50% or less of a performance increase over quad-cores for most real-world applications.
I disagree with your conclusion that higher-core processors are not available because no one wants to get creative or people are satisfied with the status quo. Instead, I believe the reason many-core processors have not further proliferated is because there aren't enough programs that are parallel enough to take advantage of them and they aren't cost-effective to make or purchase for the vast majority of consumers. When this is combined with the slowing gains in computing performance at each new process node, performance is approaching the physical limits of its base (silicon), and completely new technologies will need to be developed if significant performance gains are going to once again be possible. -
Sent from my Nexus 5 using Tapatalk -
HopelesslyFaithful Notebook Virtuoso
I personally could see use for a 6-8 core CPU now...16 core? Not so much. Besides running F@H on 16 cores just because but 6-8 cores i definitely could use. -
Details of upcoming Intel`s 9-series chipset (the one that is required to run Broadwell CPUs):
So the only improvement in my eyes is support for PCIe M2 SSDs. That is what we will see in notebooks this year. Slot for mSATA SSDs running 1GB/s with RAID0 support for 2GB/s.
Specs and highlights of IntelHopelesslyFaithful likes this. -
HopelesslyFaithful Notebook Virtuoso
EDIT: sorry referring to desktop....I am good on my M17x for now. Granted desktop is always easier to upgrade ^^ so swapping MOBO and CPU isn't that bad.Cloudfire likes this. -
Business wise I wouldn't think all innovations would be given in one hit, need to save some to make the next thing appealing. Maybe start of with 6 cores and leave the 8 for later. -
Are these "U" cpus any good, do they match the performance of "M" cpus?
Forget Intel Haswell, Broadwell on the Way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Mar 16, 2010.