Hardware has to keep up with software. For software to sell it needs more and more features. More features = more resources needed. Therefor until software stops developing hardware is going to have to keep pace.
-
-
Graphene sounds like less of a pipedream, but there is still a massive gulf between having a few experimental integrated circuits and a chip with a billion transistors on it. I am sure IBM, Intel and AMD are all looking into it, but if it was something that could be done by now, IBM would have done it. The reason is the same greed that you blame for stagnation: if it could come up with something truly revolutionary (self-cooling, 30GHz, etc.), it would leapfrog Intel and pick up a lot of server share.
You don't seem to understand how hard it is to work at these scales. For example, you disparage Intel's Tri-Gate advance, but it will take TSMC, Global Foundaries, etc. years to catch up. What you call "revisions" require a multi-billion dollar investments; a "revolution" like graphene would definitely require tens of billions of dollars. The reason you aren't seeing them isn't that Intel is greedy, it's that nobody wants to gamble that kind of money on something they're not sure of. -
-
Karamazovmm Overthinking? Always!
have you ever saw the coding olympics? Those guys are what we can produce in terms of elegant, and they are so few
The question is do I want it to run, or to run great? -
Intel to launch Pelican Lake shrink on 7nm in 2016 | KitGuru -
-
-
So when is Ivy supposed to be released? We are not talking CES 2012 are we? From the videos it looks like Intel already have everything up and running. There is one video where they have a working Ivy CPU in a laptop and a desktop.
-
-
That sucks. I had my hopes up when i read H2 2011. CES is probably the best way for Intel to advertise and reach out to people too so it makes sense. Dang
-
Thing is the 22nm process is the last major die-shrink which can be done without serious issues. Everything after that is much more complicated and while Intel wants 7nm by 2016 I doubt it will happen. The best we will get is probably 16nm by 2015 and 7nm somewhere in 2018 maybe. That tick-tack philosophy won't work anymore.
But truth being told, as someone said above, CPU power is more than enough nowadays. I have my QX9300 and it has yet to show any signs that it is an old CPU. The issue now concentrates more on GPU power were improvements in the past years have been mostly due to die-shrinks rather than architecture changes. -
-
IBM bakes new 3D circuit design - CNET News
Intel unfurls experimental 3D transistors - CNET News
The 3D circuit concept will be a decade old before Ivy Bridge comes to market....which should give those people asking "When graphene?" a good hint to their answer. -
... obviously it's been thought of before. You think they'd throw a new product out there without years of testing and perfecting? Every piece of hardware you've got in your computer was thought of years before it was released.
This is like saying "Why aren't we on 14nm? They came up with that tech years ago." -
Mr_Mysterious Like...duuuuuude
I really don't think the vast majority of us need any more CPU computing power...that's why people are saying that the tablets are cannibalizing the laptop market.
Mr. Mysterious -
It takes a maximum of 2 years to implement, test and perfect new concepts.
Aeroplane designs back in the 60-ies took that much time, and today it should be even faster what with the computers we are using.
If the above articles were published at the exact times of those concepts discoveries (and there's a possibility they were not and that the information was out of date even back then), we could have been using the 3d stacking in silicone chips 7 years ago.
Heck... graphene is older as a concept than that, and IBM already made several demonstrations (the last one being in 2010 with 100Ghz).
Seriously... for a commercial market, we should have seen implementation of graphene by now.
But at the rate we are going, it won't happen for another 9 years (if even then).
I know that some of you are content with 'toys' as we have them in the market and with a bit more efficient tech like high capacity SSD's are sorely out of reach due to stupidly high prices, but other people DO in fact require more powerful/faster chips than what we have today and would in fact like to see our technology being actual 'state of the art' and not this primitive trash that's being used to milk our pockets (I'm sorry, but that's what it is when you look at it from a broader point of view). -
"It takes a maximum of 2 years to implement, test and perfect new concepts.
Aeroplane designs back in the 60-ies took that much time, and today it should be even faster what with the computers we are using."
? Are you comparing two completely unrelated ideas to each other? -
So what if they are unrelated?
The point stands that it doesn't take vast amounts of time to implement something into practice.
Human history is full of such examples irrespective of how 'complex/different' certain technologies in unrelated fields are.
Going to the moon was a major engineering feat back in the late 60-ies, yet we've done it in a small time frame with technology that cannot compare to what we have today.
Why?
Because there was incentive to do so.
The industry on the other hand has little incentive to throw advanced technologies into the market.
They'd lose billions by not putting out revisions and giving us 30Ghz graphene quad core cpu's next year for example.
And since it would take time for software to catch up, there would be a pretty large pause (of at least half a decade) before people would be prompted to upgrade onto the 100Ghz version (which IBM demonstrated what... a mere year or two after 30Ghz... and results in 230% difference as far as raw performance goes using the same technology - now THAT is a viable upgrade).
A mere change in architecture that yields roughly 35% performance enhancement clock per clock every 2 years is NOT what I would call a 'major' improvement.
For the life of me, every time Intel die shrinks the manuf. process and brings dingy architectural improvements, limit the clock speeds AND have the audacity to OVERCHARGE on cpu's that bring only 15% speed increase...
Lol... I have to laugh at those who claim that technology is 'evolving'.
Moore's law is a very convenient method of bringing out technologies in capitalist environment... it ensures maximum profits on same technology without the need to move on to superior aspects.
But as we already established... businesses do this for profits, not 'progress'. -
Meaker@Sager Company Representative
Deks, not only do you have to come up with a new technology, you have to make sure it can be produced in LARGE quantities, with multi billion transistor chips producing high percentage yields.
You are trivialising some extremely hard problems.
On one side you point out Intel would make billions more if they release a new tech chip (especially if someone like AMD did, they would win) on the other side you say they are just happy with what they make now.
You need to think this through more carefully and do more research on the subject. -
"The point stands that it doesn't take vast amounts of time to implement something into practice."
Based on what? Airplanes? My point is that I can come up with the idea to go get eggs and it takes me 5 minutes, that doesn't mean that once you come up with an idea it takes 5 minutes to implement. -
kind of off topic, but shouldn't we be worrying about how fast we hit the bottom of the oil barrel and what we are going to do when we come out the other side? processor technology is exciting enough as it is, with more processors on one die, lower power consumption, the threshold for how small we can go before electrons can't pass, the competition and the imagination/concepts for newer processor technology. light and solar and the mix of the two are the energies of the future. there's been talk of optical processors and optical storage (not as we know it now) that consume far less energy and produce less heat and are in the early stages of concepts. i don't think a lot of people realize in the industrialized societies how much we need to get off our dependence of oil. i don't want to go too off topic but electricity as you know it now, and even the way the internet is webbed together are going to change drastically because of oil or the lack of to be more precise. distributed networking is in still its infancy but imagine how that will play into the mix also.
-
-
-
I do!
I think what you have to realize is that moving to graphene is not some innovative idea. It's a new material. 3D transistors and new architectures are innovative and very exciting. -
-
Chief River 2012 platform has fast flash standby
Ivy Bridge to process media up to 4X faster
Enjoy. I am all for Ivy Bridge and I am happy I waited for it. -
Mr_Mysterious Like...duuuuuude
Lol after the SB recall, I'm not sure if I trust Intel when it comes to releasing dates anymore. They'd want to be more careful, so we can expect it after the halfway point of the year.
Mr. Mysterious -
-
What kind of difference in power consumption can we except between SB and IB?
-
-
Karamazovmm Overthinking? Always!
-
Noise levels?
The only noise attributed to a laptop (at least in my case) would be the fan.
That's the fault of air-cooling.
As for performance benefits... we still don't know the raw numbers between SB and IB, but it's possible this new 3d stacking will allow for up to 50% gain.
However, it can easily just mean 'meaningless hype' and just lower power requirements.
What i wanted to know is when exactly can we expect 6 core and 8 core cpu's in laptops?
SB is the second generation of quads and desktops are well into the 8 core range. -
50% from Sandy to Ivy? I don`t see that happen. More like around 20%.
As for power consumption, the transistors use 50% less power, but not the entire CPU. Increased frequency and more EUs will take it`s toll. Intel have obligation to keep the TDP under a certain level to be able to cater for OEMs that put them in notebooks. Which mean that there is only so much speed you can get out from the CPUs with the current technology (die shrink etc) before you hit a wall. And from the specs released until now we know that Ivy will have identical TDP as Sandy Bridge. That should tell us something about the power consumption vs Sandy... -
Karamazovmm Overthinking? Always!
http://en.wikipedia.org/wiki/Noise_(electronics) -
-
-
well, i hope not everybody is hoping for astronomical battery life. think about it. only cpu is getting the benefit here. be realistic and expect an extra hour tops with what we're using now as far as batteries. maybe idle times might go up, but load times will be a much smaller gain. we're probably getting more benefit from the die shrink than the 3d transistors.
btw, aren't ivy bridge processors able to be dropped in to replace sandy bridge processors? i'm pretty sure that's been mentioned before. -
-
Compatible with H67 / P67 / Z68 mobos with a bios update on the desktop side. Nothing certain for laptops though.
-
Ivy Bridge to launch in March / April 2012 - Blah! Now I have to wait all the way to April.
-
They seem to really be going full steam ahead with mobile chips. Last week Otellini said they are shifting their central focus to mobile, and since then they've spatted with Microsoft and proclaimed that Apple leads the way for them. In other words, Intel knows that in order to survive they need to compete with ARM in the mobile sector while maintaining their relationship with Apple, wherever Apple goes. But since Apple is rumored to be planning for a switch to ARM in a couple years, Intel could be in serious trouble.
I don't mean to digress, but these points I feel are relevant in understanding that Intel needs to stretch out profits from traditional PC chips as long as possible. -
Karamazovmm Overthinking? Always!
-
And it gets even better: Only quad-core mobile Sandy Bridge in March / April
Looks like the firsts to come out are the quads with the duals coming out later ... -
While I'm excited about Ivy Bridge, I pulled the trigger on Sandy Bridge, and found it worthwhile. I'm getting significant battery life increases over my Penryn-2 laptop (though I don't go unplugged that often), as well as speed, and graphics performance (Optimus vs ATI switchable from the past system). mSATA has also been a great development, convincing me to go with my first SSD.
I found little compelling reason for Clarkdale or Arrandale (and skipped them), but I'm happy with this choice. I'll probably skip Ivy Bridge because I'm not made of money, but I'll watch for its successor.
As for the desktop arena, I'm still using a Q9650. I can't justify the cost of a mainboard, CPU, and RAM swap when I have 8GB of DDR2 and a processor that is still competitive enough for my needs.
We've come a long way since my first home-built system (386DX-20MHz). -
Forget Ivy Bridge. Lets talk Haswell.
Developed by the same team that brought us the Nehalem architecture, by far the most innovative product Intel managed to deliver.
-
.
-
Nehalem was certainly a great addition but not as great as Conroe imo.
-
Meaker@Sager Company Representative
Which was not as good as the original pentium M.
"Why is your 2ghz pentium M out performing my 3.6ghz pentium 4?" -
Was it really that bad?
-
Jayayess1190 Waiting on Intel Cannonlake
Will get updated when we learn more.
Forget Huron River, 22nm Ivy Bridge on the Way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Oct 1, 2010.