More haswell info from IDF today.
AnandTech - Intel Haswell Architecture Disclosure: Live Blog
-
-
Karamazovmm Overthinking? Always!
I think I made the right choice of going for SB, haswell is looking more interesting everyday -
HopelesslyFaithful Notebook Virtuoso
glad to here looks like i am getting a new netbook in the near future. I liek my SB Samsung netbook but i always though it could use better GPU and this sounds like 7 watt less TDP and 50% CPU and 100% GPU boost over my SB chip woot
-
Karamazovmm Overthinking? Always!
it looks lovely, I wonder what the rmbp 13 is going to pack
-
HopelesslyFaithful Notebook Virtuoso
IB
i'll give apple some credit for finally somehwat keeping up with the pace of tech. I always laughed at apple before because we were at SB and they still were selling Core2s lol
-
UPDATE
Intel just confirmed Haswell and HD5000 is coming in 2013 and is going to be around DOUBLE as fast as HD4000 right now. Insane -
tilleroftheearth Wisdom listens quietly...
Link?
10char -
Karamazovmm Overthinking? Always!
-
very excited about haswell...will be buying a gt3 equipped ultrabook at release (pending benchmarks of course)
-
Twice as fast, three times as fast, yadda yadda yadda.
All lip service. It'll be "faster" but I can't imagine why people get so excited by the marketing spin. -
Karamazovmm Overthinking? Always!
-
that is true -- hence pending benchmarks
and that skyrim display means absolutely nothing. not a single mention of fps, settings, etc -
Karamazovmm Overthinking? Always!
Intel predictions (LOL they have the chips with them for testing) have been pretty accurate, on all regards regarding their cpu and gpu performance since the i core line.
-
Call of Duty 3, 1080p, High details. No information about FPS but it looks playable
Skyrim 1080p High details. No information about FPS but it looks playable
<param name="movie" value="http://www.youtube.com/v/H3VoD4IkHCY?version=3&hl=nb_NO"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/H3VoD4IkHCY?version=3&hl=nb_NO" type="application/x-shockwave-flash" width='730' height="465" allowscriptaccess="always" allowfullscreen="true"></embed></object>Last edited by a moderator: May 6, 2015 -
Last edited by a moderator: May 6, 2015
-
Intel's driver support is so bad, they will never "kill" midrange GPUs.
-
Karamazovmm Overthinking? Always!
-
They are settings the benchmark for machines without a GPU.. and that is amazing, hard to understand that the current HD3000 is better than my old 8600M GT -
HopelesslyFaithful Notebook Virtuoso
-
Haswell might replace Fermi midrange mobile GPUs (555M and below) but no way in hell its replacing Kepler midrange GPUs (GT 640M+).
This will continue every year with every CPU Intel releases, and with the low price for the Kepler midrange, I hardly think Intel will offer any real competition. And I think Intel will meet a dead end pretty soon since the CPU doesn`t have the same free resources as the dedicated GPUs since the IGP have to share the resources with the CPUs. After a while Intel is met with the question: Either just make a CPU or start manufacturing GPUs. Or both.
imo -
Karamazovmm Overthinking? Always!
However the idea is that intel is really aiming for the mid range aint bull, it may not be high mid range (640m, 7750m), it may well be low mid end (630m, 7670m), but they are gunning for it, every year we receive a 10-15% cpu performance bump with the introduction of the core i series, however on the gpu side, its no less than 50% each year. Sincerely, I hope that intel gets it right, so that it forces AMD (as a cpu and gpu maker) and Nvidia to work harder.
For example its ridiculous that it took 2+ years to surpass the performance of the 5870m and the 460m in higher mid range gpus. While in the high end side of gaming we see atremendous boost of performance since those cards -
HopelesslyFaithful Notebook Virtuoso
EDIT: and 660 is high mid/low high -
Karamazovmm Overthinking? Always!
but yes, sorry, 650m is the high and 640m is the low. -
-
HopelesslyFaithful Notebook Virtuoso
note though it had no shaddows or AA so that is like 50% of the GPU power
-
TDP is only relevant in load for battery life though. In light load cases like browsing, the CPU is pretty much close to zero power. Unfortunately the rest of the platform does not scale. So Haswell greatly focuses on bringing down the idle power for the platform.
In idle with Ultrabooks you are talking:
6-7W total system power
~3W display
~3-4W platform
Haswell drops the platform idle power to 1/20th of that, while techniques like Panel Self Refresh saves 0.5W on the display.
Think of now how only the CPU is scaling up and down in power management, while the rest of the platform(all I/O devices like SSD, PCH, PCI Express, USB) stay relatively similar. With Haswell, rest of the platform will scale down in power when its not demanded.
-
Karamazovmm Overthinking? Always!
actually for me the most interesting part is that intel introduced a new power state, that means a intermediary lower power state
-
-
Haswell is going to bring true ultrabooks.
When the battery life is longer than 10 hours and yet has the same power as a i3/i5 then you can truly be a road warrior!
I don't miss the days of 3/4 hour battery life. -
ive been seeing various release timelines all over the net :s
mar-may next year, summer, q3 or 4....all over the place :S -
Jayayess1190 Waiting on Intel Cannonlake
-
Karamazovmm Overthinking? Always!
AnandTech - Making sense out of Intel Haswell Transactional Synchronization eXtensions
another interesting tid bit -
tilleroftheearth Wisdom listens quietly...
Mr MM,
Just finished reading that piece and saw you beat me to posting it here!
I was just wondering when we'll get more cores and if the graphs showing TSX at work here:
See:
AnandTech - Making Sense of the Intel Haswell Transactional Synchronization eXtensions
are able to be as easily applied to today's single core applications, then we should be seeing 16/48/96+ core processors much sooner.
Can't wait!
And a side bonus will be that using big/hot/noisy/power hungry gpu's for 'compute' will finally be a relic of the distant past (especially as HD 3000/4000 igpu graphics are already more than just good enough for productivity workstations).
Seems like nothing can stop Intel from it's forward and upwards momentum - hope they continue to have even more surprises for us over the next few years/decades. -
-
tilleroftheearth Wisdom listens quietly...
If sustained progress (by Intel) like this doesn't light a fire under AMD's butt, what will?
Yeah, I agree with you - but AMD seems to be on the wrong path (imo) for a long time now. Intel is picking/exploring AMD's best ideas (igpu's/apu's) while AMD is seemingly ignoring (to their own detriment...) Intel's strengths/direction - no matter how obviously better they are.
But I guess that is real competition - both companies exploring entirely different branches (if they were merely copying each other, that would be boring too). -
I feel Intel's main competitor is going to be ARM and not AMD. Specially once they have quad core A15 tablets at a cheap price.
-
tilleroftheearth Wisdom listens quietly...
I agree that AMD is not even on Intel's 'keep a watch on' radar for a while, but Intel is going to be ARM's competitor (not the other way around).
See:
Intel Atom S ''Centerton'' Specs Leaked
Sure, this isn't quite 'on target' yet, but I'm sure ARM is just as worried as AMD is at this point... -
Karamazovmm Overthinking? Always!
I agree that in the end this is one major move for developers to gain more performance out of a not so good multithreaded (MT) code. The fine tunning for MT is a high cost/time job, and you really get gray hairs or pull all out before you are done to some extremely good job at that.
So haswell or broadwell will be when I change my pc, I hope that we receive more news from broadwell near my purchase time, i.e. around this time next year. -
-
HopelesslyFaithful Notebook Virtuoso
-
Not impressed. Krait quad cores are lot more power efficient and fit phone envelope. 2ghz centerton seems to have 8.5w TDP. Rather 10w ivy bridge will be more impressive. Atom will look woeful until they have the new architecture.
Plus I doubt Intel will ever be bigger than ARM in low power devices. One Apple, Samsung, Qualcomm will continue to develop their cores for diffrentiation and that will dominate majority of devices plus ARM chips will be significantly cheaper than Intel. I doubt intel will ever want to get into sub $20 market.
I think ArmV8 will definitely challange intel. Their ecosystem is so robust that intel will be stretched for sure. -
tilleroftheearth Wisdom listens quietly...
I agree; Atom is not what I'm impressed about.
The direction Intel is heading at is what is interesting - I'm sure in less than a decade we'll have Intel IB level chips powering future smartphones/tablets/toys/etc at less than 1W power consumption.
Krait Quad cores? Don't see those ramping up to IB+ specs (ever)... Intel on the other hand is in a better position to bring thier already powerful processors down to acceptable power and thermal levels.
Will Intel ever sell for $20/pcs? No, probably not - but why should they? I would rather pay more for more - not less for barely good enough (in real world usage...) than 2/3/4/5 year old designs that cost even less than the $20 'target' I am assuming Krait is targeting. -
HopelesslyFaithful Notebook Virtuoso
thats what i hate about ARM is the current quad cores that are 28nm are just barely good enough that you don't want to dump for the next best thing in 1 year. All the other SOCs prior were that. They worked and as soon as something new game out and it abandon ship!
-
H.A.L. 9000 Occam's Chainsaw
The thing is at this point, Apple has set a precedent for tailoring their own SoC (A6). Apple is so vertically integrated at this point that I wouldn't put it past them to start implementing higher power versions of their own SoC designs into products like the MBA (or something like it). ARM designs certainly have enough power to push that form factor. Imagine a MBA with a true 15+ hour battery life.
It's not so much the hardware that makes people want the next best thing. It's just the way people are. Galaxy S II's from more than a year ago can still mostly keep up with brand new Krait handsets with 2GB of RAM... it's just Krait has more Floating point power. -
Karamazovmm Overthinking? Always!
the problem with that idea is how people will develop apps for arm designs, instead of x86. We will have a glimpse on how this works out with windows rt. And on the contrary road the medfield equipped phones (though they are emulating the arch in android AFAIK)
-
Karamazovmm Overthinking? Always!
AnandTech - Intel's Haswell Architecture Analyzed: Building a New PC and a New Intel
yep another article
I havent finished reading currently on page 8, but here is the thing, power power power, thats the order of business with a much better wide execution engine (meaning gpu arch workflow) -
That is a very long bull reading about 5-15% CPU performance increase over Ivy, expect one page which reveals Intel's uncompetitive part (GPU) will stay uncompetitive in Haswell generation too;
" Haswell builds on the same fundamental GPU architecture we saw in Ivy Bridge. We won't see a dramatic redesign/re-plumbing of the graphics hardware until Broadwell in 2014."
" Overall performance gains should be about 2x for GT3 (presumably with eDRAM) over HD 4000 in a high TDP part. In Ultrabooks those gains will be limited to around 30% max given the strict power limits."
After all these I can imagine AMD's Kaveri will be delayed to 2014, there is no reason to hurry. And I think this time of last year we have already heard Trinity news... -
Karamazovmm Overthinking? Always!
No one here is expecting AMD levels of igpu in haswell, no one.
AMD is far ahead in that game as far as Im concerned, and only with broadwell will intel start to think that they can do more on that, since the igpu is stated to receive an overhaul, do I still expect that they will be toe to toe when broadwell hits? no. -
Oh, I'm glad to read your rows, because I had to correct few people before who claimed there is only 20% difference between Trinity and Ivy GPU, while Haswell will be faster, even as fast as Kaveri.
Forget Intel Ivy Bridge, Haswell on the way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.