If I'm reading your post correctly, you're saying that the CPU and/or GPU will throttle if the whole chip is will go over TDP ( which is true, at least in the case of the HD4000). Since HTWingNut is saying that his chip isn't throttling (first post on this page), I think that it's not going over tdp. I was just attempting to point that out in my first post.
Also, you're right that TDP has nothing to do with temperature. Temperature is not the same heat. (e.g. a candle flame has a higher temperature than a tub of boiling water, but the latter has more heat).
-
HopelesslyFaithful Notebook Virtuoso
TDP is the value Intel has designed to limit a CPU so it does not over heat a system and gives manufactures a guide on how to build the heat sink. If the CPU uses 45w TDP at max load than the GPU cannot be used due to there is no thermal head room. If both are working than they have to share. As I stated with my u2357m (I think that is the name) it has a 17 watt TDP and both the CPU and GPU can use it up by themselves or very close. So when I play a game CPU and GPU have to share…period. They are not allowed to break that threshold. Note if the ULV or mobile ship has turbo boost it can go over the standard TDP for a set window as long as temps allow but it will never stay forever at that from my understanding. Again it still has a TDP window but it is called the short term TDP window. So on the 3720qm it has 65w TDP for short term and 45w TDP for long term.
Now towards you directly HTwingnut….my god has HTwingnut used Wikipedia as a source??!?!?!?!?! How many times have you trolled me for using Wikipedia and noteboockehck.net. You have trolled me big time for using notebookcheck.net and it is by far more reputable than Wikipedia. I think you even trolled me for using tftcentral even though they have the best reviews period. So it amazes me how you can be a hypocrite about this. None the less I will let this slide but let’s learn how to play nice in this forum and get off our high horse! So anyways HT with your logic that means my 920xm if I am running it at 30C I can overclock it to 4GHz while leaving the TDP at 55w right? Bullocks! We have to up the TDP because it will limit it. Hence why you see the TDP fluctuate when it does different tasks. It has nothing to do with heat but power draw. Note heat is directly related to it but not the other way around. Also note that TDP w does not mean it actually uses 45w of energy. It is just a value Intel has made to base its energy and heat usage. Energy tends to be higher than TDP. Same with my 3720qm I can set it to 3.8GHz but it’ll only run at that for the short period that the short term TDP window allows…which is still a TDP window. If I set the short term window to 45w I will never go higher than 45w even if temps are low. Do you understand how this works now? I don’t know amd chips at all but they should follow the same principal. Removed last statement because I am not sure if it was accurate and was confusing.
From as far as i understand this is how it all works. This is based off of intel statement and from what i have read and tested myself.....mostly from what i ahve tested with my ulv and 920xm and 3720qm
R3d id i get everything right? Any opinions?
EDIT: BTW i am still stoked to see some tests showing the idle of this chip...i hope it delivers. I saw on tech power up they had a picture of clock for clock and not impressive but thats not what is interesting. The new GPU and independent voltage regulation is whats cool with this chip. -
Will these be combatible with current ivy bridge computers, I have Sager NP9150
-
No. Haswell will have a different socket.
-
Oh well I might as well just post it since nobody else have. Apparantly some russian guys have gotten their hands on a Haswell Engineering Sample CPU, and have tested it @ 2.8GHz against an Ivy Bridge CPU at the same speed to see the clock for clock difference. I don`t believe that this is legit results, but feel free to share your thoughts.
Source: Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge" | techPowerUp -
ES vs OEM isnt a fair comparison. Some improvement s will be made and the OEM is there for better.
-
Compared to Ivy Bridge, is Haswell a tick or a tock? Also will it be released on March or June?
-
It's a Tock due in June.
-
I have a hard time coming to peace with that the Haswell will be 3-5% faster than Ivy -
HopelesslyFaithful Notebook Virtuoso
again the biggest thing intel is going for with this chip is GPU and voltage controls. i don't think intel cares too much about the CPU plus i don't really get the point in improving the clock for clock. Not like it really matters since they both are 22nm and the performance difference will still not be effected (performance per watt)
just my humble opinion -
No AMD to put preassure on Intel = Milking consumers as slow as possible.... but thats life, still gona buy it, as its more a hobby for me than a tool. But imo what intel should do is give us six cores at least on desktops on the high end (none E version)... but i dont see a reason why.
-
If they concentrated on just the IGP, than a big FU to Intel
Improving clock for clock is getting a more efficient CPU. Either you get a faster one at the same clock as Ivy chips, or you can lower the by let say 10% clocks, and it will be equally powerful as a 10% higher clocked Ivy, but with less heat.
-
HopelesslyFaithful Notebook Virtuoso
Also it appears that Intel over the last several years has been shifting focus much more on the mobile sector than the desktop sector. It seems that ever since the i7 came out intel has been focusing much harder on the mobile sector. In today's age most people get laptops over a desktop it seems. I think that is largely because laptops can actually do a lot more than back in the P3/P4 era (P6 architecture right? or mostly P6) I remember the Core due and Core 2 due mobile chips still were vary crappy compared to a desktop chip especially with iGPU being so crappy. Well with IB a 55w laptop chip is only 22w less than the desktop counter parts ^^ Why I always gun for the Die shrinks because they provide the best bang for your buck when it comes to performance per watt.
Also a 10% reduction in clock isn't even going to get close to a 10% reduction in heat. I would be surprised if it would get even a 25% reduction in heat (25% of the 10% so 2.5% total reduction) I would love to see a documentation showing a typical drop in energy per clock. I know when you break the 3GHz area the efficiency starts to tank big time because i remember the terrascale project clearly showed 3GHz is like the sweet spot. Going from 3GHz to 5 GHz required i think it was 2.5x the juice. Having a hard time remember the difference....one sec...nm it was 4 times the power to go from 3-5GHz. I am sure that varies a lot between chips but really shows how anything above 3 GHz gets stupid wasteful. (hence why i think desktops would benefit the most and laptops wouldn't notice a lot.
http://en.wikipedia.org/wiki/Teraflops_Research_Chip
EDIT:Question what is the 920xm 4 core turbo speed? for 55 watts? I get 920xm to 3.2GHz for 65-72 TDP i think so i was thinking what is the difference per freq per watt to see how the efficiency plays out. -
@HopelesslyFaithful
Just replying to your post on the last page as requested. AFAIK you'd cant measure TDP separately between the CPU and iGPU (for Intel processors at least, there's AMD system monitor for AMD APUs, but even that doesn't give TDP allocation).
Everything else you wrote fits with what I've seen around the web. What is interesting though is going from 45w to 17w (i.e. about a 60% reduction in tdp) yielded about a 30% FPS hit for Intel IVB CPUs (with virtually no throttling for the 45w quad) as per the Anandtech article. This doesn't necessarily apply to AMD APUs, but it seems to me that even accounting for TDP limitations, HTWingNut is still getting less FPS than he should. So it could be a combination of factors. -
HopelesslyFaithful Notebook Virtuoso
in HWinfo you have core TDP GT core TDP and package TDP from what i remember. i can look tomorrow. GT core TDP is iGPU
-
So if there really is hardly any CPU performance gains with Haswell, are the power savings at least going to be out of this world again? I'm honestly not screaming for better battery life in my laptops anymore, and while the GPU is going to be making another big jump with Haswell, it's still going to be classic Intel GPU a day late and a dollar short. Honestly, I don't think I'll be seriously interested in an Intel GPU until it can be used to accelerate production apps like Photoshop and Premiere, and that's not going to happen for awhile simply because of the low amount of VRAM in Haswell. -
HopelesslyFaithful Notebook Virtuoso
EDIT: oh the one other thing i know intel is milking the market on is with delaying their release of their chips now by 6 months...they are now 6 months behind schedule but after this release they will now be behind 9 months!!! (if memory serves me correctly with all the "delays" they have had.) -
The fewer cores their processors have, the cheaper it is for them to make. If their top end laptop processors have 4 cores, they will still sell for as much as a 6 core processor would, but it costs them less to make it. Also the 4 core processor can clock higher and still be within the same power envelop, so that takes away a decent chunk of the performance advantage of 6 cores as well.
-
Jayayess1190 Waiting on Intel Cannonlake
-
-
Some performance numbers? If these are true Haswell is 10% faster than IVB at the same clock, for the Fritz chess benchmark at least. (note: link is in Chinese, so get google translate ready if you dont use chrome)
-
HopelesslyFaithful Notebook Virtuoso
-
Here is some more info I could dig up:
CPU power consumption as a function of frequency and voltage - AnandTech Forums
Frequency scaling - Wikipedia, the free encyclopedia -
HopelesslyFaithful Notebook Virtuoso
-
nvm. misread.
-
-
the 22nm heat issue also won't be solved because they use the same "tick" (?)
well I am not an expert in these stuff but seems like everyone in the thread later on pretty much agrees that haswell improvement isn't as significant as they were expecting -
HopelesslyFaithful Notebook Virtuoso
what 22nm heat issue?
They are exactly what i was expecting...better GPU and than the added bonus of core voltage regulators. Only so much can be done on the same process. -
Jayayess1190 Waiting on Intel Cannonlake
-
Now that's as confusing as ever.
-
-
Karamazovmm Overthinking? Always!
they never made the distinction between the higher clocked models and not, dunno why know? lawsuit?
-
Preview from Tom's Hardware out. Doesn't look that great... I hope Intel can optimize it some more before release.
-
HopelesslyFaithful Notebook Virtuoso
exactly what i expected..... improved on idle and GPU this design was for mobile more than desktop...or a desktop for the average joe.
-
Personally, I'm already impressed by the GT2's benchmarks. Skyrim on Low (which looks terrible, but that's not the point) at 1080p 37FPS average. More demanding games around 30.
The GT3 you're going to be able to game on. Not max settings, but it won't be the stupidest idea to game on an integrated Intel GPU anymore.
The issue is going to be the new consoles coming out is going to raise the lowest common denominator way past what the GT3 can do, and that's something Broadwell won't be able to gain enough ground on, next year. (Any word on expected GPU improvements with Broadwell, yet?) -
Karamazovmm Overthinking? Always!
Does anyone know about the rumor of USB 3 being faulty?
-
HopelesslyFaithful Notebook Virtuoso
-
Karamazovmm Overthinking? Always!
-
Jayayess1190 Waiting on Intel Cannonlake
-
HopelesslyFaithful Notebook Virtuoso
damn those numbers are confusing....why change a good naming scheme.....again -
Karamazovmm Overthinking? Always!
I still think we may be in for some surprises in the line up -
HopelesslyFaithful Notebook Virtuoso
-
We consumers are too stupid to figure anything out ourselves beyond the dork at Best Buy telling us we need to buy our grandmothers i7-4773MQZ's.
I miss the days when Intel named processors for what they were. So let me get this strait, we have three variables in each name.
First it is Celeron, Pentium, i3, i5, i7. This is then followed by a random 4 digit number always starting with 4. After that, we have HQ, MQ, MX, and U. What a disaster. -
HopelesslyFaithful Notebook Virtuoso
-
Karamazovmm Overthinking? Always!
Well engineers, the little monkeys of science, never trust them to name things.
Vollausbau der
Intel HD Graphics
-
-
HopelesslyFaithful Notebook Virtuoso
What I wish we had was adjustable tdp for the u CPUs. U hit 17w with either the CPU or gpu. When you play a game you hit 50% on both because of tdp limits. It would be awesome to allow in a plugged in state a 35 watt tdp so that u can use both at full speed.
-
Karamazovmm Overthinking? Always!
-
The only GT3 (HD 5000/5100) will be BGA only in the first 2-4 months. Along with them, we get normal Haswell CPUs but with GT2 (HD 4600) After that, in Q3 (rumored October) we get normal non soldered Haswell CPUs with GT3 (HD 5200). They will be called Core i7-4850HQ and Core i7-4950HQ.
Not sure if I want a notebook with BGA CPUs like Core i7-4558U (GT3), Core i7-4550U (GT3). I feel bad for buying the GT2 CPUs listed above in the picture, but they are clocked higher than the GT3 models. Probably because it have more thermal to go on because of the weaker IGP. I rarely use the IGP and do like faster CPUs. But this time around the GT3 might be powerful enough to not having to fire up the dedicated GPU.
Hmmm, what to pick. -
HopelesslyFaithful Notebook Virtuoso
Yes true but it would give the manufactures the ability to probably limit it in the same way they do it now with prochot and stuff. Plus it would allow me to figure out some getto way to properly cool it and have an awesome ultra portable notebook ^^
EDIT: also forgot this
I am exited to see the new design for the single reason of independent voltage for all cores. i am curious to see if a 35w TDP chip will have the same or virtually the same idle as a ULV. The probably with all previous designs is that the idle power usage for 35-45w CPUs is that they use 7-10w idle and can't touch the low idle power draw of the ULVs. I could careless to be honest about a ULV if this new design fixes that.....it pretty much defeats the point of a ULV ^^
Forget Intel Ivy Bridge, Haswell on the way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.