If I'm reading your post correctly, you're saying that the CPU and/or GPU will throttle if the whole chip is will go over TDP ( which is true, at least in the case of the HD4000). Since HTWingNut is saying that his chip isn't throttling (first post on this page), I think that it's not going over tdp. I was just attempting to point that out in my first post.
Also, you're right that TDP has nothing to do with temperature. Temperature is not the same heat. (e.g. a candle flame has a higher temperature than a tub of boiling water, but the latter has more heat).
-
HopelesslyFaithful Notebook Virtuoso
So TDP with the link to SB, which also includes IB talk about how the secondary TDP Called short term TDP window if I remember correctly has to do with Intel has realized that, “Temps in CPUs take 10-60s to reach max temp” quoting myself ^^ So Intel has assumed that manufactures will build a system take can maintain 45W TDP under adequate temps. They use this knowledge to help give CPUs an extra kick for a short period. In Intel XTU you can set the short term and long term TDP. Default for IB is 68 and 45 for the 3720qm. These values have nothing to do with temps. I repeat these values have nothing to do with temps. R3D was nice enough to find a source showing exactly what I am talking about. But to no surprise anandtech has again failed to do a good job of showing the issue. Why in the hell they don’t include the TDP on the core, GT, and total package blows my mind. I realized this back on SB using the ULV so it blows my mind this isn’t a well know issue. If they recorded/showed the TDP over every section of the CPU then we could see in which game is being limited by which part. Another way of saying it is, “Seeing which game is requiring more of the other. If BF3 requires more CPU than you will see 30w for CPU and 15 w for GPU.” Also note that with the short term TDP window under low load you will get that window back get get a brief higher frame rate but with using a iGPU and having such a limited TDP the game should require full TDP the whole time. You can see that most games in the anandtech test throttle to some level because of this. This is also where a higher end CPU will be pointless if you’re playing games on it because a 3720QM will be limited to the same performance as a 3610qm if they both have a45w TDP. If the TDP window is not big enough then it will throttle. I will repeat myself again from above.
TDP is the value Intel has designed to limit a CPU so it does not over heat a system and gives manufactures a guide on how to build the heat sink. If the CPU uses 45w TDP at max load than the GPU cannot be used due to there is no thermal head room. If both are working than they have to share. As I stated with my u2357m (I think that is the name) it has a 17 watt TDP and both the CPU and GPU can use it up by themselves or very close. So when I play a game CPU and GPU have to share…period. They are not allowed to break that threshold. Note if the ULV or mobile ship has turbo boost it can go over the standard TDP for a set window as long as temps allow but it will never stay forever at that from my understanding. Again it still has a TDP window but it is called the short term TDP window. So on the 3720qm it has 65w TDP for short term and 45w TDP for long term.
Now towards you directly HTwingnut….my god has HTwingnut used Wikipedia as a source??!?!?!?!?! How many times have you trolled me for using Wikipedia and noteboockehck.net. You have trolled me big time for using notebookcheck.net and it is by far more reputable than Wikipedia. I think you even trolled me for using tftcentral even though they have the best reviews period. So it amazes me how you can be a hypocrite about this. None the less I will let this slide but let’s learn how to play nice in this forum and get off our high horse! So anyways HT with your logic that means my 920xm if I am running it at 30C I can overclock it to 4GHz while leaving the TDP at 55w right? Bullocks! We have to up the TDP because it will limit it. Hence why you see the TDP fluctuate when it does different tasks. It has nothing to do with heat but power draw. Note heat is directly related to it but not the other way around. Also note that TDP w does not mean it actually uses 45w of energy. It is just a value Intel has made to base its energy and heat usage. Energy tends to be higher than TDP. Same with my 3720qm I can set it to 3.8GHz but it’ll only run at that for the short period that the short term TDP window allows…which is still a TDP window. If I set the short term window to 45w I will never go higher than 45w even if temps are low. Do you understand how this works now? I don’t know amd chips at all but they should follow the same principal. Removed last statement because I am not sure if it was accurate and was confusing.
From as far as i understand this is how it all works. This is based off of intel statement and from what i have read and tested myself.....mostly from what i ahve tested with my ulv and 920xm and 3720qm
R3d id i get everything right? Any opinions?
EDIT: BTW i am still stoked to see some tests showing the idle of this chip...i hope it delivers. I saw on tech power up they had a picture of clock for clock and not impressive but thats not what is interesting. The new GPU and independent voltage regulation is whats cool with this chip. -
Will these be combatible with current ivy bridge computers, I have Sager NP9150
-
No. Haswell will have a different socket.
-
Oh well I might as well just post it since nobody else have. Apparantly some russian guys have gotten their hands on a Haswell Engineering Sample CPU, and have tested it @ 2.8GHz against an Ivy Bridge CPU at the same speed to see the clock for clock difference. I don`t believe that this is legit results, but feel free to share your thoughts.
Source: Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge" | techPowerUp -
ES vs OEM isnt a fair comparison. Some improvement s will be made and the OEM is there for better.
-
Compared to Ivy Bridge, is Haswell a tick or a tock? Also will it be released on March or June?
-
It's a Tock due in June.
-
Depends what ES revision it is. It could be the latest stepping that is identical with OEM. I`m thinking this is a very early ES since its so close to Ivy Bridge in performance and the Haswell CPU is even beaten in SuperPi 1M.
I have a hard time coming to peace with that the Haswell will be 3-5% faster than Ivy -
HopelesslyFaithful Notebook Virtuoso
again the biggest thing intel is going for with this chip is GPU and voltage controls. i don't think intel cares too much about the CPU plus i don't really get the point in improving the clock for clock. Not like it really matters since they both are 22nm and the performance difference will still not be effected (performance per watt)
just my humble opinion -
No AMD to put preassure on Intel = Milking consumers as slow as possible.... but thats life, still gona buy it, as its more a hobby for me than a tool. But imo what intel should do is give us six cores at least on desktops on the high end (none E version)... but i dont see a reason why.
-
The whole point of moving to a new architecture is getting better performance. Its in their Tick-Tock model.
If they concentrated on just the IGP, than a big FU to Intel
Improving clock for clock is getting a more efficient CPU. Either you get a faster one at the same clock as Ivy chips, or you can lower the by let say 10% clocks, and it will be equally powerful as a 10% higher clocked Ivy, but with less heat.
Yes, perhaps that is the case here. It seems that Intel might try their best to catch up to AMDs APUs and their GPU strenght. And I don`t like it. I still think the test is done on a very early ES that is a bit from OEM performance -
HopelesslyFaithful Notebook Virtuoso
being the same process (22nm) reducing the clocks by adding more calculations per cycle will have hardly any difference in energy usage from my understanding. I know the higher the freq the more inefficient it is but going from 4GHz to 3.5GHz and doing the same amount of information isn't going to save a lot of juice from my understanding but i could be wrong. Also note i am talking about laptops here with a set TDP...the savings in energy i think would be a trifle. Now if we are talking about desktops i think that would be a different story. With having a desktop with an "unlimited" TDP I think a reduction in clock with shifting towards more being done in a cycle would do a lot since desktops have the ability to push the freq much higher. Again my humble opinion...could be wrong but thats what i understand unless you can explain to me why there would be such a huge difference in just expanding how many things are down in a single cycle.
Also it appears that Intel over the last several years has been shifting focus much more on the mobile sector than the desktop sector. It seems that ever since the i7 came out intel has been focusing much harder on the mobile sector. In today's age most people get laptops over a desktop it seems. I think that is largely because laptops can actually do a lot more than back in the P3/P4 era (P6 architecture right? or mostly P6) I remember the Core due and Core 2 due mobile chips still were vary crappy compared to a desktop chip especially with iGPU being so crappy. Well with IB a 55w laptop chip is only 22w less than the desktop counter parts ^^ Why I always gun for the Die shrinks because they provide the best bang for your buck when it comes to performance per watt.
Also a 10% reduction in clock isn't even going to get close to a 10% reduction in heat. I would be surprised if it would get even a 25% reduction in heat (25% of the 10% so 2.5% total reduction) I would love to see a documentation showing a typical drop in energy per clock. I know when you break the 3GHz area the efficiency starts to tank big time because i remember the terrascale project clearly showed 3GHz is like the sweet spot. Going from 3GHz to 5 GHz required i think it was 2.5x the juice. Having a hard time remember the difference....one sec...nm it was 4 times the power to go from 3-5GHz. I am sure that varies a lot between chips but really shows how anything above 3 GHz gets stupid wasteful. (hence why i think desktops would benefit the most and laptops wouldn't notice a lot.
http://en.wikipedia.org/wiki/Teraflops_Research_Chip
EDIT:Question what is the 920xm 4 core turbo speed? for 55 watts? I get 920xm to 3.2GHz for 65-72 TDP i think so i was thinking what is the difference per freq per watt to see how the efficiency plays out. -
@HopelesslyFaithful
Just replying to your post on the last page as requested. AFAIK you'd cant measure TDP separately between the CPU and iGPU (for Intel processors at least, there's AMD system monitor for AMD APUs, but even that doesn't give TDP allocation).
Everything else you wrote fits with what I've seen around the web. What is interesting though is going from 45w to 17w (i.e. about a 60% reduction in tdp) yielded about a 30% FPS hit for Intel IVB CPUs (with virtually no throttling for the 45w quad) as per the Anandtech article. This doesn't necessarily apply to AMD APUs, but it seems to me that even accounting for TDP limitations, HTWingNut is still getting less FPS than he should. So it could be a combination of factors. -
HopelesslyFaithful Notebook Virtuoso
in HWinfo you have core TDP GT core TDP and package TDP from what i remember. i can look tomorrow. GT core TDP is iGPU
-
That would make a lot of sense to me, as well. Or maybe their thought is eight "cores" are enough for consumers and we'll keep on building out the more parallel GPU in the meantime for highly parallel apps.
So if there really is hardly any CPU performance gains with Haswell, are the power savings at least going to be out of this world again? I'm honestly not screaming for better battery life in my laptops anymore, and while the GPU is going to be making another big jump with Haswell, it's still going to be classic Intel GPU a day late and a dollar short. Honestly, I don't think I'll be seriously interested in an Intel GPU until it can be used to accelerate production apps like Photoshop and Premiere, and that's not going to happen for awhile simply because of the low amount of VRAM in Haswell. -
HopelesslyFaithful Notebook Virtuoso
are you saying 8 core in a laptop or a 8 core in desktop? We still are a long way from an eight core in a laptop due to thermal envelope. I do think haswell extreme for desktop is going to be an 8 core this time around...i thought i read that somewhere. Also I don't think intel is really milking the consumer market besides the extreme chip for desktops. The fact it goes from a 4 core(non extreme) to a 6 core extreme is stupid. The top line consumer should be 6 core and extreme be an 8 core. They are clearly milking the highend desktop chips. Laptops i don't think they are milking at all except for the fact that they keep them locked!!!!! GRRRRRRRR but at least they have somewhat unlocked them now.
EDIT: oh the one other thing i know intel is milking the market on is with delaying their release of their chips now by 6 months...they are now 6 months behind schedule but after this release they will now be behind 9 months!!! (if memory serves me correctly with all the "delays" they have had.) -
The fewer cores their processors have, the cheaper it is for them to make. If their top end laptop processors have 4 cores, they will still sell for as much as a 6 core processor would, but it costs them less to make it. Also the 4 core processor can clock higher and still be within the same power envelop, so that takes away a decent chunk of the performance advantage of 6 cores as well.
-
Jayayess1190 Waiting on Intel Cannonlake
-
I didn't see this before. You can actually calculate relative power consumption rather easily. Power consumption at a given voltage goes up and down linearly with frequency, and power consumption at a given frequency goes up and down the difference of the square of the voltage. The equation is something like this: Power = (constant for the specific CPU) x (Frequency) x (Voltage)^2
-
Some performance numbers? If these are true Haswell is 10% faster than IVB at the same clock, for the Fritz chess benchmark at least. (note: link is in Chinese, so get google translate ready if you dont use chrome)
-
HopelesslyFaithful Notebook Virtuoso
what is constant? amps? that is an interesting I have never heard of that before. Where did you get that? This would explain a few things. I was thinking about the numbers of my 920xm and i calculated its efficiency and it was ~ the same for 2GHz to 3.2GHz and i bet that would be because the voltage was the same (granted i bet 2GHz would be better if i lowered voltage)...at least according to the equation you gave. I think that is why the tera scale went crazy inefficient at higher freqs because it required a lot more voltage. -
Every microchip design is different, so has a different constant in the equation.
Here is some more info I could dig up:
CPU power consumption as a function of frequency and voltage - AnandTech Forums
Frequency scaling - Wikipedia, the free encyclopedia -
HopelesslyFaithful Notebook Virtuoso
so i assume i was right with what i said that a 10% reduction in clock/10% increase per clock cycle would not affect the total power consumption unless voltage was affect right? -
nvm. misread.
-
No. A 10% reduction in clock frequency while maintaining the same voltage would reduce power consumption by 10%. Power consumption goes up and down linearly with clock frequency.
-
apparently the test isn't so accurate because the turbo boost was on.
the 22nm heat issue also won't be solved because they use the same "tick" (?)
well I am not an expert in these stuff but seems like everyone in the thread later on pretty much agrees that haswell improvement isn't as significant as they were expecting -
HopelesslyFaithful Notebook Virtuoso
what 22nm heat issue?
They are exactly what i was expecting...better GPU and than the added bonus of core voltage regulators. Only so much can be done on the same process. -
Jayayess1190 Waiting on Intel Cannonlake
-
Now that's as confusing as ever.
-
That's annoying that the Ultrabook models won't have dedicated memory. My next laptop will be an Ultrabook (or, more likely, a MacBook Air), but I'm not really interested until I can get a model with a sizable chunk of dedicated RAM for minor production work. And preferably a quad-core. Maybe with Broadwell we'll see our first Quad Cores in Ultrabooks.
-
Karamazovmm Overthinking? Always!
that was expected, I dont know why they are doing this though, the they could have made it easier for people, like 5200 comes with everything, 5000 everything minus the edram, 4600, GT2 core
they never made the distinction between the higher clocked models and not, dunno why know? lawsuit?
that is actually not confirmed, the edram would make sense on ultrabooks and lower form factors, quads are usually coupled with higher end hardware that usually have dedicated gpus, so no need there. Although I do expect that the 5200 appear on quads, at least I hope, because i still have hopes that apple would put that in the rmbp 13 -
Preview from Tom's Hardware out. Doesn't look that great... I hope Intel can optimize it some more before release.
-
HopelesslyFaithful Notebook Virtuoso
exactly what i expected..... improved on idle and GPU this design was for mobile more than desktop...or a desktop for the average joe.
-
Personally, I'm already impressed by the GT2's benchmarks. Skyrim on Low (which looks terrible, but that's not the point) at 1080p 37FPS average. More demanding games around 30.
The GT3 you're going to be able to game on. Not max settings, but it won't be the stupidest idea to game on an integrated Intel GPU anymore.
The issue is going to be the new consoles coming out is going to raise the lowest common denominator way past what the GT3 can do, and that's something Broadwell won't be able to gain enough ground on, next year. (Any word on expected GPU improvements with Broadwell, yet?) -
Karamazovmm Overthinking? Always!
Does anyone know about the rumor of USB 3 being faulty?
-
HopelesslyFaithful Notebook Virtuoso
besides what NBC and a couple other places have said no -
Karamazovmm Overthinking? Always!
I tracked it down to only one not so reliable source, them that thing spread, obviously not like wildfire -
Jayayess1190 Waiting on Intel Cannonlake
-
HopelesslyFaithful Notebook Virtuoso
damn those numbers are confusing....why change a good naming scheme.....again -
Karamazovmm Overthinking? Always!
yeah when we get used to it, they change
I still think we may be in for some surprises in the line up -
HopelesslyFaithful Notebook Virtuoso
and this is the last i core line up so it changes again on next generation chips....sigh -
We consumers are too stupid to figure anything out ourselves beyond the dork at Best Buy telling us we need to buy our grandmothers i7-4773MQZ's.
I miss the days when Intel named processors for what they were. So let me get this strait, we have three variables in each name.
First it is Celeron, Pentium, i3, i5, i7. This is then followed by a random 4 digit number always starting with 4. After that, we have HQ, MQ, MX, and U. What a disaster. -
HopelesslyFaithful Notebook Virtuoso
i was starting to get the new scheme when ti came to PGA and BGA but now seein the full line up my brain is in overload... -
Karamazovmm Overthinking? Always!
Well engineers, the little monkeys of science, never trust them to name things.
Vollausbau der
Intel HD Graphics
-
I trust that the engineers would give us much better names that actually make sense than this gibberish Intel's marketing department has been throwing at us.
-
HopelesslyFaithful Notebook Virtuoso
What I wish we had was adjustable tdp for the u CPUs. U hit 17w with either the CPU or gpu. When you play a game you hit 50% on both because of tdp limits. It would be awesome to allow in a plugged in state a 35 watt tdp so that u can use both at full speed.
-
Karamazovmm Overthinking? Always!
well, the names are the engineers that come up with that no marketing dept would give a name like that. They would come up with retina display and some stuff that made the consumer aww wow uoooww
that would depend on the cooling provided, and the asus u21 could do that, but the cooling cant handle higher tdps, only the specified and the lower one -
The only GT3 (HD 5000/5100) will be BGA only in the first 2-4 months. Along with them, we get normal Haswell CPUs but with GT2 (HD 4600) After that, in Q3 (rumored October) we get normal non soldered Haswell CPUs with GT3 (HD 5200). They will be called Core i7-4850HQ and Core i7-4950HQ.
Not sure if I want a notebook with BGA CPUs like Core i7-4558U (GT3), Core i7-4550U (GT3). I feel bad for buying the GT2 CPUs listed above in the picture, but they are clocked higher than the GT3 models. Probably because it have more thermal to go on because of the weaker IGP. I rarely use the IGP and do like faster CPUs. But this time around the GT3 might be powerful enough to not having to fire up the dedicated GPU.
Hmmm, what to pick. -
HopelesslyFaithful Notebook Virtuoso
Yea i'll take intels naming scheme over retina dumb stuff any day.
Yes true but it would give the manufactures the ability to probably limit it in the same way they do it now with prochot and stuff. Plus it would allow me to figure out some getto way to properly cool it and have an awesome ultra portable notebook ^^
remember what i said about the TDP limits. So realize that in gaming it probably won't make a difference. At least with my SB i3 ULV model it is the bottom barrel and it throttles like it is cool in any game so buying an i7 version would not have made a single difference but maybe these new ones are a little more efficient. Also what do you needed dedicated GPU for besides games and rendering? Integrated should be fast enough for everything from HD video to flash....anything within normal windows usage, which a test somewhere shows that intels iGPU is much faster than any dedicated gpu for regular day to day stuff.
EDIT: also forgot this
I am exited to see the new design for the single reason of independent voltage for all cores. i am curious to see if a 35w TDP chip will have the same or virtually the same idle as a ULV. The probably with all previous designs is that the idle power usage for 35-45w CPUs is that they use 7-10w idle and can't touch the low idle power draw of the ULVs. I could careless to be honest about a ULV if this new design fixes that.....it pretty much defeats the point of a ULV ^^
Forget Intel Ivy Bridge, Haswell on the way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.
