The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Forget Intel Ivy Bridge, Haswell on the way

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.

  1. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    If I'm reading your post correctly, you're saying that the CPU and/or GPU will throttle if the whole chip is will go over TDP ( which is true, at least in the case of the HD4000). Since HTWingNut is saying that his chip isn't throttling (first post on this page), I think that it's not going over tdp. I was just attempting to point that out in my first post.

    Also, you're right that TDP has nothing to do with temperature. Temperature is not the same heat. (e.g. a candle flame has a higher temperature than a tub of boiling water, but the latter has more heat).
     
  2. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    So TDP with the link to SB, which also includes IB talk about how the secondary TDP Called short term TDP window if I remember correctly has to do with Intel has realized that, “Temps in CPUs take 10-60s to reach max temp” quoting myself ^^ So Intel has assumed that manufactures will build a system take can maintain 45W TDP under adequate temps. They use this knowledge to help give CPUs an extra kick for a short period. In Intel XTU you can set the short term and long term TDP. Default for IB is 68 and 45 for the 3720qm. These values have nothing to do with temps. I repeat these values have nothing to do with temps. R3D was nice enough to find a source showing exactly what I am talking about. But to no surprise anandtech has again failed to do a good job of showing the issue. Why in the hell they don’t include the TDP on the core, GT, and total package blows my mind. I realized this back on SB using the ULV so it blows my mind this isn’t a well know issue. If they recorded/showed the TDP over every section of the CPU then we could see in which game is being limited by which part. Another way of saying it is, “Seeing which game is requiring more of the other. If BF3 requires more CPU than you will see 30w for CPU and 15 w for GPU.” Also note that with the short term TDP window under low load you will get that window back get get a brief higher frame rate but with using a iGPU and having such a limited TDP the game should require full TDP the whole time. You can see that most games in the anandtech test throttle to some level because of this. This is also where a higher end CPU will be pointless if you’re playing games on it because a 3720QM will be limited to the same performance as a 3610qm if they both have a45w TDP. If the TDP window is not big enough then it will throttle. I will repeat myself again from above.

    TDP is the value Intel has designed to limit a CPU so it does not over heat a system and gives manufactures a guide on how to build the heat sink. If the CPU uses 45w TDP at max load than the GPU cannot be used due to there is no thermal head room. If both are working than they have to share. As I stated with my u2357m (I think that is the name) it has a 17 watt TDP and both the CPU and GPU can use it up by themselves or very close. So when I play a game CPU and GPU have to share…period. They are not allowed to break that threshold. Note if the ULV or mobile ship has turbo boost it can go over the standard TDP for a set window as long as temps allow but it will never stay forever at that from my understanding. Again it still has a TDP window but it is called the short term TDP window. So on the 3720qm it has 65w TDP for short term and 45w TDP for long term.

    Now towards you directly HTwingnut….my god has HTwingnut used Wikipedia as a source??!?!?!?!?! How many times have you trolled me for using Wikipedia and noteboockehck.net. You have trolled me big time for using notebookcheck.net and it is by far more reputable than Wikipedia. I think you even trolled me for using tftcentral even though they have the best reviews period. So it amazes me how you can be a hypocrite about this. None the less I will let this slide but let’s learn how to play nice in this forum and get off our high horse! So anyways HT with your logic that means my 920xm if I am running it at 30C I can overclock it to 4GHz while leaving the TDP at 55w right? Bullocks! We have to up the TDP because it will limit it. Hence why you see the TDP fluctuate when it does different tasks. It has nothing to do with heat but power draw. Note heat is directly related to it but not the other way around. Also note that TDP w does not mean it actually uses 45w of energy. It is just a value Intel has made to base its energy and heat usage. Energy tends to be higher than TDP. Same with my 3720qm I can set it to 3.8GHz but it’ll only run at that for the short period that the short term TDP window allows…which is still a TDP window. If I set the short term window to 45w I will never go higher than 45w even if temps are low. Do you understand how this works now? I don’t know amd chips at all but they should follow the same principal. Removed last statement because I am not sure if it was accurate and was confusing.

    From as far as i understand this is how it all works. This is based off of intel statement and from what i have read and tested myself.....mostly from what i ahve tested with my ulv and 920xm and 3720qm

    R3d id i get everything right? Any opinions?

    EDIT: BTW i am still stoked to see some tests showing the idle of this chip...i hope it delivers. I saw on tech power up they had a picture of clock for clock and not impressive but thats not what is interesting. The new GPU and independent voltage regulation is whats cool with this chip.
     
  3. LEoR

    LEoR Notebook Enthusiast

    Reputations:
    0
    Messages:
    25
    Likes Received:
    0
    Trophy Points:
    5
    Will these be combatible with current ivy bridge computers, I have Sager NP9150
     
  4. danielschoon

    danielschoon Notebook Deity

    Reputations:
    241
    Messages:
    1,473
    Likes Received:
    42
    Trophy Points:
    66
    No. Haswell will have a different socket.
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  6. danielschoon

    danielschoon Notebook Deity

    Reputations:
    241
    Messages:
    1,473
    Likes Received:
    42
    Trophy Points:
    66
    ES vs OEM isnt a fair comparison. Some improvement s will be made and the OEM is there for better.
     
  7. CORNandTOOTHPICK

    CORNandTOOTHPICK Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Compared to Ivy Bridge, is Haswell a tick or a tock? Also will it be released on March or June?
     
  8. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    It's a Tock due in June.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Depends what ES revision it is. It could be the latest stepping that is identical with OEM. I`m thinking this is a very early ES since its so close to Ivy Bridge in performance and the Haswell CPU is even beaten in SuperPi 1M.

    I have a hard time coming to peace with that the Haswell will be 3-5% faster than Ivy
     
  10. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    again the biggest thing intel is going for with this chip is GPU and voltage controls. i don't think intel cares too much about the CPU plus i don't really get the point in improving the clock for clock. Not like it really matters since they both are 22nm and the performance difference will still not be effected (performance per watt)

    just my humble opinion
     
  11. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
    No AMD to put preassure on Intel = Milking consumers as slow as possible.... but thats life, still gona buy it, as its more a hobby for me than a tool. But imo what intel should do is give us six cores at least on desktops on the high end (none E version)... but i dont see a reason why.
     
  12. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The whole point of moving to a new architecture is getting better performance. Its in their Tick-Tock model.
    If they concentrated on just the IGP, than a big FU to Intel

    Improving clock for clock is getting a more efficient CPU. Either you get a faster one at the same clock as Ivy chips, or you can lower the by let say 10% clocks, and it will be equally powerful as a 10% higher clocked Ivy, but with less heat.

    Yes, perhaps that is the case here. It seems that Intel might try their best to catch up to AMDs APUs and their GPU strenght. And I don`t like it. I still think the test is done on a very early ES that is a bit from OEM performance
     
  13. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    being the same process (22nm) reducing the clocks by adding more calculations per cycle will have hardly any difference in energy usage from my understanding. I know the higher the freq the more inefficient it is but going from 4GHz to 3.5GHz and doing the same amount of information isn't going to save a lot of juice from my understanding but i could be wrong. Also note i am talking about laptops here with a set TDP...the savings in energy i think would be a trifle. Now if we are talking about desktops i think that would be a different story. With having a desktop with an "unlimited" TDP I think a reduction in clock with shifting towards more being done in a cycle would do a lot since desktops have the ability to push the freq much higher. Again my humble opinion...could be wrong but thats what i understand unless you can explain to me why there would be such a huge difference in just expanding how many things are down in a single cycle.

    Also it appears that Intel over the last several years has been shifting focus much more on the mobile sector than the desktop sector. It seems that ever since the i7 came out intel has been focusing much harder on the mobile sector. In today's age most people get laptops over a desktop it seems. I think that is largely because laptops can actually do a lot more than back in the P3/P4 era (P6 architecture right? or mostly P6) I remember the Core due and Core 2 due mobile chips still were vary crappy compared to a desktop chip especially with iGPU being so crappy. Well with IB a 55w laptop chip is only 22w less than the desktop counter parts ^^ Why I always gun for the Die shrinks because they provide the best bang for your buck when it comes to performance per watt.

    Also a 10% reduction in clock isn't even going to get close to a 10% reduction in heat. I would be surprised if it would get even a 25% reduction in heat (25% of the 10% so 2.5% total reduction) I would love to see a documentation showing a typical drop in energy per clock. I know when you break the 3GHz area the efficiency starts to tank big time because i remember the terrascale project clearly showed 3GHz is like the sweet spot. Going from 3GHz to 5 GHz required i think it was 2.5x the juice. Having a hard time remember the difference....one sec...nm it was 4 times the power to go from 3-5GHz. I am sure that varies a lot between chips but really shows how anything above 3 GHz gets stupid wasteful. (hence why i think desktops would benefit the most and laptops wouldn't notice a lot.

    http://en.wikipedia.org/wiki/Teraflops_Research_Chip

    EDIT:Question what is the 920xm 4 core turbo speed? for 55 watts? I get 920xm to 3.2GHz for 65-72 TDP i think so i was thinking what is the difference per freq per watt to see how the efficiency plays out.
     
  14. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    @HopelesslyFaithful

    Just replying to your post on the last page as requested. AFAIK you'd cant measure TDP separately between the CPU and iGPU (for Intel processors at least, there's AMD system monitor for AMD APUs, but even that doesn't give TDP allocation).

    Everything else you wrote fits with what I've seen around the web. What is interesting though is going from 45w to 17w (i.e. about a 60% reduction in tdp) yielded about a 30% FPS hit for Intel IVB CPUs (with virtually no throttling for the 45w quad) as per the Anandtech article. This doesn't necessarily apply to AMD APUs, but it seems to me that even accounting for TDP limitations, HTWingNut is still getting less FPS than he should. So it could be a combination of factors.
     
  15. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    in HWinfo you have core TDP GT core TDP and package TDP from what i remember. i can look tomorrow. GT core TDP is iGPU
     
  16. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    That would make a lot of sense to me, as well. Or maybe their thought is eight "cores" are enough for consumers and we'll keep on building out the more parallel GPU in the meantime for highly parallel apps.

    So if there really is hardly any CPU performance gains with Haswell, are the power savings at least going to be out of this world again? I'm honestly not screaming for better battery life in my laptops anymore, and while the GPU is going to be making another big jump with Haswell, it's still going to be classic Intel GPU a day late and a dollar short. Honestly, I don't think I'll be seriously interested in an Intel GPU until it can be used to accelerate production apps like Photoshop and Premiere, and that's not going to happen for awhile simply because of the low amount of VRAM in Haswell.
     
  17. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    are you saying 8 core in a laptop or a 8 core in desktop? We still are a long way from an eight core in a laptop due to thermal envelope. I do think haswell extreme for desktop is going to be an 8 core this time around...i thought i read that somewhere. Also I don't think intel is really milking the consumer market besides the extreme chip for desktops. The fact it goes from a 4 core(non extreme) to a 6 core extreme is stupid. The top line consumer should be 6 core and extreme be an 8 core. They are clearly milking the highend desktop chips. Laptops i don't think they are milking at all except for the fact that they keep them locked!!!!! GRRRRRRRR but at least they have somewhat unlocked them now.

    EDIT: oh the one other thing i know intel is milking the market on is with delaying their release of their chips now by 6 months...they are now 6 months behind schedule but after this release they will now be behind 9 months!!! (if memory serves me correctly with all the "delays" they have had.)
     
  18. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    The fewer cores their processors have, the cheaper it is for them to make. If their top end laptop processors have 4 cores, they will still sell for as much as a 6 core processor would, but it costs them less to make it. Also the 4 core processor can clock higher and still be within the same power envelop, so that takes away a decent chunk of the performance advantage of 6 cores as well.
     
  19. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Core i7 4765T is 35W quad Haswell
    (Desktop)

    Intel to launch Haswell ULV CPUs at Computex show

     
  20. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    I didn't see this before. You can actually calculate relative power consumption rather easily. Power consumption at a given voltage goes up and down linearly with frequency, and power consumption at a given frequency goes up and down the difference of the square of the voltage. The equation is something like this: Power = (constant for the specific CPU) x (Frequency) x (Voltage)^2
     
  21. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Some performance numbers? If these are true Haswell is 10% faster than IVB at the same clock, for the Fritz chess benchmark at least. (note: link is in Chinese, so get google translate ready if you dont use chrome)
     
  22. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    what is constant? amps? that is an interesting I have never heard of that before. Where did you get that? This would explain a few things. I was thinking about the numbers of my 920xm and i calculated its efficiency and it was ~ the same for 2GHz to 3.2GHz and i bet that would be because the voltage was the same (granted i bet 2GHz would be better if i lowered voltage)...at least according to the equation you gave. I think that is why the tera scale went crazy inefficient at higher freqs because it required a lot more voltage.
     
  23. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    Every microchip design is different, so has a different constant in the equation.

    Here is some more info I could dig up:
    CPU power consumption as a function of frequency and voltage - AnandTech Forums
    Frequency scaling - Wikipedia, the free encyclopedia
     
  24. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
  25. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    nvm. misread.
     
  26. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    No. A 10% reduction in clock frequency while maintaining the same voltage would reduce power consumption by 10%. Power consumption goes up and down linearly with clock frequency.
     
  27. etismyname

    etismyname Notebook Consultant

    Reputations:
    4
    Messages:
    132
    Likes Received:
    0
    Trophy Points:
    30
    apparently the test isn't so accurate because the turbo boost was on.
    the 22nm heat issue also won't be solved because they use the same "tick" (?)

    well I am not an expert in these stuff but seems like everyone in the thread later on pretty much agrees that haswell improvement isn't as significant as they were expecting
     
  28. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    what 22nm heat issue?

    They are exactly what i was expecting...better GPU and than the added bonus of core voltage regulators. Only so much can be done on the same process.
     
  29. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Intel GT3 5200, 5100, 5000, 4600 explained

     
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Now that's as confusing as ever.
     
  31. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    That's annoying that the Ultrabook models won't have dedicated memory. My next laptop will be an Ultrabook (or, more likely, a MacBook Air), but I'm not really interested until I can get a model with a sizable chunk of dedicated RAM for minor production work. And preferably a quad-core. Maybe with Broadwell we'll see our first Quad Cores in Ultrabooks.
     
  32. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    that was expected, I dont know why they are doing this though, the they could have made it easier for people, like 5200 comes with everything, 5000 everything minus the edram, 4600, GT2 core

    they never made the distinction between the higher clocked models and not, dunno why know? lawsuit?

    that is actually not confirmed, the edram would make sense on ultrabooks and lower form factors, quads are usually coupled with higher end hardware that usually have dedicated gpus, so no need there. Although I do expect that the 5200 appear on quads, at least I hope, because i still have hopes that apple would put that in the rmbp 13
     
  33. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
  34. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    exactly what i expected..... improved on idle and GPU this design was for mobile more than desktop...or a desktop for the average joe.
     
  35. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    Personally, I'm already impressed by the GT2's benchmarks. Skyrim on Low (which looks terrible, but that's not the point) at 1080p 37FPS average. More demanding games around 30.

    The GT3 you're going to be able to game on. Not max settings, but it won't be the stupidest idea to game on an integrated Intel GPU anymore.

    The issue is going to be the new consoles coming out is going to raise the lowest common denominator way past what the GT3 can do, and that's something Broadwell won't be able to gain enough ground on, next year. (Any word on expected GPU improvements with Broadwell, yet?)
     
  36. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Does anyone know about the rumor of USB 3 being faulty?
     
  37. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    besides what NBC and a couple other places have said no
     
  38. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I tracked it down to only one not so reliable source, them that thing spread, obviously not like wildfire
     
  39. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Intel desktop and mobile CPU roadmap for 2014

    1303_SS_MM24109_Intel_1040J.jpg
     
  40. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
  41. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    yeah when we get used to it, they change

    I still think we may be in for some surprises in the line up
     
  42. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    and this is the last i core line up so it changes again on next generation chips....sigh
     
  43. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    We consumers are too stupid to figure anything out ourselves beyond the dork at Best Buy telling us we need to buy our grandmothers i7-4773MQZ's.

    I miss the days when Intel named processors for what they were. So let me get this strait, we have three variables in each name.
    First it is Celeron, Pentium, i3, i5, i7. This is then followed by a random 4 digit number always starting with 4. After that, we have HQ, MQ, MX, and U. What a disaster.
     
  44. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i was starting to get the new scheme when ti came to PGA and BGA but now seein the full line up my brain is in overload...
     
  45. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Well engineers, the little monkeys of science, never trust them to name things.

    Vollausbau der

    Intel HD Graphics

    hd_graphics-naming-rules-665x437.png

    mobile_H-series-665x181.png
     
  46. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    I trust that the engineers would give us much better names that actually make sense than this gibberish Intel's marketing department has been throwing at us.
     
  47. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    What I wish we had was adjustable tdp for the u CPUs. U hit 17w with either the CPU or gpu. When you play a game you hit 50% on both because of tdp limits. It would be awesome to allow in a plugged in state a 35 watt tdp so that u can use both at full speed.
     
  48. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    well, the names are the engineers that come up with that no marketing dept would give a name like that. They would come up with retina display and some stuff that made the consumer aww wow uoooww

    that would depend on the cooling provided, and the asus u21 could do that, but the cooling cant handle higher tdps, only the specified and the lower one
     
  49. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The only GT3 (HD 5000/5100) will be BGA only in the first 2-4 months. Along with them, we get normal Haswell CPUs but with GT2 (HD 4600) After that, in Q3 (rumored October) we get normal non soldered Haswell CPUs with GT3 (HD 5200). They will be called Core i7-4850HQ and Core i7-4950HQ.

    [​IMG]

    Not sure if I want a notebook with BGA CPUs like Core i7-4558U (GT3), Core i7-4550U (GT3). I feel bad for buying the GT2 CPUs listed above in the picture, but they are clocked higher than the GT3 models. Probably because it have more thermal to go on because of the weaker IGP. I rarely use the IGP and do like faster CPUs. But this time around the GT3 might be powerful enough to not having to fire up the dedicated GPU.

    Hmmm, what to pick.
     
  50. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    Yea i'll take intels naming scheme over retina dumb stuff any day.

    Yes true but it would give the manufactures the ability to probably limit it in the same way they do it now with prochot and stuff. Plus it would allow me to figure out some getto way to properly cool it and have an awesome ultra portable notebook ^^

    remember what i said about the TDP limits. So realize that in gaming it probably won't make a difference. At least with my SB i3 ULV model it is the bottom barrel and it throttles like it is cool in any game so buying an i7 version would not have made a single difference but maybe these new ones are a little more efficient. Also what do you needed dedicated GPU for besides games and rendering? Integrated should be fast enough for everything from HD video to flash....anything within normal windows usage, which a test somewhere shows that intels iGPU is much faster than any dedicated gpu for regular day to day stuff.


    EDIT: also forgot this

    I am exited to see the new design for the single reason of independent voltage for all cores. i am curious to see if a 35w TDP chip will have the same or virtually the same idle as a ULV. The probably with all previous designs is that the idle power usage for 35-45w CPUs is that they use 7-10w idle and can't touch the low idle power draw of the ULVs. I could careless to be honest about a ULV if this new design fixes that.....it pretty much defeats the point of a ULV ^^
     
← Previous pageNext page →