The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Forget Intel Ivy Bridge, Haswell on the way

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.

  1. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    DDR4 should come soon as possible, because integrated GPUs really could benefit from. Maybe Broadwell yes...
    There will be variants with 32MB, 64MB too. Might unaccurate to call these memories for cache, but a memory which is dedicated for GPU only.
     
  2. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    lol looked at OP and it says uses DDR4 and DDR3 sweet. Everyone should buy of DDR3 while they can on the cheap because in a year or two the price will start to go up. Also i must have missed this discussion...where is this ram being held? MOBO?
     
  3. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    Yes, it is on the motherboard, very similar what AMD introduced in 2008 with 790GX, called "sideport" memory. However AMD discontinued to use it, because was not effective enough in real use and caused plus power consumption. Maybe Intel found the solution, we will see in later tests. I'm just afraid those Anandtech tests won't representative the real gaming situations, settings...
     
  4. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    yea if it was 512mb chip than i could see real world gain. Also i am surprised DDR3 would really slow it down....i just find it hard to believe an iGPU would need that much power. Also i wonder if it'll even matter once some high powered DDR4 comes out
     
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    nobody has any solid info on crytalwell, the ram is called edram, and we dont know where its going to be.

    we dont know the size, we dont know which models will feature that, we dont know anything. What we know was that there was an added layer of functionality in the GT3, however a real revamp in the arch for the igpu is only stated for broadwell, not for haswell.

    What we also dont know is how much EU units will be available, nor their clocks, nor their thermal budget. One thing that we cant deny is how much progress has there been since the first 4500 HD to the now HD 4000, that is since the introduction of the Nehalem arch, what was an afterthought became one of the main decisive things for the engineer team. That is also no excuse for the still comparatively poor performance compared to what AMD offers in their APUs, I hope they give similar performance to what the 7660G does or more.

    according to intel time tables, DDR4 will be avaialble for broadwell and certainly for skylake, the idea is that DDR4 is already being produced, so that there is enough stock for next year.
     
  6. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101

    Believe me, iGPU jumping for faster Memory. AMD's Trinity is jumping for even tighter Memory timings... It depends how strong is the iGPU, because a more powerful needs more food from the memory ;).

    Where else could be than on motherboard? It's sure there is no room on chip for 128MB...
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    it can be on the PCH, it can be in a special connection to the CPU avoiding the PCH completely so that the bandwidth is as large as possible (which according to the leaks and semi infos, it is very large), it can be on the die itself, the mobo is a very large area, basically unless we know how it works, there are several places that the edram can be.
     
  8. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    It's on the CPU package. Meaning, not on the same die, but on the same PCB as the die, kinda like what Intel did with the iGPU for Arrandale.
     
  9. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    where did you got that from? if we assume that the bandwidth is really high it makes ton of sense
     
  10. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
  11. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
  12. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Finally! Good find R3d! I figured it would happen eventually. I think 256MB would be the sweet spot considering its performance, but take what you can get of dedicated vRAM.
     
  13. t.saddington

    t.saddington Notebook Geek

    Reputations:
    1
    Messages:
    85
    Likes Received:
    0
    Trophy Points:
    15
    LOL I just purchased a top of the line DV6 with Ivy Bridge!! Should have waited! Oh well, It's not like this is slow or anything!
     
  14. DackEW

    DackEW Notebook Consultant

    Reputations:
    47
    Messages:
    178
    Likes Received:
    24
    Trophy Points:
    31
    Yuj :(.

    Why only 128MB then?

    Is the space limited for bigger package?
     
  15. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
  16. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Availability date of Haswell processors confirmed

     
  17. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Doesn't look like anything to get too excited about. So I can wait until June 2014 when nVidia Maxwell and Intel Broadwell will collide. Just hope Broadwell actually isn't only a BGA chip. Intel i7-5920QM + GTX 890m in 18 months and counting. ;) Saving my pennies now.

    I'll likely bump up to an i7-3740QM in the interim and overclock the hell outta my 680m if needed. :)
     
  18. sreesub

    sreesub Notebook Consultant

    Reputations:
    8
    Messages:
    281
    Likes Received:
    1
    Trophy Points:
    31
    I dont see why anyone will jump after just one generation. Haswell is good for folks having Merom/Penryn laptop if their laptops are showing the age.

    I have a penryn thinkpad R61(2.5 ghz). Since I put an ssd and replaced vista with ubuntu, its performance have been great and I have replaced the battery twice. I will wait for broadwell or skylake and buy a convertible either with Mac OS(I hope there will be a MBA with touch) or windows equivalent with Android in super saver battery mode. I am hoping we will see ssd at least 512gb at reasonable price and AMOLED screens as well.
     
  19. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    Also on the upgraded integrated graphics, the GT3 is going to available only with BGA processors. I think this is a way for Intel to protect Nvidia and AMD on the graphics front, or at least try to keep forcing more expensive "ultrabooks" on consumers. Intel has powerful integrated graphics for their thin and light "ultrabooks" they are trying to push, while users with normal laptops will still be forced to purchase Nvidia and AMD graphics if they want something powerful enough to actually game on.
     
  20. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i also think it is done that way for strategic reasons. People buy bigger laptops for performance and usually come with a good CPU and some sort of GPU If i was buying a laptop that was not an ultrabook i wouldn't want die space wasted on a iGPU. I would rather have the extra CPU power. To throw in a high enough iGPU you have to give up some CPU so personally I would prefer the product plan Intel is doing. It is definitely smart and giving the best performance for everyone
     
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    People who need IGP, are mostly those who buy cheap notebooks and ultrabooks and don`t upgrade GPU/CPU anyways
    Not a big fan of soldered hardware anways grrr
     
  22. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    which is what i said right? or just i say it wrong in my sleep this mornign ^^
     
  23. superparamagnetic

    superparamagnetic Notebook Consultant

    Reputations:
    402
    Messages:
    252
    Likes Received:
    56
    Trophy Points:
    41
    There actually is a good engineering reason for using BGA. Solder conducts heat better than pins, so it might help with heat dissipation on what's likely to be a 47W to 57W chip.

    Most likely it's targeted towards space constrained systems. 57W is about the TDP of a ULV chip + midrange discrete graphics. While you might be able to dissipate that kind of heat, the space you save by going down to a single chip instead of two could be used for things like a bigger battery or better cooling. In this market it doesn't make sense to waste space on a socket anyways.

    Then again it could just be Intel up to its old market segmentation schemes again.
     
  24. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    the BGA and PIN doesn't matter. The "heat" is electricity at that point and doesn't matter. So that will have no effect on temps. Now the point of a thinner package definately could be argued on fitting a better heatsink in there.
     
  25. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    BGA reduces production costs, if only by a little bit, for both Intel and the laptop manufacturer. It also reduces the height of the chip on the PCB by a few millimeters which can translate to a few millimeters shaved off from the thickness of a computer if they design it correctly. It also makes it more of a headache for end users trying to upgrade as well as laptop manufacturers performing warranty service. A motherboard problem also costs them a CPU unless they go for the annoying process of reclaiming it from the motherboard. Also if they have multiple CPU options for a single laptop model, it is a headache with making and stocking a handful of different motherboards for a single PC.

    Personally, I hate BGA CPU's. The laptop I'm typing on has one and it is starting to get too slow for me with a lot of multi-tasking.
     
  26. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    they make great ultrabook really. BGA's are good for standard computing and getting a new computer every 2-3 years (2 years in my opinion). Though it is a trade off. I would love to see a report on what is better buying a new computer every 2-3 years or holding on to it for 3-5 years. Assuming performance is enough. As in does performance and reduced power usage justify the added waste made? I would assume for someone that uses a high powered desktop or laptop that uses close to max load all the time a upgrade to a new system would be more environmental and economical but for someone that just uses it for general computing i bet a 5 year model would be less wasteful
     
  27. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    Well after 3 years, upgrading the processor should be pretty economical. I bought my laptop two years ago, and it is perfectly fine for me for the foreseeable future except for the processor. I'm looking for a replacement, but if I was able to upgrade to a better processor from the same series, it would have been able to hold me over for at least one more year.

    It depends on how much it costs to upgrade, and how much benefit there is from an upgrade. In lots of cases it makes no sense because you are almost throwing your money at something that will not get much better. But in others, a small amount of money can greatly boost system performance.
     
  28. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    if i were you wait for di skrink. i will always upgrade on a die skrink if i can
     
  29. superparamagnetic

    superparamagnetic Notebook Consultant

    Reputations:
    402
    Messages:
    252
    Likes Received:
    56
    Trophy Points:
    41
    Err no. Heat is heat. For every watt the chip consumes as electricity it has to dissipate a watt as heat. It primarily dissipates into the heatsink, but there's no reason it can't also dissipate heat into the motherboard through the electrical contacts. After all metals are pretty good thermal conductors.

    BGA is better than PGA for dissipating heat into the motherboard. In effect the motherboard acts as a secondary heatsink (with a sizeable surface area). For low power chips like Intel's PCH the motherboard is enough to act as the primary heatsink on a BGA package.

    Intel has a nice spiel on BGA thermal characteristics: Ball Grid Array Packaging: Packaging Databook Ch 14
    See 14.10.1 if you want to read about it.
     
  30. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    maybe for a 5 watt processor but for a 45 watt processor it won't make a difference. A whole 1 or 2 watt of heat out the bottom of a board is irrelevant. The difference from a plastic plate for the pins and a soldered BGA SOC is virtually nothing. Plus i would rather not have the added heat on the motherboard anyways if it was true. Also heat is heat but energy is not heat when in transfer through wires. You have HVAC or HVDC systems that transfer more than 1,800 kilowatt hours or 6,141,854 Btu per hour but almost no heat is spent so heat is heat but energy is not heat until used. None the less even if a BGA chip dispenses heat it can not be more than a watt or two more than a PGA (?) chip. I still doubt a BGA adds anything worth while to cooling unless your talking about 1-5 watt systems. I don't even think there would be a notice in a 17 watt IB chip.
     
  31. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    So deep in entropy here. :p
     
  32. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Haswell GT3 graphics to launch in Q3 2013 via Fudzilla here and here.

     
  33. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Core i7 3940MX is Intel’s new mobile king
    Intel is working on Haswell Core i7-4930MX processor

    M-line Haswell, turbo to 3.9GHz, 57W TDP

     
  34. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    isn't m for mobile? and isn't there a xm and mx chip? mx is soldered and xm is socket?

    huh?
     
  35. CoolMod

    CoolMod Notebook Consultant NBR Reviewer

    Reputations:
    155
    Messages:
    104
    Likes Received:
    0
    Trophy Points:
    30
    Kind of debating whether or not to wait for Haswell...
     
  36. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    what are you getting? If you are getting an ultrabook than i would wait...it'll be a lot better (assuming you can.)

    otherwise waiting is relatively pointless.
     
  37. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    The only real benefit I see is if you want a thin and light notebook with just integrated graphics, then the HD5200/GT3 or whatever the heck it is would be worth waiting for imho. But the marginal increase in CPU performance isn't much to write home about. Even then GT3 graphics won't even match current AMD Trinity's integrated GPU performance they have today.

    Heck I ran the HD 4000 overclocked by 200MHz and DDR3 RAM at 2133MHz and 3DMark11 went from ~ 600 to ~ 750, while that is a 25% gain, the end result is still laughable. The 7660G in the A10-4600m is stock ~ 1350 @ 1600MHz RAM. Point being is that Intel has a lot of work to do to at least double the performance of their current design, which isn't likely. If integrated graphics are important to you, I'd say go for AMD. If you want CPU power and decent graphics performance go Intel + dedicated GPU, even a low end one will be better than Intel's IGP.

    One issue I've found too even with AMD's IGP is that CPU hungry games like BF3 fight for system resources with the IGP, so while on paper it looks to be about equivalent to a 630m, in some cases it can't even manage 25 FPS where it should be able to pull 40FPS. The dedicated card has its own memory and architecture to use independent of the CPU.
     
  38. CoolMod

    CoolMod Notebook Consultant NBR Reviewer

    Reputations:
    155
    Messages:
    104
    Likes Received:
    0
    Trophy Points:
    30
    Thanks for the information!
     
  39. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    The only thing disappointing about Haswell for me is that it still takes a top of the line mobile CPU to have a non-Turbo baseclock at 3.0GHz. It's really too bad for workflows where high frequencies matter.
     
  40. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    the reason for that is because of TDP.
     
  41. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    That's hard to justify or explain. Are you saying that if you could drive more voltage into it, that it would improve performance? The CPU doesn't throttle, the GPU doesn't throttle, it just is fighting for resources. Thermals are below 70C at all times too, so that's not the issue either. I noticed a similar issue with the Llano GPU's even though we could control the clock speed and voltage at will. I do admit that the Llano did handle BF3 on low detail much better than Trinity, but then again, the A8 Llano's have more CPU computing power than Trinity.
     
  42. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    no a chip is limited to a set TDP so if it has 17 watt TDP and the CPU can use 17 watts on full load and the GPU can use 16 watts on full load than they have to share the 17 watts so the CPU and GPU each get 5 and 12 respectively if thats what the program needs. I have a SB ulv notbook. The CPU can use 17 watts in Prime 95 and in furmark the iHPU can use 16-17 watts so with a total TDP of 17 only one can run at full see or they each are throttled. I can play DC online or Dungeon siege 3 at 15 FPS because it can only use 17 watts. When you look at CPU it is at 50% and GPU is at 30-50% (If i remember) and the total TDP is at 17 watts and CPU is using like 10 and GPU is at 7.

    So take an intel CPU and run furmark and prime 95 separately and see how much TDP each can use and than run a game or both benchmarks and compare.

    You'll see that i am right.


    EDIT: I am i wondering is if a 45 watt 22nm intel(IB) would do better than a 32nm 35 watt trinity because of this issue. Even if intels GPU is slower does the total TDP and lower process give it better game performance than the AMD counter part? I know amd will win in synthetic when its just processing one part but what about combined? What are the comparisons in 3dmark 11 in combined tests and not total?


    Also remember i did this with a ULV....it has a very limited TDP...a 45 watt processor may not have this issue or not as severe. If my ULV could run at 35 TDP than i could actually play DC online and DS3 to some level but 15-25 is a joke. Also FPS varied in game from 10-30 overall and while watching hwinfo i noticed it sagged in FPS when the cpu needed more juice while maintaining overall same TDP so iGPU are great if the game is not CPU intensive....if it is GPU performance will tank
     
  43. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Forget Haswell, Broadwell is on the way!

    Forget Broadwell, Skylake is on the way!

    Forget Skylake, Skymont is on the way!
     
  44. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i keep saying skylark in my head lol because i read it wrong once when i was reading fast and it stuck lol
     
  45. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    The CPU and iGPU would throttle to stay under tdp. If there's no throttling then the lower performance is probably due to the sharing of resources (e.g. saturating the memory controller or something).

    Though in the case if bf3 imo its just more likely that the game is CPU limited. (could check with render.showperfoverlay 1 or something)
     
  46. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    EDIT: typed it in phone inc lass and it botched what i typed

    With BF3 i would bet it is TDP and not limited by CPU. I don't see how the CPU would be too slow. Also if it slows down due to CPU not keeping up than it will again be affected by TDP before CPU is too slow.

    a 45w TDP CPU will use 100% TDP aka 45w TDP on max load.....so how can a iGPU run if they game requires that much CPU? Again no matter what type of game it is if it requires a lot of CPU or GPU it will throttle by TDP limit. A desktop chip would not do this since it has a nearly infinite TDP (If K chip) Even than the desktop chip probably doesn't have a 77w TDP iGPU so i am sure TDP will never get into the way. (could be wrong but i would be amazed if that happened. Well i could see it if your trying to play Total War..particular Rome or Medivial 2. Running 25k units on screen requires a ton of GPU and CPU so i could see it being limited by CPU or TDP.)


    If a game in a 45w chip can really be limited by ram i would be surprised. That has to be using a ton of bandwidth on such a low setting. If that is the case than maybe DDR4 will fix this problem.

    Also another point is that I can see a game limited by slow CPU and not by iGPU or ram if it is a single threaded game like what I stated above. AKA CS Source with bots or Total War or a crazy Sim city 4 map or Roller coaster tycoon 3. RCT3 i get 30-120FPS with max sized map when i use tools since it is single threaded


    EDIT: @Cakefish...beyond that we will switch to a different material than silicon and than quantum CPUs!!!!

    BTW: r3d...you do realize you repeated exactly what i said in your first statement right?
     
  47. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    BF3 is a very CPU and thread hungry game in multiplayer. It can use 8 threads no problem, and does best with a fast hyperthreaded quad core Intel CPU. Go ahead and test yourself with Throttlestop on an Intel quad core and adjust the speed of the CPU from 2GHz to 3GHz and you will see a marked difference in BF3. The AMD A10 doesn't even come close to performance of the Intel i7 quad core so you can imagine the resultant performance drop. TDP would only really be a factor if temps were too high but in this case of the Trinity, they are not. They are 70-72C max. Most other games I've tried so far, however, perform well with the Trinity IGP.
     
  48. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    TDP has nothing to do with temps....it has to do with power usage. If i run a 45w TDP chip at 0C i am not going to get more than 45w out of it compared to 70C

    Anyways i am using my experience with Intel chips...maybe an AMD chip do not reach max TDP on 100% CPU. maybe 100% on an AMD chip is only 20 watts out of 35 watts. I don't know never used one but i know what i am telling you is 100% correct. Just go test what i told you.

    Fact is if AMD cpu will use its 35 or 45 watt TDP window with 100% CPU utilization then playing a game that is CPU intensive will kill the GPU performance due to a lack of TDP. The two WILL fight for TDP, which will kill your gaming performance.

    If i remember correct....DC universe online would require more CPU than DS3. I would get 10 watts CPU and 7 watts GPU on DCUO and on DS3 i would get the opposite if i remember correctly. I did this about a year ago so i can't quite remember. I don't really feel like proving it too you since i know i am right so do whatever you want test it and see for yourself or blindly believe what you wish. It is your own prerogative.
     
  49. superparamagnetic

    superparamagnetic Notebook Consultant

    Reputations:
    402
    Messages:
    252
    Likes Received:
    56
    Trophy Points:
    41
    I'm not sure where you get your info, HopelesslyFaithful, but everything I've read says TDP is both power and thermals.

    The Sandy Bridge architecture specifically allows a CPU to turbo up above TDP if there's thermal headroom, so yes you can get more than 45W out of your 45W chip. Source: AnandTech - Intel's Sandy Bridge Architecture Exposed

    Anand also explains how Intel calculates TDP: AnandTech - Intel Brings Core Down to 7W, Introduces a New Power Rating to Get There: Y-Series SKUs Demystified
    Thermals play a big role.
     
  50. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    @HopelesslyFaithful - TDP is primarily to do with heat. Of course this is directly related to energy consumed, but TDP guidelines are to tell laptop manufacturers what cooling to provide for, not how much energy the CPU should draw. And not sure how you can measure the power consumption between the CPU and IGP if that's what you were saying in your last comment.

    Even Wikipedia states: Thermal design power - Wikipedia, the free encyclopedia

    " The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate."

    " This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope..."

    " For example, a laptop's CPU cooling system may be designed for a 20 watt TDP, which means that it can dissipate up to 20 watts of heat without exceeding the maximum junction temperature for the computer chip."
     
← Previous pageNext page →