The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Forget Intel Ivy Bridge, Haswell on the way

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.

  1. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    LOL I agree HT. I think I understand it now.

    - All of these CPUs are for Ultrabooks, hence why they have U in the end and why they are all BGA (soldered to the motherboard)

    - Then we have the Haswell CPUs for "normal" sized notebooks: 4700MQ, 4800MQ and 4900MQ. All of these are normal CPUs that is not soldered on the motherboard. They are all HD4600 (GT2). Notice that they end with "MQ" and they call HD4600 for GT2 instead of HD4400 for GT2 like the Ultrabook CPUs.

    - Later this year, Q3, we will get Haswell CPUs for normal sized notebooks (non soldered) called 4750HQ, 4850HQ and 4950HQ. These will finally have GT3 called HD5200. Notice that these GT3 models always end with "HQ" and that they call HD5200 for GT3 instead of HD5100 for GT3 like the Ultrabook CPUs.

    - Along with all of this we will also get overclock Haswell CPUs. One is called 4930MX. Notice that this CPU like all the other CPUs have the letters rearranged from Sandy Bridge. There it was called XM and QM. Now they are called MX and MQ. Plus the HQ (GT3 model).
     
  3. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    I thought mobile CPUs were completely locked. Or is it just that all laptop manufacturers except for a very few models prefer to lock any BIOS OCing options?
     
  4. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0

    i thought they said the HQ are soldered chips and MQ were not soldered?
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    They did? Ok you might be right about that.

    Then we might not see GT3 on PGA, only on soldered CPUs.
     
  6. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Since I do enjoy complicated things, like trying to watch a good show on tv, as you are aware thats a very tough proposition. I will say that we dont have enough info, and we will probably see more GT3e, since it doesnt make a lick of sense to just put those in a 47w cpu, 35w should also be available it makes sense, most quads are coupled with a dgpu of some kind, add to that the fact that those igpus would be beneficial for smaller designs where the cooling is not enough for cpu + dgpu
     
  7. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i disagree i would love to have a great iGPU in a 400 or so dollar laptop with 1080p. Or even a good 600 dollar ultrabook with a powerful i7. I bet if someone tried like i did with my i3 but do it with an i7 run cpu and gpu separately and see what the max draw is for both of them combined you would get somewhere from the 65-75w TDP range. my 3720qm reaches 55 watts on just cpu and that doesn't count the iGPU that i have never tested. If you through in a powerful 35w TDP iGPU you will be hitting walls like it is cool in games. I wonder if they put a GT3 gpu in the xm...errr mx chip what would its TDP be at stock clocks if tdp was opened to 100w? (with no clock changes) I bet it would reach close to the 75w range. That is a lot of power you can cram into a small device...given good enough cooling.
     
  8. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    you can already see what that igpu TDP cost to the series, 400mhz down, thats a lot of oomph that goes because the igpu needs that thermal

    Im not disagreeing with you in any way, except in the mx thing, which doesnt make sense.

    The problem is that:

    1) this is more like a trial run

    2) those cpus are going to cost more

    3) they probably will be in higher end models for the 2nd reason, and where we expect the benefit of the added power because of the trade off for the thinness/ weight/ design choices/ or whatever is up on HBO
     
  9. davidricardo86

    davidricardo86 Notebook Deity

    Reputations:
    2,376
    Messages:
    1,774
    Likes Received:
    109
    Trophy Points:
    81
    Not trying to derail the topic but an AMD Trinity or up comming Richland APU would be the solution for you (and you don't have to wait, you can get it now). The problem being some OEMs don't implement some of those premium features into AMD APU-based laptops whatsoever. We can count the number of 1080p/IPS AMD APU-based laptops on one hand. That's just sad.

    However, Intel still owns the CPU title, can't deny that. An AMD APU won't beat an i7 anything, they barely keep up with an i3/i5 at best in terms of CPU performance so that would have to be something you can be comfortable with.

    Can't wait to see some head to head benchmark and real-world comparison between Haswell and Richland APUs.
     
  10. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    Yea but i bet amd has even worse of a time with the TDP limit...unless AMD does not use a TDP limit. no way a 32nm cpu with a igpu can keep up with a 22nm cou with an igpu if they both have 35w TDP. I would love to see a test run and see the power draw and performance of an AMD chip with cpu and i gpu vs an intel option. With a 35w TDP i would put my money on the 22nm option.
     
  11. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    In raw CPU performance, Intel takes the cake.

    In gaming, most likely the i5 (or maybe even the i7) is going to choke before the A10, unless if the game is Starcraft 2, Total War series, Planetside 2, or other CPU-heavy games, or if the manufacturer hobbles the A10 with a single-channel 1333 mhz RAM stick.

    I'm looking forward to the reviews, as both Haswell and Richland have additional power saving features over the predecessor.
     
  12. davidricardo86

    davidricardo86 Notebook Deity

    Reputations:
    2,376
    Messages:
    1,774
    Likes Received:
    109
    Trophy Points:
    81
    I would like to see that! Yeah, that's a tough one (for the 32nm APU) as we don't have the necessary information just yet. But, 28nm Kaveri with GCN graphics should easily outpace GT3 5200, even while "trailing" behind in nm-size. You will see, all while remaining at a 35W TDP.

    I'm putting my money on Richland (32nm) giving Haswell GT3 5200 a run for its money too. And if things work out how I hope they do, Richland will be a drop-in replacement for (socket-based) Trinity APUs.

    GPU Type: Radeon HD 8650G (A10-5750M, VLIW4)
    Shader cores: 384
    Base frequency: 533 MHz
    Maximum frequency: 720 MHz
    DDR3-1866


    Just something to think about I suppose if iGPU performance is of utmost concern.

    Edit: Take this with a grain of salt:
     
  13. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    Wow, you actually think the engineers who designed these chips have ANYTHING to do with the screwy names given to them? I mean come on, the engineers would come up with some wacky confusing naming system that makes life harder for themselves, and then the marketing department would just through their arms in the air because there was nothing they could do about it? Get a clue.

    The engineers designed one processor, Haswell. Then the marketing department came up with everything else, within the limits of what the processor could do. Names, multipliers, socket, cache, GPU, etc.
     
  14. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    engineers that now work in the marketing department ^^
     
  15. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    you are aware that for them it makes more sense than to us, its been like that for ages

    3635qm - thats worth a hanging for any marketing guy that made that. Actually they usually dont put that in the product charts on mortar brick stores for example, they just put 2.4ghz or 3.4ghz, that about what people can take in terms of pcs.

    if we are to assume, that mh is just the bga ones, than it was just the replacement for the 5 at the end of that thing. and the list goes on for the changes. I never could grasp what was the method for the naming schemes of the ulv cpus.

    nomenclatures actually help them, mostly not us.

    what marketing did was pentium, celeron, i3, i5, i7.
     
  16. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    But what about Sandy Bridge (32nm) and Ivy Bridge (22nm)? Or how about Northwood (130nm) vs Prescott (90nm)? A smaller die doesn't always mean less power consumption and less heat.
     
  17. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Remember, temperature is not heat. Ivy Bridge both consumes less power (and therefore puts out less heat).

    A smaller die/transistors don't necessarily mean less power consumption/heat (esp when comparing different fabs and processes). Power consumption also depends on clock speed, voltage and architecture... But I highly doubt GloFo 32nm is going to have better power characteristics than Intel 22nm.
     
  18. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    yes it does......compare what a 32nm SB can do compared to a 22nm IB can do with 35w worth of TDP.....IB wins every time....therefor lower nm wins.

    32nm amd vs 22nm intel both with a 35 w TDP window....22nm wins. They may use the same amount of energy and produce the same amount of heat but that does not mean they produce the same amount of work.
     
  19. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    Excieted to see the new Haswell Ultrabook (17W) APUs, how much can they improve the GPU performance in that TDP range. If they have done big boost, Haswell gonna be the no doubt best APU for Ultrabooks, since Ivy Bridge 17W already faster with CPU part than Quad Core AMD A10-4655M (25W) and close with GPU performance (based on my test). I personally hope to see a Quad Core 25W Haswell with GT3 graphics... :cool:
     
  20. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    I didn't even think about that. Intel has made so little process in CPU performance the last few years, AMD in performance might actually be able to catch up.
     
  21. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    its not like piledriver has been stelar either, wait I can be less powerful than my predecessor? Count me in! And the improvements on that camp have been around the same as on intel, just with 5-20% increases in performance.

    The most problematic thing was that AMD didnt do the extreme jump that intel made on the quad mobile cpus, when it did clarksdale to sandy bridge.

    there is still hope when amd abandons piledriver
     
  22. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    No, there is no need for AMD to abandon modular architecture. Simply put it will be stupid to do that since it would cost them a lot of extra money on die size and all the thermals would go up.
    Steamroller would be a great improvement from what I heard, as some people claim it would be 30% faster than Piledriver. The whole front end will see a massive improvement along with a few other things. And as for HSA Kaveri will be the first to properly Utilize it. If some new applications start using it there is no hope for Intel to keep up, and as far as PS4 goes thats the direction its headed.
    And talking about Intel keeping up in the Graphics segment, i don't see that happening anytime soon. Not just because it cannot keep up with the GCN gpu in Kaveri utilizing GDDR5 memory, but also because the usage of GT3 graphics will be very limited. Even worse, there are only 2 quad core mobile haswell chips with 47W TDP that will use the decent HD 5200 graphics, the ones with the LV4 Cache in them). You won't get cache in anything else as ULV chips will be limited to cache-less HD 5000/5100, and the 37W chips most likely won't have that anytime soon.
    GT3 is limited to BGA (there go intel's schemes of standardizing BGA and trying to get rid of sockets).
    1. I don't see anyone adapting GT3 HD 5200 because its a high end 47W chip thats most likely going to get into workstation/gaming laptops with discrete GPUs, and even worse those BGA chips are clocked lower (400MHz) than similarly priced and 47W i7 parts, and them being BGA which is BAD in a gaming system as the OEMs send their Gaming Laptops to companies such as XoticPC who need to be able to customize them. Here is all the info about GT3 so far Intel to launch Core i7-4850HQ and i7-4950HQ CPUs in Q3 2013.
    So Intel will be unable to compete in graphics except in ULV chips.
    And as for Kaveir there are some indications that it will come with up to 6 cores (hopefully in mobile too). And Kaveri will use TSMC 28nm I belive which is well optimized for power consumption (all mobile SoC use it) so intel ends up having one less advantage.
     
  23. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    AMD's problem with present generation multi-core cpu's is that 2 cores are sharing 1 FPU... which basically makes an AMD A10 quad core more equivalent to Intels i5 (more or less) dual core.
    Intel still beats AMD in clock/clock performance, however, if AMD fixes the shared FPU between core issue, and shortens the pipeline (something that Kaveri/Steamroller is due to address), it 'should' be able to come relatively close to Intel in both single and multi-core performance if not surpass it (this is based on what AMD itself stated and observed during their own testing - of course we have yet to see what will actual real-world results say when they release these chips - that and moving most computations to the GPU - HSA that should make programming easier - at any rate, we shall see as we cannot make any conclusive statements yet).
     
  24. Megol

    Megol Notebook Evangelist

    Reputations:
    114
    Messages:
    579
    Likes Received:
    80
    Trophy Points:
    41
    I respectfully disagree. The problem isn't the shared FPU - even most FP intensive tasks have a large integer component and (if recompiled) the FPMADD support can increase the performance up to 2x. The problem is that some design choices only made sense if the clock frequency was higher, the small write through L1 data cache is one example. Another limitation was the decoder design that at most can decode two instructions per integer core/clock and is susceptible to stalls.
    Steamroller will double the decode throughput and be less sensitive to stalls and improve cache paths among other improvements. It's what BD should have been.
     
  25. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    oh wait, I said abandon piledriver, your arguments are based on steamroller which is incredibly, different from piledrive, again what was the problem?

    Ah yes, pipeline too long, not enough instructions per cycle, and a slow cycle, so steamroller modifies the problems? Yes. Is it different from piledrive? Yes. I dont see the argument that you guys are trying to raise against me saying that they should abandon piledrive
     
  26. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    who ever was spreading that rumor of intel dropping sockets is either a fool or some punk trying to stir up trouble. Intel would be absolute stupid to drop sockets on so many levels. There is no reason to ever get ride of sockets....you can not even give one good reason to do it. There will always be PGA and BGA no matter what....and for good reasons
     
  27. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Intel stated that they'll keep LGA, for the high end laptops and desktops. Ultrabooks, low-end laptops, and low-end destkops will most likely get the BGA treatment, because BGA CPUs are cheaper to manufacture than LGA.
     
  28. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    I am not saying that they are going to get rid of sockets, but they were thinking about it and it caused a lot of criticism so they changed their minds. As far as Desktops go i don't think they will get rid of sockets anytime soon, but in the mobile space it doesn't seem that way from their actions. The fact that only BGA chips include GT3 is leading to the point that they are trying to squeeze PGA out of the market slowly as to not make it seem like they did it on purpuse (to reduce costs, make motherboard costs even cheaper and their CPUs ending up in machines at lower price points, or they can sell their overpriced chips at higher prices and still end up at the same price point for the overall system, all of these spells greed to me), but that's exactly what they are trying to achieve and that's bad news for enthusiasts who like upgrading their chips to get better performance/features. Now I can only hope that OEMs don't give in to this crap and their selfish plots fail.
     
  29. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i get the 400 dollar throw away laptops but anything else is pretty stupid and all those systems already have BGA....i can't think of any systems that use PGA systems that are cheap. To me it seems it is business as usual...nothing has changed.
     
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
  31. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    still not buying it...could be an incomplete road map or a fake.....can't really see any reason to do such a stupid thing. I can't believe intel would be that short sited and dumb....they would be asking amd to pretty much steal large market share....i would probably dump intel if that happened out of sheer principle.

    EDIT: btw i did some reading on mPGA rPGA PGA LGA BGA and i just noticed...i have never had a LGA computer lol. I didn't realize LGA was opposite of PGA haha
     
  32. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Intel is already switching socket types every tick-tock and has used thermal paste instead of solder for the Ivy Bridge CPUs (and thus giving them worse thermal performance).

    Why?

    Because it's cheaper, and they can get away with it because AMD can't compete in a desktop/laptop CPU war.


    Why improve your products if your main competitor can't match them?
     
  33. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    we have not seen haswell yet so we don't knwo if intel is maintaining that dumb decision. They have received a lot of flake and it would surprise me if they continued with that. Also note that they could not get away with paste on the haswell chips. Look at intels history. When a die shrink happens they stock power draw froms. it went from 97w to 77w if i remember correctly. They could not get away with cheap paste on a ~100w chip. They could but i doubt people would buy intel and support them if they did.
     
  34. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well we notebook users don`t have anything to worry about Loney. ;)

    Mobile CPU = Die exposed directly on to the heatsink
    Desktop CPU = Over the die its a heat spreader (IHS), and on top of that again you put the heatsink.

    The reason why desktop people were mad about Ivy Bridge is because Intel put a not so great paste between the die and the heat spreader. It wasn`t as effective in tranferring the heat from the die to the heat spreader as Intels own paste (actually something called fluxless solder) which they used previously on earlier CPUs.

    Desktop CPU


    Mobile CPU
     
← Previous page