The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Haswell gt3e to crush Nvidia and Amd low end market gpus ?

    Discussion in 'Gaming (Software and Graphics Cards)' started by fantabulicius, Apr 11, 2013.

  1. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    So what do you think ?

    AnandTech | Haswell GT3e Pictured, Coming to Desktops (R-SKU) & Notebooks

    If this holds up and quad cores come with it, we could see a huge advantage for Intel versus the 650m/7850m and lower type gpus. For notebooks ranging from 600-1200 dollars (i guess)
    What seals the deal for me is the possible $200 or so lowered price of not having a dedicated old style gpu.

    Competition is always a good thing, let's see how it turn outs.
     
  2. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I personally wouldn't know exactly, but I doubt that Haswell can surpass AMD in IGP performance just yet (let alone GT 650m).
    Their Ivy Bridge IGP is worse in performance than AMD Trinity, and preliminary reports indicated a marginal increase in Haswell igp performance which would still keep it under Trinity or make it its equal at best.

    I recommend a wait and see strategy... while in the meantime taking everything Intel says with a pinch of salt.

    The 650m is about 40%-50% better than AMD 7670m... while AMD's Richland A10 mobile APU was said to be of equal graphical performance as 7670m.

    If above reports are accurate, Intel will in no way threaten AMD igp performance (yet), let alone the mid-range mobile segment of AMD and Nvidia.
     
  3. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    305
    Trophy Points:
    101
    Here is one problem with your theory. GT3e is only available on the top end, BGA Haswell processors. It is a high-end part. It isn't competing with any AMD igp or low-end Nvidia gpu. This is just for Intel's "ultrabook" initiative, putting a more powerful CPU and GPU in a thin and light package and charging an arm and a leg for it. These GT3e gpu's are going to be a staple on thin and light laptops in the $2000 class.
     
  4. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    No software based overclock? Ha, right.
     
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I think it will smash the fermi 28nm rebadge that we currently have, however I dont expect at all to win against anything higher than the 640m.

    The problem is where to find one? and how much money are the OEMs willing to throw at this new initiative. I have info that this will be only on 47w tdp quads, surely there will be 37w parts, on how many thats the real problem
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    If AMD's IGP isn't "crushing" the low end dedicated GPU market then GT3e surely won't either. What people keep forgetting is that there are many limitations with an IGP that even a low end GPU don't have to contend with. Dedicated GPU's have their own dedicated vRAM. They have their own processing power. IGP's have to rely on the system RAM amount and bandwidth. It has to share resources with the CPU. It's limited to the CPU's TDP. It's definitely a good thing, but all this "end of video cards" bunk way way way overblown. No way a 17-25W TDP CPU with on-die GPU will even come close to competing with a dedicated low end GPU with 25-30W TDP of its own.
     
  7. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Umm it seems the GT3e will destroy anything AMD has for iGPU and probably make AMD's low to mid-range dGPU obselete. And I hope so, AMD's switchable Enduro tech is so bad, it needs to be killed. Bring back the Larrabee and destroy AMD and Nvidia please.

    <embed src="http://www.youtube.com/v/VPrAKm7WtRk?hl=en_US&version=3" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" height="315" width='560'>
     
    Last edited by a moderator: May 6, 2015
  8. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That is very misleading. Actually the CPUs with GT3 (HD5200) with its own die with memory is mostly going inside high end gaming notebooks. The GT3 you are referring to (HD5100) will only be on 17W CPUs, called i7 xxxxU and they will not have its own die with memory. Those CPUs will mostly be found in ultrabooks. The CPUs with GT3 OP talk about will be 47W and they are called i7-4850HQ and i7-4950HQ.

    [​IMG]

    It does seems like Intel is stepping up their game toward GPUs. This is the first time they will introduce a on die package with its own memory. And the memory isn`t just any memory, its eDRAM on a 512bit bus which offer the same bandwidth as GDDR5 on 128bit bus.

    GT2 see about 23% over Ivy Bridge (see Tomshardware preview), 20EUs vs 16EUs. Without its own die with the fast memory. GT3 will have a whopping 40EUs, double the amount of GT2, and GDDR5 on 128bit bus. I think its going to be really fast. A great leap over Ivy Bridge. How much I don`t know, but it will beat a big bunch of dedicated GPUs. Not GT650M though, but probably not so far away either :)
     
  9. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    /shrug/ AMD showcased the same thing with their Llano APU a couple years ago with DiRT 2.
     
  10. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    the problem with AMD is the low availability of notebooks with their cpus. They indeed could have killed the low range, they didnt do it because few models can be found
     
  11. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    305
    Trophy Points:
    101
    I wasn't referring to the HD5100. I don't know what is wrong with Intel these days, but they say a lot of mumbo-jumbo about laptops. The GT3e is going into expensive computers and making them a little thinner and lighter since they can dispense with their mid-range discrete gpu. Intel can call these "high end gaming notebooks" until they are blue in the face, but we all know that is just a bunch of hot air. I was just pointing out that the GT3e is going on Intel's most expensive chips and as such is in no way competing with any AMD igp and most implementation of cheaper Nvidia gpu's since it will be entirely absent in the $600-1200 price range the OP was talking about.
     
  12. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I just did the same thing they did with the A10-4600m 7660G IGP (on left HP) vs my GT 650m (on right Sager), 1080p high settings (well 1200p on the 650m monitor wouldn't do 1080p) and my 650m is overclocked too.

    <iframe width='560' height="315" src="http://www.youtube.com/embed/kUn-TiWsclk" frameborder='0' allowfullscreen></iframe>

    It doesn't really prove anything other than it can run it. They didn't show end results and I left mine out because they did. What do you think the end FPS result was for both?
     
    Last edited by a moderator: May 6, 2015
  13. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Well said HTWingNut .
    I was just about to comment on the very obvious lack of detailed FPS information on that Haswell vs 650m video.

    Point is... both can run the game fluently at those settings, and visual evidence confirms that... however, we have no idea what the actual FPS numbers were for either setup,

    And as you so nicely showed, the A10-4600m runs the game fluently as well (and Llano had a similar demonstration a couple of years ago).

    However... this also could serve to demonstrate a point that regardless if Haswell IGP turns out slower than AMD Trinity or Richland... it's just as 'viable' as AMD on the IGP front for light-mid range gaming at 'high' settings (and lets face it, virtually all games don't offer than much of an visual advantage after 'High' settings because past that point you end up with a few minor effects you may or may not notice that tend to impact the system severely in terms of performance offering little to nothing in return - for now at least).

    Of course, price/performance factor is still something to consider because Intel systems remain rather costly compared to AMD offerings... and the problem with AMD is that their offerings were/are next to non-existent in the mobile market segment (especially in the mid-range of dedicated mobile gpu's which have appeared in maybe one or two obscure laptop offerings).

    Intel might be stepping up on their graphics game, but keep in mind that AMD Kaveri (which will seem to offer just as similar 'enhancements' on its design like Intel is doing) will be released around similar time-frame like Haswell, and if what AMD says about 'fixing' CPU performance holds accurate, then Kaveri will likely be a real viable alternative in the Quad Core section (on CPU performance) with an IGP that could easily surpass Haswell.
    It was mentioned if I'm not mistaken that Kaveri's APU will exceed 1 Teraflops.
    Not sure how much that compares to Intel however.

    One also has to pose a question of Intel increasing its TDP on various cpu's.
    Wasn't there a chart released with Intel Haswell CPU's some time ago that saw an increase in TDP for all CPU's (probably because of the IGP modifications)?
     
  14. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Doesn't seem to make sense to me for any OEM to go with an AMD CPU if Intel's offerings are better. And cost wise, it's not that big of an issue.

    Thanks for that. I guess then it's just a testament to Codemasters that their DX11 engine can run so well on even the low end GPUs.
     
  15. aooga12

    aooga12 Notebook Consultant

    Reputations:
    0
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Do you think that these new integrated graphic solutions will produce much more heat since it's such a leap over ivy? I was going to wait and get a 13" mbp with haswell, but if that thing is going to be heating up like a mf i'll look elsewhere.
     
  16. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Probably only when running games with the IGP. Otherwise the IGP will be running at lowest clocks.
     
  17. aooga12

    aooga12 Notebook Consultant

    Reputations:
    0
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    okay, with that being said, how well do games like LoL run on intel hd 4000? at say 720-800p?
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    LoL will run on pretty much anything. HD 4000 should be able to play at close to max settings at 720p.

    Don't get me wrong, I think it's great Intel is stepping up their IGP game. Just that the expectations are a little unrealistic I think. The physics of it don't make much sense from a thermal, power, and performance standpoint. Also demonstrations like this are used a lot to make people think their product is at least as good as another, when in reality the actual numbers talk.

    Many people though just want games to work. Lots of younger people that can't afford the $1500 gaming laptop are happy to be able to play games at 30fps even at low settings. I can fully appreciate that. And I only expect the same if I am using an IGP. As long as it works many people are happy. But there's always the newest game that pushes the limits like Crysis or Battlefield and makes that IGP feel quite inadequate.
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That is something that remains to be seen. Since the die is the same size as Ivy Bridge, the difference will be pretty small I think. The TDP of the two CPUs with GT3, 4850HQ and 4950HQ is 47W Ivy Bridge is 45W. The 2W extra increase is most likely from the integrated voltage regulator that is now within the CPU die, which used to be on the motherboard. So allin all, I think they will be just about equally in heat.

    Well if Intel is right that the two CPUs with GT3 only goes inside gaming computers, yeah that won`t be $600 price thats for sure. I know we have a few notebooks with i7-3610QM without dGPU and you can find those for around $700 today. So who knows, a notebook with GT3 and without dedicated GPU might go for around $1000. Guess we will have to wait and see.
     
  20. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    I disagree. Intel effectively provides the "sea level" of the GPU market since it is both the largest supplier of laptop GPUs and the one with the weakest GPUs. The reason AMD isn't crushing the low end dedicated GPU market is that most of the laptops out there use Intel CPUs and thus still benefit from low end discreet cards. Unfortunately for AMD and Nvidia, Intel has been improving their GPUs much faster than the low end discreet stuff has been improved. This is not that difficult a task given that "new" low end cards are often renamed GPUs from the previous generation. I do agree with you about the potential of new Intel GPUs being somewhat overblown (equal to the 650m? I'll believe it when I see benchmarks), but the impact on low end cards should not be underestimated.

    Isn't Haswell's main new feature that it will have its own memory on the high-end versions? And the IGPs have always had their own processing power.
     
  21. Tuxberg

    Tuxberg Notebook Geek

    Reputations:
    17
    Messages:
    96
    Likes Received:
    2
    Trophy Points:
    16
    Based on the information given, I think this conjecture is premature (at best). Intel didn't exactly improve their integrated solutions quickly and relative to the demands of consumers whose needs aren't met by current iGPU (photo editors, video editors, 3d designer, high-end gamer) they still have an extremely long way to go. Everyone else can already buy an A6-10 or Ivy Bridge with HD4000 and have their needs met.

    Push coming to shove, I think that if anyone has momentum in this market, it is not enough to disrupt the current balance of oligopoly. Most consumers will still be content without a discrete GPU and most who aren't will still be completely underwhelmed by the offering. I wouldn't count on anyone's dream ultrabook powerful enough to push a modern game into ultra settings this season, or the next.
     
  22. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    nobody is arguing that.

    If you see that both cpu makers have made long strides on the igpu market you can clearly see that, mainstream needs to catch up with high end (the gap is really too great), and that probably unless investments are made (are they worth it?), entry level will eventually disappear

    The problem here is one that amd similarly faces, availability of the cpus that come with this more powerful gpu. AMD doesnt offer the A10 on where it should be offering, formats lower than 15 and with good build quality. The price of these gt3e cores is going to be expensive, and that is probably where the the idea would fail.

    All in all, eventually casual gamers are going to be detracted from buying entry lvl gpus and just be happy with their igpu.
     
  23. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Well we all know the reason why Intel's IGP efforts will be different experience than AMD. No one cares about AMD now outside of desktops and consoles. Phones, tablets, laptops etc, are all Intel, ARM and Nvidia.

    And for laptops, I see no reason why anyone should go with a AMD solution. For GPU or CPU right now, none at all. Not even for the cost benefit, cause you pay so dearly for it in other ways.

    If notebooks already have the high end IGP, then I see no reason why anyone would want to go with a dedicated entry level GPU, doesn't make any sense to me. In that regard, I think Intel will just continue to take over the GPU market for notebooks and computers. It seems for the average person, this Intel GFX will be more than enough for indie games, facebook games and on low/medium detail on the more graphically sophisticated games.
     
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So I`m curious: We have all seen that GT3 Haswell can play Dirt 3 along with GT 650M. They didn`t say a word about FPS numbers from any of the GPUs but it looks like both were playable.

    Anyone who want to guess what GPU GT3 will compete against? GT 540M? GT 555M? GT 650M? :)
     
  25. Kallogan

    Kallogan Notebook Deity

    Reputations:
    596
    Messages:
    1,096
    Likes Received:
    19
    Trophy Points:
    56
    I'd go for P1600-1700 in 3D11 for the GT3e ;) So near 640M GT perf.
     
  26. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Based on what measurable? If you find a penny and turn it into five cents you've improved your profit 500%, but it's still only five cents. Point being is that when you're at the bottom you have no way to go but up. It's easier to improve on weak performance compared to when you're at top end. In the same way Intel has improved their CPU's only a meager 10-15% year over year, dedicated GPU's have done more than that. Every two years they about double their GPU performance.

    I guess it all depends on how the games are coded too. If they are coded so low detail can achieve 30+ FPS then does it really matter any more? I know it's a good thing, but to say it's going to compete with mid-range GPU's is bunk, unless mid-range GPU's end up being in the 15W TDP range themselves when they can effectively embed it in the chip. There's no way to fit performance of a 35-40W TDP discrete card into a single CPU running at 45W.

    No they haven't. They share TDP, RAM, cache, etc. Just because they have shaders doesn't mean it's doing all the work. Even a dedicated card relies on the CPU, but the CPU TDP and power is fully dedicated to support the GPU unlike the IGP that has to share power and thermal headroom as well as RAM and other bandwidth.
     
  27. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    It's probably important to note, the HD 5100 is only when the ULT CPU is running at 25W cTDPup mode, at 15W its called HD 5000. So there's a noticeable difference there, considering it has a different branding altogether.
     
  28. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    It definitely will run hotter, because a quad core Ivy Bridge chip with the iGPU running games only use 35W or so. The GT3 graphics, if its running at constant 1.3GHz frequency, will probably get that to TDP levels, which is 47W.
     
  29. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    The measure I would use is being able to play a large fraction of recently released games without them looking terrible. That is, it doesn't need to use all of the bells and whistles located way past the point of diminishing returns, but the game does need to run at a reasonable frame rate without turning the settings down to the point where it looks like it was made 10 years ago. By that measure, Intel has been catching up. And yes, it doesn't really help the discreet GPUs that so few games are made with the high end as the primary target (and even fewer of them are worth playing...).

    They've always had shaders (which is what I meant) and, starting with Ivy Bridge, they also have their own cache:
    Starting with Haswell, they get their own memory as well so it's basically a miniature GPU on the CPU die. It does share TDP with the CPU, but this is not so different from laptops with discreet cards (if your cooling can't handle both, one or the other has to throttle).
     
  30. Quagmire LXIX

    Quagmire LXIX Have Laptop, Will Travel!

    Reputations:
    1,368
    Messages:
    1,085
    Likes Received:
    45
    Trophy Points:
    66
    And that does sum up Nut's point, I was always amazed how fluent Dirt 3 felt even in low 20's FPS. It's misleading game performance when it comes to every other popular game genre.
     
  31. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    I have been doing some napkin math and came to the conclusion that gt3(no dram) will be between the 720m and the 730m.
    The gt3e version will be better then the 730m don't know by how much.
     
  32. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    And how did you calculate using that "napkin math"?

    There's a huge difference between the 630m and 640m which is what the 720m and 730m will be respectively. Basically because 630m has 96 shaders and 640m has four times that with 384. The AMD 7660G (IGP in A10-4600m) is about on par with the 630-635m. I'll be glad to do a head to head AMD Trinity (even Richland if socket compatible with my HP) vs Haswell GT3 when I get my new Haswell laptop with quad i7. I still think AMD's current gen will defeat Intel's Haswell IGP.
     
  33. aooga12

    aooga12 Notebook Consultant

    Reputations:
    0
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    i wonder which one apple will put in their air and pro models
     
  34. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    As stated earlier, integrated GPUs have a performance ceiling where adding more shaders has little impact, you can see this with trinity, clocking the memory has to be done to get more gains from core overclocking.

    Intel are trying to bandaid this problem with faster memory controllers (we could see systems shipping with 2133mhz memory) and putting this small amount of caching memory on.

    But it is only a small cache so don't expect wonders from it, graphics cards want a decent amount of room, so as you increase the resolution on this performance is going to TANK.
     
  35. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    its going to tank as all entry level gpus would, core too few, bandwidth strained, clocks too slow, you know the drill more than I
     
  36. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
  37. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
  38. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    I'm going to wait for benchmark results.


    Hypes in the past:

    Rambus: Too expensive for its performance, then was outclassed by later DDR versions.

    Itanium: First version was a joke, second version fixed some of the problems, third version never acquired the needed marketshare.

    Pentium 4: The Willmates (1.3 and 1.4 GHz) were sub-par. Some P3 models outclassed them. It took HT, larger cache, and a much higher clock rate to be able to somewhat compete against AMD's offerings.

    Pentium D: The glued-together P4 cores had to communicate outside of the CPU die. That resulted in a major performance hit.

    Early Phenom versions: Bug in the translation lookaside buffer. Lockups/data-corruption or 10% hit in performance.

    Early SLI and Crossfire: You thought your 680 or 7970 CF was bad? Try the late 2000's versions.

    Fermi: Nividia assured that it would be ground breaking in one of their press releases. It turned out to be a mini-oven.

    Bulldozer: Nuff' said.

    Ivy Bridge HD Graphics: A demo of it turned out to be merely a video recording. Good luck running a DX 11 game on it.

    North Korea and its threats of attacking US mainland: Lolololololol
     
  39. BangBangPlay

    BangBangPlay Notebook Consultant

    Reputations:
    199
    Messages:
    241
    Likes Received:
    0
    Trophy Points:
    30
    Intel has been promising that their processors will eventually trump dGPUs but I think it is all relative. As hardware becomes more advanced the game engines will only evolve with it, so the need for a dGPU will always be there if you want the best performance. I am not saying that it won't surpass lower end GPUs, but it will likely never make the dGPU extinct. Maybe in smaller devices and ultrabooks but not desktops and high end laptops.

    Surely Intel would like to take over that share of the market, but not quite yet.
     
  40. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41

    With the PS4 and Xbox 720 coming out in less than a year, future games are inevitably going to be much more taxing given the new, higher performing hardware.

    Initially the games are going to be unoptimized, but given 2-4 years, all quality games are going to come with native 8 thread and HSA support, and were designed for a GPU that is between a 7850 and a 7870.

    In short, any today's low to midrange GPUs are going to be insufficient in a very rapid time, though GCN/HSA-based APUs are going to last somewhat longer since the new consoles will have similar APUs (except for the Jaguar CPU), and thus more of the console optimizations will also carry over natively.

    Higher end GPUs are also going to take a beating until PCs have enough raw computation power to overcome any optimizations for the PS4 and Xbox 720.
     
  41. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I just did some creative math, those numbers are from 3dmark11 P setting

    HD 4000 = 700-800

    HD4600 = HD4000 * 20% = 840-960

    HD 5100 = HD 4600 * 50% = 1260-1440

    Thus the HD 5200 for me is around P1600 = HD 5100 * 33% or 13%, and that is still a whole lot for just 128mb of a very wide and fast edram
     
  42. BangBangPlay

    BangBangPlay Notebook Consultant

    Reputations:
    199
    Messages:
    241
    Likes Received:
    0
    Trophy Points:
    30
    I will admit that the Intel Graphics offering has slowly crept up the benchmarks list every year, but that doesn't mean that it will eventually make dGPUs extinct. Maybe the lower end ones, but even that is questionable. Intel's HD 5200 (the highest of their offerings with 40 shaders) is said to be 2X as powerful as Intel HD 4000 graphics. That type of performance may satisfy the enthusiast or average family, but it won't cut it for high end anything, especially as gaming and software continue to become more complex, it is all relative. Despite the HD 5200 predicted as being as powerful as GT 445M I am willing to bet that in real world processing it won't perform as well as a laptop that has a Haswell processor and a similar low end GPU. It may be cheaper to the consumer, and maybe a bit more power efficient, but it won't stack up against the GT 720M and GT 735M just yet.

    I can't see Intel competing for high end gaming systems any time soon, but they may have an impact on the market. Intel will try to convince some brand name manufacturers to ditch the dedicated GPU on their mid range systems and they could be successful. They will promise cheaper builds (maybe?), lower power consumption, more internal space, and more efficient cooling and surely some manufacturers will go for that, especially the ultrabooks and tablet hybrids. But the Intel HD 5200 simply won't be able to compete with the upper range GPUs and it won't sway gamers and users who need graphic power. I think that many manufacturers will recognize this fact and will still offer models with dGPUs or customization options at least. I read that computer gaming has eclipsed the console market, so there are plenty of gamers out there who will always need faster computers. The average gamer isn't going to spring for Intel graphics (at least not yet) and will demand a computer with a dedicated GPU. The one issue I foresee is if Intel manages to cut into the mid range market and make some of the lower end GPUs become obsolete this could force Nvidia to cut down it's offering and possibly charge higher premiums on dedicated GPUs.

    If it already hasn't, this possibility has to have lit a fire under Nvidia and AMDs asses. They won't get by year after year with rebrands and self overclocking GPUs. They will have to follow suit and make the next evolutionary jump in graphics processing. Whether it be smaller, faster, lower power consumption, cooler, or all of them they both need to push the envelope and force games to the next level. This is the only way that Intel's slow but steady progression will be left in the dust. Intel may expand their grip on a small percent of the market, but there is no way that they are going to dominate it any time soon. Before we buy the hype lets see the actual benchmarks of the new Intel Graphics vs some of the new lower end GPUs (w Haswell CPUs) before we pass judgement. I can see Intel getting a firmer grip on the smaller Ultrabook and tablet hybrid market with their graphics, but not the gaming or rendering PCs any time soon...
     
  43. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    There's a quite a bit of i5 and i7 laptops with Intel's GPU. There only some people that need mobile i5s and i7s but not a powerful GPU, and having dozens or even hundreds of such non-dedicated GPU laptops is a tad overkill.

    An i3 is more than sufficient for movies and web browsing...
     
  44. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    nobody is saying that igpus will replace dgpus, we have several brackets of performamce, igpus have theirs at low end, dgpus should go for mainstream and high end.

    igpus are even more limited than their dedicated counter parts, its simple, tdp for those is much more limited, while we see amd and intel installing edram to boost the performance of system, which is too slow for gpus and has a very low bandwidth, the heat is the primary concern, those things are thought in a balance.

    for example, 680m = P6500-6800 in 3dmark11, thats at least 3x times as powerful as my creative math there, hopefully with maxwell and whatever amd is calling their gpus, this will be met with more 30-50% more power, in the high end. The main problem I see here is that despite last year good increase in mid range performance (it had stalled since the fermi launch, with just a very slight bump from the 4600m), we are still miles and miles behind high end. that gap must be filled with less power consuming gpus and with better tdp ones. another example if you overclock the heck out of a 650m you are still 200% behind the 680m.

    TL :DR, nvidia and amd need to get their game together and invest on the mainstream for more performance, to make a clear cut decision of always leaving igpus to extremely mobile devices and marking the low end. dgpus wont disappear for now, who knows in 10 years or more?
     
  45. Fat Dragon

    Fat Dragon Just this guy, you know?

    Reputations:
    1,736
    Messages:
    2,110
    Likes Received:
    305
    Trophy Points:
    101
    AMD's Trinity APUs offered better graphics power a year ago than Haswell will whenever it's finally released. They also use better graphics drivers that allow them to use that power more effectively. Intel's more-powerful CPU performance is lost on most typical laptop users, while many of them would benefit from the better iGPU of a Trinity, Richland, or upcoming Kaveri processor. The problem is that most consumers are ignorant of this fact, so they'll buy a laptop with "Intel Inside" even if it is in every relevant way inferior to the competing (and cheaper) AMD equivalent, simply because they think the Intel processor makes it a superior machine.

    Unfortunately, if Intel starts moving into the tablet space more aggressively, the "Intel Inside" label will buy them a lot of customers in spite of the fact that their last foray in the field, Atom, was a complete debacle. And that means more money for Intel and less competitive power for their competitors who are producing superior products and selling them for less.
     
  46. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Good math. But if anyone has ever used an IGP for gaming while the benchmarks show it's performance is at level X in gaming reality it doesn't match up. The bandwidth and TDP limitations rear their ugly heads and completely limit performance. A perfect example is the AMD A10-4600m with 7660G. It has similar benchmark performance to a nVidia 630m but in system taxing games like Battlefield 3 the 630m with its dedicated vRAM and resources can perform flawlessly at 720p, where the 7660G tends to stutter and drop FPS frequently making it unplayable. Intel CPU's are the same, but at the moment you wouldn't even consider playing something like BF3 with HD 4000.
     
  47. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    305
    Trophy Points:
    101
    We have had powerful igp's in the past, at the same relative performance that is offered by Intel's GT3e. This isn't a game changer by any stretch of the imagination.
     
  48. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Would you please give a link? 1TFLOPS is almost twice as high as the GT 650m, and more than half of the GTX 680m.

    It doesn't feel fluent at all, especially if you're actually playing it (and care about your times).
    Besides, those P -R people are smart enough to choose a snowy level. When pretty much eveything flying at the camera is white, it's hard to tell if it's smooth or not.

    This NBR forum system replaces "P -R" (without the hyphen) with "Google Page Ranking". Whisky tango foxtrot?! (The short version of the previous sentence is also censored.)
     
  49. BangBangPlay

    BangBangPlay Notebook Consultant

    Reputations:
    199
    Messages:
    241
    Likes Received:
    0
    Trophy Points:
    30
    I totally agree, and that is the extent of the effect it will have on the market. We may initially see more smaller notebooks with Intel graphics and not dGPUs. But natural progression of both hardware and software will only return things to business as usual. I was speaking hypothetically when considering if it could cause a rise in the price of dedicated graphics, although I am sure that is their goal. Even cutting slightly into either manufacturers market share could achieve this, although who knows if buyers or manufacturers will buy the hype.

    Although the market is driven by sales it will be up to manufacturers what to put in their products. I guess we'll have to wait and see what kind of impact it has on the newest notebooks. I just don't see even the casual gamer comprising and going without a dedicated GPU. Maybe if consoles start using integrated graphics then Nvidia and AMD should really start to worry, but until then it is only marketing hype.
     
  50. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    necessarily, for me unless you put at least 512mb, the variations of performance on heavy moments will bring the unwanted fluctuations. And sincerely 512mb is getting much short for todays demanding games that use more than 2gb, but we do have the concern about space, I think if they put 512mb the ram would be pretty much the size of the entire cpu core

    The move to DDR4 in broadwell coupled with the redesign of the igp we may see more interesting performance, the base clock afterall for the DDR4 is 2133mhz, that is basically the speed of the fastest DDR3 for notebooks, and for desktops where it starts to get insanely expensive, DDR3 initial speed was 800mhz, though we rarely saw it, lets consider 1066mhz.

    but I dont know if the performance of haswell is like the creative math that I did there, Im going to be very interested on how kaveri and kaveri successor are going to show to intel how to make a more powerful igpu in broadwell. AMD needs to get some design wins this year
     
 Next page →