So what do you think ?
AnandTech | Haswell GT3e Pictured, Coming to Desktops (R-SKU) & Notebooks
If this holds up and quad cores come with it, we could see a huge advantage for Intel versus the 650m/7850m and lower type gpus. For notebooks ranging from 600-1200 dollars (i guess)
What seals the deal for me is the possible $200 or so lowered price of not having a dedicated old style gpu.
Competition is always a good thing, let's see how it turn outs.
-
-
I personally wouldn't know exactly, but I doubt that Haswell can surpass AMD in IGP performance just yet (let alone GT 650m).
Their Ivy Bridge IGP is worse in performance than AMD Trinity, and preliminary reports indicated a marginal increase in Haswell igp performance which would still keep it under Trinity or make it its equal at best.
I recommend a wait and see strategy... while in the meantime taking everything Intel says with a pinch of salt.
The 650m is about 40%-50% better than AMD 7670m... while AMD's Richland A10 mobile APU was said to be of equal graphical performance as 7670m.
If above reports are accurate, Intel will in no way threaten AMD igp performance (yet), let alone the mid-range mobile segment of AMD and Nvidia. -
Here is one problem with your theory. GT3e is only available on the top end, BGA Haswell processors. It is a high-end part. It isn't competing with any AMD igp or low-end Nvidia gpu. This is just for Intel's "ultrabook" initiative, putting a more powerful CPU and GPU in a thin and light package and charging an arm and a leg for it. These GT3e gpu's are going to be a staple on thin and light laptops in the $2000 class.
-
No software based overclock? Ha, right.
-
Karamazovmm Overthinking? Always!
The problem is where to find one? and how much money are the OEMs willing to throw at this new initiative. I have info that this will be only on 47w tdp quads, surely there will be 37w parts, on how many thats the real problem -
If AMD's IGP isn't "crushing" the low end dedicated GPU market then GT3e surely won't either. What people keep forgetting is that there are many limitations with an IGP that even a low end GPU don't have to contend with. Dedicated GPU's have their own dedicated vRAM. They have their own processing power. IGP's have to rely on the system RAM amount and bandwidth. It has to share resources with the CPU. It's limited to the CPU's TDP. It's definitely a good thing, but all this "end of video cards" bunk way way way overblown. No way a 17-25W TDP CPU with on-die GPU will even come close to competing with a dedicated low end GPU with 25-30W TDP of its own.
-
Umm it seems the GT3e will destroy anything AMD has for iGPU and probably make AMD's low to mid-range dGPU obselete. And I hope so, AMD's switchable Enduro tech is so bad, it needs to be killed. Bring back the Larrabee and destroy AMD and Nvidia please.
<embed src="http://www.youtube.com/v/VPrAKm7WtRk?hl=en_US&version=3" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" height="315" width='560'>Last edited by a moderator: May 6, 2015 -
It does seems like Intel is stepping up their game toward GPUs. This is the first time they will introduce a on die package with its own memory. And the memory isn`t just any memory, its eDRAM on a 512bit bus which offer the same bandwidth as GDDR5 on 128bit bus.
GT2 see about 23% over Ivy Bridge (see Tomshardware preview), 20EUs vs 16EUs. Without its own die with the fast memory. GT3 will have a whopping 40EUs, double the amount of GT2, and GDDR5 on 128bit bus. I think its going to be really fast. A great leap over Ivy Bridge. How much I don`t know, but it will beat a big bunch of dedicated GPUs. Not GT650M though, but probably not so far away either -
-
Karamazovmm Overthinking? Always!
-
-
<iframe width='560' height="315" src="http://www.youtube.com/embed/kUn-TiWsclk" frameborder='0' allowfullscreen></iframe>
It doesn't really prove anything other than it can run it. They didn't show end results and I left mine out because they did. What do you think the end FPS result was for both?Last edited by a moderator: May 6, 2015 -
Well said HTWingNut .
I was just about to comment on the very obvious lack of detailed FPS information on that Haswell vs 650m video.
Point is... both can run the game fluently at those settings, and visual evidence confirms that... however, we have no idea what the actual FPS numbers were for either setup,
And as you so nicely showed, the A10-4600m runs the game fluently as well (and Llano had a similar demonstration a couple of years ago).
However... this also could serve to demonstrate a point that regardless if Haswell IGP turns out slower than AMD Trinity or Richland... it's just as 'viable' as AMD on the IGP front for light-mid range gaming at 'high' settings (and lets face it, virtually all games don't offer than much of an visual advantage after 'High' settings because past that point you end up with a few minor effects you may or may not notice that tend to impact the system severely in terms of performance offering little to nothing in return - for now at least).
Of course, price/performance factor is still something to consider because Intel systems remain rather costly compared to AMD offerings... and the problem with AMD is that their offerings were/are next to non-existent in the mobile market segment (especially in the mid-range of dedicated mobile gpu's which have appeared in maybe one or two obscure laptop offerings).
Intel might be stepping up on their graphics game, but keep in mind that AMD Kaveri (which will seem to offer just as similar 'enhancements' on its design like Intel is doing) will be released around similar time-frame like Haswell, and if what AMD says about 'fixing' CPU performance holds accurate, then Kaveri will likely be a real viable alternative in the Quad Core section (on CPU performance) with an IGP that could easily surpass Haswell.
It was mentioned if I'm not mistaken that Kaveri's APU will exceed 1 Teraflops.
Not sure how much that compares to Intel however.
One also has to pose a question of Intel increasing its TDP on various cpu's.
Wasn't there a chart released with Intel Haswell CPU's some time ago that saw an increase in TDP for all CPU's (probably because of the IGP modifications)? -
-
-
-
-
Many people though just want games to work. Lots of younger people that can't afford the $1500 gaming laptop are happy to be able to play games at 30fps even at low settings. I can fully appreciate that. And I only expect the same if I am using an IGP. As long as it works many people are happy. But there's always the newest game that pushes the limits like Crysis or Battlefield and makes that IGP feel quite inadequate. -
-
-
Based on the information given, I think this conjecture is premature (at best). Intel didn't exactly improve their integrated solutions quickly and relative to the demands of consumers whose needs aren't met by current iGPU (photo editors, video editors, 3d designer, high-end gamer) they still have an extremely long way to go. Everyone else can already buy an A6-10 or Ivy Bridge with HD4000 and have their needs met.
Push coming to shove, I think that if anyone has momentum in this market, it is not enough to disrupt the current balance of oligopoly. Most consumers will still be content without a discrete GPU and most who aren't will still be completely underwhelmed by the offering. I wouldn't count on anyone's dream ultrabook powerful enough to push a modern game into ultra settings this season, or the next. -
Karamazovmm Overthinking? Always!
If you see that both cpu makers have made long strides on the igpu market you can clearly see that, mainstream needs to catch up with high end (the gap is really too great), and that probably unless investments are made (are they worth it?), entry level will eventually disappear
The problem here is one that amd similarly faces, availability of the cpus that come with this more powerful gpu. AMD doesnt offer the A10 on where it should be offering, formats lower than 15 and with good build quality. The price of these gt3e cores is going to be expensive, and that is probably where the the idea would fail.
All in all, eventually casual gamers are going to be detracted from buying entry lvl gpus and just be happy with their igpu. -
Well we all know the reason why Intel's IGP efforts will be different experience than AMD. No one cares about AMD now outside of desktops and consoles. Phones, tablets, laptops etc, are all Intel, ARM and Nvidia.
And for laptops, I see no reason why anyone should go with a AMD solution. For GPU or CPU right now, none at all. Not even for the cost benefit, cause you pay so dearly for it in other ways.
If notebooks already have the high end IGP, then I see no reason why anyone would want to go with a dedicated entry level GPU, doesn't make any sense to me. In that regard, I think Intel will just continue to take over the GPU market for notebooks and computers. It seems for the average person, this Intel GFX will be more than enough for indie games, facebook games and on low/medium detail on the more graphically sophisticated games. -
So I`m curious: We have all seen that GT3 Haswell can play Dirt 3 along with GT 650M. They didn`t say a word about FPS numbers from any of the GPUs but it looks like both were playable.
Anyone who want to guess what GPU GT3 will compete against? GT 540M? GT 555M? GT 650M? -
So near 640M GT perf.
-
I guess it all depends on how the games are coded too. If they are coded so low detail can achieve 30+ FPS then does it really matter any more? I know it's a good thing, but to say it's going to compete with mid-range GPU's is bunk, unless mid-range GPU's end up being in the 15W TDP range themselves when they can effectively embed it in the chip. There's no way to fit performance of a 35-40W TDP discrete card into a single CPU running at 45W.
-
-
-
-
Quagmire LXIX Have Laptop, Will Travel!
-
I have been doing some napkin math and came to the conclusion that gt3(no dram) will be between the 720m and the 730m.
The gt3e version will be better then the 730m don't know by how much. -
There's a huge difference between the 630m and 640m which is what the 720m and 730m will be respectively. Basically because 630m has 96 shaders and 640m has four times that with 384. The AMD 7660G (IGP in A10-4600m) is about on par with the 630-635m. I'll be glad to do a head to head AMD Trinity (even Richland if socket compatible with my HP) vs Haswell GT3 when I get my new Haswell laptop with quad i7. I still think AMD's current gen will defeat Intel's Haswell IGP. -
i wonder which one apple will put in their air and pro models
-
Meaker@Sager Company Representative
As stated earlier, integrated GPUs have a performance ceiling where adding more shaders has little impact, you can see this with trinity, clocking the memory has to be done to get more gains from core overclocking.
Intel are trying to bandaid this problem with faster memory controllers (we could see systems shipping with 2133mhz memory) and putting this small amount of caching memory on.
But it is only a small cache so don't expect wonders from it, graphics cards want a decent amount of room, so as you increase the resolution on this performance is going to TANK. -
Karamazovmm Overthinking? Always!
-
AnandTech | Intel Haswell GT3e GPU Performance Compared to NVIDIA's GeForce GT 650M
Seems to me the Anandtech guy is the easiest person in the world to trick. ..as long as it's trickery in favour of Intel, of course. -
Karamazovmm Overthinking? Always!
you forgot this, you can put on jareds tab nvidia fan as well -
I'm going to wait for benchmark results.
Hypes in the past:
Rambus: Too expensive for its performance, then was outclassed by later DDR versions.
Itanium: First version was a joke, second version fixed some of the problems, third version never acquired the needed marketshare.
Pentium 4: The Willmates (1.3 and 1.4 GHz) were sub-par. Some P3 models outclassed them. It took HT, larger cache, and a much higher clock rate to be able to somewhat compete against AMD's offerings.
Pentium D: The glued-together P4 cores had to communicate outside of the CPU die. That resulted in a major performance hit.
Early Phenom versions: Bug in the translation lookaside buffer. Lockups/data-corruption or 10% hit in performance.
Early SLI and Crossfire: You thought your 680 or 7970 CF was bad? Try the late 2000's versions.
Fermi: Nividia assured that it would be ground breaking in one of their press releases. It turned out to be a mini-oven.
Bulldozer: Nuff' said.
Ivy Bridge HD Graphics: A demo of it turned out to be merely a video recording. Good luck running a DX 11 game on it.
North Korea and its threats of attacking US mainland: Lolololololol -
Intel has been promising that their processors will eventually trump dGPUs but I think it is all relative. As hardware becomes more advanced the game engines will only evolve with it, so the need for a dGPU will always be there if you want the best performance. I am not saying that it won't surpass lower end GPUs, but it will likely never make the dGPU extinct. Maybe in smaller devices and ultrabooks but not desktops and high end laptops.
Surely Intel would like to take over that share of the market, but not quite yet. -
With the PS4 and Xbox 720 coming out in less than a year, future games are inevitably going to be much more taxing given the new, higher performing hardware.
Initially the games are going to be unoptimized, but given 2-4 years, all quality games are going to come with native 8 thread and HSA support, and were designed for a GPU that is between a 7850 and a 7870.
In short, any today's low to midrange GPUs are going to be insufficient in a very rapid time, though GCN/HSA-based APUs are going to last somewhat longer since the new consoles will have similar APUs (except for the Jaguar CPU), and thus more of the console optimizations will also carry over natively.
Higher end GPUs are also going to take a beating until PCs have enough raw computation power to overcome any optimizations for the PS4 and Xbox 720. -
Karamazovmm Overthinking? Always!
I just did some creative math, those numbers are from 3dmark11 P setting
HD 4000 = 700-800
HD4600 = HD4000 * 20% = 840-960
HD 5100 = HD 4600 * 50% = 1260-1440
Thus the HD 5200 for me is around P1600 = HD 5100 * 33% or 13%, and that is still a whole lot for just 128mb of a very wide and fast edram -
I will admit that the Intel Graphics offering has slowly crept up the benchmarks list every year, but that doesn't mean that it will eventually make dGPUs extinct. Maybe the lower end ones, but even that is questionable. Intel's HD 5200 (the highest of their offerings with 40 shaders) is said to be 2X as powerful as Intel HD 4000 graphics. That type of performance may satisfy the enthusiast or average family, but it won't cut it for high end anything, especially as gaming and software continue to become more complex, it is all relative. Despite the HD 5200 predicted as being as powerful as GT 445M I am willing to bet that in real world processing it won't perform as well as a laptop that has a Haswell processor and a similar low end GPU. It may be cheaper to the consumer, and maybe a bit more power efficient, but it won't stack up against the GT 720M and GT 735M just yet.
I can't see Intel competing for high end gaming systems any time soon, but they may have an impact on the market. Intel will try to convince some brand name manufacturers to ditch the dedicated GPU on their mid range systems and they could be successful. They will promise cheaper builds (maybe?), lower power consumption, more internal space, and more efficient cooling and surely some manufacturers will go for that, especially the ultrabooks and tablet hybrids. But the Intel HD 5200 simply won't be able to compete with the upper range GPUs and it won't sway gamers and users who need graphic power. I think that many manufacturers will recognize this fact and will still offer models with dGPUs or customization options at least. I read that computer gaming has eclipsed the console market, so there are plenty of gamers out there who will always need faster computers. The average gamer isn't going to spring for Intel graphics (at least not yet) and will demand a computer with a dedicated GPU. The one issue I foresee is if Intel manages to cut into the mid range market and make some of the lower end GPUs become obsolete this could force Nvidia to cut down it's offering and possibly charge higher premiums on dedicated GPUs.
If it already hasn't, this possibility has to have lit a fire under Nvidia and AMDs asses. They won't get by year after year with rebrands and self overclocking GPUs. They will have to follow suit and make the next evolutionary jump in graphics processing. Whether it be smaller, faster, lower power consumption, cooler, or all of them they both need to push the envelope and force games to the next level. This is the only way that Intel's slow but steady progression will be left in the dust. Intel may expand their grip on a small percent of the market, but there is no way that they are going to dominate it any time soon. Before we buy the hype lets see the actual benchmarks of the new Intel Graphics vs some of the new lower end GPUs (w Haswell CPUs) before we pass judgement. I can see Intel getting a firmer grip on the smaller Ultrabook and tablet hybrid market with their graphics, but not the gaming or rendering PCs any time soon... -
There's a quite a bit of i5 and i7 laptops with Intel's GPU. There only some people that need mobile i5s and i7s but not a powerful GPU, and having dozens or even hundreds of such non-dedicated GPU laptops is a tad overkill.
An i3 is more than sufficient for movies and web browsing... -
Karamazovmm Overthinking? Always!
igpus are even more limited than their dedicated counter parts, its simple, tdp for those is much more limited, while we see amd and intel installing edram to boost the performance of system, which is too slow for gpus and has a very low bandwidth, the heat is the primary concern, those things are thought in a balance.
for example, 680m = P6500-6800 in 3dmark11, thats at least 3x times as powerful as my creative math there, hopefully with maxwell and whatever amd is calling their gpus, this will be met with more 30-50% more power, in the high end. The main problem I see here is that despite last year good increase in mid range performance (it had stalled since the fermi launch, with just a very slight bump from the 4600m), we are still miles and miles behind high end. that gap must be filled with less power consuming gpus and with better tdp ones. another example if you overclock the heck out of a 650m you are still 200% behind the 680m.
TLR, nvidia and amd need to get their game together and invest on the mainstream for more performance, to make a clear cut decision of always leaving igpus to extremely mobile devices and marking the low end. dgpus wont disappear for now, who knows in 10 years or more?
-
Fat Dragon Just this guy, you know?
Unfortunately, if Intel starts moving into the tablet space more aggressively, the "Intel Inside" label will buy them a lot of customers in spite of the fact that their last foray in the field, Atom, was a complete debacle. And that means more money for Intel and less competitive power for their competitors who are producing superior products and selling them for less. -
-
We have had powerful igp's in the past, at the same relative performance that is offered by Intel's GT3e. This isn't a game changer by any stretch of the imagination.
-
Besides, those P -R people are smart enough to choose a snowy level. When pretty much eveything flying at the camera is white, it's hard to tell if it's smooth or not.
This NBR forum system replaces "P -R" (without the hyphen) with "Google Page Ranking". Whisky tango foxtrot?! (The short version of the previous sentence is also censored.) -
Although the market is driven by sales it will be up to manufacturers what to put in their products. I guess we'll have to wait and see what kind of impact it has on the newest notebooks. I just don't see even the casual gamer comprising and going without a dedicated GPU. Maybe if consoles start using integrated graphics then Nvidia and AMD should really start to worry, but until then it is only marketing hype. -
Karamazovmm Overthinking? Always!
The move to DDR4 in broadwell coupled with the redesign of the igp we may see more interesting performance, the base clock afterall for the DDR4 is 2133mhz, that is basically the speed of the fastest DDR3 for notebooks, and for desktops where it starts to get insanely expensive, DDR3 initial speed was 800mhz, though we rarely saw it, lets consider 1066mhz.
but I dont know if the performance of haswell is like the creative math that I did there, Im going to be very interested on how kaveri and kaveri successor are going to show to intel how to make a more powerful igpu in broadwell. AMD needs to get some design wins this year
Haswell gt3e to crush Nvidia and Amd low end market gpus ?
Discussion in 'Gaming (Software and Graphics Cards)' started by fantabulicius, Apr 11, 2013.