Integrated graphics are being considered as 'good enough' by more and more people. They can perform all general use tasks with ease and even handle heavy flash and javascript programs without breaking a sweat. They have even gotten to the point where the higher-end iGPUs can run many of the latest titles at low settings.
With the introduction of iGPUs like the Intel Iris Pro 6200 that can supposedly compete with cards like the GT 750M, do you guys think that the market for low and mid-range dGPUs is going to die out in the couple years?
-
In my opinion - yes - and that is what makes it ridiculous when GPU-manufacturers continue their austerity with "rebranding" existing chips. There'll be a time the rebranded dedicated GPUs of the lower end and midrange will perform less than Intel's integrated chips. And then NVIDIA and AMD will have to deliver proper GPUs for their line-up. All of them requiring to deliver performance at higher midrange. It'll certainly take another 5 years until that point is reached though. Currently dedicated low/midrange chips still have one or the other advantage over integrated.
sasuke256 likes this. -
i feel like the only reason one would want to buy a laptop with a mid-range dGPU nowadays is if they actually need that dedicated VRAM and don't want to spend extra to get the higher-end ones.
-
Since a dedicated GPU always gets additional TDP and the mid-range ones are something like 35-50W which is comparable to the entirety of a CPU + iGPU, they should still be able to provide additional performance. What Intel does is establish the "floor" and while it does mean that a rather long list of GPUs from previous generations would never make sense in a new laptop, GPUs like the Maxwell 850/860M will still be significantly above that.
franzerich and HTWingNut like this. -
No they aren't dead. As Althernai noted, they get their own TDP, plus their own dedicated and usually much faster video RAM than the system RAM. I recently got an Asus thin and light with a GT 840m GPU in it and it's a cool running low TDP GPU with its own dedicated DDR3 vRAM. Intel makes significant steps in IGP every 4-5 years, then it barely improves the time after that. To me, unless a GPU can manage pretty much every game released, even if it means low detail and 720p resolution, it's not really a replacement for anything.
Intel's Iris Pro is about the only thing comparable to a CPU + dedicated GPU setup, but then again, it also consumes the same amount of power and usually costs as much as or more than a CPU + dedicated GPU combo. -
Skylake will be extremely interesting, thats for sure.
You have Iris Pro 5200 with 40 Execution Units getting about 2000 points in 3DMark11 GPU. This is Haswell type 4950HQ etc.
Then you have Iris Pro 6200 wih 48 EUs which is Broadwell. Not tested yet, because we are about to jump to Skylake.
Now here comes Skylake.
Iris Pro xxxx, with 72EUs. With its own eDRAM but also with DDR4 support which will benefit the IGP greatly. On 14nm process vs 28nm Haswell.
Wouldnt surprise me to see 3000-4000 GPU score somewhere for this IGP.
GT 840M get about 2400 GPU score. -
But the Intel chip with iris pro will cost as much as a laptop with a regular igp + dedicated gpu.
Everything is also on a single die which means the heat is a lot more dense, and harder to cool with conventional means.Starlight5 and franzerich like this. -
$839 including shipping for this one
http://www.amazon.com/HP-15-k220nr-...ie=UTF8&qid=1428678643&sr=8-1&keywords=4750hq
From a performance standpoint Intel is gaining, mostly because of their process advantage. I think Intel have gained market share quarter after quarter the last few years, and a lot of that is because we have Joe`s and some gamers not bothering to check specs too much.
Totally agree with you about the heat for the upcoming CPUs. Haswell was already hot, will be very interesting to see how Skylake will be in terms of heat. We know Intel wrote in their specification files that OEMs must increase cooling for Broadwell CPUs to accomodate them so we shall see. -
What you'll need is a thick copper block on top of the GPU with like four heatpipes running through it to pull the heat off. You can buy 850m/860m/950/960m GPU equipped machines for under $1000 as well. Only way I see an advantage is cost, and it sure isn't there with the Iris Pro.
-
Iris Pro systems are overpriced. AMD has a better chance of marginalizing low-end and low-midrange dGPUs with its APUs.
-
They may be overpriced but they have advantages for the manufacturers not having to leave room for a separate GPU core with its circuitry and RAM soldered to the board. The average consumer doesn't care about high performance PC gaming, that's what their consoles are for in their minds.
I think with Skylake we will start to see more machines ship without a dGPU - and sell really well too.
I got by for quite awhile with an Intel HD 3000 by moving my gaming to my PS3. I would still be pure console if I didn't get the money I did last year.
Laptops for gaming are going the way of the dodo bird... Its sad but true. -
Now that's something interesting we haven't heard much about lately. Is there any info on what the CPU performance will be with those APUs?
-
No.
-
I really think its going to go the other way. I think this console generation might be the last traditional console generation especially with the proliferation of steam boxes. Laptop gaming might not be growing but given that it is PC gaming it will go along for the ride.
Sent from my Nexus 5 using TapatalkHTWingNut likes this. -
I'm a consumer. I don't care what advantages it has for manufacturers. By that token, BGA has advantages for manufacturers too. Ultimately these things only hurt end-users.
The average consumer also isn't spending $800-$1000 on a craptop w/o a GPU. And the slightly more educated ones for sure aren't paying 850M/860M money for 820M performance.Last edited: Apr 11, 2015 -
Simple answer, NO.
Why?
1. They are money grabber
2. Lol (yes, I mean league of legend) and let's not forget all other moba that are coming.
Sent from my 306SH -
I agree. I think laptops will proliferate for gaming, and desktops will start to lose some market share. Because everyone wants faster and smaller, as technology shrinks, it will be easier and easier to fit these components in smaller packages. HOWEVER, I just don't like though that games have been so dumbed down that before we had consoles ports that were "dumbed down" on the PC, and now PC games are becoming ports of smart phone games.
We're regressing in gaming technology really. Not that games like Atari 2600 weren't fun, just that we've moved well beyond that, but seem to be going back to the simplistic nature of games. If the gameplay is great I don't care, I don't need 4k 144Hz G-sync 26GB of textures to make a game fun. But I don't want games that require two buttons to control when tech has progressed significantly beyond that. -
Yeah, they might be a money grab, but high-end iGPUs are also a money grab for Intel. Also, high-end iGPUs can handle LOL and other MOBAs just fine.
This is my point. High-end iGPUs have progressed to a point where they can rival low-end dGPUs. While not even the best Iris Pro can compete with an 860M at the moment, iGPUs are getting faster and faster, arguably at a rate faster than dGPUs.
The next gen xx40M may not be able to compete with an Intel Iris Pro 7300. In terms of performance they could be similar, but the fact that Intel controls most of the laptop component business and the fact that iGPUs are generally more efficient may give them the push they need to make low and mid-range dGPUs obsolete. -
Your point is null because it assumes that iGPUs exist in a vacuum irrespective of perf/price. Even if they can compete with low-end dGPUs in terms of performance, they're priced like upper midrange/low high-end dGPUs. Which is why nobody in his right mind should buy such systems.
No they aren't. For every 50% improvement per generation that Intel boasts of, Nvidia and AMD eclipse that on a regular basis without Intel's process advantage (see Titan X: 60% faster than Titan Black on the same 28nm) while introducing a plethora of new hardware and software features that Intel doesn't begin to scratch.
No they're not. -
I agree with you on the fact that low-end dGPUs are better when it comes to price/performance, but again, remember that Intel controls most of the laptop component market.
The GT 840M is about 30% faster than the GT 740M. The Intel HD Graphics 5600 is slower than a 640M, yet the 6200 may even outperform the 840M.
EDIT:I should have compared the Iris 6200 to the Iris 5200, not the HD 5600. Even so, my point still stands.
Hmmm, I was under the impression that they were.
Just looked it up, yeah you're right at least if we're solely judging by TDP.Last edited: Apr 11, 2015 -
What's your point?
850M is 75% faster than 750M. 860M is 50% faster than 760M. 965M is 90% faster than 765M. All 28nm.
Meanwhile HD 5600 (14nm) is 20% faster than HD 4600 (22nm). Iris 6200 (14nm) is 20% faster than Iris 5200 (22nm). -
I have no doubt that low end gpus will always be in existence because they're so useful for marketing purposes, regardless of their performance.
-
Good point, selling it with a GeForce GPU will make users want it or make it seem more powerful.
-
Yep, just look at all those prebuilt "gaming" desktops in big box stores with i5/i7 and GT 610 or Radeon 7450
-
I guess you guys are right about mid-range dGPUs; they will probably still have a place for years to come. Is there really still a place for lower-end stuff like the 820M, though (aside from being a marketing-gimmick)?
-
820m can die a fiery death. Honestly, those are just a joke and waste of silicon. 64-bit, low shader count, low RAM speed. IGP can replace that without much issue. *HOWEVER* if you're looking at a ULV + 820m, the 820m still makes a little bit of sense, because it's still 40-50% faster than the HD 4400 that usually comes with the ULV CPU's, but still isn't gaming worthy, so I don't quite see the point. You'll get 15FPS instead of 11 at 800x450. Who cares.TomJGX likes this.
-
My point exactly. The new Intel HD 5600 will easily outperform the 820M. It's the same card as a 540M is I remember correctly.
-
All Ferni parts can burn in hell as far as I care.. They don't deserve to exsist in 2015!
-
Soo sorry
about sheer amount of fermi chip.
Feel free to write a letter to your government explaining that they should help fund for upgrades
Are low-end and mid-range dGPUs dead?
Discussion in 'Gaming (Software and Graphics Cards)' started by DataShell, Apr 9, 2015.