I was wondering what the community thinks about the possibility of discrete video cards becoming more and more a line of "pure enthusiasts" hardware. According to Intel Iris Pro Graphics 5200 - NotebookCheck.net Tech Intel's Iris pro can run most games on medium settings in or around 30fps.
If Intel continues to push their APU's, and assuming they can continue exponential performance gains rather than hit some sort of bottleneck, will discrete video cards become less relevant? As of now the Iris Pro is only 10-15% behind the GT 650M and GT 750M and is faster than AMD's 8650G. What does this mean for Nvidia and AMD?
I think with advent of 4K APUs may have a more difficult time keeping pace with dedicated GPUs but who knows what Skylake or beyond might bring, Intel's roadmap is shooting for 5nm by 2020-that is only 6 years away.
-
-
The high end ones won't become obsolete, but ever since I saw the Iris Pro 5200 I started wondering about mid range and lower GPUs. "Dedicated graphics" won't be of much importance as the sole marketing scheme anymore, as many ads don't actually name the video card model.
-
Haha... No they will not become obsolete. Iris Pro 5200 graphics laptop costs more than a machine with a GTX 765m which has over twice the performance. Iris Pro can run at 720p, beyond that it still has very limited bandwidth to run at 1080p. Maybe your lowest end GPU's will be replaced by IGP eventually, but even today the standard HD 4600 can't compete with the lowest end 720m or Radeon 8550m. And yes with larger resolutions, IGP's will fall WAY behind, again due to limited bandwidth.
Iris Pro is just an experiment, nothing more. Perhaps they will take lessons learned from it and incorporate it into their future products, but I doubt they will continue down the patch of Iris Pro as the norm.deniqueveritas likes this. -
Totally agree but it still pretty impressive none the less.
Do you think it may be possible in the future to link a APU with a current video card in a SLI/crossfire type configuration? I am sure it is mainly a driver/controller issue versus a pure architectural issue, no? If that is possible you would be able to squeeze a lot more life out of your set-up and if Iris pro turns out to be a successful experiment it could worth it from a price to performance ratio perspective. -
In general? Nope I don`t think so.
But eating up the low end space, yep. IGPs from Intel, APUs from AMD, SOC`s from Nvidia/ARM, they are all making low end notebooks obsolete because the devices they are in are so much cheaper.
High end gaming, I mean the gamers who want to game with high settings in atleast 1080p, need a dedicated GPU. The IGPs/APUs/SOCs can play 1080p too, but mostly for old games.
4K/3K is the big fuzz right now and even the greatest mobile GPUs (780M etc) can play only a few games in 4K. You need SLI notebooks to utilize that 4K display to the full in games.
imo -
Not like those SLI notebooks can do so much better at 4K at Ultra settings. If people are really that much into 4K, just get a desktop.
-
HTWingNut likes this.
-
GTX 780M SLI could play many games in 4K.
Just check out GTX Titan results.
Nvidia GTX780Ti Review (1600p, Ultra HD 4K) | KitGuru
Of course it wouldn`t be 60FPS, but playable 30-50FPS in most of them
Maxwell top end cards in SLI would be amazing with 4K display. You hear that Alienware? -
It would be interesting to see the 'theoretical" benchmarks of the IP 5200 and 780M in an asymmetrical configuration. I know it's not going to happen but I am just curious what it would look like. -
SLI/CrossFire work best with identical performing cards. -
On-topic: The asymmetrical builds have results all over the place. AMD should stop creating them and focus on more equivalent Crossfire setups. -
In theory, some rendering pipeline designs can be distributed across different GPUs, resulting in sightly higher total throughput with little lag. But I'm not aware of any implementation of this kind in any type of real time graphics. -
No, discreet cards are never going to disappear. And if a really good IGPU does come out it would, at best, replace the lowest end cards. From what AMD showed at CES I think it will be them who does it though and not intel.
-
The HD 4600 is a decent chip but even then its simply a bigger version of the 4000 rather than an architectural improvement. Besides the new consoles are out and requirements are going to shoot up meaning the Intel IGPs will struggle to play on Low again.
Its great news they are improving so much but its still a while, at least now you can play older games comfortably and the occasional new games with decent settings. -
inperfectdarkness Notebook Evangelist
If anything, dedicated desktop cards will die out first. Until laptops can hit parity with desktops, discreet cards will still be around. 1080p performance may be fairly easy (as it should, since "2k" resolution (1600x1200 / 1080p) has been the standard for well over a decade. Until 4k displays are the standard, and games are programmed to be rendered at that level natively, we won't begin to see the end of discreet cards. There's just too much room for improvement. We'd probably already be there if consoles weren't destroying progress by watering down what is "acceptable".
-
Well, no real surprise the Iris cost/performance ratio is pretty high. It's still a product in its infancy. Despite the enthusiasm for high end GPUs, the low and mid range still make the vast majority of sales. I can see Intels Iris series becoming a pretty big threat to AMD and Nvidia.
-
Blame teh consoles!
-
If their largest and most profitable segment is shrinking what does that mean for the innovation of dGPU's. Gamers, enthusiasts and supercomputers can't fill the gap for ever, or can they? -
Maybe not obsolete, but not a requirement either.
Intel 5200 is close enough to a 750m that if they make a similar leap when Skylake comes out in 2015 then I can see an IGP at roughly a 680m/7970m level.
Considering 880m and R9 290m (or whatever) are just rebadges of those chips, that is scary.
Also, with PS4 running a Pitcairn derivative I think developers will target that level. So the only reason for more power will be 4k resolution that PCs can offer. -
Where is the IGP gonna get that memory bandwidth from? Can we has 8-16GB unified GDDR5 on a 256-bit or greater bus in every gaming laptop by 2015? Can we has a lot of software, specifically games, written to take advantage of HSA while maintaining backward compatibility with the traditional memory architecture by 2015? LOL
HTWingNut, Zero000 and deniqueveritas like this. -
Without discrete graphics cards, how on earth are we supposed to fry eggs now?
Cloudfire likes this. -
-
Absolutely not. In fact, they will continue to get more powerful. The APU will have their division for playing Pokémon, and the content creators will use the real GPUs.
-
Attached Files:
-
-
btw those fighters look good with those balls, at least each of them have ball
(sounds weird but it needs to be singular, am i right?) -
I would imagine that GDDR5 can be implemented in such a manner for PC's on a smaller manufacturing process, just like it was done for the consoles (hence the enormous bandwidth there).
Carrizo (successor to Kaveri) might feature this option, but its more likely to come with DDR4 support (which should double the existing bandwidth I think).
HSA is there to SIMPLIFY things for developers and take advantage of the shared memory on the APU (to my understanding, its not difficult to implement HSA support for existing software... actually, if I'm not mistaken, its as simple as integrating it into a patch, and was specifically designed so it can be implemented relatively quickly - all you have to do is essentially have developers behind it, and the HSA Foundation already does this - Windows 8 and all upcoming/existing Adobe products are HSA ready).
Plus, Kaveri, and chips in general that support HSA do not have to copy data from the CPU and convert it into GPU readable code, hence the shared memory on the APU where data goes directly from the CPU to be processed by the GPU.
Kaveri is the first implementation of HSA, hence why people need to write a patch for existing software to utilize this function... however, if I understood the premise behind Excavator design, it should be able to do this on a hardware level without prior software support (so the CPU and GPU can really work in union automatically).
This is why Kaveri for example is touted as having 12 compute cores (4 CPU steamroller cores and 8 GCN cores) - and since the GPU can do things far more efficiently and faster than a CPU in numerous areas....
As for maintaining backward compatibility... the APU's are still x86, are they not?
As for the far future, if you want backward compatibility, then I would imagine you will have to use something akin to DosBox and Virtual Machine (besides, we have to get away from this ancient design as is). -
We are talking 20-50% difference.
The higher res, the higher difference.deniqueveritas likes this. -
Beamed from my G2 Tricorder -
Yup, once you hit the IGPs with MSAA and such or higher res, they can`t keep up with dedicated graphic cards.
Anandtech`s HD5200 review covered it pretty well.
AnandTech | Intel Iris Pro 5200 Graphics Review: Core i7-4950HQ Tested
I welcome 4K. Once they hit 17"/18" notebooks I will be one happy dude. Right now I`m really pissed about the fact that 13-14-15" notebooks with a weak midrange GPU get them. While 17and 18 inch notebooks with the greatest hardware is still stuck with 1080p displays.
Or the fact that mobile phones are starting to get 1440p displays.While the notebooks with the greatest hardware is still stuck with 1080p displays.
Somewhere in the companies that make displays for notebooks, there is one gigantic troll of a CEO/engineer sitting in his office with a coffee cup, laughing at us. Every day I dream that the guy will drop some really hot coffee on his lap and get severely burned.columbosoftserve and Akimitsui like this. -
1) these "higher" resolutions, not 4k displays
2) desktop replacement only account for 10% of the laptop market
3) there's more to a display than just mere pixels
I'd prefer a quality HD display than one that merely adds pixels. After the camera industrie's race for higher pixel debacle, you'd thing we would realize that by now. -
inperfectdarkness Notebook Evangelist
I agree with #3. That said, msi's 3k display is actually very, very good.
Cloudfire likes this. -
Cloudfire likes this.
-
-
inperfectdarkness Notebook Evangelist
i know. let's go back to 16:10 and throw all of the 16:9 crap in the garbage.
-
[ Insert mandatory 4:3 super race comment here. ]
Cloudfire likes this. -
-
Yes it looks horrible because the PPI(pixel density) is poor.... 960x540 is just a poor resolution on any size screen. It's not blurry because of the scaling. each pixel is BIG because it's comprised of 4 smaller pixels. There is no magic...math is math. If an 1080p image looks good on an 1080 p screen it will look the same on the same screen size 4k screen. The same as if a 540p image looks bad on an 1080p screen it will look just as bad on an 540p.
On a side note it seems that everywhere I go I find you posting wrong info Octiceps)
Cloudfire and deniqueveritas like this. -
" Some observations have indicated that the unaided human eye can generally not differentiate detail beyond 300 PPI;"
So sure if you have 1920x1080 and run at 960x540 it will look blurry on a 15" display, that's only about 140ppi. But crank up the pixel density high enough (over 300 dpi) it won't really matter, it will look the same as if it were a native 960x540 screen or a 3840x2160.
Too many Pixels? | SamMobile
And no where do you see me comparing LCD's and CRT's? I just stated (again) that is something that was lost when going from CRT to LCD, which is a fact. Once we regularly achieve > 300ppi then it won't matter again.Cloudfire and deniqueveritas like this. -
540p is marginal better than traditional TV. Ofc it will look blurry. 1080p on 4K will give you the same density as native 1080p, even though the pixel transition might look more smooth/blurry depending on resampling algo, which differs from implementation to implementation. If the algo is simple nearest it should look almost identical to native. Which algo is the best depends on what is being displayed, and personal preference as well.
But if you were seeing any kind of aliasing effect due to resampling from 540p to 1080p there was something wrong with your configuration. -
PlanetSide 2 Examples
1080p Rendering Quality 100% (1080p native):
1080p Rendering Quality 50% (540p native):
Rendering Quality 100%, Low Preset:
Rendering Quality 50%, High Preset:
That should put to rest any misconceptions that running 1/4 of native resolution will somehow look just as sharp as running native resolution. -
-
It was that if you pump 4x the number of pixels in a 15 inch display and then downscale it to 1x it will look just the same as a 15 inch display with a 1x native resolution
That is if i spread 1920x1080 pixels on a 15 inch 4k screen it will look just the same as 1920x1080 on a 1080p screen
Sent from my GT-N7000 using TapatalkHTWingNut, Cloudfire and deniqueveritas like this. -
-
Sent from my GT-N7000 using TapatalkCloudfire and deniqueveritas like this. -
inperfectdarkness Notebook Evangelist
I completely agree about 1080p on 4k looking like 1080p on 1080p. i think one of the biggest problems we've had thus far though, is a lot of bad scaling programs--which is why it's taken forever to go past 1080p, simply because so many unwashed masses were scared about the text size on something with a higher resolution.
as to the 4:3 comment, um, no. the GOLDEN RATIO is where it's at. 1.6180339..... 16:10 is 1.6. 16:9 is 1.777777777 repeating. so 16:10 is practically identical to the golden ratio, whereas 16:9 is quite a bit off (and 4:3 is even further off). to add insult to injury, the entire point of 1080p is to avoid letterboxing...which is still common with the new 22:9 standard that many movies are shot in. -
Took a PNG format screenshot from here.
I scaled down this image to 540p as GPU input, and scale it back to 1080p as panel output. The first one is done with nearest resampling to simulate the native screen, as 4 pixels with exactly the same colour value will appear as a bigger one. The second was done with cubic resampling, a smooth interpolation algo commonly seen. The third one shows the two images 50-50.
Open the four images in separate tabs and scale to 1:1 to view the effect.
-
OK great, so you know how to use Photoshop, but show me a game which does this?
-
Do you think discrete graphics cards will become obsolete....soon
Discussion in 'Gaming (Software and Graphics Cards)' started by minerva330, Jan 8, 2014.