When internal graphics will be strong enought to run demanding games?
-
Never since developers will continue to code games to use the hardware designed last year. For instance the games out today were designed to push the limits of the last generation of dedicated cards like the 580m and 6990m in laptops.......(as a general rule, obviously not every game will be as demanding as some) and now that 7970m and 680m are out you can rest assured that companies are coding games to push them as well. It's a never ending cycle.
-
masterchef341 The guy from The Notebook
Never is not a very realistic answer. 7 years. Whether the GPU should be integrated or not on high performance setups depends entirely on the tradeoff of latency and bandwidth vs heat restrictions. At some point, high end graphics will be integrated with the CPU on a single chip.
Now, this answers the topic question, which I read as: "when will there be no more dedicated graphics?"
As for your post question: "When will integrated graphics be strong enough to run demanding games?" - I would say that time has already come. -
I disagree. Consider this: 7 years ago the graphics that are currently integrated into the chips (HD3000 / HD4000 for example) would have been considered the penultimate in graphics performance. Now 7 years later, thanks to developers continually pushing the envelope, the integrated graphics that we have now would struggle mightily to produce playable frame rates at most resolutions in modern games.
So yes, I believe integrated graphics (specifically on chip graphics processing ability) will continue to advance, possibly (probably) to the point of an HD10000 (hypothetical designation) internal graphics processor being able to run BF3 at ultra settings and 50fps or greater like our 680's and 7970's do now but at the same time I think the Nvidia 1360m and the AMD 15950m will be running BF6 at those frame rates that the HD10000 couldn't touch. -
Karamazovmm Overthinking? Always!
not necessarily, I agree with masterchef.
The thing that will push out/in dedicated gpus is as always the heat and power constraints of a notebook. That wont change, we are moving for smaller, lighter and thinner devices. That will also happen to a lesser degree on the gaming and workstation line, but it will happen, it has happened.
Integrated gains in performance have been more interesting than the entry and middle range can offer, not to mention what amd has done with their apus, and apu from amd is more powerful than nvidia middle range line up from last year and same thing for amd as well. thats indeed incredible, not to mention that the core basis is not even gcn. What possibly can happen is that the HD5000 with edram should perform almost the same as a 640m, I dont think that it will perform as the 650m, it wont, but as a 640m it might, I have been saying this for quite awhile. -
But even performing as a 640 (or even a 650) is not nearly enough to push out dedicated graphics from a gamers perspective. Remember, just because we can't imagine it doesn't mean it won't happen. Technology will always advance.
-
Karamazovmm Overthinking? Always!
no, for people that buy high end gpus that will still be meaningless. However for people that doesnt care that much to max everything, it will be plentiful, and that means a very large portion of the consumers
Im almost happy with the gaming that the HD3000 provides me, however if I want more I use an old revived 4670m that I have laying around, neither provide extreme settings for me, but it makes up for allowing me to kill 1-3h in a strategy game here and them. -
You are of course correct, the vast majority of consumers aren't gamers and don't care about dedicated graphics. That is true even today. Integrated graphics today are more than adequate to support the needs of probably 80% of consumers. This being the case, why is it that companies like Dell, Asus, Samsung, Toshiba and the list goes on and on, continue to produce high end dedicated graphics equipped machines? The answer is simple, because the percentage of consumers that will buy them are willing to pay the premium for them. This isn't going to change. There will always be a segment of the consumer base that will pay for the best in terms of graphic performance. Since that is the case, there will always be a push on the part of developers to take advantage of the capabilities of the cards that are made. The companies making the cards are always going to try to push something bigger, better, faster, stronger out the door. It's the cycle of technology and it's not going to stop just because you and the majority of other people out there are satisfied with your HD3000 and your 4670m.
-
When the CPUs are so far ahead of time in terms of raw power that Intel/AMD can start using some of the CPUs resources on just the IGP
Have no idea where we are now or if we ever get there... -
masterchef341 The guy from The Notebook
-
-
Meaker@Sager Company Representative
Problems such as pin count and the different types of ram could stop the high end chips integrating.
-
Karamazovmm Overthinking? Always!
we are still going to move towards SoCs as I see, sharkbay for example doesnt even have the PCH anymore, and we expect that its going to serve as the basis for the broadwell ''chipsets'' or actually the lack of it. And Skylake might really be the one dishout the PCH -
The dedicated graphics we've got now will be able to play the most demanding games 7 years ago. Yes next year (late as November 13, 2014) we will be able to play Crysis on medium, I will bet anything.
7 years from now, we'll be able to play Battlefield 3 and Far Cry 3 (just some bad optimized examples) on Medium/High on the dedicated graphics (roughly).
.. Or not -
moviemarketing Milk Drinker
HD4000 Metro2033 Benchmarks
HD4000 Skyrim Benchmarks
HD4000 Battlefield 3 Benchmarks
No more dedicated graphics
Discussion in 'Gaming (Software and Graphics Cards)' started by fantomasz, Jan 16, 2013.