The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    No more dedicated graphics

    Discussion in 'Gaming (Software and Graphics Cards)' started by fantomasz, Jan 16, 2013.

  1. fantomasz

    fantomasz Notebook Deity

    Reputations:
    147
    Messages:
    1,113
    Likes Received:
    0
    Trophy Points:
    55
    When internal graphics will be strong enought to run demanding games?
     
  2. SlimShady

    SlimShady ΜΟΛΩΝ ΛΑΒ&

    Reputations:
    806
    Messages:
    979
    Likes Received:
    20
    Trophy Points:
    31
    Never since developers will continue to code games to use the hardware designed last year. For instance the games out today were designed to push the limits of the last generation of dedicated cards like the 580m and 6990m in laptops.......(as a general rule, obviously not every game will be as demanding as some) and now that 7970m and 680m are out you can rest assured that companies are coding games to push them as well. It's a never ending cycle.
     
  3. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    Never is not a very realistic answer. 7 years. Whether the GPU should be integrated or not on high performance setups depends entirely on the tradeoff of latency and bandwidth vs heat restrictions. At some point, high end graphics will be integrated with the CPU on a single chip.

    Now, this answers the topic question, which I read as: "when will there be no more dedicated graphics?"

    As for your post question: "When will integrated graphics be strong enough to run demanding games?" - I would say that time has already come.
     
  4. SlimShady

    SlimShady ΜΟΛΩΝ ΛΑΒ&

    Reputations:
    806
    Messages:
    979
    Likes Received:
    20
    Trophy Points:
    31

    I disagree. Consider this: 7 years ago the graphics that are currently integrated into the chips (HD3000 / HD4000 for example) would have been considered the penultimate in graphics performance. Now 7 years later, thanks to developers continually pushing the envelope, the integrated graphics that we have now would struggle mightily to produce playable frame rates at most resolutions in modern games.

    So yes, I believe integrated graphics (specifically on chip graphics processing ability) will continue to advance, possibly (probably) to the point of an HD10000 (hypothetical designation) internal graphics processor being able to run BF3 at ultra settings and 50fps or greater like our 680's and 7970's do now but at the same time I think the Nvidia 1360m and the AMD 15950m will be running BF6 at those frame rates that the HD10000 couldn't touch.
     
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    not necessarily, I agree with masterchef.

    The thing that will push out/in dedicated gpus is as always the heat and power constraints of a notebook. That wont change, we are moving for smaller, lighter and thinner devices. That will also happen to a lesser degree on the gaming and workstation line, but it will happen, it has happened.

    Integrated gains in performance have been more interesting than the entry and middle range can offer, not to mention what amd has done with their apus, and apu from amd is more powerful than nvidia middle range line up from last year and same thing for amd as well. thats indeed incredible, not to mention that the core basis is not even gcn. What possibly can happen is that the HD5000 with edram should perform almost the same as a 640m, I dont think that it will perform as the 650m, it wont, but as a 640m it might, I have been saying this for quite awhile.
     
  6. SlimShady

    SlimShady ΜΟΛΩΝ ΛΑΒ&

    Reputations:
    806
    Messages:
    979
    Likes Received:
    20
    Trophy Points:
    31
    But even performing as a 640 (or even a 650) is not nearly enough to push out dedicated graphics from a gamers perspective. Remember, just because we can't imagine it doesn't mean it won't happen. Technology will always advance.
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    no, for people that buy high end gpus that will still be meaningless. However for people that doesnt care that much to max everything, it will be plentiful, and that means a very large portion of the consumers

    Im almost happy with the gaming that the HD3000 provides me, however if I want more I use an old revived 4670m that I have laying around, neither provide extreme settings for me, but it makes up for allowing me to kill 1-3h in a strategy game here and them.
     
  8. SlimShady

    SlimShady ΜΟΛΩΝ ΛΑΒ&

    Reputations:
    806
    Messages:
    979
    Likes Received:
    20
    Trophy Points:
    31
    You are of course correct, the vast majority of consumers aren't gamers and don't care about dedicated graphics. That is true even today. Integrated graphics today are more than adequate to support the needs of probably 80% of consumers. This being the case, why is it that companies like Dell, Asus, Samsung, Toshiba and the list goes on and on, continue to produce high end dedicated graphics equipped machines? The answer is simple, because the percentage of consumers that will buy them are willing to pay the premium for them. This isn't going to change. There will always be a segment of the consumer base that will pay for the best in terms of graphic performance. Since that is the case, there will always be a push on the part of developers to take advantage of the capabilities of the cards that are made. The companies making the cards are always going to try to push something bigger, better, faster, stronger out the door. It's the cycle of technology and it's not going to stop just because you and the majority of other people out there are satisfied with your HD3000 and your 4670m.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    When the CPUs are so far ahead of time in terms of raw power that Intel/AMD can start using some of the CPUs resources on just the IGP

    Have no idea where we are now or if we ever get there...
     
  10. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    I'm not talking about old graphics hardware or current graphics hardware. I'm talking about a phenomenon that will occur in the future for graphics processors, where they will be integrated onto the same physical chip as the CPU, for performance reasons.
     
  11. SlimShady

    SlimShady ΜΟΛΩΝ ΛΑΒ&

    Reputations:
    806
    Messages:
    979
    Likes Received:
    20
    Trophy Points:
    31
    But you have to consider that if technology advances to the point that you can integrate graphics onto a cpu die and get the performance we're getting today out of it, then imagine the performance the same technology would provide for on a dedicated GPU?
     
  12. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Problems such as pin count and the different types of ram could stop the high end chips integrating.
     
  13. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I have already answered that

    Gaming line and workstation, always have demanded a premium, and until they exist they will. There are as you point out people that are willing to pay for it, so the market is there.

    we are still going to move towards SoCs as I see, sharkbay for example doesnt even have the PCH anymore, and we expect that its going to serve as the basis for the broadwell ''chipsets'' or actually the lack of it. And Skylake might really be the one dishout the PCH
     
  14. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    The dedicated graphics we've got now will be able to play the most demanding games 7 years ago. Yes next year (late as November 13, 2014) we will be able to play Crysis on medium, I will bet anything.

    7 years from now, we'll be able to play Battlefield 3 and Far Cry 3 (just some bad optimized examples) on Medium/High on the dedicated graphics (roughly).

    .. Or not :D
     
  15. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181