The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia 780s next month

    Discussion in 'Gaming (Software and Graphics Cards)' started by Harlon21, May 4, 2013.

  1. Harlon21

    Harlon21 Notebook Evangelist

    Reputations:
    558
    Messages:
    464
    Likes Received:
    2
    Trophy Points:
    0
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Since its the same silicon I would not be surprised.
     
  3. steviejones133

    steviejones133 Notebook Nobel Laureate

    Reputations:
    7,172
    Messages:
    10,077
    Likes Received:
    1,122
    Trophy Points:
    581
    Might coincide with a M18x revision.... ;)
     
  4. Siflyn

    Siflyn Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    Can't wait to see these new cards in the M18x. :D
     
  5. vs3074

    vs3074 Notebook Evangelist

    Reputations:
    349
    Messages:
    588
    Likes Received:
    159
    Trophy Points:
    56
    I am hoping next crop of m18x will come with choice of 780m/8970m :D

    Even if its same silicon but rebranded, they might have been able to reduce temps a bit.
     
  6. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
  7. lif3t4k3r

    lif3t4k3r Notebook Consultant

    Reputations:
    119
    Messages:
    288
    Likes Received:
    2
    Trophy Points:
    31
  8. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    That's ridiculous, lol. More so than what I referenced.
     
  9. lif3t4k3r

    lif3t4k3r Notebook Consultant

    Reputations:
    119
    Messages:
    288
    Likes Received:
    2
    Trophy Points:
    31
    I wish I could find the article again. Either way its just speculation anyway till they release it, hopefully it is a beast of a card because it will be right up my alley.
     
  10. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    I'm simultaneously excited and depressed. For hot on the heels of the 780 will be the 880 and on ad infinitum. Whatever you have will quickly become obsolete in less than a year. I am willing to wager that both AMD and Nvidia have hardware much faster than 780 and 8990, in numerical model terms they probably approximate 1580 and 15990. But if they were to release that onto the market now, tech that represents the actual state of the art, then they would have a failed business model as it would probably take them 5-10 years to surpass that in engineering terms. So by slowly feeding us subtle speed improvements year over year (has anyone wondered why there is a marked but limited improvement every year or so over the predecessor? More curious is how nearly identical the hardware from two separate companies is, i.e. 680 vs. 7990?) they can continue to finance their operations, at considerable profit I might add. Sitting this next round out, going to run my 680's until failure post-warranty warrants their replacement.
     
  11. wheth4400

    wheth4400 Notebook Evangelist

    Reputations:
    284
    Messages:
    312
    Likes Received:
    100
    Trophy Points:
    56
    Umm, that could open up the possibility of quadfire on the M18x. Now that would give an advantage to AMD on these laptops :)
     
  12. vs3074

    vs3074 Notebook Evangelist

    Reputations:
    349
    Messages:
    588
    Likes Received:
    159
    Trophy Points:
    56
    Lets not go into conspiracy theories. Its all too common for a company to keep privileged information to themselves and keep tech from evolving too quickly.

    But it that was the case then jumps like 580 -> 680 wouldn't have been so big, same goes for 6990 -> 7970. And its nothing new to re-brand old silicon and launching it as a new card.


    To be honest, I doubt we will see 2 dies on one card inside a laptop, there is just not enough space with MXM. Unless dell goes back to M1730 style GPU solution.
     
  13. wheth4400

    wheth4400 Notebook Evangelist

    Reputations:
    284
    Messages:
    312
    Likes Received:
    100
    Trophy Points:
    56
    I think it is possible to have two dies on one card in a laptop, just not probable right now.
     
  14. lif3t4k3r

    lif3t4k3r Notebook Consultant

    Reputations:
    119
    Messages:
    288
    Likes Received:
    2
    Trophy Points:
    31
    Like I said its just a rumour, the disadvantage would be the massive power setup it would need to run it though.
     
  15. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Not possible on the current MXM system no.
     
  16. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
     
    Last edited by a moderator: May 12, 2015
  17. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I wouldn't exactly call it a conspiracy theory, more so the unfortunate way companies often release their technology on a drip to maximize profits. Intel do the same thing to consumers with their mediocre performance increases each iteration. This just highlights why healthy competition is good for the consumer,
    Want to talk about conspiracy theories, have a look at all the bought out and shelved tech sitting in the patent office.
     
  18. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    Bear in mind folks that the GTX 780 is a GK110 part - based off of the same die as the Titan, and Tesla K20. As such, we might not see a mobile part for some time, and there's nothing to say that if we do see a 780M, that it won't simply be a 680M with slightly higher clocks.
     
  19. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Oh boy, here we go again... :rolleyes:
     
  20. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    bleh, it's not conspiracy stuff. There are road maps that show releases with approximate dates and what fab process is anticipated for each release. The next huge jump for Nvidia is Maxwell. It will be beautiful and a jump likely bigger than what the 580->680 was. All the releases now are just incremental upgrades due to being on the same fab process with existing/similar architecture. That's pretty much the gist of it. Now the conspiracy is when game developers purposely add crap to their coding to create the buzz that the game is a "GPU KILLER" and gets people all excited to upgrade...but I digress and that's a topic for a whole nother thread :)


    Fin.
     
  21. neoideo

    neoideo Notebook Consultant

    Reputations:
    72
    Messages:
    171
    Likes Received:
    4
    Trophy Points:
    31
    regarding upgrades,

    i always think of the advantages and disadvantages of upgrading to a new recently released GPU arch v/s upgrading to a that same GPU arch but in its ending phase. This is assuming that I have an older GPU arch (upgrading on the same arch is not convenient in my opinion).
     
  22. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Haha, dude this is normal. It's not magic nor a particular conspiracy. R&D has plans and designs that allows for a performance they want. The problem is, sometimes cost is a bit too prohibitive for the consumer, so they choose to maximize profit while remain competitive. You can see trends and specifications, even protocols detailed (3G and 4G were conceptual and created several years before market introduction, for example) and thus it takes time and market development to have some technologies.

    Now, there are many variables, including competing technologies that can adapt better and become better thus killing of other techs. Then there is the issue of lack of competition, which can sometimes stagnate progress introduced to the market. Cost, competition, etc, are driving forces in many develpments. Even with current cellphone technologies, you can see detailed ARM CPUs and GPUs designed years ago that far surpass current popular offerings, but due to cost, production, manufacture feasibility and introduction to market, they haven't found themselves into products.

    Now, while technology is advancing every year, that does not mean you are "obsolete" whatsoever. We can develop everything we want, but we still need people to buy it. And if people can't afford it or feel it is not time to buy, then we must first convince them. Since most people have particular budgets, everything tends to cater to the masses, and thus, even GPU solutions from years ago are more than relevant today.

    For example, I just read an article of tomshardware comparing Core 2 Duo and Quad to current low entry offerings. In some points, even a Qore 2 Quad gave similar or more playable performance for gaming, and these CPUs were 5 years old. Sure they consumed quite more power and are much slower in other areas, but they are still relevant. Hell, there are people still gaming in Pentium 4 and Athlon 64 with medium range GPUs from years ago (HD4650) today. Sure they are gaming on the lowest end by now but they still enjoy it.
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    The 680M is hardly going to be obsolete quickly.
     
  24. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Exactly. Nothing to worry about yet.
     
  25. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Stop lying meaker, the 780m has turbo boosted rainbows now, nothing can beat that
     
  26. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    AMD has been bleeding money for years. If they have their architectures 8 gens in advance, why are they so uncompetetive?

    Look at the $1 billion in R&D that Nvidia spent or the $1.3 billion that AMD spend last year. All of that goes into developing newer microarchitectures.

    Not to mention that much of the technology that is required for the performance increases depends on other companies and other segments of the industry. Why don't AMD and Nvidia have some finished 14nm next gen GPUs hidden away? It's because TSMC and GloFo don't have a 14nm process yet.

    No doubt Nvidia, AMD, Intel (and Qualcomm, ARM, etc. etc.) all have some next gen engineering samples and plans. But those are far from being products that are technologically and commercially viable like people make them out to be.
     
  27. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Lol, made me laugh. :D