The Nvidia 780 is going to be released next month. I guess the 780Ms won't be too much later. Nvidia Rumored to Release GeForce GTX 780 Graphics Card on May 23 | Maximum PC
-
Meaker@Sager Company Representative
Since its the same silicon I would not be surprised.
-
steviejones133 Notebook Nobel Laureate
Might coincide with a M18x revision....
-
Can't wait to see these new cards in the M18x.
-
I am hoping next crop of m18x will come with choice of 780m/8970m
Even if its same silicon but rebranded, they might have been able to reduce temps a bit. -
-
-
-
-
I'm simultaneously excited and depressed. For hot on the heels of the 780 will be the 880 and on ad infinitum. Whatever you have will quickly become obsolete in less than a year. I am willing to wager that both AMD and Nvidia have hardware much faster than 780 and 8990, in numerical model terms they probably approximate 1580 and 15990. But if they were to release that onto the market now, tech that represents the actual state of the art, then they would have a failed business model as it would probably take them 5-10 years to surpass that in engineering terms. So by slowly feeding us subtle speed improvements year over year (has anyone wondered why there is a marked but limited improvement every year or so over the predecessor? More curious is how nearly identical the hardware from two separate companies is, i.e. 680 vs. 7990?) they can continue to finance their operations, at considerable profit I might add. Sitting this next round out, going to run my 680's until failure post-warranty warrants their replacement.
-
-
But it that was the case then jumps like 580 -> 680 wouldn't have been so big, same goes for 6990 -> 7970. And its nothing new to re-brand old silicon and launching it as a new card.
-
-
-
Meaker@Sager Company Representative
-
Want to talk about conspiracy theories, have a look at all the bought out and shelved tech sitting in the patent office. -
Bear in mind folks that the GTX 780 is a GK110 part - based off of the same die as the Titan, and Tesla K20. As such, we might not see a mobile part for some time, and there's nothing to say that if we do see a 780M, that it won't simply be a 680M with slightly higher clocks.
-
-
bleh, it's not conspiracy stuff. There are road maps that show releases with approximate dates and what fab process is anticipated for each release. The next huge jump for Nvidia is Maxwell. It will be beautiful and a jump likely bigger than what the 580->680 was. All the releases now are just incremental upgrades due to being on the same fab process with existing/similar architecture. That's pretty much the gist of it. Now the conspiracy is when game developers purposely add crap to their coding to create the buzz that the game is a "GPU KILLER" and gets people all excited to upgrade...but I digress and that's a topic for a whole nother thread
Fin. -
regarding upgrades,
i always think of the advantages and disadvantages of upgrading to a new recently released GPU arch v/s upgrading to a that same GPU arch but in its ending phase. This is assuming that I have an older GPU arch (upgrading on the same arch is not convenient in my opinion). -
Now, there are many variables, including competing technologies that can adapt better and become better thus killing of other techs. Then there is the issue of lack of competition, which can sometimes stagnate progress introduced to the market. Cost, competition, etc, are driving forces in many develpments. Even with current cellphone technologies, you can see detailed ARM CPUs and GPUs designed years ago that far surpass current popular offerings, but due to cost, production, manufacture feasibility and introduction to market, they haven't found themselves into products.
Now, while technology is advancing every year, that does not mean you are "obsolete" whatsoever. We can develop everything we want, but we still need people to buy it. And if people can't afford it or feel it is not time to buy, then we must first convince them. Since most people have particular budgets, everything tends to cater to the masses, and thus, even GPU solutions from years ago are more than relevant today.
For example, I just read an article of tomshardware comparing Core 2 Duo and Quad to current low entry offerings. In some points, even a Qore 2 Quad gave similar or more playable performance for gaming, and these CPUs were 5 years old. Sure they consumed quite more power and are much slower in other areas, but they are still relevant. Hell, there are people still gaming in Pentium 4 and Athlon 64 with medium range GPUs from years ago (HD4650) today. Sure they are gaming on the lowest end by now but they still enjoy it. -
Meaker@Sager Company Representative
The 680M is hardly going to be obsolete quickly.
-
-
Karamazovmm Overthinking? Always!
-
Look at the $1 billion in R&D that Nvidia spent or the $1.3 billion that AMD spend last year. All of that goes into developing newer microarchitectures.
Not to mention that much of the technology that is required for the performance increases depends on other companies and other segments of the industry. Why don't AMD and Nvidia have some finished 14nm next gen GPUs hidden away? It's because TSMC and GloFo don't have a 14nm process yet.
No doubt Nvidia, AMD, Intel (and Qualcomm, ARM, etc. etc.) all have some next gen engineering samples and plans. But those are far from being products that are technologically and commercially viable like people make them out to be. -
Nvidia 780s next month
Discussion in 'Gaming (Software and Graphics Cards)' started by Harlon21, May 4, 2013.