The Nvidia 780 is going to be released next month. I guess the 780Ms won't be too much later. Nvidia Rumored to Release GeForce GTX 780 Graphics Card on May 23 | Maximum PC
-
Meaker@Sager Company Representative
Since its the same silicon I would not be surprised.
-
steviejones133 Notebook Nobel Laureate
Might coincide with a M18x revision....
-
Can't wait to see these new cards in the M18x.
-
I am hoping next crop of m18x will come with choice of 780m/8970m
Even if its same silicon but rebranded, they might have been able to reduce temps a bit. -
Looks like there will be an 8990M. http://forum.notebookreview.com/gam...-radeon-hd-8970m-unveiled-11.html#post9186820
-
Have even heard rumours of the 8990M being 2 8970M's on one card. Will have to wait and see.
-
That's ridiculous, lol. More so than what I referenced.
-
I wish I could find the article again. Either way its just speculation anyway till they release it, hopefully it is a beast of a card because it will be right up my alley.
-
I'm simultaneously excited and depressed. For hot on the heels of the 780 will be the 880 and on ad infinitum. Whatever you have will quickly become obsolete in less than a year. I am willing to wager that both AMD and Nvidia have hardware much faster than 780 and 8990, in numerical model terms they probably approximate 1580 and 15990. But if they were to release that onto the market now, tech that represents the actual state of the art, then they would have a failed business model as it would probably take them 5-10 years to surpass that in engineering terms. So by slowly feeding us subtle speed improvements year over year (has anyone wondered why there is a marked but limited improvement every year or so over the predecessor? More curious is how nearly identical the hardware from two separate companies is, i.e. 680 vs. 7990?) they can continue to finance their operations, at considerable profit I might add. Sitting this next round out, going to run my 680's until failure post-warranty warrants their replacement.
-
Umm, that could open up the possibility of quadfire on the M18x. Now that would give an advantage to AMD on these laptops
-
Lets not go into conspiracy theories. Its all too common for a company to keep privileged information to themselves and keep tech from evolving too quickly.
But it that was the case then jumps like 580 -> 680 wouldn't have been so big, same goes for 6990 -> 7970. And its nothing new to re-brand old silicon and launching it as a new card.
To be honest, I doubt we will see 2 dies on one card inside a laptop, there is just not enough space with MXM. Unless dell goes back to M1730 style GPU solution. -
I think it is possible to have two dies on one card in a laptop, just not probable right now.
-
Like I said its just a rumour, the disadvantage would be the massive power setup it would need to run it though.
-
Meaker@Sager Company Representative
Not possible on the current MXM system no. -
I wouldn't exactly call it a conspiracy theory, more so the unfortunate way companies often release their technology on a drip to maximize profits. Intel do the same thing to consumers with their mediocre performance increases each iteration. This just highlights why healthy competition is good for the consumer,
Want to talk about conspiracy theories, have a look at all the bought out and shelved tech sitting in the patent office. -
Bear in mind folks that the GTX 780 is a GK110 part - based off of the same die as the Titan, and Tesla K20. As such, we might not see a mobile part for some time, and there's nothing to say that if we do see a 780M, that it won't simply be a 680M with slightly higher clocks.
-
Oh boy, here we go again...
-
bleh, it's not conspiracy stuff. There are road maps that show releases with approximate dates and what fab process is anticipated for each release. The next huge jump for Nvidia is Maxwell. It will be beautiful and a jump likely bigger than what the 580->680 was. All the releases now are just incremental upgrades due to being on the same fab process with existing/similar architecture. That's pretty much the gist of it. Now the conspiracy is when game developers purposely add crap to their coding to create the buzz that the game is a "GPU KILLER" and gets people all excited to upgrade...but I digress and that's a topic for a whole nother thread
Fin. -
regarding upgrades,
i always think of the advantages and disadvantages of upgrading to a new recently released GPU arch v/s upgrading to a that same GPU arch but in its ending phase. This is assuming that I have an older GPU arch (upgrading on the same arch is not convenient in my opinion). -
Haha, dude this is normal. It's not magic nor a particular conspiracy. R&D has plans and designs that allows for a performance they want. The problem is, sometimes cost is a bit too prohibitive for the consumer, so they choose to maximize profit while remain competitive. You can see trends and specifications, even protocols detailed (3G and 4G were conceptual and created several years before market introduction, for example) and thus it takes time and market development to have some technologies.
Now, there are many variables, including competing technologies that can adapt better and become better thus killing of other techs. Then there is the issue of lack of competition, which can sometimes stagnate progress introduced to the market. Cost, competition, etc, are driving forces in many develpments. Even with current cellphone technologies, you can see detailed ARM CPUs and GPUs designed years ago that far surpass current popular offerings, but due to cost, production, manufacture feasibility and introduction to market, they haven't found themselves into products.
Now, while technology is advancing every year, that does not mean you are "obsolete" whatsoever. We can develop everything we want, but we still need people to buy it. And if people can't afford it or feel it is not time to buy, then we must first convince them. Since most people have particular budgets, everything tends to cater to the masses, and thus, even GPU solutions from years ago are more than relevant today.
For example, I just read an article of tomshardware comparing Core 2 Duo and Quad to current low entry offerings. In some points, even a Qore 2 Quad gave similar or more playable performance for gaming, and these CPUs were 5 years old. Sure they consumed quite more power and are much slower in other areas, but they are still relevant. Hell, there are people still gaming in Pentium 4 and Athlon 64 with medium range GPUs from years ago (HD4650) today. Sure they are gaming on the lowest end by now but they still enjoy it. -
Meaker@Sager Company Representative
The 680M is hardly going to be obsolete quickly.
-
Exactly. Nothing to worry about yet.
-
Karamazovmm Overthinking? Always!
Stop lying meaker, the 780m has turbo boosted rainbows now, nothing can beat that -
AMD has been bleeding money for years. If they have their architectures 8 gens in advance, why are they so uncompetetive?
Look at the $1 billion in R&D that Nvidia spent or the $1.3 billion that AMD spend last year. All of that goes into developing newer microarchitectures.
Not to mention that much of the technology that is required for the performance increases depends on other companies and other segments of the industry. Why don't AMD and Nvidia have some finished 14nm next gen GPUs hidden away? It's because TSMC and GloFo don't have a 14nm process yet.
No doubt Nvidia, AMD, Intel (and Qualcomm, ARM, etc. etc.) all have some next gen engineering samples and plans. But those are far from being products that are technologically and commercially viable like people make them out to be. -
Lol, made me laugh.
Nvidia 780s next month
Discussion in 'Gaming (Software and Graphics Cards)' started by Harlon21, May 4, 2013.