But its ooooold![]()
-
-
-
mine just turned two and still going strong
Sent from my Galaxy Nexus using Tapatalk 2 -
King of Interns Simply a laptop enthusiast
Mine turned 3 and half some while back
I haven't had it that long though just taking care of it through its retirement
-
Mine will be 5 years in December 2013
). I can't believe it's still working to be honest.
-
ha, wow, blacky definitely beat us there
haha
-
King of Interns Simply a laptop enthusiast
If only you and I still had our old C90S's mate. Then we would have him beat
-
-
Yes, I originally had the 9800M GTX and upgraded to the 280M. It was a good upgrade.
-
the lack of new info in this thread makes me sad
-
I'm not sure what you expected. It's been said over and over that the entire 7xxM series are likely to be 6xxM rebrands. Which makes total sense since it's still based on Kepler on the same 28nm fabrication. Nvidia and AMD go through this pattern all the time, it's predictable. Year 1, new architecture, new fabrication. Year 2, rebadge. Rinse and repeat.
Well anyway, if you are so sad, I would suggest finding new information and then you can share it in this thread. -
hm yeah but if the 780m is the 680mx its still a decent upgrade. 30% estimated performance increase and then overclocking the 780m on top of it all..
im sure i can expect the usual price of about $800 or 900 for it yes? -
The 780m is likely not to be clocked the same as the 680mx. Most likely same or slightly slower than the 680m. The extra cores and faster vRAM is what will make the difference and keep it under 100W TDP. I'm sure there will be some overclock headroom though, just your PSU and cooling in the laptop had better be up to the task.
-
Holy Shiva! According to the poster he have spoken with a guy who have tested the GTX 780M and it will be a FAST GPU.
Specifications of the upcoming GTX 780M:
GK110: 2122 cores @ 513 - 620MHz. GDDR5 @ 320Bit @ 1250MHz
Features GPU Boost 2.0 like the rest of the 700M series for extra clocks.
Max power consumption of the card is 113W. Leaked tests have revealed that the GTX 780M is up to 33% faster than GTX 680MX.
I double checked with GK110 to see if the cores match. They do.
Full GK110 = 2880 cores and 15SMX. 1SMX = 192 cores. GTX 780M will have 11SMX (4 disabled). 11SMX * 192 = 2112 cores. It match.
113W at those clocks sounds legit too. Especially since the GTX 680M doesn`t use nearly as much energy as the 580M did so they do have a lot of headroom for the 780M.
What if this really is true?
Translated source: Google Translate -
HaloGod2012 Notebook Virtuoso
If this is true then ill never have to go back to a desktop again for gaming. Hopefully the m17x r4 can take this thing. The specs mentioned seem to good to be true but who knows with nvidia. Looks like the 8970m is going to get crushed
-
The first ever > 256-bit card comes without a die shrink or architectural shift? -
Holy mother of...
60% more cores? Although clocked 200MHz slower if that 513 means stock and 620 means max boost? But nice that it will have 2500MHz GDDR5.
320-bit is pretty amazing too. If it's truly 33% faster than the 680M X then that means likely 50% more than the 680m. Pretty phenomenal. I guess it's just wait and see though to know for certain. -
-
no way jose. ill only believe THAT kinda perf. jump when i see a legit review
Sent from my Galaxy Nexus using Tapatalk 2 -
If all this is true this card is gonna be freaking expensive the 680m is ridiculously overpriced now so you can expect the 780m to be the same or more even.
-
so itll be like: new laptop or new gpu, both at 2k
jk
Sent from my Galaxy Nexus using Tapatalk 2 -
Don't get me wrong, I want that to be true, since I'm going with the GTX 780M no matter what.
I just don't see how it can have specs that crush the 680MX so soundly, yet it comes in at an even lower power consumption of 113W. -
I don`t trust for a second that the GTX 680M is 100W TDP either so don`t read too much in to the database from Techpowerup. -
failwheeldrive Notebook Deity
If those #s are correct, I'm going to be pretty bummed that I didn't wait a few months to get my M18x. The M18x wouldn't handle two 113w cards, so my only option to get some 780m goodness would be an entirely new laptop
-
I really hope its true. I`m drooling at that 320bit bus
Can`t you just get a bigger PSU failwheel? -
failwheeldrive Notebook Deity
-
Wow, if this is true then the 780m should have even more overclocking headroom than the 680m.
-
failwheeldrive Notebook Deity
-
-
GTX 680. TDP = 195W
Power consumption during gaming = 166W
-
-
GTX 680 will draw more than TDP too if they push it, 228W.
GTX 650 TI boost is also a GPU where TDP and power consumption don`t go together.
TDP = 134W.
Yet the maximum power consumption is 118W when pushed. Peak is 112W
So its looks pretty mixed to me. Agreed? -
King of Interns Simply a laptop enthusiast
Nvidia good job! I like that they are playing very competitively from the start this time. AMD are going to be the ones left catching up this time. Unless they indeed bring out a 8990M lol
If they don't I guess the 780M will cost $800 or so -
Well we know nothing for certain at this time of course. So it's fun to speculate, but it's still just that, speculation. We will see, we will see.
-
Sent from my Galaxy Nexus using Tapatalk 2 -
King of Interns Simply a laptop enthusiast
Lol my bad. Make it $1k
-
Wonder what the GPGPU performance will be like. If they cut it down like the 6xx cards.
Sent from my Nexus 4 -
So that is actually another reason why Nvidia might be leaning toward GK110 for GTX 780M: To offer a graphic card that is capable in both gaming and computation. The previous GTX 580M was a decent GPGPU performer, it had 1/8 of the cores as FP32. Maybe that is what Nvidia want to recreate, -
AMD massively outperform Nvidia in OpenCL at the moment. -
Sorry I wrote wrong. The GTX 580M was based on 560 TI not GTX 580. So 580M had 1/12 of the cores as FP32. Still way better than GTX 680M. So you can only imagine if GTX 780M comes with 1/3. It will leave 580M in the dust. Depends on wether the rumor is true though.
Adobe still offer support for CUDA right? I think you are safer on CUDA than OpenCL for most programs (?) if you can find a Nvidia GPU with decent computation performance. -
If Nvidia cut the GPGPU in favour for their Quadro cards then AMD cards would offer the most performance for cost.
Sent from my Nexus 4 -
Meaker@Sager Company Representative
Nvidia is not adding 1/3rd rate to the GPUs, that would KILL its gaming performance, it would be worse that the current chips.
-
Okay, checking those leaked specs and doing a little math, it comes out to the supposed specs for that 780M that cloudfire posted are quite... fishy. Firstly, the memory bandwidth beats the GTX 680, not even thinking about the 680M or 680MX. 200GB/s vs 192GB/s. Secondly, the core clock power is about 15% faster than the 680MX and the memory bandwidth is 25% faster than the 680MX... so not sure where a 33% boost in performance will come from unless games are THAT memory dependent these days. That's settled. The 33% I would believe to be versus the 680M itself, where it has something like 80% extra memory bandwidth and 35% extra core speed, which should tear through performance charts in comparison. If this is the case though, as someone pointed out, the card would be extremely expensive and would probably cost as much as buying SLI 680Ms currently costs. The core performance with no OCing is within 10% of the GTX 680's power and its memory bandwidth is already beaten... making it a sort of sidegrade to that card. That does not already exist nor has been leaked/talked about/etc in a way that it will exist soon. In short, I don't believe the technology is there for them to create this for a notebook model, and if they do, it will NOT be cool. It'll probably get as hot as the GTX 480M did or worse, and reduced clocks is great and all, but they already have trouble cooling a 680MX in a laptop format... such a massive boost to cores and the memory bus while keeping the 1250MHz base clock for the memory to boot while keeping it powered and cool just isn't feasible.
Also I'd like to point out that earlier when people said the max TDP was a guideline and some cards cross it and some cards don't even hit it, that might be well and good and all, but if the card is already pushing boundaries and it fluctuates, it may overdraw power and shut down, or cause other parts to lose power which may result in performance fluctuations or crashes. I think they also design cards and machines to allow for erring on the side of caution such that if it does use too much power or generate a little too much heat, the power/resilience is there to compensate for that overdraw. While it COULD run forever at the highly fluctuated state, it doesn't wish to. If it was designed to and did, then it could fluctuate even higher. I might be confusing some of you right now. It's often hard to explain things as my head sees it. But yes, I hope you understand the rest of the post.
TL;DR - I highlyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy doubt that those specs are legitimate with their current level of tech. At least what's made available to the public. *cues conspiracy theories* -
If those specs stick I may have to re-open "Prema Shop" during the summer brake and bring some cards along for you guys...
-
Prema shop! yehaw!
-
-
-
Do we have a estimate release date for the 780M and will it just fit in most standard new laptops?
-
-
Those estimated performance increase figures seem a little overoptimistic to me. Still, I can't wait
New details regarding upcoming GTX 780M
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 1, 2013.