Straight line? There are many softwares including scenarios where sequentual speed comes to play Meaker. Especially when working with big files (loading big maps in games too...). There are other benefits (and cons) but that belongs in a different threads so let us not have that discussion. I like the bragging rights that comes with it. Lets just leave it at that![]()
Except 680M does not have GPU Boost 2.0 and its not the same as boost current 600 series have...
I find it strange that someone like you who mod GPUs don`t know the difference.
-
-
TheBlackIdentity Notebook Evangelist
-
I think it will be 1536 CUDA cores alright, but I wouldn't have upgraded if I had 680m.. Still may not upgrade if AMD is indeed releasing 8990m on 20nm fabrication this year, that would be huge.
-
Specially at high resolutions with AA etc and other effects enabled, the extra memory speed should help.
What Meaker means is that the traditional use of windows etc won't really change much, because you are still bound by small file performance I/O, not so much for huge files transfering in a single second.
Still, it sounds mighty impressive! -
Yeah I think it is 1536 cores too. Even if 680m overclocked reached the same potential, from a marketing standpoint and the fact they already have 680mx designed and ready, allows them for much more bragging rights to the average customer, where "more cores is better" regardless of everything else. It will be easier to sell at a premium compared to just an OC'd version.
-
If 780M with GPU Boost, there won`t be any other difference than having to do the GPU Boost manually. Or OC the memory. Lower voltage is always nice though
They doubled the core count on GTX 760M compared to 660M so that might be sign that they will increase cores with GTX 780M too
Several possibilities -
TheBlackIdentity Notebook Evangelist
-
Seriously... that's exactly what I have been talking about earlier this day - use your brain before posting.
Boost is just a vbios feature, the new crap isn't any better than the old, that's why I disabled it in most of my mods, even on the Titan. It seems to me that you never used anything with your so praised boost 2.0... you just see slides from Nvidia and therefore conclude it must be something great. -
"Use your brain before posting"
Says the guy who don`t understand that people have different needs. Seriously svl, that was a little short sighted and rude of you to say.
For the overclocker: It might be a hassle and not worth it since people have different profiles regarding what they anyway do so they control the temperature that way. I still think that the temperature slider is a nice feature.
For the average joe: Its great because it automatically overclocks the GPU based on temperature and available power and will never exceed the temperature, while the system ensure that the user have maximum performance from the GPU at all time. The user who want better performance don`t have to deal with Afterburner and create profiles or overclock.
For OEMs: Great. As long as the base specs fit their thermal envelope on their notebook, wether its slim, fat, big or bulky. They will now have a system that ensure that the GPU give the customer the best performance, while not having to deal with fried GPUs or other components because they miscalculated TDP or put too hot GPU in their system. With notebooks and the restricted thermal rooms available, I`m pretty sure GPU Boost 2.0 will be gladly accepted among notebook OEMs.
No I haven`t tried it yet, but I have a few friends who have the Titan and overclock theirs and I have seen the system in action.
Its kinda like Nvidia`s program that automatically pick the best settings in any game based on your hardware. Some hate it and want to find out what fits them best, while other people like it. -
No dude, GPU Boost 2.0 or any other variant is pure garbage concocted by NVIDIA to protect themselves and their AIB partners from RMA returns. They don't want people doing real overclocking so they introduced power limits and b.s. "boost" that really just scales up to the cards pre-defined TDP (it never exceeds it no matter how good your cooling).
As a result, cards like the Titan come with weak VRM and that's why 1.215v is the hard limit of the card--to exceed it you have to do a zombie mod. If they had properly built a $1000 card, none of this would have been necessary. It's all about the $$$ but fortunately AMD hasn't gone down that road. See what you made me do? You made me make AMD look good and now svl7 is gonna rub it in.
P.S. SVL7 is right, you should read up on what this stuff is about first. If anyone knows about boost and the inner workings of nvidia vbios, its him. -
You are welcome to tell me where I`m wrong Joker since you say I need to read about it
"NVIDIA to protect themselves and their AIB partners from RMA returns"
And how is that not related to OEMs (which I talked about)? And how on earth can you guys compare desktops which have an almost unrestricted thermal capacity against notebooks with 100W TDP limit? Its like two totally different worlds.
I understand the serious overclockers concern about Titan. I know about the restrictions the GPU Boost put on the user who really don`t want another feature to slam down on overclocking. I wrote about it on the post above yours. You don`t have to tell me this, I have been involved with different vbios with Titan and see what they do. But its wrong to say GPU Boost is not meant for notebooks when it own pros which seems tailored for that sort of systems.
Not everyone is hardcore overclockers. The system still allows you to overclock. I know many who will be happy with it and don`t see it as crap. I see great potential using this in mobile enviroment. I`d like to see you guys come with argument why it should not be used by the Average joe... -
-
Also I'm not really sure how you are in a position to criticise svl, given his contributions to the community and proven expertise with GPUs. It's also worthwhile to point out how you don't react to the posts that make remarks on your quite biased views on nVidia GPUs and your general behaviour on the forum. Giving an explanation to why you are constantly exaggerating and discarding opposing opinions might increase your credibility.
Anyway, I think there is a very real chance that nVidia will just release an overclocked 680M, given that the 485/580/675 were the same cards as well as far as I know, and that AMD doesn't seem to have anything that would beat the 780M significantly in this scenario either. nVidia is also not exactly known for releasing the best stuff they have even if they don't have to. -
As for your Nvidia bias BS, troll harder next time ey? You are clearly flaimbaiting.
I didn`t critisice SVL, I tried to have a debate against him. He do know his way around vbios and that stuff, but that does not mean he is always right. I disagree on him about GPU Boost 2.0. I`m allowed to have a different opinion right? -
-
I totally understand and agree that to some extend the GPU Boost does mess up the overclocking at some point. So I guess its ok to have different views on it. Lets just leave it at that. I personally think its interesting and welcome it. The rest can join forces with SVL to get rid of the godforsaken crap -
That's a very fundamental rule for a reasonable discussion.
It's not as if I expect you to realize that anything Nvidia does could possibly not be the best idea for everyone, but that's how it is. We're talking about the highest-end GPU here which costs almost as much as a Titan, and the boost is just castrating it, boost 2.0 is even worse than the first gen stuff when it comes to this. You haven't used 2.0 yet, right?
It's just as Joker says, boost has no business in the enthusiast market.
Also that stupid policy of Nvidia which results in only reference cards being released... but that's another story.
-
BTW did you guys hear that AMD plans to release their 20nm chips in 2013? Probably will have a paper launch in December with real cards (both mobile and desktop) hitting in early 2014 but it should be out on the market before NVIDIA can answer with their own die shrink.
-
Titan cost just as much as 780M yes. Well that may be right, but not everyone overclock it either. Same thing happend to some Titan owners. Some grew tired of Boost, changed to higher power settings, tried tons of VBIOS, some didn`t mind. Why should 780M be any different?
And I disagree with the whole "it cost $800 so they must be enthusiasts" . I know several people who just want to game. They don`t know much about hardware, they own 680M, they just want enough power for the newest games.
I have seen Prema and his data on the different voltage between 650M and 745M. It got a pretty big voltage reduction, so yes, I can understand that they have a lot of voltage headroom to play around with. Hence why I had no problem believing 680MX on 100W TDP. -
omg, these threads turn so quickly into reality shows
-
Very good point, Nvidia, just like any other company, is using marketing strategies here (to beat its rival, AMD, on paper), check this out:
1. Nvidia plans 675mx as 680m last year leaking screenies to understand what AMD has in the pocket.
2. AMD, seeing Nvidia aiming at a moderate increase in power, challenges them with 7970m in an instant.
3. Nvidia, realizing they are about to lose the high-end market to AMD, pulls the next year product, actual 680m, a year in advance.
4. Nvidia, realizes that they are about to lose a year of profit from the next gen, underclocks AND overvolts 680m, so that it performs much worse than it can on stock environment (and comes on-par with 7970m).
5. Nvidia, pulls the clocks and volt to normal on 680m, to re-release it, for a huge gain in performance (stock environment), a very nice illusion.
6. Question is, will AMD, seriously again challenge Nvidia with 20nm this year?
I wouldn't call it trolling, but certainly profit-oriented
+rep for the beautiful reasoning there Svl!
-
Hahahaha I agree maxheap. People don't fight so much and don't get so offended Cloud. Enthusiasts will work to overcome the limitations that comes with GPU boost. Users oblivious to the whole deal will enjoy their high end gaming
-
-
-
As others said though, I think nVidia can easily get away with a tweaked 680M this year as the 8970M is pretty much an overclocked 7970M as well, and keep the 680MX for next year or release a 780MX later on maybe.
-
Unless nVidia wants to delay Maxwell, I think they will offer some form of optimized 680mx as the 780m this year, likely in the next month. Whether it will truly be a 680mx with just lower voltage parts and lower clocks is yet to be confirmed. I can easily see them bumping up the clock speed of the 680m considering how it can easily overclock and at reasonable temperatures, and maybe that's all it will be.
-
Meaker@Sager Company Representative
Power consumption does go up a lot with the frequency though, so a 30% increase in frequency is drawing 20-30% more power.
-
-
Meaker@Sager Company Representative
TDP is power though really.
The way boost works, a benchmark like 3dmark 11 is going to see the biggest increase while a game benchmark like crysis 3 will see the smallest gain. -
780m a rebranded 680mx? If you compare the specs of 780m and 680mx at the link below you'll see it's exactly the same.
700m series: Comparison of Nvidia graphics processing units - Wikipedia, the free encyclopedia
600m series: Comparison of Nvidia graphics processing units - Wikipedia, the free encyclopedia -
Meaker@Sager Company Representative
Yes but that's based off rumored specifications, I think at one point it was claiming it would be titan based.
-
I see. So as of now the most reliable thing on 780M are the 3DMark11 scores you found carried out by somone in China?
And I also think I read somewhere on here you or someone else saying 8970M is a 7970M with higher clocks? -
-
I will wait till the next generation, this just isn't a big enough leap, my 680M reaches insane scores and runs everything at uttermost high-ultra.
I can deal with it -
-
Also I think with the 790M you can feel the graphics, you can feel water and gunshots, and then AMD goes bankrupt and they buy Intel.... cannot wait!!
What a hype -
I'm hoping Dell goes for the 4GB model this time. Games today are chomping up some VRAM, especially on high-res externals.
-
How much FPS do I get? -
LOL. Don't get me wrong. Just from 680m to 780m it won't be that significant. From even a 580m or less powerful it will be a world of difference. But heck, 280m to SLI 780m is like ecstasy. -
@HT: With multiple monitors it will. I'd still prefer 4GB over 2GB, especially if I don't plan to upgrade for a few years. Personal preference I suppose. -
-
Edit: And if you were wondering how I have room for multiple monitors, I don't. I sneak into a lab at night and setup monitors to game, haha. Haven't gotten in trouble yet. -
16x9x8 inches. Smaller footprint than most 17 inch notebooks.
-
-
Haven't looked into those types. Can all the hardware fit in that thing? Looks small like X51.
-
-
failwheeldrive Notebook Deity
Yup, you can go all the way up to 690s and Titans with Mini ITX builds. People even fit full wc loops in them these days. You should also check out micro ATX cases like the Corsair 350D, J.Dre. They're not much bigger than mini itx cases and can fit multiple graphics cards. Pretty crazy what you can do with desktops these days.
-
I think the 780M GTX will probably be the 680MX but tweaked a bit to meet the 100W TDP of most high end notebooks. Of course you never know, NVIDIA could very well get greedy and just toss an overclocked 680M in there and still whip AMD. -
I used to take my Shuttle "desktop" everywhere. It fit in a duffle bag and hung a 17" monitor off the side of it for use wherever I went. Shuttles are / were great, problem is proprietary motherboard and power supply and form factor. So you're stuck with their products and replacement parts are expensive. Plus they weren't always the best quality. But it was great to have.
-
failwheeldrive Notebook Deity
3DMark11 on the upcoming GTX 780M!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.