It's not titan, the sager user manuals that were published have proved that, so that little theory can be put to bed now.
We also know that GK104 is getting no revision, just slightly different binning so there will be no IPC improvements.
(GK104-400-A2 = GTX680, GK104-425-A2 = GTX770) on the desktop.
The desktop GTX780 will be titan LE, which is maybe where your friend was getting confused and that will be 320bit GDDR5 with fewer SMXs than the current titan.
There may also be a fully enabled titan coming out.
So the best hope is that we get GK104-425-A2 on the 780M which would be very nice.
-
Meaker@Sager Company Representative
-
^^ seconded!! gtx 770 in mobile platform, make it happen!
-
Yup it looks like we will get 4GB 780M and not 5GB. That leaves out 320bit bus and GK110.
Another thing I noticed: PSU is still 180W on the 780M models. Not sure if I like that -
Meaker@Sager Company Representative
I noticed that too which means the TDP is still 100W since any increase would need a larger PSU.
-
Unless Nvidia somehow release a improved chip, GK114 exclusive for mobile GTX 780M, I`m either seeing a GTX 680M with GPU Boost 2.0 or a GTX 780M (680MX) that eat up most of what that 180W can deliver.
So much for overclocking potential... -
Meaker@Sager Company Representative
No, we know GK114 is going to be the next generation of chip. There will not be a mobile specific chip like that when it would have huge implications for the desktop market.
It would also have to sell for $5000 just to make back costs of development :/ -
GK114 might not even be a reality Meaker. Since Nvidia is focusing on GK104 and GK110 for GTX 7xx, the next generation will be GMxxx. Out next year.
Oh boy now I don`t know what to think about upcoming card. If its GTX 680M with GPU Boost 2.0, thats not very tempting since its just automatic overclocking on 680M. Manual still works.
If its 1536 Core GPU on a 180W PSU, what difference does it make vs a 680M? We can reach 680MX clocks with overclocking 680M. The cores can handle that easily. If they couldn`t, there would be reasons to use a 1536 core CPU since the clocks would be distributed on more cores not stressing them like you would with less core count.
What you think? -
Meaker@Sager Company Representative
The finer details are very hard to pin down, a slightly more aggressivly binned core could mean we see the same ram but a fully unlocked core, maybe with some form of turbo, great for 3dmark 11 P runs but not so impressive gains at 1080p gaming.
Of course if you can tweak the ram to go with it you will see gains of the 680M but baby steps rather than leaps.
Either way I think my 240W PSU was a great investment and getting hold of a larger brick and modifying it may be a staple for some time with the 15 inch sagers and MSI machines if you want to do any serious tweaking. -
That reminds me, I have to give my PSU mod a second try. Didn't you just solder the tip on and not at the PSU unit? Seems to be holding up just fine?
-
failwheeldrive Notebook Deity
Source? Last I heard, the rumor mill was saying that the 780 won't be anywhere near as powerful as the Titan, let alone the Titan LE. Why release a $1k, then release an even more powerful card for $500 a few months later? Doesn't make a whole lot of sense imo (not that Nvidia hasn't done retarded things in the past lol) -
TheBlackIdentity Notebook Evangelist
The Titan LE is a cut down version of the Titan with 2496 cores,a 320 bit bus and 5gb ram. It is rumored that the 780 will be the Titan LE with only 2.5gb memory and possibly lower clocks. That would still be an awesome deal for 500 bucks. I'd love to see Asus make a non ref card with that core.
It's also rumored that there will be a Titan Ultra with the fully unlocked GK110 core clocked at 950mhz base. That should be an awesome performer as well. It'll probably match the 690.
-
failwheeldrive Notebook Deity
I see, I was under the impression that the Titan le would be more powerful than the Titan Superclocked. That would be awesome if the 780 ended up being gk110. -
and now let our imagination go all out and picture a 790 with two titan ultras on one pcb
Sent from my Galaxy Nexus using Tapatalk 2 -
TheBlackIdentity Notebook Evangelist
That's unlikely however two Titan LE's would be more than sufficient to destroy the new 7990. Hell,even the Titan Ultra would be right up there with it and I'd definitely take a huge single core from Nvidia over two smaller ones from AMD.
What I really want is Nvidia allowing custom PCB's. Imagine an Asus dcu ii top Titan Ultra!
-
Meaker@Sager Company Representative
No mine was a simple two terminal design so I replaced the cable.
Two titan LE cards will also take up more space and use more power. -
W/e the 780 GTX ends performing at, Maxwel 880 GTX will be at least 50% maybe 100% better ! That's what i am excited about. Also unified memory.
-
TheBlackIdentity Notebook Evangelist
The GK110 core has better performance/watt figures so it would outperform the 7990 for the same amount of power.
7970 perf/watt: 14.59 gigaflops
GK110: 16.81 gigaflops -
Meaker@Sager Company Representative
The 7990 uses hand picked cores for lower power consumption and is rated at 375W so I don't know why you are bringing the 7970 up. In fact it hovers around the usage of 2 GTX680 cards so about 140W less than a pair of titans under normal game play.
-
There is a world of difference between the efficiency of a GK110/GK104 and 7990.
Yep, 240W would be ideal for a 680MX. What concern me however is that there might be a thermal reason to why they chose to use 180W instead of a 240W PSU. Perhaps that 180W is sort of the limit for the GPU. Bigger than that, it becomes to hot on stock cooling? Meaning it might not have the juice we hoped it had. Or maybe we are still stuck with GTX 680M only with GPU Boost. Unless we get GK114, that might be a game changer. -
Remember, all NVIDIA needs to do to lower the Watts is undervolting the same old chips and they don't even need to pick better ones for the 7-Series or count on an improved production.
I am currently running a "slightly edited" GT740M vBIOS on my GT650M and guess what, they use the EXACT voltage tables, to reduce the power consumption, which users of my undervolt mod have already been using on their GK107s since over a year...so it was sort of proven "open beta" model...ha ha ha
-
Interesting. So Kepler can be undervolted a lot? How many watts are we talking here with undervolting?
-
I haven't measured pull from the wall, but they lowered the voltage for 135-405 Mhz both to 0.8120 (135-270 was 0.8870 & 405 was 0.9250), 3D from 1.0500 to 0.9870 and Turbo from 1.0870 to 1.0250...
-
dayum, thats quite significant!
thx for the headsup
-
Yeah thats a lot. Even GPU Boost have lower voltage than 3D-mode with 600 series. Well that leaves me optimistic again for the 780M. An undervolted full GK104, with GPU Boost like the rest of the 700-series. Thanks Prema for restoring my hope for the future
-
Meaker@Sager Company Representative
Well considering the 680M can get higher than 950mhz at stock voltage, yes there is a lot of room for tweaking. Not good news for overclockers though.
Also cloud you are comparing a single titan to a dual card setup, the 7990 offering 7970 performance per watt while already factoring scaling losses. Also don't look at the all resolution bar when that includes 1280x800 resolution tests as anyone who buys a 7990 or a titan setup for that res needs their brain replaced. -
TheBlackIdentity Notebook Evangelist
I'm using the 7970 because that's the only thing I could find a figure for. Actually it's the firepro w9000's performance/watt which is a 7970 core. The gk110 figure is from the k20x.
Anyway. Who says nvidia can't specifically choose low consumption cores? Face it Meaker. It won't take much effort for Nvidia to beat the 7990 if they want to. Even a Titan ultra should be right up there with it and seeing as how it would have better frametimes,being one massive core,it would be the winner any day in my book. Framerates are completely meaningless now.
Thanks for that graph Cloud! I really hope AMD's next arch will be something more efficient. If nvidia pulls off that 80% perf/watt improvement with Maxwell than AMD will be waaay behind. Look at what happened with this gen. AMD messed up so we only got the mid range nv core as the 680. This year they're not bringing out anything so the GK'110 which should have been the 680 is selling for a 1000bucks. First AMD's incompetence messes up the cpu market and now the gpu market. I really don't like where this is going. -
I'm just going to respond with a big *SIGH* to this...........
-
TheBlackIdentity Notebook Evangelist
I had the same reaction when I saw the 7990's performance results. lol
Like it or not a 790 would destroy the 7990. Weather we'll see a 790 or not is another question. Personally I doubt it. -
I think there's a lot to be gained from looking at what hardware is being produced by each, but to claim one or the other is incompetent likely far exceeds your own level of expertise.
-
TheBlackIdentity Notebook Evangelist
What do you mean flagship gen to flagship original? -
Disregard that comment, sleep deprivation is starting to take its toll. -
TheBlackIdentity Notebook Evangelist
Well what would you call AMD's disability to make high performance cpu's or efficient gpu's? I'd call it the incompetence of their engineers. Alternatively they could be lacking the funds to do so. After all,AMD is putting most of their money into those cheap garbage apu's of theirs and they do beat intels low end stuff but that's meaningless for us enthusiasts.
I just hope their Volcanic Island gpu's will be better next year. They're supposed to be revolutionary but from what I've read all I see is them copying Nvidia's project Denver. -
well, let us remember some facts before we blatantly destroy AMD, they are a cute little company struggling against giants like:
Intel, Market Capital 115.74B
Nvidia, Market Capital 8.26B
And let us take a look into AMD, Market Capital 1.88B
I mean with almost 1/60 value of Intel and 1/5 of Nvidia, still competing with those companies in CPU and GPU departments respectively. I would say they are doing a good job! -
Let's also remember they are at this state because they failed to advance technology as those other 2 starting 3-6 years ago. (specially cpu)
-
You know, I will say this one thing. If AMD got their driver stuff together, PERIOD, they'd flat out blow nVidia out the water. Especially in the mobile market with their still-cheap GPUs. I cannot feel any empathy toward AMD if they themselves are the reason why many users have sworn them off and even half of their fanboys have pointed out exactly how retarded some of their stuff is. Like my buddy has three monitors and two 7870s, but he can either use 1 card + eyefinity 3 monitors OR 2 cards in CrossfireX + 2 monitors. He is, BY DESIGN, unable to use 3 monitors while using CrossfireX. Like, no, AMD. Fix it, more people buy your stuff.
-
TheBlackIdentity Notebook Evangelist
Well tbh I wish they could sell more of their fx8350's. Those are damn good value. Tying even the 3770k here and there for 60% of the price. I wonder if that rumor about the 5ghz fx 8350 is true or not.
Very true. Their drivers are what keep me away. I'd love to save a few bucks but i'm simply not willing to put up with their slow crap driver development and horrible framelatencies.
I mean look at all those red spikes: http://uk.hardware.info/reviews/404...e-gtx-660-frametimes-review-hitman-absolution
http://uk.hardware.info/reviews/404...a-geforce-gtx-660-frametimes-review-far-cry-3
Their cards stutter like crazy and those games are supposed to be optimized for AMD. -
So for some reason you didn`t even pay attention to the 7970Ghz there? Its just about where the 7990 is in terms of efficiency. Not really shocking when it really is two 7970s inside the 7990...
Also, if you remove that 800p resolution, use 1050p, 1200p, 1600p, the Titan comes 32% ahead of 7970 Ghz in terms of performance/watt ( source). Everyone knows that Tahiti XT is really inefficient, that it gets trumped by their own Pitcairn chip. As for the high end, GK110 is the most efficient chip out there. Which is why there is no surprise Nvidia want to make not just one GPU out of it, but 3 cards. -
Not only tying, but BEATING the 3770k in a lot of places. Livestreaming at high resolutions, gaming above 1080p, etc. In regular, 1080p or below gaming usage, or programs like photoshop etc, you won't notice the difference though. But livestreaming is getting more popular and a lot of people use it for spare income or a full time job too. Then so many people are looking for 1440p screens now too. It'll help there as well, by a good bit according to tek syndicate's comparison and review.
It does run hotter, though. That is a fact. But it's mostly a hardware 8 core, so to be expected. -
TheBlackIdentity Notebook Evangelist
Not surprising really. 8 weaker cores will still be better than 4 stronger ones in multi tasking and above 1080p the cpu gets less important for the game. I wish AMD made more money of those cpu's.
PS.: It runs hotter because it's on 32nm + it has a 125W TDP. Intel is waaay ahead in the efficiency department. -
I`m not sure which tests you are referring to but an i7-3570k beats FX-8350 in the clear majority of the games. In a few games the FX-8350 is within 2-3FPS from the i7. 3570k only cost $20 more and its a world of difference in power consumption between those two
-
TheBlackIdentity Notebook Evangelist
There are a lot of newer tests where the 8350 performs much better now. Especially in windows8 which is better optimized for it. Obviously in games that like single core performance better it'll lose but otherwise it's ahead or is equal to the 3570k and sometimes even the 3770k.
http://www.youtube.com/watch?v=eu8Sekdb-IE
http://www.youtube.com/watch?v=YHnsfIJtZ2o
There were two fixes in win7 for it + you need to disable core parking for it to get fully used. I'd also recommend that for intel cpu's. With it disabled bf3 uses all 8threads on my 2670qm. Game feels smoother now and has no hangs. -
Aha. I spesifically remember that 3570k beat 8350 in just about every single review I read, but that was on Windows7. Don`t you have any other reviews than those two videos: I`ve seen both Linus and razetheworld put out ridiculous benchmarks many times before.
Well never mind. That discussion belongs in a different thread. Interesting though, that Win8 give better performance for FX-8350. -
TheBlackIdentity Notebook Evangelist
Yeah I remember those old reviews to but since the fixes for win7 arrived it has improved a lot. Obviously it'll still consume more power because it's on 32nm. -
BlackIdentity has it, you need to install hotfixes for Win7 and such, but as I said before, it remains neck and neck with the 3570K and 3770K UNLESS you use livestreaming programs or game above 1080p resolution. The video is the first link he posted too, with their tests. If someone's just going to be gaming and is worried about heat or is simply not very computer savvy, a 3570k would probably fit the bill better. But I'm talking about doing real high-end stuff that people would buy these high end CPUs for. I mean no offense, but if you've got a 3770K and all you're doing is playing Guild Wars 2 and World of Warcraft but your CPU is OC'd to 4.5 GHz, then you're just wasting your time, power and the life of your CPU, because my i7-950 which is WORLDS weaker due to lower speed and lack of SB/IB's chipset optimizations eats all those games for breakfast at stock, almost never passing 50% usage. In fact, the ONLY game I've ever seen pass 50% CPU usage with my CPU is BF:BC2 in DX9 mode with max settings, and the DX9 api has been said to be more CPU dependent than DX10 or 11 apis. And even then it only used say 55%.
wow yeah really went off topic there. Oh well. -
Damn this is interesting
-
Meaker@Sager Company Representative
If we want to see them mobile side their power efficiency needs to improve, I had a Phenom II X6 running at 4ghz with a highly overclocked northbridge and it could rival I7s of the time but boy did it eat the power.
-
TheBlackIdentity Notebook Evangelist
I doubt we'll ever see them in the mobile high end again. Not with a stand alone cpu. Maybe an APU in a few years. That's where everything's headed anyway and they're pretty good at those. -
If only their APUs were compatible with their higher end GPUs for hybrid crossfire...
But either way, AMD + AMD has always ran better for people, for as long as i've known people. Anytime people mix Intel and AMD or AMD and nVidia they've had problems. Including my desktop, which loves bluescreening through two different motherboards. >_>. -
TheBlackIdentity Notebook Evangelist
Intel and nvidia work well together. I haven't seen a single bsod ever on my lappy. Except when I overclocked the bclk too high but that's my fault not the systems. Can't speak for desktops. I guess I'll find out if I decide to get a 4770k and a Titan ultra.
-
Intel and nVidia aren't specifically meant to work together though. I think it's just that because AMD doesn't like anything other than AMD, intel + nVidia is a default "work together". If there was a third CPU or GPU manufacturer, their parts may work well together while mismatched too.
New details regarding upcoming GTX 780M
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 1, 2013.