Wow, so those default/stock GPU scores were all with an undervolt applied, so as you say that would make the gap between OC'd vs stock even bigger in terms of performance per watt if your default GPU scores were actually at the stock voltage! Well, it certainly supports my findings that overclocking in a TDP limited scenario will improve performance & efficiencies. (+rep for the testing & presentation in a graph)
-
Robbo99999 Notebook Prophet
-
.
Robbo99999, Papusan, D2 Ultima and 1 other person like this. -
-
Robbo99999 Notebook Prophet
Wow, this is pretty cool, so spurred on by the benefits of overclocking in TDP limited scenarios, I decided to test my card with various TDP limits to work out the most efficient TDP which operates at a 'silent' fan level; not only that but I decided to see (on the off chance!) whether reducing the TDP would allow for a higher stable overclock - crazy right! This is a graph of my TDP vs Superpostion 4K benchmark score (chosen because uses lots of power):
You can see that the efficiencies start to drop off more dramatically after the 70% TDP value, however my card is 'silent' & cool at 80% TDP on Auto Fans, so that's the TDP I'm gonna choose. Note that my GTX 1070 Amp Edition has a high TDP of 220W, so 80% of that is still 176W, which is still quite a lot for a GTX 1070 as the Founders Edition cards are only 150W cards.
I then wanted to experiment to see if I could gain a higher stable overclock now that I have a reduced TDP of 80% - my theory being that the high load / high voltage & high frequency points would be cut out due to the reduced TDP ceiling & therefore perhaps the card is operating in a more efficient & more overclockable/stable zone on the curve. Well, it turned out to be true, I managed to increase core clock by 2 overclocking notches without even increasing voltage - so increased core by an extra 25Mhz. Here is a quick summary of my different stable overclocks, two of which will be used for testing today:
1) +75Mhz (1682Mhz Base Clock), no added voltage, 120% TDP, Aggressive Manual Fan Curve
2) +87Mhz (1694Mhz Base Clock), 50% added voltage, 120% TDP, Aggressive Manual Fan Curve
3) My new overclock found today based on 80% TDP allowing me to overclock higher: +100Mhz (1707Mhz Base Clock), no added voltage, 80% TDP, Auto Fans ('silent').
Below are my results comparing Profile 2 vs Profile 3:
Results:
Profile 2) Previous Max Stable Overclock at 120% TDP with Added Voltage & Noisy Aggressive Fan Curve (1694Mhz Base Clock):
Profle 3) New Max Stable Overclock at 80% TDP / no added voltage / 'Silent' Auto Fans (1707Mhz Base Clock):
From above results: 80% TDP new overclock is same score in Firestrike (slightly faster), but instead drops just 1% performance in 4K Superposition (reason being is that Superposition wants to suck down more power than Firestrike, even though Firestrike is still TDP limited for the most part at 80% TDP). This new higher overclock at 80% TDP is a big win for me over the previous overvolted 120% TDP profile I've been using historically - same performance but a lot quieter and lower power draw by typically 20W!
Who'd have thought that lowering the TDP would allow for a greater overclock! (from previous 1682Mhz Base Clock up to the new 1707Mhz Base Clock)
Footnote: stability of overclocks both new & old tested with looped 2hr+ run of Firestrike Extreme Graphics Test 1. Historically, for me, this finds an unstable core overclock within 45mins if it's gonna fail, otherwise it tends to be stable for 8hrs+. (Use TimeSpy Stress Test to weed out unstable VRAM overclocks - most sensitive.).Last edited: Aug 31, 2017 -
How about trying no added voltage, higher TDP allowance, and a higher overclock? Think you could make 2100MHz?hmscott, Robbo99999 and Vasudev like this. -
Robbo99999 Notebook Prophet
1) +75Mhz (1682Mhz Base Clock), no added voltage, 120% TDP, Aggressive Manual Fan Curve
2) +87Mhz (1694Mhz Base Clock), 50% added voltage, 120% TDP, Aggressive Manual Fan Curve
3) My new overclock found today based on 80% TDP allowing me to overclock higher: +100Mhz (1707Mhz Base Clock), no added voltage, 80% TDP, Auto Fans ('silent').
2100Mhz is actually achievable now with my newly discovered overclock (#3 option above):
It happens to be the highest boost clock available to the GPU when running Base Clock of 1707Mhz. It won't stay there during gaming though - depending on temperatures & load - remember the 80% TDP restriction of course. But, for most games for most of the time the 80% limit is not reached, so I'm getting increased performance in those vs my benchmarks I've shown today. I use the Firestrike benchmarks for stability testing, because in my experience the Graphics Test 1 is the most sensitive to unstable core clocks - I tune it to that & then it's stable in all my games as a result.Last edited: Aug 31, 2017hmscott likes this. -
hmscott and Robbo99999 like this.
-
Robbo99999 Notebook Prophet
That's a good idea you have there. My 120% TDP setting is a little misleading though, because my card never gets close to 120% TDP, but it does get to about 93/94% TDP during the Firestike Extreme Graphics Test 1 - so my TDP reduction to 80% is not quite as extreme as it initially sounds. So, I don't think that the small reduction to 90% would make much of a difference - but I think it's worth a try actually - the only problem is that it negates the reason for me choosing the silent 80% TDP profile - would be interesting to try for Science though, I'll try it.Last edited: Aug 31, 2017hmscott likes this. -
Ionising_Radiation and hmscott like this.
-
Robbo99999 Notebook Prophet
Last edited: Aug 31, 2017hmscott likes this. -
Well I guess you've got a lot of checking to do xD -
Robbo99999 Notebook Prophet
-
hmscott and Robbo99999 like this.
-
Robbo99999 Notebook Prophet
EDIT:
+150Mhz @80% TDP not stable, failed within 1 minute
+125Mhz @80% TDP not stable, failed within 5 minutes
+113Mhz @80% TDP....testing, but to be honest that's only 1 notch above my +100Mhz stable overclock I achieved earlier today, so praps not worth the testing!Last edited: Aug 31, 2017hmscott likes this. -
hmscott likes this.
-
Robbo99999 Notebook Prophet
1) Temperature of GPU has decreased about 4-5 degC since I have done some case mods to improve GPU airflow, combined with liquid metal on the GPU core (although liquid metal aspect only gave me 1 degC drop).
2) Has Firestrike had updates in the last few months that somehow allow for it to be less punishing to overclock stability? (This was my method of overclock stability testing).
3) Any changes in NVidia drivers that allow for a higher overclock? (Although historically in every GPU I've ever owned the driver hasn't made any difference to a 100% stable overclock)
4) Windows 10 Creators Update perhaps somehow allowed for higher stable overclock?
Those are just some of the changes that have happened since I last tested my overclock. I'd recommend you guys try testing your max stable overclock again, perhaps something has changed that is common in our ecosystems that allow for a higher overclock?
Here's a screenshot of my now balls-to-the-wall max stable overclock with 50% overvolt at 120% TDP setting in a 45min Dirt Rally benchmark loop (+113Mhz overclock, 1720Mhz Base Clock, 2113Mhz max boost clock, 2075Mhz stable boost clock seen throughout latter portion of run):
Although, one thing my testing has shown is that if I want a quiet system I can just run +100Mhz at zero added voltage at 80% TDP on a 'Silent' Default Auto Fan Profile and I'm only sacrificing 2% performance along with gaining 20W power savings (in situations where TDP limiter does kick in, doesn't on Dirt Rally) - so this is the one to run at most times. As seen in following screenshot, this yields a consistent 2050Mhz in Dirt Rally Benchmark - really not giving much performance away here!
Apologies once again for incorrect conclusion I drew in my quoted post.
@D2 Ultima , @Mr. Fox , @Coolane , @Papusan , @dspboys , @Ionising_Radiation , @Miguel Pereira , @hmscott , @ThePerfectStorm (in case you guys aren't subscribed, and you've all been active in the latest convos with likes or comments).
EDIT: and if anyone wants to post Volta news than please do so, because I'm aware that I've been instrumental in derailing this thread, although to be fair no one had posted in this thread for months, but welcome to post any Volta news & thoughts!Last edited: Sep 3, 2017Coolane, Mr. Fox, hmscott and 1 other person like this. -
ThePerfectStorm Notebook Deity
Sent from my SM-G935F using TapatalkMr. Fox, hmscott and Robbo99999 like this. -
ThePerfectStorm Notebook Deity
Sent from my SM-G935F using TapatalkRobbo99999, Mr. Fox and hmscott like this. -
There was an update earlier this year where OCs crashed a lot on Firestrike. I think it was since resolved. There were people with superclocked cards who couldn't finish a run at stock.Robbo99999, Mr. Fox and hmscott like this. -
So any word on Volta Gpu's?What can we expect? Improved power efficiency and more power?
While the 10** series has been pretty amazing for the most part I think its time for new series to release. -
Robbo99999 Notebook Prophet
https://www.pcgamesn.com/nvidia/nvidia-volta-gpu-specifications
And here's another one I posted earlier in this thread (but in the article ignore the 32 inch squared - that must be a typo or something!):
https://www.notebookcheck.net/Nvidia-reveals-more-details-on-the-upcoming-Volta-GPUs.244332.0.html
NVidia are seeming to be very tight lipped about Volta, doesn't seem to be much news nor many leaks about it recently. Personally, I'm not particularly interested in Volta, but that's probably because I won't be upgrading to it - Pascal performs stellar at 1080p (not only) and I just don't think there's much hunger in the market for a new generation of cards right now - on top of that you have AMD struggling to be competetive with Vega - BAHH, who needs a new architecture right! Ha!Last edited: Sep 6, 2017 -
A 1080ti handles 1440p 120hz+ remarkably - even a 1080 for that matter (to make it relevant for laptops). I can't think of a reason I'd want to upgrade a GPU unless it's going to give me the same frames at 4k - which I doubt the next generation will.
I love high frame rate gaming but it's a real Pandora's box. 60fps/Hz almost immediately becomes a deal breaker after about 1 weeks use.
Those who are well content (as I use to be) with 60fps/Hz... "don't go 120hz" unless you're prepared to be disappointed with anything less down the road. It's totally limited my selection in future laptops. Even not gaming laptops. Sometimes I wish I never opened that box.Prototime and Robbo99999 like this. -
Ark Survival Evolved notebook and desktop benchmarks (09/02/2017)
"Survival Evolved has been officially launched it is time to take a look at its system requirements to figure out whether or not the game gulps up computing power as fast as it does player avatars."Robbo99999 likes this. -
Robbo99999 Notebook Prophet
TBoneSan likes this. -
Yeah, as someone who hasn't been spoiled by high refresh rates and enjoys gaming at 1080p, I see little reason to upgrade to Volta. A laptop with a GTX 1060 can max out most titles at 1080p@60fps. Eventually I may want to upgrade to a laptop that can handle 4K@60fps at max settings, but I doubt Volta will offer that--at least not on mid-range GPUs.
hmscott and Robbo99999 like this. -
But your right about a 1060 being plenty for 1080p @ 60fps. -
If they can make their Volta version of 1070 easily push 4k @ 60FPS, I'm ready. Otherwise, meh. I'd much rather see 3K/2.5K (2880x1620 / 2560x1440) quality LCD's with current tech at cheaper prices. Or even better, 16:10 equivalents of 3k/2.5k. The current 1070 and 1080 can easily push 2.5K @ 60FPS.
Ionising_Radiation, Robbo99999, Arondel and 2 others like this. -
I think I'm going big this time, whether they call it the 1180 or 2080, I want it in my next laptop.
Good enough is never enough.hmscott, Robbo99999, ThePerfectStorm and 1 other person like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Man, a GTX 1170/2070 upgrade over my 860M is going to be huge. Imagine cramming a GTX 1080 Ti into a 15" notebook...
Fire Strike Graphics scores jumping from ~ 5000 to ~ 30000. That's a sixfold improvement.temp00876, hmscott and Robbo99999 like this. -
Robbo99999 Notebook Prophet
Ionising_Radiation and hmscott like this. -
-
Ionising_Radiation ?v = ve*ln(m0/m1)
GTX Titan Volta: HBM2
GTX 1180 Ti: HBM2/GDDR6
GTX 1180: GDDR6
GTX 1170: GDDR5X
GTX 1160: GDDR5X
GTX 1150 Ti: GDDR5
GTX 1150: GDDR5
GTX 1130/150MX/1140: GDDR5
GDDR6 is already postulated to be as fast, and HBM2—as demonstrated by AMD Vega—is prohibitively expensive. I don't think Nvidia will want to use it widely until prices fall to a more reasonable level. By then, we'll have HBM3, which is supposedly faster, less power-hungry and also cheaper. -
Yeah I need a volta titan for good gaming experience.
Titan xp isnt enough for 5k(5120 x 2880). I also want to move to 8k in the future.hmscott and Ionising_Radiation like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Jokes aside... I think the pared-down GV104 chip, whatever it's called, will more than suffice for whatever gaming I want to do. Going by history, the X70 series has always outdone the previous generation's X80 Ti. 970 > 780 Ti, 1070 > 980 Ti, 770 > 590/690. Like I said earlier, the cut-down GV104 chip will very likely perform as well as the GP102 (1080 Ti) we have today. 4K 60 FPS gaming will become a reality, and 1440p 120 FPS gaming will become commonplace... -
I think you are expecting too much. I need a minimum of a Volta Titan. 8k will likely require the one after that.hmscott likes this. -
8K will probably require 2x Titan V for smooth 60fps+ experience.
Single Titan V should be able to at least provide 4K 60fps+ maybe 120fps+
5K @ 60fps will probably be doable with single Titan V
I don't think the 1180 will be much faster than a 1080Ti though, maybe 15%.hmscott likes this. -
Miguel Pereira Notebook Consultant
That tendency you talk of is recent.
Enviado do meu MHA-L29 através de Tapatalkhmscott likes this. -
Progress toward consumer release now that commercial release has been reached?
Nvidia Ships first Volta-based DGX Systems
http://www.anandtech.com/show/11824/nvidia-ships-first-volta-dgx-systems
"This Wednesday, NVIDIA has announced that they have shipped their first commercial Volta-based DGX-1 system to the MGH & BWH Center for Clinical Data Science (CCDS), a Massachusetts-based research group focusing on AI and machine learning applications in healthcare. In a sense, this serves as a generational upgrade as CCDS was one of the first research institutions to receive a Pascal-based first generation DGX-1 last December. In addition, NVIDIA is shipping a DGX Station to CCDS later this month.
At CCDS, these AI supercomputers will continue to be used in training deep neural networks for the purpose of evaluating medical images and scans, using Massachusetts General Hospital’s collection of phenotypic, genetics, and imaging data. In turn, this can assist doctors and medical practitioners in making faster and more accurate diagnoses and treatment plans."
NVIDIA Ships First Volta DGX-1
https://www.top500.org/news/nvidia-ships-first-volta-dgx-1/
"CCDS is well-versed with NVIDIA gear. It received the first-generation DGX-1 last December, which was equipped with eight Pascal-based Tesla P100 GPUs. With 170 teraflops of 16-bit floating point performance, it was the most computationally dense GPU-based platform you could buy at the time
However, that P100-based appliance pales in comparison to the new V100-powered DGX-1, which, courtesy of Volta’s Tensor Core technology, offers 960 teraflops of mixed FP16 and FP32 matrix math. That’s nearly a petaflops-worth of deep learning performance in a 4U box."
NVIDIA Ships First Volta-based DGX Systems
https://www.reddit.com/r/hardware/comments/6ynvrt/nvidia_ships_first_voltabased_dgx_systems/Last edited: Sep 9, 2017Robbo99999 likes this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
@Papusan—yeah, I still want to get a desktop
Just that I want a notebook, too. Can't bring my desktop into lecture halls, can I?Papusan likes this. -
Or maybe a smartphone. They are alredy passed 6" and can maybe be a outsider
-
-
Robbo99999 Notebook Prophet
-
A History of Nvidia GeForce, Part 1 - Fierce Competition
It's not as good when jumping around, you'll miss the subtle nuances, be sure and watch Part 1 first, and then Part 2 afterwardsajc9988 and Robbo99999 like this. -
-
Robbo99999 Notebook Prophet
EDIT: Sh*t, same approach with the only intro being that it's a 'history' of NVidia GPUs, yet still no main synopsis given at the beginning of his vid - I get the feeling he's trying to compare increases in GPU performance from NVidia generation to generation, but I'm not willing to spend 40+ minutes of my time listening to an annoyingly syncopated voice over to finally get to the point the guy's trying to make in the end. Now, if he'd have come out at the beginning of his vid saying now this is what I've found in a couple of short sentences & now I'm gonna show you the detail -then I could live with it - otherwise it's just a waste of time. Anyone know what the final/ultimate point this guy is making in his two 25min vids?Last edited: Sep 10, 2017hmscott likes this. -
Second, part two discusses how Nvidia pulled the switcheroo, making the 80 series filled with the mid-range chips when it used to be the number for high-end. It then further discusses Nvidia's techniques to milk and bilk the market. It does note where Nvidia did very well, such as Maxwell, but also shows how they were screwing consumers due to that. He addresses fanboys common statements (both sides), while concluding AMD gave up and decided to bilk their fanboys as well (so there is some AMD bashing in with the video).
Finally, you don't pay me to perform analysis, so don't act like a sniveling twit and respond wanting me to do the work for you. If you have no interest in it, move on with your day. Don't respond. Don't tell me what I should do. Just let it go, as you made up your mind not to watch it, which is your right. -
Robbo99999 Notebook Prophet
-
-
Robbo99999 Notebook Prophet
I hope you guys get through those awful storms, they do sound horrendous! -
Robbo99999 likes this.
Volta: NVIDIA's Next Generation GPU Architecture (2017-2018)
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Aug 14, 2016.