Early is good as far as I am concerned. Nothing wrong with that.
Even if the motivation is to do that, there is still nothing wrong with that. I would not blame Intel for pulling "an NVIDIA" on AMD. All is fair in love and war. They are competitors. Only the strong survive. At least, that's how I look at it. I hate the effect that monopolies have on an industry, but if they're in it to win it, they should do everything within their power to make their competitors look incompetent. That's smart business and we should not be all butt-hurt by Intel or NVIDIA taking strategic business maneuvers. Just my two cents.
-
-
I was more getting at....We hear this in 50 different threads a day, can we get back to this being the overclocking thread and not the political analyst thread about everything other than overclocking.
Side note: I went searching for information about doing some things to my synology nas the other day and do you know what I found? 200 reasons on why not to do something instead of the 1 reason on how to actually do it. And a bunch of other stuff not pertaining to what I was looking for.
So now I totally understand what people mean be derailing a thread.
Last edited: Apr 20, 2017Papusan, bloodhawk, ajc9988 and 1 other person like this. -
-
I disagree with the incompetent part. Better products don't necessarily make the competitor look stupid, rather I'd term it insufficient. AMD gets short shrift often, but they do have interesting IP that is good (although execution has been poor). But definitely agree with the sentiment that competition is good. I point out what and why, please don't mistake that as complaining because I support this.
I do not support trying to use deals to restrict ability to purchase competing products and other non-competitive methods that have been used in the past. That is wrong. But I applaud what they have done here. Same with Nvidia moving specific cards up. I also love that AMD caused these reactions in the market! But, watching how and when everything will happen is fun to me (analyzing companies, markets, and strategies).Ashtrix and leftsenseless like this. -
Meaker@Sager Company Representative
Yes, healthy competition is good, as are informed consumers, anti competitive and sheep like behaviour a bad. -
So, has anyone benched Unigen Superposition yet?
Sent from my SM-G900P using Tapatalk -
Never heard of it. I will go find it and see what's up.
Have you tried it? Like it? -
Not on the DM3 yet.
thattechgirl_viv, Johnksss, Papusan and 1 other person like this. -
I will have to try it. I have zero interest in VR crap, but it look like an interesting benchmark.
Hmm... video looks kind of boring, but I will give it a shot.
Last edited: Apr 20, 2017 -
Haven't tried it yet, but since I saw it, forgot about it, then just found it again, figured I'd share.
Sent from my SM-G900P using Tapatalk -
Ionising_Radiation ?v = ve*ln(m0/m1)
The thing is that 'VR-Ready' is a fairly consistent marker of a given hardware assembly's performance. 2 x 1920 x 1080 @ 90 FPS is no joke, that's 37 megapixels per second. -
Yes, I understand that part... it takes muscle to run and it has value as a stress test/benchmark. I'm just not interested in VR, that's all.
@bloodhawk - Looks like SLI is borked by Unigine on this. Only one GPU is utilized. Setting AFR1 makes it worse. AFR2 uses both GPUs but the performance is not much better than SLI disabled. Guess that figures... slop it out the door since most gamer-boys only run a single GPU. If I have to settle for one, better be an extra good one, like your 1080 Ti. -
Yeap exactly what 1080Ti SLI people observed. Has something to do with how VR handles rendering. IIRC VR doesnt really support SLI.
How is scaling for you in VRMark ? -
Low GPU utilization. Look... 1080p extreme is almost the same result as 4K optimized. This is just GPUs running stock.
Ashtrix, TomJGX, Johnksss and 1 other person like this. -
Damn. What are the score like with a single 1080?
I should be home in about 2 hours, will do a few runs then.Mr. Fox likes this. -
Don't know yet. I was working on trying to get SLI working. I ran single GPU for about 10 seconds and escaped out of it to tweak things.
I missed your previous question. I'm not sure on VR mark. Haven't run it in a while. Last time I checked there were no HWBOT points available for it, so I wasn't that interested. -
Higher. LOL.. 100% GPU utilization. That really sucks. Here is a stock run with SLI disabled.
Ashtrix, Johnksss, ajc9988 and 1 other person like this. -
-
Probably won't, unless they get off their lazy two-bit heinys and make SLI work, or start making some crazy wicked GPUs with as much horsepower as two of today's high end GPUs. Or, maybe they will just dumb down the graphics to make it less demanding. Some will be so goo-goo-gah-gah about VR that they might not even notice. It already seems kind of like that is what "4K Optimized" might mean on this benchmark. "Optimized" is a nice sounding word that is sometimes secret code for "sucky" in today's world.Last edited: Apr 20, 2017
-
I believe VR rendering is less optimized than 3D, which simply renders the same thing at a different angle. I don't know the entire specifics. But due to the rendering technique, for some stupid reason, it ends up needing to calculate things twice. Some data can be re-used, but some cannot. So the GPU essentially needs to calculate the same frames over again, not just render double the frames. In SLI the overhead kills it.
Inter-GPU bandwidth fixes this; x16/x16 SLI would do better than x8/x8 as we have. Bloodhawk could check. Also, the SLI profile could be improved most likely over plain force AFR2. NVPI would be useful for it, but I don't know of a good SLI profile for it.
The HB bridge poured all its optimizations into improving frame pacing issues (making the game more stable) with I think 372.54 and onward, so assuming that no driver post that has VR optimizations, if you tried with uhh... 368.81 or something (I remember that being a driver?) it would likely give you some better performance, if it used the HB bridge connection toward inter-GPU bandwidth for performance and not just better framepacing.
Welcome to why I make noise about the stupid band-aid on the SLI bandwidth issue that the HB bridge is... and why AMD is significantly better for VR in multi-GPU (XDMA gives them 8GB/s to and fro each GPU to the other in a PCI/e 3.0 x8/x8 configuration, compared to the paltry 1GB/s + minuscule PCI/e connection of a SLI bridge, 1.625GB/s of a LED bridge, or 3.25GB of the HB bridge). Among GCN's architecture benefits for mGPU too, that is (remember I said Maxwell and to a lesser extent Pascal are not designed to be mGPU friendly before?)
Edit: @bloodhawk if they actually bloody use 3D Vision render tech, it'll significantly improve the performance of VR, and you could use 1 GPU per "eye". -
Ionising_Radiation ?v = ve*ln(m0/m1)
Simple solution: use AMD CrossfireX.
-
Can you fire up Windoze 7 and compare the results with X?
-
HA.
Here, I posted this on reddit about crossfire earlier:
AMD Crossfire is about the most worthless thing you could spend money on with a PC, barring buying the i7-6950X.
- Crossfire does not work unless an AMD driver has a profile for the game, making it worthless for most indie titles.
- Crossfire does not work unless you are in true fullscreen.
- Crossfire has less default profiles than SLI, and I find SLI's default profile list is lacking.
- Crossfire cannot be forced on games that do not support it via a driver-editing program like nVidia Profile Inspector (which is the ONLY reason I consider SLI worth anything these days).
- Crossfire has a higher tendency to be stuttery, especially with newer games. It usually takes a while before it settles, and this is even higher of a chance with newer cards.
- All the downsides about SLI also apply, such as being incapable of using it with 3 screens without the other 2 screens going black without spanning the game across 3 monitors, etc.
I only recommend SLI with the single strongest card on the market being the first card used. 1080N SLI for laptops or 1080Ti/Titan X Pascal/Titan Xp for desktops. The number of times it doesn't work and the requirement of being a tinkerer with NVPI makes it something only the most tech-loving users or benchmarkers will see real benefit from.
And Crossfire? Their strongest card is somewhere in the vicinity of the GTX 1070 at its peak; which I wouldn't SLI. RX 480s and RX 580s and Fury X cards are the best AMD has to offer and with all the added downsides AMD refuses to fix, there's no reason to do it.
If you're considering crossfire, just sell the card and buy a stronger nVidia card like a 1080Ti. I dislike giving this advice, but it's the only advice that makes sense.Ashtrix likes this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
-
Welcome to "Why D2 Ultima is always so annoyed"
-
Unfortunately, CrossFire has never been any good. It has pretty much sucked for as long as I can remember. When AMD bought ATI things started circling the drain. The last good mobile AMD GPU I owned was the ATI Radeon 5870M in the Alienware M17xR2 and the CrossFried part of it kind of sucked even back then. In contrast, SLI has been good for me for many years. Only recently, as crappy console ports and sucky game developers started to become status quo has SLI not been as good as it used to be. That's not the fault of SLI though... it's the fault of the sucky developers and crappy console ports and cheesy games engines. I will still take a multi-GPU NVIDIA powered beast over a single GPU any day of the week. When it doesn't work, it works like a single GPU PC. When it does work, it annihilates a single GPU PC.Ashtrix likes this.
-
Here you go. This time on the Tornado F5. W7 vs Creators Abortion... Statistically irrelevant difference. No GPU overclock. This does have a @Prema vBIOS mod and pulled 220W versus the stock Clevo vBIOS on the DM3, which pulls less than 200W in the same test.
Attached Files:
Last edited: Apr 21, 2017 -
-
-
I can see the BIG gain with Crematory Abortion, LOL
BTW. Is it coincidence that graphics in both machines maxed out temp at 88C ?
Your previous test with Alien Killer II, maxed the graphics temp 67.0C
-
The magic of vapor chamber + single GPU.
To be honest, 67c looks "high" to me, considering his is repadded. -
They are working on it, but reason has given no challenge since the 6000/7000 series against the 400 series. They have created a single developer's card for developing crossfire and are reaching out to software companies to get support and adoption. Further, Vega is the first card in half a decade or more not designed around the APU integration, meaning reason did not have to report to the CPU side....
Three points:
1) I'd actually recommend waiting until the Vega release next month before I gave a buy recommendation. With the release bring so close, there is no reason not to wait to see performance and any pricing changes;
2) where AMD has differentiated itself is choosing to create a Pro Duo, a single card for developers that helped achieve a 100%scaling. This doesn't negate your point on forcing crossfire on titles without support, but it does say AMD is making an effort. But that effort is not yet tangible, meaning I do agree with your assessment; and
3) if you can wait longer, finding out if the rumor Nvidia moved volta up could be with while.
Edit: M$ "feature updates" and support timeframe:
http://m.hexus.net/tech/news/softwa...ll-offer-biannual-feature-updates-windows-10/
Sent from my XR6P10 using TapatalkLast edited: Apr 21, 2017 -
Was this on top of an AC?
-
Nope, in my lap on top of the less powerful U3 sitting in the recliner watching TV. The @Prema vBIOS mod does not have the "room temperature" throttling problems that a stock 1080 notebook GPU is famous for. The clocks stayed about 1850~ until it hit like 86-87°C and then dropped to something like 1750-1795.bloodhawk likes this.
-
Nice. Now i wanna run some
Let see if i can beat that score on stock. Tonight !
Pulled the trigger on that Black+Decker. And got rid of that old noisy Honeywell one for $60 on craigslist XDMr. Fox likes this. -
I think this revision is probably more accurate:
" Microsoft's biannual feature updated versions of Windows 10 will be "serviced and supported for 18 months". The theory is that this clockwork schedule and relatively short support periods could encourage more businesses and enterprise to stay up to date with the latest, most secure, give them an excuse to milk a new revenue stream by bombarding businesses with multiple releases of Windows."
No, this is not a coincidence. That temperature is normal for the Tornado F5 under sustained GPU load. Part of the reason for it is the fans are too weak and do not push nearly the volume of air through the heat sink radiators that they need to. I wish the fans for the DM3 would work in the Tornado F5. I thinik it would run much cooler, and I honestly do not mind the extra fan noise. I can hardly even hear the fans running on max speed on the Tornado F5. Running the fans on max has almost no effect on temperature versus letting them run "auto" speed.Last edited: Apr 21, 2017Ashtrix, D2 Ultima, ajc9988 and 1 other person like this. -
-
-
-
-
-
Meaker@Sager Company Representative
Everyone does use thicker pads to help with heatsink variation, you can do this with any vendors. -
With the amount of space and the rigidity of the frame, that should not be the case. Specially at the cost of this system. 2 Different heat sinks with 2 different thermal pad thicknesses, like wth.
End of the day i WILL end up using thicker pads and more than the necessary amount of thermal paste. But thats different discussion all together. -
Meaker@Sager Company Representative
You say the cost of the system but it has more engineering than any other machine in it's class by using the desktop CPU and yet is the cheapest. Even with this it still wrecks faces.
-
Well the point of having all the best hardware, and then ending up with all these problematic variations, defeats the purpose of having such a high end machine doesn't it?
And its not cheap by any means to sit behind the, its the cheapest, argument.
At least those other lower performing machines, dont have these issues over multiple generations
and there is visible effort on on the manufacturers part to improve things. (While moving backwards and using ****tier hardware)
I only know of the first 4-5 machines that worked properly and dont have issues with heat sinks.Last edited: Apr 23, 2017Ashtrix, Scerate, Papusan and 1 other person like this. -
Meaker@Sager Company Representative
Those other lower performance machines that cost more don't have an issue? I think the issue lies right there
-
You need to read a bit more carefully instead paraphrasing.
I didint say they dont have any issues, more like their issues are slowly ironed out over generations. And most if not all of them dont have these issues. More so those manufacturers are at least trying to fix those things. The hardware being used in those machines being BGA and all is a totally different discussion all together.
Someone in your position will keep on defending Clevo's fault, and i cannot fault you for that.
But almost every single buyer of the KM1/DMx laptops will more or less agree with what i stated earlier. I can definitely go ahead and fix these issues on my own, but ,sure as hell, i will not recommend these machines to anyone else.Last edited: Apr 24, 2017 -
Meaker@Sager Company Representative
That's like saying an A- is worse than a B+ because the B+ is more refined.
At stock the clevo still beats the competition for less money, you can just take it further if you know what you are doing.
Also if you think those other manufacturers never repeat their mistakes.... -
*Sigh* How many of those systems do you need to mess with to the extent of these *HIGH END* DTR's , get the point now?
Whatever, its pointless arguing with you.
Clevo Best, Clevo best system out there. Because im cherry picked mine or something. -
Meaker@Sager Company Representative
I am the wrong person to ask that question, I have messed with every single notebook to a massive extent, some more so than the DTRs.
The stock performance is what I am talking about though, they are cheaper and faster than any machine out there which in my opinion are some of the most important aspects.
Clevo Overclocker's Lounge
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Mar 4, 2016.