-
Robbo99999 Notebook Prophet
Cool, looking good! I noticed that GPU's weren't maxed out that close to 100% when you disabled V-sync, with your fps at the 59-84 fps that you mentioned, and GPU usage maybe getting up to a maximum of 90% usage on both cards. Do you know why it wouldn't run at 100% GPU usage, were you limited by CPU? Or does sli rarely ever get to 100% usage on each GPU in games anyway (vsync disabled of course)? (I could understand if it only stabalised at 98-99% GPU usage, because that's what I see from the usage on my single card.)Last edited by a moderator: May 12, 2015Peter likes this. -
I noticed that GPUz utilization is not 98 to 100%, that is why i tried to disable V-sync still it is only about 75 - 90%. I dont think so it limits by CPU, it OC to 4.6Ghz on all cores. Yes mostly in all game GPU usage around 98 -99%. I think we need better 980M driver.
-
My 3.5GHz 4800MQ limits my two 780Ms at 950/6000 in Crysis 3, so a 4.6GHz ivy bridge limiting two 980Ms doesn't seem like a far stretch to me. If your CPU was hitting 85-90% then it was CPU limited.
Also, those temps are bloody amazing. Wish I could get things that cool here =O -
I didnt buy the card. It was given to me for tests. Tell your friend to contact Hansung who is selling the card. But I think the cheapest place is RJ.
-
Ah, ah, ah (shaking finger)...no videos of you staring at the sky. You need to be in a heated gun battle.
Let me know when you get to this part. Everything was maxed out and im pretty sure this is running msaa and full motion. Might have to retest this though. Looking for my games saves now.
Bad quality because I did not know what the heck I was doing.
Last edited by a moderator: May 12, 2015 -
Looks like it may need some work to get full potential...
Last edited by a moderator: May 12, 2015Robbo99999 likes this. -
Robbo99999 Notebook Prophet
Ah, that's good that you get 98-99% GPU usage in most other games, that just proves Crysis 3 to be an exception. Like D2 Ultima said, if your CPU usage is very high 90% usage, then that is probably limiting you, it's possible that only one or two of the cores are maxed out really close to 100% which could limit you too, (but Crysis 3 is really well threaded so probably not that). So, it seems that 980M sli is definitely not bugged in performance then on your laptop, you're getting the performance you should if you've got good GPU usage in most games, it's just the Auto Fan Throttling Bug that remains right?
EDIT: Just looked at Johnkss's video of Crysis 3 using the same settings as you & in the same place in the game, he's getting quite a lot higher framerate than you just with his 780M sli at 850Mhz, with about 98% GPU usage constant. There's definitely something not right with your 980M sli setup there, or perhaps your CPU is throttling, or as you say it might be the drivers. A bit more investigation warranted I think - and perhaps some more thorough testing of other games too to prove or rule out that it's a problem across the board. (I haven't seen any convincing game performance of 980M sli in Alienwares yet - but I have to confess I haven't gone looking for it!)Last edited: Dec 2, 2014 -
That's... odd... that last video was with 780Ms, correct? I didn't get near that framerate as far as I remember. I was more around 45-60 with the same settings, excepting 1x SMAA (which should EASILY be less demanding). Maybe my memory is broken and I should try it again O_O. I was getting 80+ in all the multiplayer maps, though.
From what I saw in his video, his CPU appeared to be fluctuating from 4.5GHz to 1.1GHz on various cores... unless I am reading the video entirely incorrectly, as I do not use that monitoring tool. -
Robbo99999 Notebook Prophet
Well spotted, if that's the case then maybe Peter needs check his CPU performance. Will be useful to hear from Peter on this, at the moment I'm not convinced that 980M sli is functioning without bugs in Alienware at the moment. -
@Peter - check your NVIDIA Control Panel and be sure you are set to "Use advanced 3D image settings" (not "Let the 3D application decide") and confirm the Global and Crysis 3 profile settings are not using "Adaptive" (set everything to give preference to performance). I got equal or better results with 680M and 780M SLI (average 100+ FPS) compared to what you are getting with 980M SLI. Could be drivers, or it could be wrong NVIDIA Control Panel settings. You should be getting higher FPS than what you are.
-
thank you godfafa...
so is my 3920xm in my m18x r2 any good if i want to run gtx 980m sli? No bottleneck whatso ever if i run it at 4.4 ghz while gaming with gtx 980m sli? -
GTX 980M SLI is undoubtly pretty beastly that you will need a good CPU to keep up. The CPU is basically running 2 GTX 780 in SLI.
There are some concerns that a 4710HQ @ 3.3GHz (4 cores active) bottleneck a single GTX 980M. You will be running another GTX 980M with 3.6GHz (stock, 4 cores active) so I would be highly suspecious and experiment with CPU OC to figure this out -
Do you know how to play the game?
What are you doing with 980M SLI?! That's like someone owning a sports car and not knowing how to drive stick. Come on, man! Let's see some battles. :thumbsup:Last edited by a moderator: May 12, 2015 -
Robbo99999 Notebook Prophet
I don't think it's got much to do with what graphics card you put your CPU with, I think it's mostly down to the fps that you're running in games. If you're gaming on a 60Hz monitor there's not much to be gained by running more than 60fps, so you don't really need a CPU better than a 4710HQ. If you're gaming at 120Hz and aiming for 120fps then that's when you want a faster CPU if you can justify the expense. HTWingNut did some testing with a single 980M in a GT72 and the CPU wasn't limiting framerates until over 100fps.n=1 likes this. -
Then why are A10 APUs from AMD bottlenecking a 7970M down to GTX 660M on framerates greatly below 60FPS? And why does the APU work perfectly well with GPUs that are weaker than 7970M?
Even Anandtech stated that the GTX 980M did worse than expected due to a weak 4710HQ. HTWingNut got better results on games with higher FPS than 60FPS, but it can be just that the game is more CPU dependant than the rest. -
lol 5800 dpi (too much COD
)
Well it was just for testin
-
If you look at the first video, i was getting between 50 to 100, but that was over a year ago and i was running an overclock.
I also tested it in dual core mode, but have long since forgot where that stuff is.
I sure hope you were not staring at the sky there either. That's a lot of sky to be "camping" under.
-
I tried some setting as you've suggested, There is no change in performance. I think it's driver bug avg. gpus usage around 70%.
-
How about you try without running sli and see what happens?
-
Because the AMD A10 is laughably weak. Even the best chip 5750M is only about on par with the i5-4210U in the AW13. (I kid you not, check the benchmarks) So yeah, not even close to 4710HQ's performance.
And then I'm going to let D2 Ultima explain the rest because I'm tired of repeating myself:
Robbo99999 likes this. -
So if I get D2 right, there isnt something called small bottleneck, say 10%? I don`t believe that. A CPU can run near the limit on what is required, and come near 90% of what a GPU should score with a proper CPU. Benchmarks are indications of that. You have a CPU dragging down the GPU score by 10% there as well compared to other systems with better CPU and the same GPU. Isnt that called bottleneck? Likewise, if you have a CPU driving a GPU without bottlenecking it, you dont gain 10% GPU score if you overclock the CPU with +1000MHz. That is a method to reveal if your CPU is good enough or not.
-
I get around 40 to 56fpg GPU usage 80 - 86%. I dont know why this game not usin 99% gpus. Im gona try Crysis 1 later on. may be 980ms driver is not optimized
-
Robbo99999 Notebook Prophet
A10 APU's are rubbish in comparison to Intel for gaming performance, so that's why. Sure, a few games would benefit from a faster CPU than a 4710HQ, but not many at all when you're talking 60Hz gaming. I agree with what n=1 wrote a few posts above me, with the exception that you don't always see a performance increase from getting a faster CPU (like it's not always 10%, if the CPU is not being pushed at all, then it's gonna be 0% difference). If you're seeing say 85-90% CPU usage (doesn't/wouldn't be 100% usage), then you could probably get a few more frames by getting a faster CPU, but with a 4710HQ that can maintain full boost with no thermal throttling then that is a very rare usage level to see in a game. -
Yeah as I said above, the best AMD A10 is equivalent to an i5-4210U, the one used in the AW13 that gets blasted to hell and back. Not even remotely comparable to the 4710HQ.
I think the point was given a sufficiently non-demanding game, you can always artificially create a CPU "bottleneck". So if we're getting technical on semantics yes it's a "bottleneck". But practically, if you're already close to or over 100 FPS, and gain less than 5% FPS by going +500 MHz on the CPU, calling that a "bottleneck" isn't exactly the best idea since it gives the wrong impression. -
I still have crysis 1 as well. I may load it to see what happens.Peter likes this.
-
Robbo99999 Notebook Prophet
Ah, that's not too good, pretty sure there's something off with your system or your system's interaction with the 980M's, you should definitely be very close to 100% GPU usage when running just the one GPU in Crysis 3. I don't think testing Crysis 1 is gonna be a good test to see if you get high GPU usage - I ran it a couple of days ago and when lots of action started the GPU usage dropped to say 80% usage (no CPU bottleneck), whereas without lots of action it was 100% usage - talking the opening parts of the game from the beginning through to where you see the sea from the cliff, it was at that point (shortly after the cliff top scene) when I started fighting multiple targets that GPU usage went down. I suppose you could use the very beginning part of the game to test, but I don't think it's the best title for you to test because I don't think it's optimised very well. Try some Far Cry 3. -
I think it is related to Crysis 3 or driver not optimized for 980ms. Im goin to try AC Unity. lol never played Far Cry series
Last edited: Dec 2, 2014Mr. Fox likes this. -
Go to the NVIDIA Control Panel "Program settings" tab and select Crysis 3 from the drop-down menu. If the SLI Rendering Mode for Crysis 3 is "NVIDIA Recommended" try Alternate Frame Rendering 1 (AFR1) and Alternate Frame Rendering 2 (AFR2) and see if either of those work better than the default SLI rendering mode. You will need to close down the game each time to try AFR1 and AFR2, as these setting changes do not work on the fly.
I would also be interested in seeing what BF4, Crysis 1 and 2, COD: Ghosts and COD: Advanced Warfare and Witcher 2 run like if you have any of those titles. If you have FRAPS, that sucks at video recording, but the benchmark tool does a nice job of documenting minimum, max and average FPS.Peter likes this. -
Well then you could call it a "small bottleneck" versus a "large bottleneck" and my advice still remains. If you only have an incremental increase (a couple FPS every few hundred MHz) then is it really worth the extra heat and power drain? In a game where +300MHz gave me about 20fps (up from anywhere between 50 and 80 fps; as I have a 120Hz screen) I would DEFINITELY try to get my CPU running at +300MHz constantly. That's a huge bottleneck. But if it only gave me about 5fps and I was already in a decent FPS range... that's not really worth it. But then again, it's pretty much impossible to not have a bottleneck in a PC at all. Either your CPU is doing it or your GPU is doing it, no matter how slightly. It's just how it is.
Also, I would like to point out that benchmarks like 3DMark are very... weird. The fact that the CPU has little effect on the graphics score of the tests shows that the tests which measure graphics score are very likely NOT CPU limited... which I understand. Because the FPS is very low, and the scenes are likely too advanced and lacking possible optimizations for ease of rendering solely for the purpose of testing raw number crunching of the GPUs, so the CPUs aren't overworking for them. 3D mark type benchmarks are not a real-world comparison of CPU bottlenecks for that reason, because as I mentioned in the post n=1 quoted from me, there are instances where OCing a CPU can grant massive amounts of FPS increases. And these increases are often well above 100fps, where the CPU simply cannot render more low-quality frames to feed to the GPUs fast enough.
The best bottleneck tests are various gaming tests. Some game benches will reveal something like +5 FPS over an already high like 130 FPS count. Other game benches will show healthier increases of 20 FPS or something. Also, some games make more use of architecture changes too. It's VERY rare, but I've seen games get +10% performance simply from switching to Ivy Bridge-E from Sandy Bridge-E. No overclock or anything; just CPU swap.
TL;DR? CPU is always a bottleneck until GPU becomes the bottleneck. Whether it's a large enough bottleneck to warrant overclocking or changing the CPU depends entirely on the game, the target FPS of the player and the benefit from overclocking, and also most importantly whether the heat/power tradeoffs are worth the increase considering the previous 3 points. -
Just tried AFR1 settings, frames dropped to only 13 - 24fps and AFR2 = Nvidia global same frames as SLI profile, ( Both GPU usage around 40 to 50%. COD AW PC version is completely broken, i just tried all high settings and im only gettin about 15 to 20fps. I'll upload it when i get fix for it. BF4 and Crysis in queue
I dont have COD Ghost (i hate it) and Witcher 2.
Edit
I noticed that Nvidia Control panel>program settings >COD AW> SLI Rendering mode as AFR1 by default? i tried to change it to Nvidia global setting but it's revert back to AFR1?. bug...Last edited: Dec 2, 2014Mr. Fox likes this. -
i've been away for a while, didnt want to read through all the pages i missed, did we ever figure out how to get 980m sli working on the R2?
-
Yeah, something is definitely not working correctly. I get outstanding results with COD:AW single player campaign. Yes, AFR1 is the default in the driver profile for this game. You may want to uninstall your drivers with DDU (using the "Install New GPU" method in Safe Mode) and reinstall them again. If you did not do that when you were tinkering with the INF mod it may have something goofed up in the drivers or Windows registry, or both. Using that feature in DDU will clear out everything so you can start fresh.
Do you have Metro 2033 or Last Light? Those would also be great tests. You can use the built-in benchmark tool with either of them. The benchmark EXE files are in the game installation folders, but Steam does not have shortcuts to the benchmark in the Steam Client UI.
When you do the INF mod, simply use NotePad Find>Replace feature to replace all 11A0 with 13D7 in the NVDMI.INF file. You can do this in a couple of mouse clicks. Then, go down to [Strings] and replace the "680M" with "980M" in the line with 13D7.0550.1028 and you'll be good to go. This will set up the driver and registry exactly the same as 680M SLI. If you are only doing the driver mod for yourself it won't matter that the INF file changes code for other laptops.
Robbo99999 likes this. -
Anyone tested 980m single or sli in new AW17 or AW18?
My folks in korean alienware forum says their GPU clock fluctuates too much.
More than it should.
Edit:
http://www.3dmark.com/3dm11/9038672
Look at the graphic score. GPU core was OCed at 1173.
This is my single 980m 3dmark11 at stock clock. No OC
http://www.3dmark.com/3dm11/9039415
Just look at the graphics score.Last edited: Dec 2, 2014Mr. Fox likes this. -
Actually i was searchin for fix, found that there is no official SLI profile for AW from Nvidia. Tomorrow im goin to use Nvidia inspector to fix AW SLI profile.
Nope i dont have metro 2033 or Last Light, yes they are graphics hungry games. Ahh thank you very much for quick driver inf mod trick
Battlefield 4 will up in few mins :thumbsup:Mr. Fox likes this. -
Now it's time someone got to test dual 980ms in Aw18
Yes card clocks fluctuatin frequently, specially i noticed with Benchmark software. We need new drivers
-
I wonder if this happens with the Clevo machines? I suppose Meaker could tell us.
-
No it doesn't.
980m sli in m18x R2 and p375sm works very good as it is supposed to.
Due to boost, the clock doesn't stay but I believe it is meant to work that way.
And this is why we all wanted unlocked vbios from the old days.
But in AW17, my two friends are experiencing the same "too much fluctuation of clock".
It is very sad that about 3-5% of graphic score is known to be deducted in AW17 and AW18 with 780m.880m and now something more serious is going on with AW17. As peter said, this might be a driver problem but I am thinking differently because 980m single/sli work good in old m18x R2 and clevo with the same driver.Mr. Fox and Robbo99999 like this. -
Well i think we really need a dell gtx 980m vbios...i am shocked at dell that they are not upgrading the line up...worst of all are they waiting for e3 2015 to unveil new line up or are they giving up aw 18 completely. Also what about aw 17? At least they should give gtx 980m to aw17 then we can get vbios from it...
Also really waiting for nvidia to release a new driver to see some performance optimizations.. -
You need to check multiplayer. SP runs better usually, by far. Also, your CPU still appears to be fluctuating from 1.2GHz to 4.5GHz? Odd. The framerates you're getting at ultra are nicely high though. I should try SP on ultra and check. Brb, I'll do that right now.Robbo99999 likes this.
-
Awesome Peter.
I hope my 980m comes back fast. They said they will send me a new 980m for the broken one
Last edited by a moderator: May 12, 2015 -
pathfindercod Notebook Virtuoso
My 9377 with 980sli doesn't flucuate at all. I average 120-140 FPS on 64man servers (bf4).
Robbo99999 likes this. -
About that cpu bottleneck, it's not true that a 4700mq will bottleneck 980m sli. first, the 4700mq has around the same processing power as a desktop i5 4690k, and the 980m sli is equivalent to dual gtx 970. There is no instance of a bottleneck, as our modern day cpu are still years ahead of our graphics cards. secondly, as we go into higher and higher resolutions, games will become more and more taxing on graphics cards than it is on the cpu. so basically I suspect peter's 980m sli don't have the best drivers at this moment to go on full load, as his ivy extreme is faster than a 4700mq as well.
-
So I put the game on hard and ran around shooting everything bull-rush-style with a Famas. Went until I died. Video will be up soon kekekeke. Also, I checked back your video, and your FPS is mucho high =O. I want it. I WANT IT NAO. GIMME. Also, disabling AA for me apparently gives me comparable FPS to you. Interesting indeed =O. Seems like BF4 really really hates using MSAA.
-
And here is my really stupid BF4 gameplay for le ultra preset.
Last edited by a moderator: May 12, 2015 -
so the m18x r2 works fully ( maybe gpu not running at 98-100% all the time), if running windows 8 or newer with uefi. Is the sli cable like different, because i am planning to order my gtx980ms soon, and getting the sli cable is extra $$, but its cheaper to get from the same place than from other place like ebay or something. is the gtx 780m sli cable compatible or has everyone who upgraded from 780ms/880ms to 980ms needing a new sli cable.
Thanks -
as far as I know it should be; just Peter's cable crapped out on him. Dell isn't selling 980Ms now, so it can't be a new cable.
Aw m18x R2 Dual 980m SLI upgrade!!
Discussion in 'Alienware 18 and M18x' started by Peter, Nov 12, 2014.