Beautiful...
-
Robbo99999 Notebook Prophet
Last edited by a moderator: May 12, 2015Peter likes this. -
I noticed that GPUz utilization is not 98 to 100%, that is why i tried to disable V-sync still it is only about 75 - 90%. I dont think so it limits by CPU, it OC to 4.6Ghz on all cores. Yes mostly in all game GPU usage around 98 -99%. I think we need better 980M driver.
-
Also, those temps are bloody amazing. Wish I could get things that cool here =O -
-
Let me know when you get to this part. Everything was maxed out and im pretty sure this is running msaa and full motion. Might have to retest this though. Looking for my games saves now.
Bad quality because I did not know what the heck I was doing.
Last edited by a moderator: May 12, 2015 -
Looks like it may need some work to get full potential...
Last edited by a moderator: May 12, 2015Robbo99999 likes this. -
Robbo99999 Notebook Prophet
EDIT: Just looked at Johnkss's video of Crysis 3 using the same settings as you & in the same place in the game, he's getting quite a lot higher framerate than you just with his 780M sli at 850Mhz, with about 98% GPU usage constant. There's definitely something not right with your 980M sli setup there, or perhaps your CPU is throttling, or as you say it might be the drivers. A bit more investigation warranted I think - and perhaps some more thorough testing of other games too to prove or rule out that it's a problem across the board. (I haven't seen any convincing game performance of 980M sli in Alienwares yet - but I have to confess I haven't gone looking for it!)Last edited: Dec 2, 2014 -
-
Robbo99999 Notebook Prophet
-
@Peter - check your NVIDIA Control Panel and be sure you are set to "Use advanced 3D image settings" (not "Let the 3D application decide") and confirm the Global and Crysis 3 profile settings are not using "Adaptive" (set everything to give preference to performance). I got equal or better results with 680M and 780M SLI (average 100+ FPS) compared to what you are getting with 980M SLI. Could be drivers, or it could be wrong NVIDIA Control Panel settings. You should be getting higher FPS than what you are.
-
thank you godfafa...
so is my 3920xm in my m18x r2 any good if i want to run gtx 980m sli? No bottleneck whatso ever if i run it at 4.4 ghz while gaming with gtx 980m sli? -
GTX 980M SLI is undoubtly pretty beastly that you will need a good CPU to keep up. The CPU is basically running 2 GTX 780 in SLI.
There are some concerns that a 4710HQ @ 3.3GHz (4 cores active) bottleneck a single GTX 980M. You will be running another GTX 980M with 3.6GHz (stock, 4 cores active) so I would be highly suspecious and experiment with CPU OC to figure this out -
What are you doing with 980M SLI?! That's like someone owning a sports car and not knowing how to drive stick. Come on, man! Let's see some battles. :thumbsup:Last edited by a moderator: May 12, 2015 -
Robbo99999 Notebook Prophet
n=1 likes this. -
Even Anandtech stated that the GTX 980M did worse than expected due to a weak 4710HQ. HTWingNut got better results on games with higher FPS than 60FPS, but it can be just that the game is more CPU dependant than the rest. -
)
-
I also tested it in dual core mode, but have long since forgot where that stuff is.
-
-
How about you try without running sli and see what happens?
-
And then I'm going to let D2 Ultima explain the rest because I'm tired of repeating myself:
Robbo99999 likes this. -
So if I get D2 right, there isnt something called small bottleneck, say 10%? I don`t believe that. A CPU can run near the limit on what is required, and come near 90% of what a GPU should score with a proper CPU. Benchmarks are indications of that. You have a CPU dragging down the GPU score by 10% there as well compared to other systems with better CPU and the same GPU. Isnt that called bottleneck? Likewise, if you have a CPU driving a GPU without bottlenecking it, you dont gain 10% GPU score if you overclock the CPU with +1000MHz. That is a method to reveal if your CPU is good enough or not.
-
-
Robbo99999 Notebook Prophet
-
Yeah as I said above, the best AMD A10 is equivalent to an i5-4210U, the one used in the AW13 that gets blasted to hell and back. Not even remotely comparable to the 4710HQ.
-
Peter likes this.
-
Robbo99999 Notebook Prophet
-
Last edited: Dec 2, 2014Mr. Fox likes this.
-
I would also be interested in seeing what BF4, Crysis 1 and 2, COD: Ghosts and COD: Advanced Warfare and Witcher 2 run like if you have any of those titles. If you have FRAPS, that sucks at video recording, but the benchmark tool does a nice job of documenting minimum, max and average FPS.Peter likes this. -
Also, I would like to point out that benchmarks like 3DMark are very... weird. The fact that the CPU has little effect on the graphics score of the tests shows that the tests which measure graphics score are very likely NOT CPU limited... which I understand. Because the FPS is very low, and the scenes are likely too advanced and lacking possible optimizations for ease of rendering solely for the purpose of testing raw number crunching of the GPUs, so the CPUs aren't overworking for them. 3D mark type benchmarks are not a real-world comparison of CPU bottlenecks for that reason, because as I mentioned in the post n=1 quoted from me, there are instances where OCing a CPU can grant massive amounts of FPS increases. And these increases are often well above 100fps, where the CPU simply cannot render more low-quality frames to feed to the GPUs fast enough.
The best bottleneck tests are various gaming tests. Some game benches will reveal something like +5 FPS over an already high like 130 FPS count. Other game benches will show healthier increases of 20 FPS or something. Also, some games make more use of architecture changes too. It's VERY rare, but I've seen games get +10% performance simply from switching to Ivy Bridge-E from Sandy Bridge-E. No overclock or anything; just CPU swap.
TL;DR? CPU is always a bottleneck until GPU becomes the bottleneck. Whether it's a large enough bottleneck to warrant overclocking or changing the CPU depends entirely on the game, the target FPS of the player and the benefit from overclocking, and also most importantly whether the heat/power tradeoffs are worth the increase considering the previous 3 points. -
I dont have COD Ghost (i hate it) and Witcher 2.
Edit
I noticed that Nvidia Control panel>program settings >COD AW> SLI Rendering mode as AFR1 by default? i tried to change it to Nvidia global setting but it's revert back to AFR1?. bug...Last edited: Dec 2, 2014Mr. Fox likes this. -
i've been away for a while, didnt want to read through all the pages i missed, did we ever figure out how to get 980m sli working on the R2?
-
Do you have Metro 2033 or Last Light? Those would also be great tests. You can use the built-in benchmark tool with either of them. The benchmark EXE files are in the game installation folders, but Steam does not have shortcuts to the benchmark in the Steam Client UI.
When you do the INF mod, simply use NotePad Find>Replace feature to replace all 11A0 with 13D7 in the NVDMI.INF file. You can do this in a couple of mouse clicks. Then, go down to [Strings] and replace the "680M" with "980M" in the line with 13D7.0550.1028 and you'll be good to go. This will set up the driver and registry exactly the same as 680M SLI. If you are only doing the driver mod for yourself it won't matter that the INF file changes code for other laptops.
Robbo99999 likes this. -
Anyone tested 980m single or sli in new AW17 or AW18?
My folks in korean alienware forum says their GPU clock fluctuates too much.
More than it should.
Edit:
http://www.3dmark.com/3dm11/9038672
Look at the graphic score. GPU core was OCed at 1173.
This is my single 980m 3dmark11 at stock clock. No OC
http://www.3dmark.com/3dm11/9039415
Just look at the graphics score.Last edited: Dec 2, 2014Mr. Fox likes this. -
Nope i dont have metro 2033 or Last Light, yes they are graphics hungry games. Ahh thank you very much for quick driver inf mod trick
Battlefield 4 will up in few mins :thumbsup:Mr. Fox likes this. -
Yes card clocks fluctuatin frequently, specially i noticed with Benchmark software. We need new drivers -
-
980m sli in m18x R2 and p375sm works very good as it is supposed to.
Due to boost, the clock doesn't stay but I believe it is meant to work that way.
And this is why we all wanted unlocked vbios from the old days.
But in AW17, my two friends are experiencing the same "too much fluctuation of clock".
It is very sad that about 3-5% of graphic score is known to be deducted in AW17 and AW18 with 780m.880m and now something more serious is going on with AW17. As peter said, this might be a driver problem but I am thinking differently because 980m single/sli work good in old m18x R2 and clevo with the same driver.Mr. Fox and Robbo99999 like this. -
Well i think we really need a dell gtx 980m vbios...i am shocked at dell that they are not upgrading the line up...worst of all are they waiting for e3 2015 to unveil new line up or are they giving up aw 18 completely. Also what about aw 17? At least they should give gtx 980m to aw17 then we can get vbios from it...
Also really waiting for nvidia to release a new driver to see some performance optimizations.. -
Robbo99999 likes this.
-
I hope my 980m comes back fast. They said they will send me a new 980m for the broken oneLast edited by a moderator: May 12, 2015 -
pathfindercod Notebook Virtuoso
My 9377 with 980sli doesn't flucuate at all. I average 120-140 FPS on 64man servers (bf4).
Robbo99999 likes this. -
About that cpu bottleneck, it's not true that a 4700mq will bottleneck 980m sli. first, the 4700mq has around the same processing power as a desktop i5 4690k, and the 980m sli is equivalent to dual gtx 970. There is no instance of a bottleneck, as our modern day cpu are still years ahead of our graphics cards. secondly, as we go into higher and higher resolutions, games will become more and more taxing on graphics cards than it is on the cpu. so basically I suspect peter's 980m sli don't have the best drivers at this moment to go on full load, as his ivy extreme is faster than a 4700mq as well.
-
So I put the game on hard and ran around shooting everything bull-rush-style with a Famas. Went until I died. Video will be up soon kekekeke. Also, I checked back your video, and your FPS is mucho high =O. I want it. I WANT IT NAO. GIMME. Also, disabling AA for me apparently gives me comparable FPS to you. Interesting indeed =O. Seems like BF4 really really hates using MSAA.
-
And here is my really stupid BF4 gameplay for le ultra preset.
Last edited by a moderator: May 12, 2015 -
so the m18x r2 works fully ( maybe gpu not running at 98-100% all the time), if running windows 8 or newer with uefi. Is the sli cable like different, because i am planning to order my gtx980ms soon, and getting the sli cable is extra $$, but its cheaper to get from the same place than from other place like ebay or something. is the gtx 780m sli cable compatible or has everyone who upgraded from 780ms/880ms to 980ms needing a new sli cable.
Thanks -
Aw m18x R2 Dual 980m SLI upgrade!!
Discussion in 'Alienware 18 and M18x' started by Peter, Nov 12, 2014.