That does make sense actually, I only have the 4770k
Sent from my SGH-T999 using Tapatalk 2
-
If you run a 3Dmark11 on a stock 4770K with 780ti SLi and then do a new run with 4,5Ghz, you will see that your GPU score will increase.
-
34500 GPU score on 4,6Ghz and 34000 CPU score on 3,9Ghz
NVIDIA GeForce GTX 780 Ti video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI FORMULA
NVIDIA GeForce GTX 780 Ti video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI FORMULA
But I swear I have seen the GPU score increase somewhere when I increased the CPU clock..
Anyway, Im getting my 880M's on tuesday, thanks to great service from upgrademonkeyHopefully I will manage to get it running with the R2.
-
Meaker@Sager Company Representative
There are lots of weird bottlenecks that can come in.
-
-
So is the 880M 'Plug-and-play' ready in the (2013) Alienware 17 running on a stock 780M? Does it need modified drivers and VBIOS flash and all those shenanigans seen in previous upgrades of earlier AW 17 inchers? I'm considering ordering one from eBay, so thanks a lot for you guys' advices
-
For now perhaps, but the new Alienware 17 and 18 (Haswell machines) should be supported without an INF mod now that they are shipping with 880M.
In fact, it looks like this driver has official 880M support for both machines. (I checked the INF file and it's there for the 17 and 18.)
NVIDIA GeForce R337.61 BETA Hotfix Display Driver Release
You'll still want a vBIOS mod to have the best possible performance... NVIDIA can't (won't) help with that part. -
Mr. Fox likes this.
-
[Quick Tips] More INF modding help...
NVIDIA INF modding is very simple. As soon as you understand the concept, it can be done in a matter of seconds. At first it can be a bit confusing. It's easy to understand but somewhat of a challenge to explain in a way that everyone understands clearly. You change the hardware ID for an "authorized" GPU on the same platform to the new GPU hardware ID, or change the motherboard hardware ID for an "authorized" configuration to your "unauthorized" configuration. Same principle, but coming at them from opposite angles. I will try to explain using actual hardware ID numbers.
For example, the M18xR1 motherboard is 048F. The M18xR2 motherboard is 0550. GTX 680M hardware ID is 11A0. GTX 580M hardware ID is 1211. The 780M ID is 119F and the 880M ID is 1198. It doesn't matter whether you decide to change the motherboard or GPU hardware ID as long as you are consistent.
If you search the NVDM.INF file you will find both of the above configurations, along with the hardware ID for each GPU option that shipped from the factory in a stock configuration. So, to make the driver work you change the GPU/mobo hardware ID from a stock configuration to the new GPU/mobo hardware ID for your non-stock configuration. If you are only modding an INF for yourself and don't care if it works on a variety of platforms you can use NotePad's Find/Replace feature to do it in a couple of mouse clicks.
Starting with the hardware ID for a valid configuration (in this case the Alienware 18, which is motherboard hardware ID 05AB) to make the driver work for an M18xR1 with GTX 880M...
Under [NVIDIA_SetA_Devices.NTamd64.6.1], [NVIDIA_SetA_Devices.NTamd64.6.2] and [NVIDIA_SetA_Devices.NTamd64.6.3] you would change the line:
%NVIDIA_DEV.1198. 05AB.1028% = Section341, PCI\VEN_10DE&DEV_1198&SUBSYS 05AB1028
to...
%NVIDIA_DEV.1198. 048F.1028% = Section341, PCI\VEN_10DE&DEV_1198&SUBSYS_ 048F1028
... under [Strings] change:
NVIDIA_DEV.1198. 05AB.1028 = "NVIDIA GeForce GTX 880M "
to...
NVIDIA_DEV.1198. 048F.1028 = "NVIDIA GeForce GTX 880M "
For the M18x R2 you would replace 05AB with 0550 instead of 048F. If you were to take the approach of changing the GPU hardware ID, you would need to also change the GPU name to match the new GPU hardware ID in the [Strings] section.
Using Find/Replace to change the motherboard hardware ID is somewhat easier and faster. For our purposes using the NotePad "Replace All" feature to change all occurrences of 05AB (Alienware 18 mobo) to 0550 for the M18xR2 would automatically allow that driver package to install with any GPU that came stock in an Alienware 18. So, GTX 765M, 770M, 780M and 880M would be supported in an M18xR2 by that simple INF mod.
If you go the other route and change the GPU hardware ID (say changing 11A0 for 680M to 119F for 780M, or 1198 for 880M) the mod will work equally well, but it would only would only work for that one GPU/mobo configuration. This is OK if the mod is for you alone, or another R2 owner with 880M. However, you would also need to make one extra mod under [Strings] to change the GPU name to match. If you change 11A0 to 1198 it will still say "680M" at the end of that line of code. The driver will install, but the GPU(s) will be identified incorrectly as 680M. So, you have to manually change 680M to 880M. However, if you change the mobo hardware ID the GPU name will already be correct under the [Strings] section.
In other words, if you change NVIDIA_DEV.11A0.0550.1028 = "NVIDIA GeForce GTX 680M "
for use with 880M the editing looks like this: NVIDIA_DEV.1198.0550.1028 = "NVIDIA GeForce GTX 880M"
Hopefully, this is enough detail to make sense without creating too much confusion.
For additional examples, see these posts:
[Quick Tips] GPU Driver INF Modding for 'Aftermarket Upgrades'
[More Tips] GPU Driver INF Modding for 'Aftermarket Upgrades' -
-
Yup... like anything else, using the right tool for the job is important. If you use something for a purpose that was not intended, you'll have to tolerate compromise and put up with decreased performance. Using a GeForce card for production work isn't going to be a great experience and using a Quadro for gaming isn't going to yield results that rival the GeForce card. You have to make an intelligent decision about which intended purpose is most important to your situation before deciding which machine to purchase.
-
Quadro K5100M has about 90% of 780M's performance (at stock). Of course you can't SLI it up, but not bad at all considering it's a workstation card yet can double as a slightly slower 780M. Goes for about $1000 on eBay, so price isn't too bad either. Probably the best compromise if you need CAD performance but also want to game.
-
Hummm, a friend of mine was running solidworks on his asus laptop. If that was the cpu only then it was moving pretty damn good for it.
-
if you were a girl (and maybe alittle younger) I would kiss you miss fox!
Mr. Fox likes this. -
Mr. Fox likes this.
-
Meaker@Sager Company Representative
-
Sent with love from my Galaxy S4reborn2003 likes this. -
Will test it out later today. I have solidworks & autocad. Although I have no clue how to make it work though.
So will let a friend test it for me.deadsmiley, reborn2003 and Mr. Fox like this. -
There might be something that needs to be downloaded from the NVIDIA Developer Zone.
-
Oh yeah, what am I looking for Brother Fox?
Edit: That Nvidia face works video is looking pretty damn spectacular! -
-
Mr. Fox likes this.
-
I just pretend the opposing set of knuckles is a face and then it's a lot more fun.
-
I just downloaded and checked out that FaceWorks demo that you mentioned bro Johnksss.
I had seen it on a Nvidia demo video and it looked amazing.. now running on my machine and it looks even more amazing.
Makes me want to see a whole game made from this kind of quality. -
I can settle for a bearhug! -
Mr. Fox likes this.
-
Meaker@Sager Company Representative
Yeah they made slight hardware differences between them.
-
Yep. The Quadro gets ECC RAM. For some reason the memory bandwidth is nearly double the GTX.
Sent with love from my Galaxy S4 -
Meaker@Sager Company Representative
No, the bandwidth is the same. It's just a small part of the core to ID it as a quadro or not and allow the use of the drivers since they are what you pay for.
Mr. Fox likes this. -
So... how do I keep my 880M from throttling due to power limits?
Sent with love from my Galaxy S4 -
-
Dunno. I know it throttles below 954 running 3DMark 11. P0 is active.
Sent with love from my Galaxy S4 -
-
I have looked at the modified GTX 780M BIOS vs. the stock GTX 880M BIOS. It looks very different. So far my powers of deduction have failed me. I thought I might learn something but all I learned is that I need to learn more.
-
woodzstack Alezka Computers , Official Clevo reseller.
-
how did you get the heatsink to fit? Just got mine and the backplate uses much smallers screws that the 680M heatsink use,
Mr. Fox likes this. -
Posted this answer in the other thread... if you're using what was provided with a Clevo GPU it is the wrong support plate (aka back plate, "X" bracket)... 680M heat sinks and SLI bridge will work perfectly for 780M or 880M. Not only is having larger screws a little better, the heat spreader is there for a reason. You place the heat spreader on the bank of vRAM on the outer edge of the PCB, not the bank of vRAM that spans the width of the card.
deadsmiley and johnksss like this. -
Meaker said this:
Re: HTWingNut Review of Sager NP8268 / Clevo P150SM-A
The TDP of the 880M will never let it fully turbo in an intense game, raise that up and you should be good to go.
In response to my statement that I was underclocking my 880M in my Sager NP8278 by 50MHz.
Not sure how to accomplish increasing the TDP. I am underclocking because it gets quite warm (right at 90c max). Turning on max fans while gaming helps. I haven't recorded max temp without underclocking with the fans on max.
I know this is the AW section. It is also the most complete review I have seen for the 880M, which is why I am posting here.Mr. Fox likes this. -
Indeed, Brother Johnksss did a nice review and he is far more through in his testing than most. You can take what you read in most "professional" reviews with a grain of salt, because they typically cover only the very basics for stock performance or very mild overclocking. As such, most of those "professional" reviews are not very useful if you are a performance enthusiast.
You change the TDP using EVGA Precision X or NVIDIA Inspector. See indicated adjustments in the screen shot. For best results prioritize achieving the power target over maintaining temperature. Move both all the way up as far as the sliders allow for Power and Temperature Targets. There is a sensor for this in HWiNFO64 that shows the results. If you use NVIDIA Inspector, you have to change and apply new settings for each GPU independently using the drop-down menu. Using NVIDIA Inspector these settings does not change them synchronously as it does using EVGA Precision X.
If those adjustments do not work, you will need to wait until SVL7 and Johnksss release an "unlocked" 880M vBIOS. If you are using MSI Afterburner and it does not work, don't assume anything because it may work with EVGA Precision X and/or NVIDIA Inspector even though it does not with MSI Afterburner. (GTX 780M Voltage Control does not work with MSI Afterburner, but it does with both of the other utilities using an unlocked vBIOS.)
deadsmiley likes this. -
Meaker@Sager Company Representative
As for temperatures they can be brought down by replacing the thermal pads with thinner ones if you can and doing a good paste job with a high quality compound.
deadsmiley likes this. -
-
If it is locked with EVGA Precision X also, until an unlocked vBIOS makes the Power Target available, uncheck prioritize temperature and move that slider to the right. You can also experiment with NVIDIA Inspector batch files to see if inputting the code allows you to change the Power Target even though the software UI does not.
You can put this into a batch file in the NVIDIA Inspector folder and see if it works when you run it. If you need to, modify the path to where you have NVIDIA Inspector unzipped. I have it at "D:\nvidiaInspector" on my system. If that changes your power targets, then you can edit the rest of the batch file to set the core and memory clocks the way you want them. If you have one GPU, the second line of code can be deleted. It applies to the second GPU in an SLI system.
Code:D:\nvidiaInspector\nvidiaInspector.exe -setBaseClockOffset:0,0,135 -setMemoryClockOffset:0,0,500 -setVoltageOffset:0,0,12500 -setPowerTarget:0,222 -setTempTarget:0,0,93 D:\nvidiaInspector\nvidiaInspector.exe -setBaseClockOffset:1,0,135 -setMemoryClockOffset:1,0,500 -setVoltageOffset:1,0,12500 -setPowerTarget:1,222 -setTempTarget:1,0,93
deadsmiley likes this. -
Ok, I will play with this. I tried a couple of things before with batch files, but used different arguments.
-
My results:
AW18 with GTX 880M SLI: NVIDIA GeForce GTX 880M video card benchmark result - Intel Core i7-4900MQ,Alienware 0FT9KT
AW18 with a single GTX 880M (SLI Disabled) NVIDIA GeForce GTX 880M video card benchmark result - Intel Core i7-4900MQ,Alienware 0FT9KT
These are all done at stock speeds on firestrike 1.1, I am (very) far from your own results where a single card is almost as fast as my SLI setup!
Any idea of what may explain those differences ? -
Mr. Fox likes this.
-
Where did you get those drivers?
Here's a link to the current beta drivers: http://www.geforce.com/drivers/results/74637 -
http://nvidia.custhelp.com/app/answers/detail/a_id/3491
I needed the hotfix for hyper-v.
Sent from my SM-N9005 using Tapatalk -
I am not running Hyper-V so I didn't need the hotfix.
It makes me wonder what the hotfix broke?
My Nvidia GTX 880M Test Run Review
Discussion in 'Alienware' started by Johnksss, Feb 26, 2014.