Can we just use the nvidia drivers from nvidia site?. Or is it always the custom razer ones ?
-
eventually you will be able to use the drivers straight from NVIDIA, but not yet. latest driver fails to install, boo.
-
mindinversion Notebook Evangelist
Quoted from a few pages back, this is how to hack the new drivers to install.
Hoof, If you're still out there, did you see any difference in temps, performance, benchmarks. . . anything that would give a before/after picture with the new drivers? Also, does it fix the Gsync loading screen flashes affecting some [I know all Blizzard Titles do it] games?hoofhearted likes this. -
No offense here, but if you're just playing Blizzard titles then getting a 1080 is simply overkill. It's a known fact that Blizzard games are well optimized and developed to work on almost any machine including Toaster Ovens. Not saying that Blizzard Games are bad, but I am saying that Blizzard Games are a very low measuring stick or standard and thus really shouldn't be used as the marker. I love blizzard games as much as anyone else if not more (I've dumped hundreds of dollars on their products) but I can also play these titles easily on a Macbook Pro Bootcamp very comfortably.bloodhawk likes this.
-
So does this lead us to believe that razer has No certification from nvidia for the 1080, or is it a case of they haven't gotten round to supporting the blade yet?
-
New device variant. NVIDIA will add the Ids eventually
-
I have 2 x Samsung 950 pro 512GB drives.
I know that the nmve driver is broken with my alienware , would the same be true of the razer? -
mindinversion Notebook Evangelist
Ok, how about Elite Dangerous VR ultra settings [tweaked w/ a little more Ambient occlusion and added high quality effects where applicable] and dev options bumped to 1.5x? I know it'll choke a 4.7Ghz desktop 6700k w/ reference GTX 1080, and the Blade Pro handles it at the same levels
-
In all honesty, my review made the fans sound a lot louder than they do because my camera was close to it and I made sure the room was dead silent; and didn't check to see if my cam microphone was on a default settings or anything . But at full load the fans sound like the low setting in a hair dryer. Really not that bad seeing as my temps are hovering around 80c full load
-
That's good to know
-
Just updated my driver to 376.19. As far as Blizzard titles go, I only play SC2. I had screen jumpiness on the driver that was delivered with the notebook (375.74). After updating to 376.09 and now 376.19, it fixed the SC2 jumpy issue (didn't have to disable gsync). Haven't really did any benchmarks. Just now, after updating to 376.19, I just quick tested Witcher 3, Witcher 2, Dishonored 2, Skyrim SE (64bit), Skyrim (Enderal) (32 bit) Fallout 4, and SC2. Basically I just checked to see if said game ran at max resolution (3840x2160), ran my character around a little, listened to the fans spin up and exitted game (at work now). Maybe Thursday sometime, I will try and look at more details (temps and benchmarks). Anyone have any suggesting as to what a taxing game of late may be? Watchdog 2?
-
hoof, I had the titan gt80 just like you, how do you like the blade keyboard compared to it? I would say try battlefield for an intensive game. or watch dogs 2. I think my cpu got up to 82 degrees in watchdogs.
-
I actually prefer the GT80 keyboard, but bringing that monster to work and back is a chore. I am getting used to the Blade keyboard though. Probably will try WD2. Trying to stay away from Origin titles though. I am a bit upset with MSI right now though. No 1080 MXM upgrade modules like they promised. You have to do a trade-in.
-
yeah I was bummed over the mxm issue as well, ended up just selling mine. the blade is a much more totable machine.
-
I ran watchdogs 2 yesterday in 4K medium settings and got constant 59-60 fps. Fun game lol. Played for 6 hours and Msi afterburner said max temp was 87c for cpu but GPU and cpu was around 79-82c. Every now and then in intense scenes I would get maybe a frame drop to 50 for a second then right back up. Couldn't get Msi overlay to run either but had it running in the background. Game looked gorgeous in 4k medium settings. Also wanted to note that 375.74 or whatever driver razersupport.com has is extremely buggy I kept getting a lot of frame drops with it. I used hoofs method and am on 376. 369.(something) is the one that caused screen flashes during loads for me .
-
Actually it might have been the "ship with" driver now that I think about it, that was causing the SC2 jumpy screen issue. I think it was fine with the on off razer's site (375.74)
-
Yeah the 369 ship with driver was jumpy screen, the 375.74 was giving me frame drops sometimes but it was a vsync gsync issue
-
Can one of you guys do a 3D Mark Firestrike Standard and Firestrike Ultra run ?
Then post the respective links.
Much appreciated. -
-
Hmm do you remember what the GPU/Graphics score was?Papusan likes this.
-
Graphics was 18154 (?) and cpu was 91_ _(?) trying to look at my review video but it looks blurry
I'll run some more test and link when I get a chance
-
If you create a login on futuremark.com your final scores will be kept for you on their website, so you can login to your account and view them at any time.
www.3dmark.com/login
It's also really handy to share URL's to our scores, and even URL's to comparisons between our scores - like out of box score vs best OC score, rather than screen shots or text info posted.
It's really nice to get those URL's to do comparisons against our own scores, because they contain CPU / GPU info with speeds run during the tests.Last edited: Dec 8, 2016Papusan likes this. -
Oh sweet ok. Was wondering if I could do that since I stupidly paid 30$ for 3D mark lol. Didn't realize it was free
hmscott likes this. -
Yeah, I keep posting that they are free from their website, and most people only share the "free" tests results anyway
I think you can use the Steam version to save the results on the futuremark.com site too.
Or you could use your futuremark key from steam and download the version(s) from futuremark directly, those should upload results automatically.Last edited: Dec 6, 2016 -
Ah gotcha.
Thank you for looking into it!
Appreciate it.Papusan likes this. -
I ran standard and got 9840. Ultra says advanced edition only and I don't want to pay for this.
http://www.3dmark.com/3dm/16532865hmscott likes this. -
That seems a little too low. Was G-Sync on by any chance ?
You should be in the 17k-18k territory for the graphics score. -
That looks like a score from battery mode
That's a 1080 so you should be getting about double that score +25%
Wow, you gotta dig into your settings, and get it out of power saving mode...Papusan likes this. -
Clevo 1080's, that are true desktop class, score about 22-24k depending on the temperature.
The best I have seen a RBP do as of yet is 186XX. That's around what an overclocked 1070 would peak out at.
Most other lower TDP 1080's peak out at sub 19k , at best 20k. -
A friend if mine is looking into buying one. I told him to get it only if he cares about the build and a decent keyboard.
Performance wise it really isn't worth the $4000 Razer is charging for it.Papusan likes this. -
You need to turn off gsync and vsync, and get process lasso or handbrake to turn off core parking .hmscott likes this.
-
Could you guys please run one or two of your favorite games in 1920x1080 resolution for several minutes and tell us how it looks?
In your mind try not to compare it with 4K (of course it would be awful) but with other 17-inch native 1080p display.
Depending on upscaling algorythms the image may look jagged (if nearest neighbor algorythm used, 1 pixel from 1080p signal becomes 4 pixels of the same color in 4K, each set of 4 pixels looks more like a square comparing to 1 big pixel) or blurred (more complex algorythms).
I would like to know which case do we have here with RBP.
Thank you! -
Yeah it has jagged edges, not entirely too bad though. I occasionally play games at 1080 cause UI scalingbloodhawk likes this.
-
it will look exactly like 1080p because 4K is an exact 4x multiple of 1080p. -
Ideally - yes.
But it's a little more complicated.
So...
Nope
-
Thank you!
Here is an explanation from another forum:
-
Saying there's no quality lost is objectively a false statement.
People love throwing out the math, but in practice it just isn't true. Put a 1080p monitor next to a 4K monitor running at 1080p, and you'll immediately be able to tell which is the native 1080p.
It also has a lot to do with how good the monitor's scaler is, and laptops tend to not have the best of that technology included. And it goes without saying, that GPU scaling is far from perfect.Last edited: Dec 8, 2016 -
mindinversion Notebook Evangelist
Just a quick update and word of caution:
I've been having Gsync issues [screen kept flashing between color and almost black and white] on loading screens in Heroes Of The Storm and Overwatch, and I was told to grab the updated driver from Razer's website. The machine started shutting off during HoTS [specifically every other match]
So I did a recovery restore, downloaded the same driver, and it's gotten worse. Now I don't even need to drop into a match before it shuts down. Temps aren't anywhere near the danger zone [machine never broke 82c] so it looks like either I got a bum unit, or there may be a rash of questionable power supplies out there. I suspect the PSU, as I was specifically requested by Razer support to take a pic of the sticker on it and send it to them.
I FINALLY get a machine from them that doesn't BSOD or spike to 98c, and it can't handle me
Go figure >< I've requested an RMA, waiting to hear back.
I'm disappointed as heck, the machine is like a living entity under my hands during gaming. . but by now we all know these things happen. I just wish they'd quit happening to me... -
I'm not sure about this one. Have you done that? I put my Acer XB271HK next to my buddys 1080p 27" IPS when I first got it. You'd really be hard pressed to tell the difference. Your 100% right for anything but 1080p, but 1080p just takes 4 pixels and treats them like one like has been said. That combined with the high quality display of the XB271HK in general really makes it look identical to a native 1080p display. It's not like you can see the pixels, so having 4 of them be treating like one doesn't really look gimmicky or anything. 1440p for example I think looks worse than 1080p on the XB271HK. I suppose that post above about the pixels having clearer edges could be correct, but that's not really an image quality loss.
I'll concede you may technically be correct about quality loss, but nothing you can tell in a real usage scenario.
My Clevo P650RS is not quite the same story with a pentile 4K display, but even it's scaling isn't terrible. -
That's pathetic sorry to hear of your troubles/problems but if you spend 4 grand and have problems playing Blizzard titles then in my book that's pathetic..... Blizzard requirements are so minimal, are well optimized and probably should be among the first Class A/1 titles that should be tested...
I am literally speechless, I mean not only are you having issues but the system is shutting down, so it isn't like Lag or anything.... Awful...Spartan@HIDevolution likes this. -
This is why I was skeptical as well. The less demanding the title, the more the heat generated . Specially if the fps is uncapped or the system is pushing something like 4k.
The Gsync issues are mainly NVIDIA dropping the ball. -
The most demanding game I've ever ran on the 1080 is Warframe on 4K. It pushed the 1080 to 280w (desktop) -
-
loads the cpu fairly well too
warframe is very easy on the cpu -
mindinversion Notebook Evangelist
And it's handling the HEAT just fine. 82c max on the CPU and the occasional spike to 72c on the GPU both TOTALLY within spec [tJunction on the 6700hq is 100c, and the desktop 1080 SHOULD be able to handle similar temps [obviously we're talking pascal here, not Fermi. . .but STILL]
On the one hand, I know Razer is contracting some company to build these, and with the stock drivers it DOES kinda sorta work. If the updated drivers are enough to bump the power profile to the point where either the motherboard or PSU can't handle it, it kinda makes sense. I get that.
On the other hand, you're absolutely right: for 4 grand the thing SHOULD be flawless. The fact that I got two responses from Razer support and then no word for over 24 hours is incomprehensible and EXTREMELY Frustrating. I spent the few days before following their instructions tolerating the original issue, and the machine is absolutely MAGICAL! It's almost soul crushing.
Now I can hear all the haters crying about how overpriced the machine is, how there are better values out there [there aren't] and how I should get X machine with this 1080p matte screen [which I can't stand, and is why I stuck with Razer] or this other huge 12lb monstrosity from Asus or MSI [been there, no thanks]
And it kills me to have yet ANOTHER issue with Razer's machines [my 5th or 6th over the years now] because I have no ammo at this point to defend against said haters.
For anyone contemplating this machine: It's a dream. It's SO much better than the confined space of a desktop, to be to move and game and work. I'd love to say the chances of this happening to you are minimal [they probably are] but at this point I can't personally vouch for it. Eventually I'll get a working unit, and if I have to send 8 units back to find one I'll do it. It just kills me that I can't spend all that time. . all THIS time. . USING a machine I fell in love with that turned out not to work ><
Sorry for the /emo, had to vent a bit. -
mindinversion Notebook Evangelist
Sorry for the double, browser froze up.
-
you should edit the nvidia 376.19 driver shown how in this thread. it will fix the graphic issues you experienced. 376.19 no problems.
-
Problem is the power limits aren't software based. They are hardware.
They can be unlocked to a certain extent using a custom vBIOS. But a full unlock needs hard mods.
Also this 1080 is hitting 82C AFTER being TDP limited to 160W. The 180W ones in other laptops obviously have better cooling and run at about 80C max, but they have much more head room.
Then again such is the cost of the form factor.
I wanted to reccomend this to my friend who wanted a portable well built notebook with a 1080, but at $4k, the performance doesn't justify the cost .
This 1080 is performing like the 1070 OC, at best. I'm yet to see this system score more than 19k graphics points in Firestrike. And if I'm paying $4k for a laptop with a 1080 in it, it better perform like a 1080 and not like it's little brother.
I wanted to like this machine so badly, since I tried it. But first - Min's arrogance and literal lying to the audiences face ticked me off too much and then the driver issues. Not to mention their detoriating, by the day , customer support. -
I thought the absolute same thing you are sayin while I was waiting for my 3200x1800 RB regarding HD (1600x900) res; it ended up being wasted hope to be honest.
There's nothing such as 4:1 ratio, HD looks bad while FHD (1080p) looks almost just like 3200x1800 in any full screen app/game, it just doesn't work this way at all as explained above, sadly.
It's probably up to the scaling process used; Nvidia might be doing a batter work than Intel (which would explain why windows scaling isn't so good in any case, while FS apps look good) but still, the higher ther res, the better the result, IMHO.
Razer Blade Pro 17" (1080 GPU/late 2016) Owner's Lounge
Discussion in 'Razer' started by reloader-1, Oct 20, 2016.