So I was looking at GTX 765m and GTX 860m 3DMark Firestrike graphics results and am a bit confused on why the GTX 765m scored higher.![]()
GTX 765M
NVIDIA GeForce GTX 765M video card benchmark result - Intel Core i7-4700HQ,ASUSTeK COMPUTER INC. G750JW
GTX 860m
NVIDIA GeForce GTX 860M video card benchmark result - Intel Core i7-4700HQ,ASUSTeK COMPUTER INC. G750JM
-
Because the driver is probably not as new on the 860M, says driver not approved on the newer card.
The 860M should be faster than the 765M in all cases, moreso on the Maxwell 860M, by at least 30%. Check around the forums here first before trying to make judgement calls in reference to what's shown on those bench runs.
Often enough, those scores are rather difficult to compare to, since most of them don't show the accurate clock speeds that were being used for each machine.
-
Oh makes sense. Thank you for clarifying!
-
EDIT: It's not Kepler. The review I watched was incorrect. It is indeed Maxwell.
-
Ran Fire Storm this morning. 1242mhz after the boost (which is perpetual on this low tempurature machine) and 6383buffer-mhz and 4173 and 4206 (I beat the max gtx 860M score listed on Notebook check.
Also, the only time I have been able to get past (or saturate) 1.5gb Vram was in Heaven 4.0 @ ultra, extreme, 3840x2160 and 4x Anitaliasing...
Anteceding 4x aa, 2x aa was only sipping 1.5g (AA seemingly engourges Vram, not pixels : ) )
Temps on this laptop never, ever go above 60. and in LoL @ 4k 458-605 megabytes of vram is used.
Stone giant, Heaven, FS, none of them are able to pragmatically touch the 2 Gigs. Practical gaming is between 460-1300. Yes, Touch 4k UHD Lenovo Y50
the GPU seems to become challenged (if you want to call it that..) in all pragmatic workloads before you hit the perfect 4k 2 gigabyte saturation, which is maxed 4k Heaven/8xxaa. 4xaa is at the limit, / not quite saturating it, so it's perfectly adequate (and while I definitely see a 4xaa demarcation (improvement)/distinction standard 4k gaming is by no means bad. : ) And I'm very happy here and do not find aa practical or life altering at this resolution. My life will go on, even if I daresay it indubitably (and no doubt undoubtedly!) looks better. I've wanted a 30-50 megapixel 15.6 inch screen for some time....
Please give me my 8k 15.6 in computer display. Then I will be much closer to happy. Mind the damn colors too if not 240hz. I want 33.17 15.6 in Megapixels display now. At least 14 megapixels...
Here's to hoping that Apple... Next year.. I will buy it if they have a decent gpu/14 megapixels. All I need it for is mathematics projects and engineering stahffs (and the pretty, obligatorily)~
^^ At that point (8xxaa) I'm guessing 2300-2450MBytes is 'needed'.. 2x is ~1500Megabytes, and 4x is 1905-2041Mbytes (The GRAND plurality of the time it was 1905 bang on.) No aa @ 4k with ultra and extreme is 1050-1200 Megabytes. It sits very pretty.
Metro Last Light (most intensive game I know) also doesn't go beyond 1.735 gig under any circumstace. AA eats Vram..
Please Now, can anyone tell me how to circuitously surpass/remove this +135 'core' cap. I want to take it to 1400ish. Maybe boost clock farther.. It is very cool, again--never beyond 60-61C. The grand majority of the time it is around 48-56C WHILE stressed/maxed / regardless of title or benchmark.. It idles with a 7-9GB workload (meaning other programs are open) (1a) probably optimus / virtually off) at 38C flat. It's been at 37 flat right now..
I'm all ears. I remember doing this with my 2012 and 2013 rmbp: however, it's been awhile. Does anybody have the hardapplied Inspector codes or a sidestep? Please assist an overclock a couple of standard deviations beyond +135mhz.
That score that you mentioned is overclocked. But my memory is overlclocked a few mhz beyond that given that his score is only 4110 Or he's having throttling issues, though probably not likely with a 45W 860M, even if his computer is around 85C.
Pleese halp!
Oranjoose
I can't get this icky itch/scratch back there. Will someone oblige me? -
The 860M in the ASUS G750JM is Maxwell. That's HaloGod's score, and there's an entire megathread about it here: http://forum.notebookreview.com/gaming-software-graphics-cards/751352-860m-beast.html
Asus G750JM-T4014H Notebook Review Update - NotebookCheck.net Reviews -
I suppose the review I watched must have been wrong. The review I saw said it wasn't soldered on. Which, from what I recall, meant that it was a Kepler GPU.
-
It is soldered on, just like the CPU.
-
Then it's a steal at $1249 on Amazon.
-
Sure, as are most compared to Alienware.
-
Alienware? What made you bring that up.
-
You called the G750JM a steal for $1250, which it really isn't considering the Lenovo Y50 is cheaper and the fact that I paid $300 less for the same performance 1.5 years ago with my Y500. It's only a steal if you're used to the Alienware tax.
-
Ah, I see. Well, I've owned both Lenovo and ASUS as well. The Y50 isn't much less than the 750JM.
-
The JW has 1600x900 screen, JM has 1920x1080 screen. It ran the JW at 1600x900 and JM at 1920x1080. That's your difference. Try plugging JM into a 1080p screen and see results.
edit: nevermind, that's not it. I plugged my 860m machine into a 1024x768 old VGA monitor I had and ran it and ended up with http://www.3dmark.com/3dm/3726555
That Fire Strike score with the 765m is more than a stock 880m. Something isn't right, it's an anomaly. -
King of Interns Simply a laptop enthusiast
Hmm 1300 bucks for a soldered on GPU/CPU laptop! A rip off not a steal!
You could fully spec out a M15x with SSD/920XM/680M for that money and wipe the floor with that Asus lol (at least in games where it matters)
I am not happy to say this...more sad that a 6 year old laptop can still do this to a brand new machine.octiceps likes this. -
Sarcasm detection error.
-
Nice backtrack.
-
Yeah, thought it was pretty good...
-
Not even close; way too obvious.
-
Way too much testosterone in this thread
-
Hnnnnggggg
-
Seems pretty balanced to me.
-
That's because the 765M score in the OP is bogus. It was obtained using Lucid Virtu MVP GPU virtualization software to artificially boost the 3DMark score. It's kinda like SLI with the Nvidia and Intel GPU--basically useless for anything but comparing 3DMark e-peen since it hurts performance in actual games.
-
more like nerdosterone
-
Geek, not nerd. Noob.
GTX 765M overclock scores vs GTX 860M overclock scores
Discussion in 'Gaming (Software and Graphics Cards)' started by Bor, Aug 3, 2014.