Nvidia support? Maybe
Intel support? Fat chance
-
-
ratchetnclank Notebook Deity
“ This feature can give gamers an extra boost in gameplay FPS on Intel’s new 11th generation H/S and select 10th generation systems when paired with supported graphics cards, including NVIDIA GeForce RTX 30 Series GPUs."
https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/
However it's unlikely laptop manufacturers would release the vbios update and bios updates needed for it i guess.JRE84 likes this. -
if you shug off the speculation..people that are under nda and have the mobile 3080 in their possesion are claiming 40 percent upgrade over 2080 and thats about as big of a jump as we have ever seen...
I think it's decided im upgrading to the 3080 max p no point in the 3060 or 3070. looking forwards to 120 plus fps in olderish titles as I have about 300 AAA games in my backlog...time to get back into gaming for more than a hour a week -
Of course they will be supporting it in products they are planning to sell. -
Kunal Shrivastava Notebook Consultant
-
mxm good hope???????????????????????
as for the actual performance do you guys agree we will see a 40 percent bump or was that guy talking best case? -
https://www.clevo.com.tw/ces_2021/X170KM-G.html
The same chassis is used.Kunal Shrivastava likes this. -
"With our new laptops we’re introducing Dynamic Boost 2.0, which uses AI to balance power between the CPU, GPU and GPU memory. The AI networks in Dynamic Boost 2.0 manage power on a per-frame basis, so your laptop is constantly determining where power is needed the most and optimizing for maximum performance."
-
As I have said from the beginning. Back to old days with castrated mobile cards
http://forum.notebookreview.com/thr...scale-on-laptops.834039/page-21#post-11064334
For their latest generation, while not being explicitly noted by NVIDIA, it looks like a change in nomenclature is at hand. In the past couple of generations the company has simply referred to their mobile parts with the same name as their desktop parts – e.g. GeForce RTX 2080. However for the RTX 30 series, we’re seeing parts listed as “laptop GPU”, e.g. “GeForce RTX 3080 Laptop GPU”. It’s a small change, but an important one: as we look at the specifications, it’s clear that NVIDIA isn’t always using the same GPU in a mobile part as they have its desktop counterpart, so any kind of true equivalency between laptop and desktop has gone out the window.
Yeah, Nvidia got it as they wanted. Implemented double up with crippled M branded cards. Some of us saw this coming when they added in the Max-Q lineup. And here we sit in 2021 with only Max-Qrippled in all shape and forms (different SKU's with different TGP and vRam depending how crippled you want it).
How many will be screwed with cards maxed out with the lowest possible TGP? Brand it Laptop GPU is a new way to try lure people into a disaster purchase. Nice!
NVIDIA’s Dynamic Boost is designed to take advantage of the fact that in many laptop designs, the GPU and the CPU share a common thermal budget, typically because they are both cooled via the same set of heatpipes. In practice, this is usually done in order to allow OEMs to build relatively thin and light systems, where the cooling capacity of the system is more than the TDP either of the CPU or GPU alone, but less than the total TDP of those two processors together. This allows OEMs to design around different average, peak, and sustained workloads, offering plenty of headroom for peak performance while sacrificing some sustained performance in the name of lighter laptops.
"It’s also worth noting that even with the Max-Q designation, it’s likely that some of the new functionality available here will also show up in non Max-Q laptops"
Edit.
See also my post here... http://forum.notebookreview.com/thr...m-owners-lounge.831618/page-374#post-11070420
Last edited: Jan 13, 2021 -
Last edited: Jan 13, 2021 -
so 3060 vs 2060 is 40 percent bump
2080 super vs 3080 is 15 percent
and 3070 well now how about that 3070 i think ill pass
I have a 1060 I get 10000-12000 in firestrike graphics the 3080 get 35000 or so but im getting 40fps in the best looking game(breakpoint) and have to debate like many how important 100fps is in a 3rd person shooter. will the game be more fun at a higher framerate, will it be more immersive...I really dont know I guess ill have to wait and see prices of a thin and light bga turdbook with the filthy 3080 max crap.Last edited: Jan 16, 2021electrosoft and Mr. Fox like this. -
-
Normimb, Papusan, seanwee and 1 other person like this.
-
yeah I meant 2080 super lol...whoops nice catch Mr. Fox
-
Why pass on the 3070 though. Seems like the best value out of the bunch for me.
-
yrekabakery Notebook Virtuoso
3060M looks like the best perf/price to me, since 3060M laptops are supposed to start at $999. 3060M has more cores than desktop 3060 (3840 vs. 3584) and power limit up to 115W (130W with Dynamic Throttle Boost 2.0), so it should get fairly close to the desktop version (170W), although it might need a little V/F curve tuning to get there.
electrosoft, Aroc, JRE84 and 2 others like this. -
The figures we're seeing for the 3070 laptops so far are proving that. -
Uh nope there are a couple of 3060 laptops close to the 1k price, I was skeptical at first too, but here you go.
https://www.bestbuy.com/site/msi-gf...-ssd-8gb-memory-black/6448272.p?skuId=6448272
https://www.bestbuy.com/site/asus-t...pse-grey-eclipse-grey/6448933.p?skuId=6448933
And once tongfang gets going....Aroc, JRE84, Mr. Fox and 1 other person like this. -
yrekabakery Notebook Virtuoso
JRE84 likes this. -
Nvidia's recommended pricing for RTX 3070 machines was what... $1299? That's my point. -
why would the 3070 machines be cheaper than used 2070 machines? nvidia and laptop mans. would lose alot of money.
2070 laptop - Best Buy..
3060 is the best bang for buck...its not even close
its kinda like desktop 3080 vs 3090 is 500 more bucks worth the 7-12 percent boost if that..
nvidia is banking on people wanting the greatest...kinda like cars do you really need a new porche everytime a new one comes out -
Nvidia said laptops with 3060's would start at 999, that's been proven regardless of various opinions nuff said. What's next up, claiming that they "really" meant with tax included, xx gb of ram, xxxhz screen etc etc etc LOL
-
-
yrekabakery Notebook Virtuoso
JRE84 likes this. -
lol....nice
too prove it for or against post a picture of all 8 threads capped at 98-100 percent in any modern game....long story short aint gonna happen -
The consoles have 8-core CPUs now, having half as many is a death sentence.
The proof you want is easily attainable, to the point of not being worth addressing. Max out four cores/eight threads? Come on man. -
-
I'm sure there will be 6+ cores option in other models this year we haven't seen all the 3060 models so far. I certainly don't think 4 core cpu's are obsolete that's crazy.
-
yrekabakery Notebook Virtuoso
In my experience, modern open-world and large-scale multiplayer AAA games at high Hz are more limited by memory bandwidth than CPU core/thread count when the GPU is not the bottleneck. 6-8 core CPUs have become commonplace in gaming laptops for a few years, but RAM has not kept pace. Mainstream mobile CPUs have 50-100% more cores/threads now, but RAM improvements have barely nudged since the 2400-2666MHz used during the quad-core days because the usual 2933-3200MHz you find in gaming laptops these days are using atrocious JEDEC timings (e.g. CL22) so their effective bandwidth is much lower, resulting in limited scaling to additional CPU cores/threads.
When I upgraded from an i5-8600K to i7-8086K, my minimum FPS drops in Battlefield V and large scale CoD modes (Warzone and Ground War in MW 2019, Combined Arms in Black Ops Cold War) at competitive low settings didn't improve because I was more memory than thread limited. I saw a bigger benefit tuning my memory from the default 2666MHz CL18 JEDEC profile to 3100MHz CL15 with optimized secondary and tertiary timings, than I saw overclocking the CPU.
Hopefully once DDR5 gets rolling, we get decent SO-DIMMs that are actually usable in most laptops so that mobile CPUs are less bottlenecked by memory bandwidth.Last edited: Jan 16, 2021JRE84 and GrandesBollas like this. -
Kunal Shrivastava Notebook Consultant
And also because of horrible temps on that badly binned chip. -
4 cores paired with 3080. What a disgusting combo.... https://www.notebookcheck.net/The-G...with-an-Nvidia-GeForce-RTX-3080.514777.0.html -
I'm eyeballing the AMD/3080 combo that ASUS is offering. Gonna have to see how things bench there and elsewhere with the rest of the goods coming out before getting too far ahead of myself. Ambivalence is high; running a 1080 DT which certainly still works but with Cyberpunk I've had to go down to HD which is a trend I expect to continue. I'm financially independent but I didn't get there by throwing money away. I gotta see something tangible.
-
edit maybe i just forgot how poorly a quad handles games these days..id say half can still run good though..
it looks like 8 cores should be the min according to this video
-
Well it's all about what you want to pay for the performance you want. Laptops are mostly used with the built in screen/native resolution and the manuf's aren't going to mismatch gpu/cpu's too badly, regardless of core counts etc. Desktops it matters more because users are free to interchange almost everything resulting in the possibility of large cpu/gpu mismatches/wasted power/money. Trying to future proof a laptop past a couple of years is crazy expensive and not a good idea with how long most gaming laptops last, barring some insane $xxx per year warranty. Not to mention they're not cpu/gpu upgradable in most cases anymore. It is frustrating that they won't put in say a 9750h hexacore over a 10300h quad core, but I think laptop manuf's are locked into only the latest generation of cpu's by intel contractually, I'm guessing. And pricing can't get too or more out of control. Other than multicore/thread stuff there really isn't that much benefit, but sure consoles do drive minimum specs. I'm kind of skeptical about the real performance of these consoles tbh, are they really getting 4k/60 in TOP LEVEL NEW games for $500?
-
Kunal Shrivastava Notebook Consultant
Nothing running in the background for me except steam and a couple of other launchers.
Multi-threaded quad cores still have a year or 2 perhaps, single threaded quads are obsolete.JRE84 likes this. -
i honestly thought a 1080 was enough for ultra in cyberpunk..so yeah if you have pascal or dont mind running medium nows the time to upgradeLast edited: Jan 17, 2021seanwee and Kunal Shrivastava like this. -
Kunal Shrivastava Notebook Consultant
I have these settings with 1080p upscale+ 50% nvcp sharpening on a 1440p screen and honestly, it looks damn good and plays at 50-60.Last edited: Jan 17, 2021JRE84 likes this. -
oh nice what gpu do you have?
-
Kunal Shrivastava Notebook Consultant
Still holding its own in fact, it still plays great for 1080p.
AC Valhalla gives me about 60 at optimized settings+reworked colours(reshade preset)+50% nv sharpen+80% scale @1440p. Looks better than vanilla too!
After toying around with unlocked frequency briefly for CP2077, went back to base frequency+170mv undervolt+DF 'optimized' settings. 40-55 fps on average+gsync is good enough for meespecially with temps not going beyond 65(very silent fans;50% max fan speed)
Crysis remastered plays nice at 1080p with performance RT+high physics(still got to have those reacting palm tree leaves don't we) +66% sharpen~75fps average, 1%-56 fps.JRE84 likes this. -
yea i almost bought a m15 with a 2070 for 1500cad instead i bought a 1060 locally for 1700....msi gs63vr 8re....i prefer to support local businesses instead of giant online corps.....but man do i wish i got the m15 games got demanding in a hurry....luckily i dont mind 25 plus fps(old school nerd/ old timer)
Kunal Shrivastava likes this. -
Kunal Shrivastava Notebook Consultant
-
yeah i know im old school i always use mixed low high settings even if their is a slight difference.
some games i run all low like crysis 3.....really can you spot the diff when playing -
Kunal Shrivastava Notebook Consultant
JRE84 likes this. -
I turn everything up to Ultra/Psycho when I want to take a screenshot. -
Kunal Shrivastava likes this. -
yrekabakery Notebook Virtuoso
Even 6C/6T is obsolete, unless you like 100% CPU usage and bad frame times, for example:
Deus Ex: Mankind Divided (2016) - i5-8600K (6C/6T):
Call of Duty: Black Ops Cold War (2020) - i5-8600K (6C/6T):
The Witcher 3: Wild Hunt (2015) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
(Look at the huge lurches on the i5 frame time graph)
Battlefield 1 (2016) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
Battlefield V (2018) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
Crysis 3 (2013) - i5-8600K (6C/6T) vs. i7-8700K (6C/12T):
Just Cause 3 (2015) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
The Witcher 3: Wild Hunt (2015) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
Watch Dogs 2 (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
PlayerUnknown's Battlegrounds (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
Deus Ex: Mankind Divided (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
seanwee, JRE84 and Kunal Shrivastava like this. -
oh wow.....am i ok with 12 threads 60hz or should i upgrade...also rep for that learning is a life long journey
-
Even my 9700K's lack of multi-threading is starting to be a hinderance in pushing framerates. I really need to find a cheap 9900K to get me by until 2023.JRE84 likes this. -
yrekabakery Notebook Virtuoso
I remember when I upgraded from an i7-4720HQ (4C/8T 3.6GHz) with DDR3 1600MHz CL9 to an i5-8600K (6C/6T 4.7GHz) with DDR4 3100MHz CL15, my performance in some games more than doubled...
JRE84 likes this.
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.