Nvidia support? Maybe
Intel support? Fat chance
-
-
ratchetnclank Notebook Deity
I think Intel already announced support for both 10th and 11th gen cpus.
“ This feature can give gamers an extra boost in gameplay FPS on Intel’s new 11th generation H/S and select 10th generation systems when paired with supported graphics cards, including NVIDIA GeForce RTX 30 Series GPUs."
https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/
However it's unlikely laptop manufacturers would release the vbios update and bios updates needed for it i guess.JRE84 likes this. -
if you shug off the speculation..people that are under nda and have the mobile 3080 in their possesion are claiming 40 percent upgrade over 2080 and thats about as big of a jump as we have ever seen...
I think it's decided im upgrading to the 3080 max p no point in the 3060 or 3070. looking forwards to 120 plus fps in olderish titles as I have about 300 AAA games in my backlog...time to get back into gaming for more than a hour a week -
I meant bring support to their older generation of cpus.
Of course they will be supporting it in products they are planning to sell. -
Kunal Shrivastava Notebook Consultant
I think we'll have to rely on dsanke/prema BIOS for that, main question is physical compatibility of the 3080 mxm boards with older form factors. -
mxm good hope???????????????????????
as for the actual performance do you guys agree we will see a 40 percent bump or was that guy talking best case? -
Linus from Linus Tech Tips mentioned January 26th as the day the embargo lifts.
It's pretty much guaranteed. Clevo released a marketing video previewing the X170KM-G, which is the 11th Gen Core and RTX 3000 refresh of the X170SM-G.
https://www.clevo.com.tw/ces_2021/X170KM-G.html
The same chassis is used.Kunal Shrivastava likes this. -
I was talking about Dynamic Boost 2.0
"With our new laptops we’re introducing Dynamic Boost 2.0, which uses AI to balance power between the CPU, GPU and GPU memory. The AI networks in Dynamic Boost 2.0 manage power on a per-frame basis, so your laptop is constantly determining where power is needed the most and optimizing for maximum performance."
Maybe best case scenario, with RT enabled and DLSS so the RTX 3080 max-p takes advange of the 2nd gen of RTX cores and the new 3rd gen of tensor cores that should do an extra boost in favor of the RTX 3000 -
NVIDIA Announces GeForce RTX 30 Series for Laptops: Mobile Ampere Available Jan. 26th tomshardware.com
As I have said from the beginning. Back to old days with castrated mobile cards
http://forum.notebookreview.com/thr...scale-on-laptops.834039/page-21#post-11064334
For their latest generation, while not being explicitly noted by NVIDIA, it looks like a change in nomenclature is at hand. In the past couple of generations the company has simply referred to their mobile parts with the same name as their desktop parts – e.g. GeForce RTX 2080. However for the RTX 30 series, we’re seeing parts listed as “laptop GPU”, e.g. “GeForce RTX 3080 Laptop GPU”. It’s a small change, but an important one: as we look at the specifications, it’s clear that NVIDIA isn’t always using the same GPU in a mobile part as they have its desktop counterpart, so any kind of true equivalency between laptop and desktop has gone out the window.
"Laptop graphics adapters have always been their own product family, and now by returning to having distinct naming and specifications for their laptop adapters, NVIDIA has gone back to more clearly defining this structure"
Yeah, Nvidia got it as they wanted. Implemented double up with crippled M branded cards. Some of us saw this coming when they added in the Max-Q lineup. And here we sit in 2021 with only Max-Qrippled in all shape and forms (different SKU's with different TGP and vRam depending how crippled you want it).
How many will be screwed with cards maxed out with the lowest possible TGP? Brand it Laptop GPU is a new way to try lure people into a disaster purchase. Nice!
NVIDIA’s Dynamic Boost is designed to take advantage of the fact that in many laptop designs, the GPU and the CPU share a common thermal budget, typically because they are both cooled via the same set of heatpipes. In practice, this is usually done in order to allow OEMs to build relatively thin and light systems, where the cooling capacity of the system is more than the TDP either of the CPU or GPU alone, but less than the total TDP of those two processors together. This allows OEMs to design around different average, peak, and sustained workloads, offering plenty of headroom for peak performance while sacrificing some sustained performance in the name of lighter laptops.
"It’s also worth noting that even with the Max-Q designation, it’s likely that some of the new functionality available here will also show up in non Max-Q laptops"
Edit.
See also my post here... http://forum.notebookreview.com/thr...m-owners-lounge.831618/page-374#post-11070420
Last edited: Jan 13, 2021 -
Finally... an admission of guilt for selling compromised garbage that is poorly designed and [mal]functions 'as intended' is status quo for an industry run by dishonest losers. All this nonsense is bestowed on us under the auspices of offering really cute trash that is easy for wimps to pack around like a glorified smartphone. Keep the money flowing and they'll continue to produce that broken trash in equal measure, at a price that is as high as the market of silly people will allow them to.
Last edited: Jan 13, 2021 -
so 3060 vs 2060 is 40 percent bump
2080 super vs 3080 is 15 percent
and 3070 well now how about that 3070 i think ill pass
I have a 1060 I get 10000-12000 in firestrike graphics the 3080 get 35000 or so but im getting 40fps in the best looking game(breakpoint) and have to debate like many how important 100fps is in a 3rd person shooter. will the game be more fun at a higher framerate, will it be more immersive...I really dont know I guess ill have to wait and see prices of a thin and light bga turdbook with the filthy 3080 max crap.Last edited: Jan 16, 2021electrosoft and Mr. Fox like this. -
-
I think there was a typo in his post. Probably meant 2080 Super.Normimb, Papusan, seanwee and 1 other person like this.
-
yeah I meant 2080 super lol...whoops nice catch Mr. Fox
-
Why pass on the 3070 though. Seems like the best value out of the bunch for me.
-
yrekabakery Notebook Virtuoso
3060M looks like the best perf/price to me, since 3060M laptops are supposed to start at $999. 3060M has more cores than desktop 3060 (3840 vs. 3584) and power limit up to 115W (130W with Dynamic Throttle Boost 2.0), so it should get fairly close to the desktop version (170W), although it might need a little V/F curve tuning to get there.
electrosoft, Aroc, JRE84 and 2 others like this. -
Not happening. Nvidia's "recommended" pricing was purely PR.
The figures we're seeing for the 3070 laptops so far are proving that. -
Uh nope there are a couple of 3060 laptops close to the 1k price, I was skeptical at first too, but here you go.
https://www.bestbuy.com/site/msi-gf...-ssd-8gb-memory-black/6448272.p?skuId=6448272
https://www.bestbuy.com/site/asus-t...pse-grey-eclipse-grey/6448933.p?skuId=6448933
And once tongfang gets going....Aroc, JRE84, Mr. Fox and 1 other person like this. -
yrekabakery Notebook Virtuoso
Not sure what you’ve seen, but the prices I saw looked similar to 20 Series, with 3060M laptops between $1000 and $1500, and 3070M laptops between $1500 and $2000.JRE84 likes this. -
Maybe I've missed some. Link to $1k USD RTX 3060 laptops?
Nvidia's recommended pricing for RTX 3070 machines was what... $1299? That's my point. -
why would the 3070 machines be cheaper than used 2070 machines? nvidia and laptop mans. would lose alot of money.
2070 laptop - Best Buy..
3060 is the best bang for buck...its not even close
its kinda like desktop 3080 vs 3090 is 500 more bucks worth the 7-12 percent boost if that..
nvidia is banking on people wanting the greatest...kinda like cars do you really need a new porche everytime a new one comes out -
Nvidia said laptops with 3060's would start at 999, that's been proven regardless of various opinions nuff said. What's next up, claiming that they "really" meant with tax included, xx gb of ram, xxxhz screen etc etc etc LOL
-
I've not seen 3070 machines below 1500$+, but the markup for stuff like gaming laptops is even more than your average items of all types. The 999 claim from what I saw was only for 3060 versions, never saw it for above that. They could easily cut profit margins by a large chunk and be profitable companies, hopefully the consoles can knock nvidia upside the head.
-
yrekabakery Notebook Virtuoso
$999 2060 laptops also generally came with quad-core CPUs, so idk why you’re moving the goalposts now.JRE84 likes this. -
lol....nice
8750h is commonly used but yeah quad core cpus are not obsolete
too prove it for or against post a picture of all 8 threads capped at 98-100 percent in any modern game....long story short aint gonna happen -
Held the exact same opinion of those, just wasn't relevant to voice in an Ampere thread.
Quad cores are done. In a year system requirements will be listing 6-core CPUs as minimums.
The consoles have 8-core CPUs now, having half as many is a death sentence.
The proof you want is easily attainable, to the point of not being worth addressing. Max out four cores/eight threads? Come on man. -
dude i have almost every game and never max out my 8750h 6 core cpu....heck even aoe 3 def ed doesnt max it out..4 cores is not obsolete quite yet as for the new consoles you are correct but wait a year
-
I'm sure there will be 6+ cores option in other models this year we haven't seen all the 3060 models so far. I certainly don't think 4 core cpu's are obsolete that's crazy.
-
yrekabakery Notebook Virtuoso
In my experience, modern open-world and large-scale multiplayer AAA games at high Hz are more limited by memory bandwidth than CPU core/thread count when the GPU is not the bottleneck. 6-8 core CPUs have become commonplace in gaming laptops for a few years, but RAM has not kept pace. Mainstream mobile CPUs have 50-100% more cores/threads now, but RAM improvements have barely nudged since the 2400-2666MHz used during the quad-core days because the usual 2933-3200MHz you find in gaming laptops these days are using atrocious JEDEC timings (e.g. CL22) so their effective bandwidth is much lower, resulting in limited scaling to additional CPU cores/threads.
When I upgraded from an i5-8600K to i7-8086K, my minimum FPS drops in Battlefield V and large scale CoD modes (Warzone and Ground War in MW 2019, Combined Arms in Black Ops Cold War) at competitive low settings didn't improve because I was more memory than thread limited. I saw a bigger benefit tuning my memory from the default 2666MHz CL18 JEDEC profile to 3100MHz CL15 with optimized secondary and tertiary timings, than I saw overclocking the CPU.
Hopefully once DDR5 gets rolling, we get decent SO-DIMMs that are actually usable in most laptops so that mobile CPUs are less bottlenecked by memory bandwidth.Last edited: Jan 16, 2021JRE84 and GrandesBollas like this. -
Kunal Shrivastava Notebook Consultant
This is the reason I upgraded from 7700 to 9900kf:
And also because of horrible temps on that badly binned chip. -
The new is 35w quad cores and high Clocks
4 cores paired with 3080. What a disgusting combo.... https://www.notebookcheck.net/The-G...with-an-Nvidia-GeForce-RTX-3080.514777.0.html -
I'm eyeballing the AMD/3080 combo that ASUS is offering. Gonna have to see how things bench there and elsewhere with the rest of the goods coming out before getting too far ahead of myself. Ambivalence is high; running a 1080 DT which certainly still works but with Cyberpunk I've had to go down to HD which is a trend I expect to continue. I'm financially independent but I didn't get there by throwing money away. I gotta see something tangible.
-
dude how were you getting 100 percent on all four cores 8 threads..i suspect something is running in the background...in bfv i get like 30-40 usage with 6 cores and gpu around 90 percent..im downloading bf v right now too confirm. also do you have a video of two other people experiencing this? i say this because before my 8750h laptop i had a i5 9300h and was never cut short from the cpu in about 100 games.
edit maybe i just forgot how poorly a quad handles games these days..id say half can still run good though..
it looks like 8 cores should be the min according to this video
-
Well it's all about what you want to pay for the performance you want. Laptops are mostly used with the built in screen/native resolution and the manuf's aren't going to mismatch gpu/cpu's too badly, regardless of core counts etc. Desktops it matters more because users are free to interchange almost everything resulting in the possibility of large cpu/gpu mismatches/wasted power/money. Trying to future proof a laptop past a couple of years is crazy expensive and not a good idea with how long most gaming laptops last, barring some insane $xxx per year warranty. Not to mention they're not cpu/gpu upgradable in most cases anymore. It is frustrating that they won't put in say a 9750h hexacore over a 10300h quad core, but I think laptop manuf's are locked into only the latest generation of cpu's by intel contractually, I'm guessing. And pricing can't get too or more out of control. Other than multicore/thread stuff there really isn't that much benefit, but sure consoles do drive minimum specs. I'm kind of skeptical about the real performance of these consoles tbh, are they really getting 4k/60 in TOP LEVEL NEW games for $500?
-
Kunal Shrivastava Notebook Consultant
Here, and notice it's bottlenecking a 1070.
Nothing running in the background for me except steam and a couple of other launchers.
Multi-threaded quad cores still have a year or 2 perhaps, single threaded quads are obsolete.JRE84 likes this. -
yeah your right....very glad i have a 6 core now....i had no idea quads are obsolete
i honestly thought a 1080 was enough for ultra in cyberpunk..so yeah if you have pascal or dont mind running medium nows the time to upgradeLast edited: Jan 17, 2021seanwee and Kunal Shrivastava like this. -
Kunal Shrivastava Notebook Consultant
About Cyberpunk, did you have a look at the digital foundry optimization guide? Ultra is not worth it on any GPU. Not just pascal,even a 3080 has trouble there(especially psycho reflections/RT)-all for marginally better visuals.
I have these settings with 1080p upscale+ 50% nvcp sharpening on a 1440p screen and honestly, it looks damn good and plays at 50-60.Last edited: Jan 17, 2021JRE84 likes this. -
oh nice what gpu do you have?
-
Kunal Shrivastava Notebook Consultant
GTX 1080.
Still holding its own in fact, it still plays great for 1080p.
AC Valhalla gives me about 60 at optimized settings+reworked colours(reshade preset)+50% nv sharpen+80% scale @1440p. Looks better than vanilla too!
After toying around with unlocked frequency briefly for CP2077, went back to base frequency+170mv undervolt+DF 'optimized' settings. 40-55 fps on average+gsync is good enough for me
especially with temps not going beyond 65(very silent fans;50% max fan speed)
Crysis remastered plays nice at 1080p with performance RT+high physics(still got to have those reacting palm tree leaves don't we) +66% sharpen~75fps average, 1%-56 fps.JRE84 likes this. -
yea i almost bought a m15 with a 2070 for 1500cad instead i bought a 1060 locally for 1700....msi gs63vr 8re....i prefer to support local businesses instead of giant online corps.....but man do i wish i got the m15 games got demanding in a hurry....luckily i dont mind 25 plus fps(old school nerd/ old timer)
Kunal Shrivastava likes this. -
Kunal Shrivastava Notebook Consultant
-
yeah i know im old school i always use mixed low high settings even if their is a slight difference.
some games i run all low like crysis 3.....really can you spot the diff when playing -
Kunal Shrivastava Notebook Consultant
No, unless you stop moving and squint at the grass.JRE84 likes this. -
Yeah I run DF's optimized settings and it runs well.
I turn everything up to Ultra/Psycho when I want to take a screenshot. -
yeah I took a pic of all low then veryhigh and there was such a slight change i chuckled.
DF is awesome for this type of thing they really break it down....its not just cyberpunk they have done it alot and its very helpful...yeah since all we get is ports I say its mandatory to use mixed settingsKunal Shrivastava likes this. -
yrekabakery Notebook Virtuoso
If you have a high refresh rate display, quad cores have been obsolete since 2016.
Even 6C/6T is obsolete, unless you like 100% CPU usage and bad frame times, for example:
Deus Ex: Mankind Divided (2016) - i5-8600K (6C/6T):
Call of Duty: Black Ops Cold War (2020) - i5-8600K (6C/6T):
The Witcher 3: Wild Hunt (2015) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
(Look at the huge lurches on the i5 frame time graph)
Battlefield 1 (2016) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
Battlefield V (2018) - i5-8600K (6C/6T) vs. i7-8086K (6C/12T):
Crysis 3 (2013) - i5-8600K (6C/6T) vs. i7-8700K (6C/12T):
Just Cause 3 (2015) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
The Witcher 3: Wild Hunt (2015) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
Watch Dogs 2 (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
PlayerUnknown's Battlegrounds (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
Deus Ex: Mankind Divided (2016) - i5-8400 (6C/6T) vs. i7-8700K (6C/12T):
seanwee, JRE84 and Kunal Shrivastava like this. -
oh wow.....am i ok with 12 threads 60hz or should i upgrade...also rep for that learning is a life long journey
-
Yuuuuuup. That's a great point that I forgot to bring up.
Even my 9700K's lack of multi-threading is starting to be a hinderance in pushing framerates. I really need to find a cheap 9900K to get me by until 2023.JRE84 likes this. -
yrekabakery Notebook Virtuoso
I remember when I upgraded from an i7-4720HQ (4C/8T 3.6GHz) with DDR3 1600MHz CL9 to an i5-8600K (6C/6T 4.7GHz) with DDR4 3100MHz CL15, my performance in some games more than doubled...
JRE84 likes this.
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.