https://www.digitaltrends.com/computing/nvidia-geforce-rtx-3080-ampere-design-leak/
https://www.tomsguide.com/news/nvid...chmarks-just-leaked-and-amd-should-be-worried
Looks like the new cards and not even ti variants will boost 30 percent to 25 percent over non consumer cards like the rtx titan and at least 35 percent over the 2080ti...
I'm personally stuck with my 1060 laptop but will be most likely getting a 3080ti egpu setup to run ps5 ports and Im pretty sure the 8750h will hold up well compared to the ryzen 2 architecture. and I will be gaming at 1080p ( to the skeptics) and i'm still not sure if its even possible to get 1080p 60fps running on ps5 ports even if on ps5 they are running at 4k 120fps..optimisation this upcoming generation is going to be insane.
-
I've read all the leak articles. I am going to save my speculation for seeing reviews of actual launch hardware.
killkenny1 and JRE84 like this. -
And this coming from linus... a known shill.
Regardless I wont hold my breath for 3000. Most games I play suffer enough from utilization problems enough as it is, though I suppose the real potential is actually 4k 120hz + gaming. -
Over the past few years I've found that jumping a generation between purchases is a good value even for high end hardware. Before the current 2080 I had a 980m laptop and while it was still powerful enough and nowhere near obsolete, the jump in performance to the 2080 laptop was big enough to offset the higher price. If we consider time frames, this would translate to 2-3 years.
On the other hand, I am not impressed by the build quality of the current generation laptops and if things don't drastically change in the next couple years, I'm going back to building a desktop. -
cucubits likes this.
-
its funny people keep saying Linus is a shill.
the rumors are all over the place so yeah. we should have something concrete by the end of August. -
shill
/SHil/
Learn to pronounce
INFORMAL•NORTH AMERICAN
noun
- an accomplice of a hawker, gambler, or swindler who acts as an enthusiastic customer to entice or encourage others.
linus is more negative than positive and I don't see how hes a shill, he has 11m subs and you probably have 200 singular subs...stop being so jealous the guy came from nothing into something great, I love his input and at least he knows what hes talking about..like seriously you think your better?thefatapple, seanwee, electrosoft and 1 other person like this. - an accomplice of a hawker, gambler, or swindler who acts as an enthusiastic customer to entice or encourage others.
-
-
more like 100 percent of the time it is!!
like dude I dont know if you just joined NBR but these leaks have never been wrong. its a leak not speculationLast edited: Jun 25, 2020jc_denton likes this. -
Fire Tiger, JRE84 and alaskajoel like this.
-
electrosoft Perpetualist Matrixist
The upside as always is 2080 prices will drop even further retail (as they clear out stock) and used market as a glut of 2080ti's will make their way to eBay and other outlets.
Or I might go "new" again and get a 3000 series (or wait for Big Navi and see what they have to offer).
I'm running a 5700xt + an old HP ZR30 QHD.
My next logical step is a >30" 4k display and a card powerful enough to drive games @ 4k. -
yeah I know dude....I'm rarely serious I was raised in a sarcastic joking home/upbringing
jc_denton likes this. -
But when you cut the power to laptop levels will it really be a significant improvement?
BrightSmith and JRE84 like this. -
well 1060 laptop vs desktop wasn't too big of a jump hopefully they cut the difference again and laptop vs desktop becomes negligible.
ok so i looked up a valley benchmark of the 1060 and it scored 62.1fps my laptop which has really good cooling gets 65fps so yeah the laptop/desktop are basically equal and theirs a chance the 3080 versions will be tied the same way.Last edited: Jun 30, 2020 -
Everyone knows this will be a 50 percent jump and if your in the market it would be wise to wait for the mobile 3080
-
ratchetnclank Notebook Deity
-
No one wants to have to sink 25% more power in a laptop and have it go to hair-dryer mode to get massive gains. The 3090 board power is rumored to be 400W, which is much more than the 250W that the 2080Ti is rated. If I recall the 3080 rumored board power was 350W. I don't trust any of it though until we see reviewers like Hardware Unboxed and Gamers Nexus et al. get their hands on them. If it all turns out to be true, that's great, I couldn't buy a rumored unreleased product anyway so it didn't matter.
But lets also remember that we're not seeing these numbers in any type of context, and also IT'S ALL STILL RUMOR and we have no verifiable facts at all. I say we just wait until products are actually shipping before trying to speak definitively about it.
I'm going to be in the market for a new desktop card since I'm not going back to using laptop dGPU and will use eGPU instead. I'm definitely going to wait to see what AMD releases before buying anything. If it means playing Cyberpunk 2077 on my current system, so be it.Last edited: Aug 30, 2020 -
cj_miranda23 Notebook Evangelist
I'm thinking Nvidia naming their initial high end card 3090 means a TI version may be released? What you guys think? If the rumor 1400$ price for a 3090 is true, we might see a 2K plus TI 500W version lmao!
Last edited: Aug 30, 2020 -
I'm thinking the mainstream laptop high end is going to use wording like "Max Q" which will be operating with a lot less consumption. Even the dual power supply clevo laptops would need bigger bricks; not saying no one will go there but it wont be anything close to mainstream. I've actually enjoyed having so called 2k res and my desktop 1080, it's enough power to drive the screen in everything I've tried so far to decent frame rates at that resolution and to my eye looks great on a 17' screen. It also means I can run in the native res at the desktop, not something this old man could do with 4k. I'd want more with a bigger screen but I did a 19" laptop and that was too much trouble to pack around so 17'' is really my target. I think it could be pretty good with whatever their top mobile Max Q'ish part is, 2k, 17", good enough for my laptop. Or at least that's my hope, the parts will have to prove me right before I go there.
-
Yeah the tdp does seem high.. probably a small gain over the mobile 2080 super as they will have too cut the power draw by two thirds
-
hfm likes this. -
Yeah at the time of posting I was unaware of tdp values and unless Nvidia works some magic this could be a disappointing year
-
I'll need to see how the rtx 3070/3080 actually performs in medium sized laptops before i jump to conclusions that mobile 3000 cards will only be a hair faster than mobile 2000 series cards. Nvidia did claim a 1.9x perf/watt improvement. Even if thats just marketing BS, gpus do get exponentially more efficient as they are underclocked so we may get 70% of the performance at 40% the power draw.
To be honest, I hope the inefficiency of the 3000 series drives up average gpu tdps in laptops as it will force laptop makers to innovate and create really efficient cooling deigns. That and if it forces nvidia to kill off Max-q gpus, good riddance.
That said, I do plan on waiting for 5nm Nvidia Hopper for an efficiency boost. And in the meantime I'll be investing in a VR setup to tide me over.Eclipse251 and Prototime like this. -
TGP aside, it uses 320bit memory, so I doubt that we'll see this on a laptop. We'll have to wait and see.
-
Ok guys. I'm on a cell during my vacation and I just watched digital Foundry exclusive look at the 3080...brace yourself...I was wrong it's not a 50 percent increase...it's a whopping 60 to 80 percent faster...un fricken believable
-
Imagine if 3080 Max-Q come at 80/110W vs the real 3080 at 320W -
Honestly I'm not very impressed. It looks like around 35% performance per watt improvement at around 150W despite a major process node change. Most of that 35% improvement is due to a bigger core allowing for lower voltages to maintain performance. Clocks are down on the entire 30 series indicating a big voltage dropoff. PPW improvement will likely be only around 20% for the 100W range of mobile parts.
This is interesting for the modding scene though. Power limit mods are going to bring major performance increases, but also insane amounts of power draw. With how fast these cards are though you need 4k 120Hz screens to actually need the extra speed.JRE84 likes this. -
Nvidia is always able to bring desktop laptop closer correct me if I'm wrong but isn't this then the same jump from Maxwell to pascal
-
https://www.tomshardware.com/uk/news/nvidia-geforce-rtx-3080-everything-we-know -
The card is being tested it was released to digital Foundry...they are seeing 80 percent gains
-
I just saw a rumor somewhere that they are expecting to be supply constrained until 2021. I guess that makes sense because
- It's new and they are always supply constrained at first
- They want to give people anxiety that are thinking about waiting to see what AMD does to pull the trigger on release day. LOL
Prototime likes this. -
-
Saw this advertisement on the right hand side. had to share it...
Aroc, Prototime and yrekabakery like this. -
Oh wow that was a 3080 3090 is double that
-
yrekabakery Notebook Virtuoso
-
And I do think that the ampere cards are downclocked to be more efficient so im expecting huge gains by shunt modding the cards. -
Well, I would like to see some user reviews before I go for it but I trust Digital Foundry. If that's so, I will jump ship to desktop; tired of waiting for 17", 4k >120Hz screens to pop up as well as seeing throttled gpus on laptops from nvidia..
-
comparison: GTX 980 was a 180W card for desktops. Laptop 980M would eat 100-120W depending on the VBIOS and power delivery components (note, NOT THE MOBILE 980) and that was 20-40% away from the real desktop 980 performance. Nvidia should have just stuck with the M suffix for their GPUs because they will never ever be anywhere near their desktop counterparts.
Who knows what the deal will be for Mobile 3080. The Area51M tried to deliver desktop 2080 performance and ended up blowing mosfets right and left. Dell clamped down and reduced the max power consumption of the cards and nerfed them. Keep in mind desktop 2080 power consumption was around 225-230W at the time. Ampere is not wildly more efficient than Turing. The just made a fat core and fed it more powerLast edited: Sep 3, 2020 -
I'm hoping this forces laptop makers to innovate and come up with really good designs. -
Can't wait to see Ampere in laptops, I hope we get 300W TDP for the max-p GPU. I think laptops can handle that kind of power. My laptop can handle 2x200W SLI GPU and keep it under 80ºC. I know is not the same but it can't be that far either
-
-
-
Not thin and light but not desktop replacement -
electrosoft and Papusan like this.
-
electrosoft and Papusan like this.
-
Last edited: Sep 3, 2020Vasudev likes this.
-
I doubt most OEM will be able to handle 200W of 3080 Super, let alone 3090 Max-P/Q/A based mobile chips. One way they can leverage is through soldered DRAM/VRAM crossing 12GB for caching huge texture and lower GPU clocking to sub 900MHz and push to 4k 60 easily with DLSS 2.x. So, a combination of super fast NVMe, big VRAM and more RAM can turnaround mobile laptop gaming in theory. Even if they manage that, there will be heat and you need separate fans and impressive passive cooling to cool off PCIe4 SSD, 3800MHz+ DRAM, 4.5GHz-4.9GHz all core CPU and GPU clocks and DRAM freq. Thin and light books have it tough now.Papusan likes this.
RTX 3080 trumps 2080ti by a whopping margin
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.