The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    RTX 3080 trumps 2080ti by a whopping margin

    Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Outstanding GeForce RTX 3080 and RTX 3090 Samples May Be Hard To Come By Tomshardware.com | Today

    Making the GPU silicon lottery harder.

    Igor's Lab, who's well-connected within the graphics card industry, just got the scoop on the quality of silicon that we can expect from Nvidia's GeForce RTX 30-series (codename Ampere) graphics cards.

    As you would expect, initial yields and silicon quality in early stages of production won't be phenomenal. As production matures, both start to gradually improve. The early figures for Ampere aren't that bad actually. According to Igor, vendors divide the quality of the silicon into three main categories: Bin 0 for average quality, Bin 1 for good quality and Bin 2 for very good quality.

    Ideally, you would want to land a chip from Bin 2. Dies from this category reportedly operate at a significantly lower temperature under full load with a higher boost clock speeds at the same voltage. Additionally, there is more overclocking headroom.

    The German publication's sources claim that 30%, 60% and 10% of the GeForce RTX 3080's dies belong to the Bin 0, Bin 1 and Bin 2, respectively. The data is reassuring since it suggests that the majority of consumers should get a good sample, unless you're one of the unlucky ones to fall inside the 30%.

    This should be the same for the GeForce RTX 3090 as well, except that the percentage of Bin 2 chips are even lower. This is due to the fact that manufacturers are only receiving a small number of GA102 dies from the initial production. Plus, RTX 3090 will necessitate using the higher binned GA102 chips in the first place since it has 82 SMs instead of only 68 SMs.

    That last part will likely impact the overall quality of RTX 3080 chips. If most of the Bin 2 chips end up going into RTX 3090, and there's a reasonable chance that most of the Bin 0 chips will end up in RTX 3080, your chance of getting a 'very good' RTX 3080 drop. Beyond that, the AIBs will further bin whatever chips they receive and offer varying models of each GPU, with the best chips being reserved for higher factory overclocks ... and higher prices.
     
    JRE84 likes this.
  2. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    does anyone have benchmarks of the 3090?
     
  3. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,422
    Likes Received:
    3,120
    Trophy Points:
    281
    I’m not even attempting to order a Ampere GPU on 09-17 or 09-24. I know they’ll be sold out for a few years and price gouged by every retailer under the sun until EOL. It really sucks, but this is how it goes with an Nvidia graphics card..

    People are gonna be paying $1,000 plus dollars for RTX3080’s.

    After seeing these numbers, I think I’d be better off buying another 2080Ti honestly. Especially for only $500-$600 bucks. Because we all know that a RTX3080 being only a little faster will fetch $1299 on eBay.

    Towel thrown in! I’m not buying a ampere GPU.
     
  4. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Seems silly to throw in towels before it's actually been for purchase. You can probably just buy one directly from nvidia sometime in the next few months. They won't be changing that $699 price any time soon I don't think. We all know it's going to be short supply for the first few months. After that it should be ok.

    The only thing I'd be a little concerned about is if crypto mining booms back.
     
  5. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    ?i dont know what videos you guys are watching but check this out

     
  6. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Does it really need to be spelled out?
     
  7. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yes I think getting 165 vs 65 fps is kinda a big deal
     
  8. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    I T S
    F A K E
     
  9. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    lol whoops oh really? hows that? seemed pretty legit and like it was of good accuracy. I could be mistaken and probably am is the 3090 not out yet?
     
  10. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    No live FPS counter lol
     
    Aroc and Papusan like this.
  11. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I think on top of that I'm not sure anyone has a 3090 yet, I think reviewers only received 3080's so far.
     
    JRE84 likes this.
  12. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I see these huge difference in number but I see no fricking difference on the actual footage , lol
     
  13. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yeah my bad im just a human no excuse just an error
     
  14. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Nor will you. Videos are re-encoded to a maximum of 60 FPS when uploaded to YouTube.
     
    JRE84 likes this.
  15. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    But one of the video games had a supposed average frame rate of 45 on the 2080 Ti and the 3090 had 108 fps that difference should be noticeable. I cant tell the difference between 45 and 60 fps, specially if vsync is enabled
     
  16. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Like I said, the video is re-encoded at 60 FPS by YouTube. It’s not as if the left and right halves of the video can run at different frame rates.
     
    hfm and BrightSmith like this.
  17. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    It's either fake or isn't giving enough information to prove it's real. I'm guessing without a shadow of a doubt this is just to get youtube views and it's straight up BS, but enough guessing based on specs that it COULD be somewhat accurate.

    We should just move on and stop posting rando videos of benchmarks, we'll be getting the read deal benchmarks from places we trust for the 3080 on Wednesday and I'm guessing 3090 benchmarks within the week after that.
     
  18. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    As i have said before, the video is fake. The signs are all there.

    New account, clickbait format, lack of proof when asked by ppl in the comments and performance that is waaay over the moon.
     
  19. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181

    oh its just probably where these people got their info and why they made the video ect.
     
  20. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    Why has the NDA for reviews been extended to Wednesday?
     
  21. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    The internet is flooded with the info...
    [​IMG]
    https://www.tomshardware.com/news/n...d-to-sept-16-rtx-3070-available-on-october-15

    Company representative NV_Tim explained that "Due to COVID, delayed shipping and other issues, we received many requests from folks asking for more time to finish their review of the RTX 3080 Founders Edition."

    ----------------------------------------------------------------------------

    [​IMG]
    Poor memory OC scaling on the GeForce RTX 3080 might be a blessing in disguise, with closer than expected performance to the GeForce RTX 3090 notebookcheck.net

    Leaked overclock benchmarks for the upcoming NVIDIA GeForce RTX 3080 Ampere graphics card were recently published. The tests showed memory clocks pushed 9 percent for a mere 2-3 percent performance gain. This raises questions about the RTX 3080’s performance gap with the RTX 3090, which delivers 936 GB/s of memory bandwidth.
     
  22. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,771
    Messages:
    4,117
    Likes Received:
    3,990
    Trophy Points:
    331
    COVID strikes again.....

    3080 is clearly the sweet spot IMHO. 3090 is for top dog performance and deep pockets.

    Looking forward to the official reviews to clear up the FUD on both sides of the aisle.
     
    TBoneSan, Vasudev, hfm and 1 other person like this.
  23. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Yeah, I'm tired of reading all the garbage, it's time for trusted analysis of the real hardware.
     
    BrightSmith and electrosoft like this.
  24. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    except it's pretty accurate as per usual and can be trusted.....I have never seen a video of leaked hardware that appeared to be semi correct be off since the 8800gtx days.

    I'm on earth and on earth some people have an accurate idea even if it's just a guess. I think nvidia is obviously not going to release a gpu for that much more when the 3080 nets 10 percent less..im not sure what planet you guys are residing on but since the 8800gs vs 8800gtx days not much has changed. also people saying the divide between desktop and laptop is too great and they are sticking with desktop is so darn funny because for decades desktop has always been more powerfull. one generation of equality and when it goes back to the same old people switch or say they are going to....I'm a firm believer in einsteins thoughts on stupidity no limits....end rant.

    I was right about maxwell I was right about pascal I was right about fermi I was right about turing.....and ampere should be the exception right? and why on earth is that

    one proof of all this watch entire video below



    edit and yeah dude they made the 3090 triple the size for 10 percent more performance



    edit insert picture of a crack head
     
    Last edited: Sep 13, 2020
  25. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    I don't need to wait either. My uncle who works for nvidia told me everything I need to know. :rolleyes:

    Oh yes, it's "stupid" for people to want equality between laptop and desktop GPUs, and it's "stupid" for people to expect that laptop-desktop GPU parity would continue for more than a single generation after nvidia dropped the "m" from laptop GPU names. /s

    You were wrong in this very thread, posting a fake benchmark video.

    You might also consider Ghandi's thoughts: "It is unwise to be too sure of one's own wisdom. It is healthy to be reminded that the strongest might weaken and the wisest might err."
     
    Last edited: Sep 13, 2020
  26. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I'll speak for myself here guys. As many I used to be used that the most powerful laptop video card of the current gen was at the same level as the most powerful desktop video card of the last gen. for example the 8800M GTX = 7800 GTX, the GTX 280M = 9800 GTX and so on, and I was kinda ok with it. I mean that was how the life was back then. But since NVIDIA released the GTX 980 (for laptops) I thought that the gap difference between laptops and desktops was over. I'll be a bit pissed if the gap gets wider again, but if we get the RTX 2080 Ti level for the RTX 3080 (laptop) that's ok to me, I'll be happy anyway.
     
  27. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Yeah, me too.
     
    ratchetnclank and seanwee like this.
  28. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    it is fake but accurate as far as representations go the 3090 will be 80 percent faster than a 2080ti you can quote me when the card comes out otherwise its a waste of effort...but saying im wrong before the pie is cooked is kinda ignorant don't you think
     
  29. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
  30. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    We can make some pretty educated guesses based on the fact that nvidia revealed all the specs, I'd just like to see some REAL benchmarks instead of more prognostication, I really don't care who gets it wrong or right in their predictions.
     
  31. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    The guy there is just overlooking too much on the 3090. That card is meant for 8K gaming nothing else. The 3090 wouldn't probably give much edge over the 3080 in 4K or lower resolutions. 8K has to be for screens over 60 inchs or something, IIRC 8K is the resolution used in theaters so I don't thinks the 3090 is useful for the 99% of gaming population. Maybe for guys who does a lot of benchmarking too (?)...

    Here is an interesting opinion from this guy:

     
    Prototime and JRE84 like this.
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    This "8K" gaming is 1440p (1/9th resolution) upscaled to 8K using DLSS 2.1
     
    JRE84 likes this.
  33. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    i'm guessing this time but the 3080 can do 8k 30 and as far as guesses go do you really think they would make a card physically 2 times bigger and only net 5-10 percent more performance and the price lets talk about the price it's through the roof
     
  34. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Different design due over double the amount vRam. Regarding price through the roof. More of everything won't cost the same :) And $1500 open up room for +1000 for the 3080 Ti :D
    upload_2020-9-15_3-41-13.png
     
    Last edited: Sep 15, 2020
  35. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    [​IMG]

    I see at least 50% performance jump compared to the 2080S
     
  36. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    [​IMG]

    Most of these don't have DLSS.

    Either way if we can have 1440p upscaled to look like native who cares what the internal resolution is? The net benefit is the same.
     
    JRE84 likes this.
  37. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    IIRC Forza has DRS and dynamic settings scaling. R6S has its own reconstruction (an early version of checkerboard rendering) and being an eSports title is relatively easy to run, as are the other games.
     
  38. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    lol right again lol that took a big 3 days see see the light.....its not exactly quantum physics

    yet everyone well almost everyone here acts like we know nothing and cannot deduce anything based on what we know...stone ages thinking imagine if scientists said well we can't guess at all because what we know may or may not be true and with that line of thinking every discovery and theory out the window with the scientific method out the door ended at the hypothesis. :p really funny bickering about simple idea thoughts and opinions. remember with that line of thinking the scientific method would be thrown out the window and basically every cosmologic and biological discovery would not be known to us and we would essentially have a book on earth 2 pages big.

    no offense change is guaranteed and I highly recommend learning deductive reasoning.
     
    Last edited: Sep 15, 2020
  39. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    I completely agree with this sentiment. If an image upscaled from 1440p to 8K looks no different to the end user than an image rendered natively in 8K, then it doesn't matter which one we pick. Since there is no perceptible difference, we should take whatever approach yields the best performance.

    Another example of a performance saving technique is using frustum culling in video games to only render objects that can be seen. There's no reason to render objects that cannot be seen by the player because this wastes performance. To the end user, the visual result is the same whether frustum culling is used or not. Might as well use the approach that yields higher framerates, because that translates to a better experience for the user.
     
    Prototime and JRE84 like this.
  40. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    good points but then again they have been doing culling since the dreamcast days it was one big feature of shenmue to only render what was seen...results? game 8 years ahead of its time
     
  41. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    Independent reviews show that the 3080 indeed beats the 2080ti by a whopping margin, including with ray-tracing on. I'm most excited about the ray-tracing + DLSS results. It looks like a promising combination this generation.






    (Jump to 11:49 for benchmarks)

     
  42. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Looks like an average of 30% over the 2080 Ti at 4K, and with similar 30% increase in power draw. Also seems to have next to no overclocking headroom due to being power starved. I’m unimpressed at the efficiency. Perf/W barely moved from Turing.
     
    TBoneSan and Papusan like this.
  43. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Digital foundry showed a 68percent increase in performance And I would rather trust them
     
  44. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
  45. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
  46. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yeah i know! and you dont think their will be a 3090 or 3080ti
    would you compare a 980m to a 880m...most likely and the benefit


    heck they will probably have a 3090 a 3080ti and a titan
     
  47. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,771
    Messages:
    4,117
    Likes Received:
    3,990
    Trophy Points:
    331
    I'm impressed with the performance per dollar vs 2080ti.(~30-35% faster, $400+ cheaper)
    The real comparison is 3080 vs 2080 (if comparing similar pricing slots)
    Let's see once the real modders get ahold of it with shunt mods, etc...
    If I owned a 2080ti right now, I wouldn't upgrade to a 3080. I'd either bite the bullet for a 3090 or wait for the inevitable 3080ti
     
    long2905, TBoneSan, seanwee and 3 others like this.
  48. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    On desktop, sure. You can always increase cooling and power (to a point). But mobile Ampere hopefuls should be worried about the lack of efficiency gain despite a full generational jump in process node. If you compare the perf/W of 2080 Ti at 300W versus 3080 at 320W, the improvement is only 10-15%.
     
    seanwee, Papusan and electrosoft like this.
  49. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,771
    Messages:
    4,117
    Likes Received:
    3,990
    Trophy Points:
    331
    I agree 100%. We might (most likely) will start to see a separation of desktop and mobile performance again unless there is a major advance in cooling. I tend to focus on DTRs which give a bit more headroom for cooling hot and heavy GPUs, but in the thinner and lighter gaming laptops? Something is going to have to give I'm afraid...
     
  50. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Even DTR's are gonna fall well short of desktop GPU level performance, unless mobile gets no higher than the 3070? Sure, you can undervolt the 3080 to shave off 50W while maintaining stock performance, but that's still 270W, which is an unrealistic jump from the ~200W ceiling for GPUs even in the most gargantuan, can-hardly-still-be-called-laptop systems since 2015.
     
← Previous pageNext page →