The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]

    Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.

  1. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Hey, cool use for the 1660 Ti: it uses the Turing NVENC and NVDEC and has the same number of those encode and decode chips as the 2080 Ti. So, on github, there is instructions and what you need to mod the driver so you are no longer limited to just 2 streams for encode or decode. Do that and use it with a plex server or another media server than can use NVENC and NVDEC. (remember, this is hardware accelerated encode and decode)
     
    Rage Set, Papusan and Mr. Fox like this.
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    The 7980XE is an amazing CPU. It might be getting long in the tooth, but it takes no prisoners and leaves a wake of devastation in its path.
     
    Last edited: Aug 31, 2020
  3. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So, if the rumor is true that the 3070 matches the 2080 Ti and that the $600 is only FE, so the retail is $500, then Nvidia may give you that even (only 24% from 2070 Super to 2080 Ti according to the TechPowerUp pages: https://www.techpowerup.com/gpu-specs/geforce-rtx-2070-super.c3440).

    I'm waiting to see if the hype is worth it on the Nvidia cards. If the rumor on the leaked Time Spy Extreme benches are real, then that is 57% or so for the 3090, over 40% on the 3080, and that is a more sizeable upgrade.

    If the other rumors are true of closer to 45% for the 3090, 30% or so on the 3080, and extra money for the extra performance, you are not getting all that generational performance, you will be paying for the performance (around 14% higher price on the 3080, thereby meaning you only get 16% more performance generationally, whereas with the 3090, it depends if you start at the $1000 or $1200 mark, with the $1200 mark being 16.7% more expensive, thereby only getting around 28% more performance generationally).

    So I am waiting for reviews and benches until I pass judgment on the Nvidia lineup regarding price and performance.

    But, leaning into what @Mr. Fox mentioned, next gen is where the real fun begins for BOTH companies. RDNA3 and Nvidia's Hopper will have CHIPLETS. Now, imagine taking lower end smaller dies, but not having to clock them past the voltage knee to now get the performance target for the card. That means they can leave WAY MORE OVERCLOCKING PERFORMANCE on the table!!! Yes, overclocking then comes down to binning and whether or not the multiple dies are matched well by Nvidia, but that is yet to be seen (I could see the chiplets being integrated onto the substrate package and that package sent to AIBs, so pre-binning becomes a bigger deal for Nvidia to do right).

    This means we could see a true leap in tech with that switch, similar to how Ryzen turned MOAR COREZ into a thing (which software is starting to catch up to now for some things). So I am truly more interested in the next gen GPUs than this gen (but may have to pick up a card soon anyways because my current GPU is so old).
     
    Rage Set, Mr. Fox and Papusan like this.
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Tinkering a bit...
     

    Attached Files:

    Rage Set likes this.
  5. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Last edited: Sep 1, 2020
  7. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    C-states are good.....run C-states and Maximum Performance Power Plan in Windows, and have your CPU locked at your max overclock (don't use the setting that fluctuates the clocks). I did some performance and power (W) comparisons back when I first had my 6700K and C-states enabled didn't reduce performance when used with the settings described in my previous sentence, but it did reduce idle power consumption dramatically, as well as idle voltage. That's my recommendation.

    EDIT: here's the link to the thread I created on the topic which contains my testing too: http://forum.notebookreview.com/thr...formance-responsiveness.803030/#post-10494101
     
    Last edited: Sep 1, 2020
    Rage Set and Mr. Fox like this.
  8. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    New GPUs look great. RTX 3090 is a bomb that just got dropped all over little RDNA2. 10496 Cuda Cores. :)



    Actual benchmarks.
     
    Last edited: Sep 1, 2020
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I think the 3080 is gonna be the card to get, it's got 50% more cuda cores than the 3070 and it's not 50% more expensive! As long as they free up the power limits enough that we can run those cores at the same speed as the 3070 then I think 3080 is the one to get. Yes, and impressive to see in the region of 80-90% performance increase of 3080 vs 2080! That performance increase was more than I initially expected, I was initially expecting 70-80% performance increase, but then leaks seemed to suggest it was gonna be less than that.....looks like the leaks were wrong & it was closer to my initial estimation like years ago lol! I don't know if I can justify getting the 3080 though, but it does seem like the one to get vs the 3070.

    (Here's a link to a Guru3d item comparing the specs where you can see the 3080 has 50% more cores than the 3070: https://www.guru3d.com/articles-pages/geforce-rtx-3080-and-3090-what-we-know,1.html)
     
  10. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Nvidia blew it out of the water guys!!!!! Is it possible we may bottleneck PCI-E 3.0 X16 with RTX3090 cards overclocked?? Hope not.

    The RTX3070 is 20TFLOPS
    The RTX3080 is 30TFLOPS
    The RTX3090 is 36TFLOPS

    I am thinking the RTX3080 is a solid choice. Even over a 2080Ti. I will take a good educated guess that the RTX3080 will achieve around 18,500 in timespy graphics right out of the box! The RTX3090 will land around 23,000 in timespy graphics right out of the box.
     
    Last edited: Sep 1, 2020
    Rage Set, ajc9988, Papusan and 2 others like this.
  11. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I'll be getting 3090. Now the question is...founders edition or 3090 KINGPIN lol

    I'll let you know if it bottlenecks in PCIe x16 3.0 slot. :p
     
    Rage Set, Mr. Fox and Papusan like this.
  12. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    I will be waiting until we know for certain if there will be a 3090 Ti. The Green Goblin is banking on everyone clamoring to be one of the first to own the flagship GPU that maybe isn't the flagship GPU. That would suck more than words can say.

    And, the price is extremely stupid and unreasonable. I guess they figure that because they got away with raping customers on 2080 Ti pricing they might as well go ahead and awaken the Geforce trouser snake for another monetary booty call. I mean, c'mon NVIDIA. For that kind of money, customers that are willing to bend over and take it like a big boy deserve unlocked voltage and power controls, and removal of firmware signing restrictions... not another gimped GPU that has to be butchered with a soldering iron and SPI programmer to put the appropriate control over the GPU behavior back into the hands of the GPU purchaser.
     
    Last edited: Sep 1, 2020
  13. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Have you guys seen the massive market crash on 2080Ti's? lol. I sold both of mine just in time!

    And the next gen is not even out yet. 2080Ti's have fallen.

    $669 OBO!


    [​IMG]
     
    Robbo99999, Mr. Fox and Talon like this.
  14. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    I disagree. I was really unimpressed. All of those large gains are with RT and DLSS on. What about games that don't support those? What about games that have one or the other? What about games with DXR or DXML or both?

    My point is, do not rely on numbersfrom the company.

    What we know is that power consumption is up 30% on the 3090 over the 2080 Ti. This is that much for likely 50% more performance without rt and dlss. The 3080 is likely 30% faster than the 2080 ti without dlss and rt. Now that is 40% or so over the 2080 at the $700 price point. That is the gold for consumers (like others mentioned). But at more power.

    Only card I like is the 3080. 3090 is to blend Ti and titan most likely.

    So it depends where you are coming from. If coming from maxwell or pascal, then 3070 or 3080. If from turing, eh...

    That explains things are not what people are all excited about. Wait. After benchmarks, things could move. After AMD things could change again. AMD won't beat the 3090, but could be competitive on 3080 and 3070. No idea if compatible on machine learning or raytracing though.

    Edit: To be clear, the 2070 Super to 2080 Ti was about 24% performance. So, even if the 3070 wins, it will be around or LESS than a 30% jump generationally. Because of this, if you have a 2070, the upgrade does not seem worth it yet. For 40%, the 2080 to 3080 is at that threshold.

    The 3090, the 8K@60 was heavily due to the tensor cores. What most missed is Jensen said that with DLSS 2.0, you need fewer raytraced points to extrapolate the machine learned image. This also explains why they beefed up the shader count and the tensor cores MORE THAN the RT cores. The performance increase comes from machine learning from the 16K images and that helping to allow the images to be filled in quicker. It is impressive what it can do. The question is how it deals with DXR and DXML and how much adoption DLSS works.

    Edit 2: Also, the 3070 seems nearly the full 104 die. So they are trying to compete to put that full die at $500.
     
    Last edited: Sep 1, 2020
    Papusan likes this.
  15. Rage Set

    Rage Set A Fusioner of Technologies

    Reputations:
    1,611
    Messages:
    1,682
    Likes Received:
    5,068
    Trophy Points:
    531
    Likewise for myself. I find it fishy that last gen, they had the Titan, TI and 80 class cards and this gen the 90 class is the Titan and the 80 is the "top card". Mark my words, there will be a 3080 TI AND a "3090 TI/Super" or an actual Titan. This presentation was a warning shot to AMD. I have no doubt that AMD will have a card that can beat (or match) the 3080 at a slightly lower price but Nvidia will have the 3080 TI waiting in the wings for 1200. The question I have is if the 3080 TI will support NVLink or is the 90 card the last pro-consumer card to feature it?
     
  16. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Remember, the guy who did sli was hired away not too long ago.
     
  17. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    The Titan RTX 24GB market has collapsed as well!

    I saw one sell for $1,200 already. Another on there now for $1,400 OBO BRAND NEW SEALED lol.

    man I might grab a Titan RTX now that the market has crashed, Nvidia has slayed their own product stack haha.

    people have literally just bought these cards for $2,200-2,400+ new or like new. I’d be so pissed right now if I just purchased a RTX Titan.
     
    Rage Set and ajc9988 like this.
  18. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Nvidia try justify the ugly price point for 3090 due the massive 24GB vram for 8K. If AMD can't compete then we won't see the Ti tag for 3090. Maybe they add in the Super tag for all cards next fall. The Ti tag is last veapon on same arch to be in front of AMD. They won't use it before they have to.


    GPU Market Gets Flooded With Used GeForce RTX 2080 Ti’s After NVIDIA’s RTX 3090 & RTX 3080 Unveil, Prices Below $500 US
    For reference, an ASUS ROG STRIX 2080 Ti 11 GB graphics card is currently listed for $415 US while an open-box ZOTAC RT 2080 Ti AMP! is listed for $500 US.
     
    Last edited: Sep 1, 2020
    jc_denton, Rage Set and ajc9988 like this.
  19. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Well I’m fine with using RTX in my games. I remember not using it with my 2080 Super and even my 2080Ti because of the massive performance hit it caused. Either way, these cards are fast. And huge Cuda core numbers do great in low power options like laptops.

    A 2080Ti at just 250-280 watts is a beast because it has 4,300+ Cuda cores. It just eats up any game with good performance, while being extremely efficient.

    I can see a RTX3080 and RTX3090 just chewing stuff to pieces. Next generation laptops are actually going to be super capable in a super thin setup. I know people hate BGA. But, in all honesty a RTX3080 Max q could be a 2080Ti desktop performance inside of a tiny thin laptop! How amazing is this?

    I mean the RTX3080 was leaking out like sub 19K timespy graphics scores, this is equivalent to like a 2.3Ghz RTX Titan. And the 3080 is running stock. I think
    It’s safe to say for $699, these cards are gonna crush last gen.

    I think the 3080 will be about 40% faster than a 2080Ti. Stock vs stock.

    Either way! I am looking forward to it. I blew my budget on my new build. I went all out on CPU, so I am going to just buy a RTX3080 for $700. I’d love to squeeze all of the juice I can out of it.
     
    ajc9988 likes this.
  20. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,566
    Trophy Points:
    181
    The cards from the Ampere lineup will be very good for high framerate gaming at 1080p or 1440p.

    I saw the demo where the RTX 3090 was pushing 60 fps at 8K resolution. I don't really see a point in a resolution that high unless the budget cards have the performance required to target 8K 60 fps.

    Furthermore, 1080p still looks good on laptop and most desktop screens to this day. Sure 4K objectively looks better, but you really need large screens to take advantage of the visual fidelity of ultra high resolutions. The difference is less pronounced on smaller screens, and not everyone wants to connect their gaming rig to a large TV.
     
    Robbo99999 likes this.
  21. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    I think it’ll be nice to finally use our 2560x1440P 165HZ G-Sync to their fullest! 8K 60FPS? I highly doubt it..

    Even a 2080Ti Locked at 2,130Mhz core and memory at 16Ghz struggles to run demanding titles like RDR2 at 80-100FPS at just 2560x1440P, with no AA,and tree tessellation turned off, but all other details are maxed out.

    So, we will finally be able to push 165FPS 1440P with the highest graphics fidelity possible.

    The monitor technology is only 5 Or 6 years old.
     
  22. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Borderlands 3, the first game tested does not use any DXR. No DLSS, no ray tracing. They found over an 80% increase in performance compared to RTX 2080. While we don't have a huge selection of games tested, that is very impressive IMO.

    RTX 3080 looks to be the best valued card. Incredible performance for $699. RTX 3090 looks to be the Titan, and I am willing to bet we get an RTX 3080 Ti in about 6-7 months with 9800 cuda cores or something at like $999.

    Nvidia looks to be pricing these to squeeze AMD. 8nm Samsung was cheap for Nvidia, and they've proven with their 12nm chips they don't need the cutting edge 7nm process to compete with AMD. I don't have alot of hope for AMD's GPU department.
     
    Mr. Fox likes this.
  23. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    I think Nvidia actually put in everything they had with Ampere. And 70-80% is a realistic figure for most games even with RTX off. This generation is so good, people are practically dumping off their old cards at any loss. They simply “Don’t care give me Ampere”
     
  24. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Can you link the slide from the official show? I do not remember it being in there but may have missed it. Or is that the leaked game slide never actually used in the Nvidia release presentation (and therefore is not official)?

    EDIT:
    upload_2020-9-2_9-2-18.png
    There was this slide. That is in 4K and with RT, etc.

    Edit 2:
    https://www.anandtech.com/show/1605...re-for-gaming-starting-with-rtx-3080-rtx-3090
    Considering they show no other slide with game performance and I did scrub through the video again, you are referencing a NON-OFFICIAL LEAK! The authenticity of which was questioned by many. Can you provide an official reference for it?

    Edit 3: Nope, just found it referenced, need to do more digging:
    https://www.eurogamer.net/articles/digitalfoundry-2020-hands-on-with-nvidia-rtx-3080
    upload_2020-9-2_9-17-22.png

     
    Last edited: Sep 2, 2020
    Rage Set, Talon and Papusan like this.
  25. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Found the answer on WHY Nvidia selected the games they did for Digital Foundry to test: TAA.
    upload_2020-9-2_11-53-36.png

    Now why is TAA important?
    https://www.tweaktown.com/news/7388...uld-work-on-any-game-that-uses-taa/index.html

    In other words, even though Nvidia did NOT announce DLSS 3.0, their DLSS 2.0 implementation on Ampere, and the drivers provided, very well could be using DLSS on the TAA implementation for Borderlands 3, thereby negating your claim that is doesn't use either technologies. Thereby, this score is likely inflated.

    But that DOES make these cards EVEN BETTER, as that score is suggesting the DLSS can be used in situations with TAA, and that means WAY MORE UPSCALING DAY ONE! THAT is AWESOME!

    As to AMD, the 8nm Samsung is actually a mature 10nm design. Keep that in mind. TSMC 12nm was actually any generation 16nm past gen 4 16nm from TSMC. So, TSMC 7nm is more dense and more refined than Samsung 8nm. But, Nvidia's designs, generally, have been better on efficiency to date. Let's say that directly.

    So, Nvidia is getting more performance, by 30-50% is my guess, but at 30% more power consumption (so good scaling per watt) while also having increased scaling for overclocks. We are going to see over 400W when overclocked on the Nvidia 3090s!

    Nvidia was trying to play Samsung versus TSMC. But AMD bought up Apple's 7nm fab time also. They may also buy up the Huawei fab time as well. It is incredible just how much fab time AMD is buying, and since doing GPU, CPU, and embedded there, TSMC understands just how much of their fab time AMD can take up, especially as they grow CPU adoption and if they grow GPU adoption. Plus, rumor has it, they like working with AMD better.

    But, you are wrong on the 12nm "proved" anything. Look at the 5700XT. Without that, we would NOT have had the Super lineup, which was when the 2000 series FINALLY showed something above the pascal lineup in some way (and why then they only had 24% and 12% performance of the 2080 Ti over the 2070 or 2080, respectively).

    Since then, they changed the uarch to fully get rid of GNC, they then also increased the max CUs capable on the dies, they changed the instruction set even further and how that is processed on RDNA2, they added RT and Machine Learning cores, etc. In fact, Nvidia was targeting going up against an RDNA2 card with 72 CUs. AMD was talking possibly an 80 CU variant.

    Moreover, AMD moved the engineers over that made the Ryzen transistors so efficient for that huge jump to the Radeon group, which is allegedly how they achieved the 50% efficiency jump this generation (chiplets is how RDNA3 is supposed to achieve that jump again), with some bit coming from process changes.

    So, before you say AMD doesn't have hope, wait. Do I think they take the 3090? NO. Do I think they actually compete with the 3080 and 3070? Possibly (excluding RT and DLSS, with it, I have no clue if they can compete).

    As such, you should DIG DEEPER before you say it is without DLSS or RT.

    Edit: And to be clear, I'm currently looking at buying a 3080. So please do not say I am just trying to be an AMD fanboy. I explained changes and nuance. But the TAA thing makes me lean toward Nvidia more because I don't know that AMD's ML will be worth anything first gen.

    Edit 2: look at Control using TAA versus Control using DLSS and RT. (also, the DLSS and RT were using a lower resolution according to the video, then upsampled to 4K).

    Edit 3:
    Rumors of super and Ti cards.
     
    Last edited: Sep 2, 2020
    Rage Set, Papusan and Talon like this.
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Deja vu. Seems like that happens with every generation. It will be an instant replay when the Ampere successor surfaces. To a great extent that facilitates the ludicrous pricing and consumer behavior is largely to blame for it. NVIDIA is doing it because they can and with no meaningful competition they only need to make marketing overpriced products their priority.
     
  27. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    [​IMG]

    RIP, SLI: Nvidia GeForce RTX 30-series slams the final nail in multi-GPU's coffin

    The multi-GPU dream is effectively dead. Only the $1,500 GeForce RTX 3090 will support it in Nvidia's new GeForce RTX 30-series lineup.

    So yeah, SLI isn’t technically dead, but it’s effectively dead for all but the top 0.1 percent of us. If you’ve got $3,000 to drop on a pair of RTX 3090s, by all means feel free to lord it over your geeky pals. Just don’t expect developers to spend much time making sure it works great in your games.

    Benching in SLI will only be for those who have sponsors or those who need it in work.
     
    jc_denton, Rage Set, Convel and 2 others like this.
  28. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Sad but true. I always loved SLI. But, NVIDIA isn't even putting effort into it. I've notice benchmark scores are not even enough to make up the difference. Instead of a 60% or greater boost like used to be typical, it seems more like about 20-25% increase. Multiply the cost times two and it's a stupid idea financially even if you love SLI. Back when all laptops had anemic graphics performance it took two MXM cards in SLI to match a single GPU of the same model/class on a desktop and it was essential if you wanted a monsterbook to run like a monsterbook should. Now even that gap is narrowing because of the gimping and metering of desktop GPU performance in order to leave the headroom to resell it with a new name and less metered than the same GPU with a different model number from last year.
     
    jc_denton, Rage Set, ajc9988 and 2 others like this.
  29. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    So, I'm sitting on this at the moment.... Haven't cracked it open yet.
    20200902_193632.jpg
     
    jc_denton, Mr. Fox, Rage Set and 3 others like this.
  30. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    That is the board I went with. I originally had the Hero XII, and that board was equally great. I actually returned it and went with the extreme for the simple reason of wanting PCIE 4.0 when/if it ever came with Rocket Lake. However it now seems Asus will support it, even though the chips on the board don't seem to reflect even being possible. So confusing. The Extreme board has PCIE 4.0 hardware/chip on the board though. Both boards overclocked the same, but I also didn't push extreme clocking either. The Apex board is the board for some heavy duty OC on the memory.
     
    Papusan likes this.
  31. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Possible but not likely. If Nvidia had the ability to turn on DLSS for any TAA game with the rumored DLSS 3.0 they would have likely talked about that during the presentation. That would be HUGE news and I highly doubt Nvidia would keep that information in the dark. Either way we'll all know more as the days pass and we eventually get lots of third party reviews. I think I'm also going with the 3080 and using the EVGA StepUp program to dump my 2060 Super I grabbed as a hold over card when I sold off my 2080 Ti last month.
     
    Papusan and ajc9988 like this.
  32. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Here is why I think Nvidia didn't:

    1) It isn't ready for mainstream. Meaning, it takes implementation by the company to a degree and game ready drivers. Nvidia had fewer companies sign on to implement it than they thought they would (or they planned on a later release, so the other partners just were not ready yet). Either way, this would lead to them not wanting to talk about it.

    2) Nvidia only had so much time to get the game ready driver itself and was trying to iron out other performance gains using other features and just was not ready to announce it.

    Either way, I think this is a hint at what Nvidia will drop in a later announcement. Like a sneak preview. Otherwise, I find it interesting to always use TAA for each of the games tested. Also, on Control, the TAA result was the same percentage of gains as RT+DLSS. Now this could cut either way, because Digital Foundry admitted that RT+DLSS was done at a lower resolution and upscaled. Because of that, I'm assuming the DLSS under TAA is less efficient, to a degree, than DLSS with RT. Either way, it is still a relative percentage over the prior generation. How could RT+DLSS have the same percentage over the prior generation as without using those two features? That is where I think Nvidia's selection of games comes in, which then led me to the TAA and game ready driver hypothesis.

    Personally, the TAA thing I think would be very cool. I just do not see how they would get to there without that and why they restricted the testing to their pre-approved list (of course you want to show it in its best light, but . . . ). Now, if I am wrong and that performance is there, then the 30% figure I estimated over the 2080 Ti is wrong, and it is closer to 47%, as was shown in the leaked time spy extreme score. That is a possibility, if I am being honest. But I need to find out more and Nvidia gave very little, even in the historical sense for events like these.
     
  33. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    Same here. I wanted a 4.0 ready board with all the bells and whistles for when i'm not over clocking.

    That Apex looked really nice as well.
     
    Talon and Papusan like this.
  34. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    So it sounds like AMD's Big NAVI might actually be able to beat 3080 and match 3090 based on some rumors.

    Perhaps we will see that 3090Ti/Titan card after all later in the year.
     
    electrosoft, ajc9988 and Rage Set like this.
  35. Rage Set

    Rage Set A Fusioner of Technologies

    Reputations:
    1,611
    Messages:
    1,682
    Likes Received:
    5,068
    Trophy Points:
    531
    I agree with the 3080. AMD will have a card that I suspect will beat the 3080 in non ray-traced games but the 3090 will remain the top dog. We all know that Nvidia wouldn't be happy with just a single card being the best, because AMD can lower the price on its cards, thus dragging Nvidia into a pricing war. The 3080 TI will match the 3090 in near performance but lack the high amount of VRAM and sit right at 1K to 1.2K. If somehow AMD manages to release a card that goes blow to blow with the 3090, Nvidia is in trouble, even if AMD were to price this card at 1500.

    This is the first generation where I am actually waiting on both AMD and Nvidia to show their cards before I make my decision.
     
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
  37. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So, more on the DLSS situation. This nugget is on Reddit ( https://www.reddit.com/r/nvidia/com...tx_30series_community_qa_submit_your/g3mjdo9/)


    "DLSS SDK 2.1 is out and it includes three updates:

    ● New ultra performance mode for 8K gaming. Delivers 8K gaming on GeForce RTX 3090 with a new 9x scaling option.

    ● VR support. DLSS is now supported for VR titles.

    ● Dynamic resolution support. The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution."

    So, this lends some credence to Nvidia working on new DLSS implementations and possibly having DLSS working with the TAA in specific games.

    Now, DLSS is evidently built on TAA.

    https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors
    (and the dumpster)
    https://wccftech.com/dlss-ultra-per...rt-coming-soon-to-unreal-engine-4-rtx-branch/
    https://wccftech.com/nvidia-dlss-2-0-behind-the-scenes-how-the-magic-happens/

    These are suggesting that TAA being on may have actually allowed those tensor cores to help. As I said, if true, this could open up a lot of games once game ready drivers and implementations are in place.

    But this might be me liking a *possible* feature that may not come to fruition. Personally, though, I think this is why Nvidia carefully controlled the games and the narrative on performance (because the non-accelerated performance is not as high).

    Edit: and CoD WWII is on sale for $20 (the Digital Deluxe for $33) on steam.
     
    Last edited: Sep 3, 2020
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    That's what I got. Returned the Aorus Master. It was a nice piece of hardware with crap firmware. The Apex works much better, especially with Windows 7.
     
    jc_denton, Rage Set and temp00876 like this.
  39. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    Were you able to find all the usb drivers for Windows 7?
     
    Rage Set and Mr. Fox like this.
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Yes, I believe so. The only things in Device Manager with no drivers available are the Aura USB controller, WiFi and Ethernet. I just used the Windows 7 installation that I was running on the X299 Dark. I used the 300 Series Intel SATA drivers. I am using a USB WiFi adapter for internet on Windows 7.

    I'm still trying to figure out the voltage control. From what I can see, Intel made these on purpose to have extremely high idle voltage. I saw where someone posted that Intel's default stock core voltage maximum is like about 1.525V, but I am not finding anything published on it. The ASUS BIOS legend is showing for standard (not LN2) the core maximum is 1.700V.

    I can't really believe how cool running this CPU is. I guess after running 7960X and 7980XE for a while, I got kind of numb to high temperatures.

    Capture.JPG
    Wraith.jpg
     
    Last edited: Sep 4, 2020
    Ashtrix, jc_denton, Rage Set and 5 others like this.
  41. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Why there's no GeForce RTX 3080 Ti (yet) pcworld.com

    Bottom line: It makes more sense for Nvidia to wait.

    Why GeForce RTX 3090 instead of 3080 Ti?
    Debuting with a $1,500 GeForce RTX 3090 instead of a GeForce RTX 3080 Ti makes sense for two key reasons: It lets Nvidia continue to push graphics card pricing upward while establishing a hard-to-topple performance benchmark, and it leaves room to maneuver if AMD’s imminent RDNA 2-based “Big Navi” Radeon graphics cards come out with GPUs blazing.

    By pulling this card out with the long-dead xx90 classification, Nvidia can also keep prices high. The company faced fierce criticism and slow starting sales thanks to the higher prices established in the GeForce RTX 20-series, which pushed up costs for each GPU tier (the RTX 2070 and 2080 each cost $100 more than their predecessors) but offered the same traditional gaming performance in each price tier.

    If Nvidia called this the GeForce RTX 3080 Ti but priced it $300 higher than the 2080 Ti, forum-goers would grab their torches and pitchforks. Stuffing this much performance and 24GB of cutting-edge GDDR6X memory into a radically redesigned graphics card isn’t cheap. By calling it the GeForce RTX 3090, Nvidia can squeeze more money out of the price-is-no-object enthusiasts while avoiding unrest.

    The GeForce RTX 2080 Ti was the outlier when it exploded onto the scene at launch--Nvidia likely did so in only to put the best possible ray tracing performance on the street as quickly as possible. Don’t forget that Nvidia’s RTX graphics cards were the chicken, while actual ray traced games were the eggs.

    In short back to the normal.



     
    Last edited: Sep 4, 2020
    TBoneSan, jc_denton, Rage Set and 5 others like this.
  42. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    Hummm, then I'm going to have to search a bit harder for those drivers...

    I have seen that as high as 1.625V. and was like WTH! And that was on auto with AI3 installed.

    True, this runs much cooler than my 7980XE, that's for sure.
     
  43. Rage Set

    Rage Set A Fusioner of Technologies

    Reputations:
    1,611
    Messages:
    1,682
    Likes Received:
    5,068
    Trophy Points:
    531
    Johnksss, Mr. Fox and Papusan like this.
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Oh sure now that I figured it was never going to exist and bought a darned ASUS board here it comes, LOL. That is the one to get.
     
    Johnksss and Rage Set like this.
  45. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    The Comet Lake Chipset driver/INF package will install on W7. The USB drivers for Z3XX will work and so will the AHCI and ME drivers. The difficult is running Windows 7 Setup, which is why I used the Macrium image from the X299 Dark. I even have the 2080 Ti USB Type C working on Windows 7 using an AMD USB 3.1 driver that I modded to change the Device name. The AMD and NVIDIA hardware IDs are the same because those GPU Type C controllers are made by the same company.
    Yeah it is rather unsettling to see it so high, but apparently normal. I wish I could find something official from Intel on it. The low voltage I was initially seeing seems to have been inaccurate sensors on the Aorus Master. After a firmware update the voltage went high and I returned it thinking it was defective. Then I see the exact same thing with the Apex, LOL.

    The Apex BIOS is definitely better, which is expected with ASUS. But, I probably would have just lived with it had I known what was going on was to be expected. However, there were some issues with Windows 7 on the Gigabyte board that do not exist with the Apex. I think Gigabyte is drinking too much of the Windows OS X wee-wee flavored Kool-Aid. I've always viewed them as being more of a gamer-boy brand, like MSI... not especially tuned in with people that are more serious about overclocking than playing video games.
     
    Last edited: Sep 4, 2020
    Johnksss likes this.
  46. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    This older BIOS seems to be a lot better at keeping the voltage in check. Z490 does a nice job at memory overclocking. Brother @Robbo99999 should like these AIDA64 memory tests.

    4600-CL17.JPG AIDA64-4600CL17b.JPG
     
    Johnksss, Talon and temp00876 like this.
  47. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Shouldn’t increasing the water capacity of a expandable AIO cooler help lower CPU temps? I have a decent XSPC reservoir laying around, some 3/8 tubing, and fittings. My Celcius S24 is expandable, and all of the fittings are removable. So, I know these AIO’s have like hardly any water capacity in them, maybe 100-200ML. I am gonna try to add a reservoir to my current AIO to help it out a little. Maybe it would at least take longer to reach a equilibrium. Or maybe I am bored and trying to find something to do to my computer lol

    I passed RealBench 8 hours on my 7980XE at 4.4Ghz/ X30 mesh frequency, no AVX offsets at all, and only 1.13 volts, DDR4@ 3733Mhz. Darn thing still hit 98C max.

    At least it is stable.
     
    Last edited: Sep 5, 2020
    Johnksss, Rage Set and Mr. Fox like this.
  48. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Yes, it should improve your thermal capacity. My system holds about 3/4 of a gallon.
     
    Rage Set likes this.
  49. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    @Mr. Fox If you don't already have this.. Good for real time memory tweaking in Windows. Comes on the USB stick for the Extreme board. Not sure if the Apex includes it, and of course it's difficult to find for download.

    https://filehorst.de/download.php?file=dDvyBAAe

    Also tweaking these settings that are only exposed/available in that program (still not available in the BIOS) can get the latency a bit lower. I think around 2ns lower.

    https://rog.asus.com/forum/showthread.php?118328-Asus-Z490-stuff/page14
     
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Awesome, thank you. It does come with the Apex board, but that is a newer version of the TurboV Core app. It works better than anything else in Windows. No XTU bloat crap. You have to be careful using it though. Sometimes it will read settings from the BIOS incorrectly and cause the system to crash. I look at what it reads before clicking on apply and sometimes have to close and reopen it.

    I will also check out those RAM tweaks.

    Edit: Sorry, I said TurboV Core... although what I said about TurboV Core was accurate, it is a newer version of MemTweakIt than what came with my mobo. It's also newer than the version posted on HWBOT. @Talon
     
    Last edited: Sep 5, 2020
    Johnksss, Talon and Papusan like this.
← Previous pageNext page →