The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]

    Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.

  1. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    For seriously designed overclocking cards like MSI Lightning, etc, then I think they should have a mode & software support to say - "Look, you can do whatever you want with any of the parameters and we recommend that you keep within xyz limits with xyz cooling, but whatever you do is at your own risk!". So completely unlocked, and if cards fail then they fail.

    2250Mhz is good though for sure with your 2080ti, so congrats on that. They should have some cards on the market though (perhaps at significantly higher prices) that are super overengineered and without any constraints (my last part of my previous paragraph).
     
    Mr. Fox likes this.
  2. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    Hummm, with 3 drives in an Asus laptop I was able to hit 9000+ MB with 3 (2 Intel and 1 Samsung - First two drives have to be Intel's while the 3rd can be what ever)drives.
     
    Rage Set and Mr. Fox like this.
  3. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    Awh shooky dooky, I have that controller. I bought it a while back along with having a Raspberry Pi.
     
    Mr. Fox and ajc9988 like this.
  4. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    https://www.igorslab.de/en/nvidia-g...-so-important-and-what-are-the-object-behind/

    More from Igor's lab on capacitors. Part is clearly a jab back at Buildzoid's video saying only Panasonic capacitors are POS-caps (which is technically correct) to say engineers use it more broadly, in part calling them Piece of ____ caps.

    But, since we are learning more about the cards and their power delivery, I'm including it for anyone that cares to read more about capacitors and their function.

    Edit:

    And Buildzoid's FE analysis.
     
  5. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    If this is the end results from AMD we won't see Nvidia will change course https://videocardz.com/newz/amd-navi-21-to-feature-80-cus-navi-22-40-cus-and-navi-23-32-cus

    The coming 3080@20GB cards can help Nvidia hold up the former 2080Ti price point(sell it at $899/$999/$1099 depending on AMD's best card). Many will buy a 1000 card over the much more expensive 3090, even if the cards only show minor performance gains over the cheaper 10GB card. Big win for Nvidia.

    Why there's no GeForce RTX 3080 Ti (yet)pcworld.com | UPDATED 2 days ago | By Brad Chacos

    Nvidia's blowout RTX 30-series launch revealed a GeForce RTX 3080 and a RTX 3090, but no GeForce RTX 3080 Ti. We explain why that might be.

    Why is there no GeForce RTX 3080 Ti? Because Nvidia doesn’t need it….
     
    Last edited: Sep 27, 2020
    ajc9988 and Robbo99999 like this.
  6. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Still, there's only 11% performance difference between 3080 and 3090, and the 3080ti or 3080 Super (whatever it might be called) can't really perform better than the 3090, so it's only gonna slot into that silly little 11% performance gap between 3080 & 3090....the only reason they'd launch a product inbetween those two is for 20GB vs 10GB VRAM reasons and praps toss in 5% extra performance in the form of just a few more shaders unlocked, so any 3080ti is only gonna be worthy for it's 20GB VRAM really beyond what the 3080 10GB currently offers....that's the way I see it. (Unless they've got a new & larger chip, larger than the 3090 up their sleeve, but still I don't think they could release a 3080ti performing better than a 3090 - just by going on the naming convention and also the pricing).
     
  7. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yeah, the rumored specs on AMD are not as bad as you made out, though. We are talking boost clocks for the higher energy variant to 2.2GHz for the 6900XT, 80CUs, and you cannot compare TFlops. Hell, because of the bottleneck in the 3000 series (which will be fixed in the next series of cards from Nvidia done on a smaller node), the TFlops on the 3000 series card doesn't scale the TFlops calculation from prior gens. You just cannot compare TFlops from different architectures.

    The mem bandwidth rumored is only 512Gbps, much slower than Nvidia. So they really need a hell of a cache system and that needs probed for the limits.

    The Navi 22 has a boost of 2.5GHz, meaning AMD overcame the logic breakdown issues when working on the GPU for the PS5. Also means that the 2.2GHz 80CU card might have more gas in the tank for overclocking. Remember, it takes power mods and better cooling to even come close to that with Nvidia. But there is the IPC question also, so even that is not a fair comparison until we know more on AMD's cards.

    The 6900XT and 6900 (the 2.2Ghz and 2.05GHz boost cards) are what are competing with Nvidia's 3080. The 3090 should have both beat. For the XT, the rumor has it at a 300W card, which is using the more efficient and better TSMC 7nm node. Samsung 8nm is actually a 10nm node that has been refined, much like TSMC 12nm was the fourth gen and beyond of TSMC's 16nm node, or like Intel's 14nm++ (two refinements on that Samsung node). So even being 300W, that should be more efficient 300W (from a process node standpoint), and due to not using GDDR6X, way less is going to the memory (Nvidia's design has around a 60W memory design, whereas the rumored Ampere Quadro 6000 has GDDR6, but with a larger memory bus so more memory bandwidth than the 3090, even though it only uses Gddr6).

    Even with that, and the expected higher speeds (only 5-10% faster, depending on card for the 3080), they would need both the IPC uplift AND a good amount of changes to hit the target to actually match the 3080. It is possible. But at minimum, these specs do suggest they have reached a 2080 Ti performance level, if not higher.

    Unfortunately, from the leaks and what we know, we are looking at the 6900XT (Navi 21B), the 6900 (Navi 21A), a 40 CU 6700 (Navi 22) and the 32 CU 6500 (Navi 23).

    Either way, I wouldn't count AMD out on that leak.

    [​IMG]

    That is from Newegg and gives CU counts, although some have said the figures may be guesswork rather than final spec numbers.

    Edit: according to the newegg list, the 6800 XT will be a 60CU chip.

    Edit 2: and the reason that Nvidia's shaders seemingly doubled was the inclusion of FP instructions along the int pipeline to make it an int/FP pipeline and a direct FP pipeline. But the int/fp pipeline still needs to do int about 25% of the time. With the bit of overhead, that means you get less efficient processing. This is why it seems Nvidia will do 2xFP dedicated and either an int or int/fp line in their next uarch, which is easier to accomplish on a smaller node. That is why comparing the 29TFlop of the 3080 to the 22TFlop of the 6900 XT is not proper, among many other reasons.

    Edit 3: I don't think I made it clear that when Nvidia had one pipeline for INT and one for FP, they only counted the FP as a cuda core. Now that the INT is an INT/FP, they count it as a cuda core, hence the doubling of cuda cores, but because INT takes up part of the usage of the pipeline, you do not get the actual performance seen when INT is not sucking up part of the FP performance. So prior shader archs from Nvidia and their TFlops are not one to one.
     
    Last edited: Sep 27, 2020
    Robbo99999 likes this.
  8. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    My 2080Ti deal fell through. The guy won’t sell because he apparently cannot find a 3080 or 3090 to buy. This 1660Ti is just not fast enough for me. I hope AMD brings something quick.
     
    Mr. Fox likes this.
  9. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Nov. 11. Reviews up Nov. 9th. Event date is Oct. 28th.
     
    Mr. Fox and Robbo99999 like this.
  10. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    So we're looking roundabout 3080 or maybe 20% less performance (when you say better than 2080ti performance) for the best upcoming cards from AMD.....so this means we're looking at price point when they come out end of Oct beginning of Nov....thing is they may actually be available to buy like is not the case with 3080 NVidia! I've got my preorder in with overclockers.uk and they say they're gonna create a stock order calculator to work out based on your pre-order number how long the expected wait could be, they've not created that yet.....some peeps who have bought them talking about after Christmas for their cards (their guess only), but I think NVidia & partners will be able to ramp up production and accelerate the release....for instance I ordered mine exactly a week ago and I can't see this card coming after Christmas, I really think that would be crazy, I think they'll sort it by end of Oct beginning of Nov (hunch).
     
    Mr. Fox and ajc9988 like this.
  11. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    To be honest, it could even potentially BEAT Nvidia's 3080. It should lose to the 3090. My main point in that is that if you divide the number of shaders cited by Nvidia, that is closer to what it is (or subtract about 30% of the performance for the number of shaders cited). Remember, the 2080 Ti only had 13.45 TFlops, yet is how close to the 3080 with 29.8 TFlops. This is why the comparison to AMD saying 22.5 TFlops and Nvidia's 3080 having 29.8 TFlops means NOTHING.

    Now, if the game uses DLSS 2.0 or 2.1 and you use that, then Nvidia may give better performance. Rumors have the Navi 2 performance on Raytracing around that of Turing, but less than Ampere. If not using that, then it doesn't matter.

    And one thing to point out, AMD is ALSO releasing their review embargo two days before purchases go live. That means many can know the numbers BEFORE they make the decision to buy Nov. 11. That also says they are decently confident in what reviewers will find.

    So, just going to take time. I'm waiting to see pricing and how close the 6900 is to the 6900 XT and whether they have a scaling issue above number of CUs.
     
    Robbo99999 and Mr. Fox like this.
  12. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    No, you’re looking at potential rasterization performance that smokes the RTX 3080, while having a 16GB frame buffer and a 100W lower TDP. Let me explain.

    If you look at the the Ampere SM, each SM has 64 FP32 cores, and 64 FP32+INT32 cores capable of either floating-point or integer math. An RTX 3080 has 68 of these SMs, for a total of 4352 FP cores and 4352 FP/INT cores.

    Contrast that with the Turing SM, which has separate blocks for FP and INT. The RTX 2080 Ti also has 68 of these SMs, for a total of 4352 FP cores and 4352 INT cores.

    In a pure float workload, each Ampere SM can execute 128 float ops/clock. So in something like Folding@Home, the 3080 is an absolute monster, and does indeed have double the float shading throughput of the 2080 Ti, around 30 TFLOPS.

    But in a mixed float/int workload, such as games, the Ampere SM runs in concurrent mode and has the same execution as the Turing SM, 64 float + 64 int ops/clock. Nvidia estimates that games average around 70 float/ 30 int per 100 instructions, although this can go as high as 60 float / 40 int in Shadow of the Tomb Raider. In concurrent mode, 3080 is identical to 2080 Ti with 4352 float + 4352 int, which is 15 TFLOPS.

    Now, consider the 6900 XT with 80 CUs (64 stream processors per CU for a total of 5120) at up to 2200MHz would have around 22.5 real TFLOPS. This is in comparison to the 15 TFLOPS the 3080 has in concurrent mode. And this massive amount of float shading throughput is at an impressive 220-238W according to that article.

    Keep in mind that RDNA1 and Turing have almost identical perf/FLOPS, as the 5700 XT and 2070 Super are identical in TFLOPS and memory bandwidth at the same clockspeed, and RDNA2 is rumored to have increased IPC relative to RDNA1. RDNA1 did not have the ability to execute float and int concurrently, but that did not seem to hurt it in the above comparison from Shadow of the Tomb Raider, which is an integer heavy game as mentioned.
     
    Robbo99999, Mr. Fox and ajc9988 like this.
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    And, you can count on probably not being able to buy anything desirable or paying WAY MORE than the MSRP any time close to the release date. When I went shopping for my downgrade I explored a variety of options to use the money from selling the X299 Dark and 7980XE and nothing desirable was available, or what little (and I do mean little) was available was grossly overpriced. There were plenty of garbage consumer CPUs and motherboards that no enthusiast would want, like non-K and i3, i5 and i7 processors.

    I even toyed with the idea of going with 3950X as a plan B. Every place I looked was either out of stock or raping people on the price. It seems Intel, NVIDIA and AMD are all really bad about releasing products that you cannot buy at all, or can't buy for anything that resembles the MSRP because stock is so limited and vendors are eager to see what level of robbery they can get away with when people are stupid with their money.

    In other words... figure on getting something obsolete or waiting until Christmas, or later.

    I heard a rumor that retailers were dumping 2080 Ti stock for dirt cheap, but that doesn't seem to be accurate. They're still raping customers on new GPUs that are obsolete. Some are charging more now that they are obsolete, LOL. I haven't seen any examples of them being sold at a loss or for huge discounted prices. The only people selling them for somewhat acceptable prices are end-users selling their used GPUs.
     
    Last edited: Sep 27, 2020
    DreDre, ajc9988, electrosoft and 2 others like this.
  14. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    I have noticed that the timming “tFAW” greatly affects the stability, and performance of my DDR4. I couldn’t quite manage a stable setup with tFAW at 30. Default XMP is 48. I think running (4) sticks makes this more difficult. I am not exactly sure. So far I have tried tFAW at 34 which still wasn’t stable, but it made it a lot further. Maybe 36-38 tFAW will work.

    Obviously running it at 30 is stable for benchmarks, or gaming, or even substantially heavy benchmarks. But, designated memory stress testing fails.

    Im glad I tracked down the angry Apple in the bunch though. tFAW helps performance quite a bit on my system.

    I am using intel burn test with maximum memory utilization. I know this isn’t a memory stress test, and some people advise against this. But, it finds the problem quick VS. waiting 30 minutes to see an error or problem If I am using another stability test for memory.
     
  15. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,766
    Messages:
    4,107
    Likes Received:
    3,945
    Trophy Points:
    331
    Unfortunately it is a market of supply and demand for some inane reason inflated demand allows price gouging and rampant bots to soak up the new and keep the old stock's prices still pumped up.

    If you have a 2080ti and you can't score a 3080 coupled with the leaks of AMDs new hardware not meeting or exceeding the 3080 (hope that's not true), that leaves the market for 2080ti's still intact atm.

    I sold my AMD 5700xt 50th AE banking on a >=3080 class card which would net me ~2x the performance of my 5700xt. I even have a 32" 4k display on the way (it won't be here till mid October), but it looks like there's a good chance I'll end up tiding myself over for awhile or just hook up my P870TM1 to it and use that till something makes itself clear and worthy.

    I had a chance to scoop up a 3090, but I just couldn't logically or budget wise rationalize dropping $1700 after tax on a GPU. Even a 3080 was contingent on selling my 5700xt as I had to make room in the budget for the 4k display. In my defense, my 30" HP ZR30w is starting to die with the dreaded blink/off problem and massive, sudden screen distortions while suddenly running much hotter and has gotten much dimmer over the last year. The built in hub died years ago too.

    I have another ZR30 the Mrs. uses for work (connected to an AW18), and it is still running perfect but according to her touching it may result in bodily harm upon my person. :eek:

    GPU market forces win again.... :mad:

    ....for now. :p
     
  16. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Cool, well I hope AMD do well, it sounds like they're not gonna be far off or maybe even equal or better - mixed bag! I'm happy with my 3080 purchase though because of G-Sync monitor / Ray Tracing / 10GB ok for 1080p for quite a few years I imagine, so I won't be pissed if AMD beat my card for a lower price point in general rasterisation.
     
  17. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281

    I am most likely going to buy a 3090 around tax season next year. Preferably a Galax, or Kingpin model. The 360MM AIO kingpin looks interesting. I bet I could squeeze some good performance from a GPU like that. Unless I can come across something pretty cheap right now, it just isn’t worth being impatient and buying one just to get it now. 2080Ti prices were so good right when the 3080 was announced. I should have picked one up then. They have gone back to nearly normal pricing before the 3080 was even announced.

    As for the rest of my system, I am happy with it. Everything is tuned and really fast. My CPU is at a easy 4.6Ghz with really good temperatures, I have managed to get 3.2 on my mesh, with 4,000Mhz DDR4 at really tight timmings with very low latency, I see this platform working good for a while. Accept for my 1660Ti. Which is the weakest link. Or as my 8 year old son would say, “This thing sucks”
     
    Johnksss and Mr. Fox like this.
  18. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
  19. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Johnksss, Papusan and Mr. Fox like this.
  20. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,766
    Messages:
    4,107
    Likes Received:
    3,945
    Trophy Points:
    331
    Papusan and Johnksss like this.
  21. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,843
    Trophy Points:
    931
    4K 120-144Hz?
     
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    I really hope NVIDIA don't go the same paths as when Dell fixed their burning up Area-51 MESS:D See... Cripple own products. Do it with firmware or drivers is equal pathetic.

    [​IMG]
    Tested: Nvidia's new drivers fix RTX 3080 crashes by sacrificing clock speed pcworld.com | Today

    Nvidia's new Game Ready 456.55 drivers fix the issues we've had with a GeForce RTX 3080 crashing during games, but slightly limits the top GPU Boost clock speed.

    As I said, with the original drivers, the HZD benchmark ran at a mostly consistent 2010MHz on this GPU until it hit the 2025MHz wall and crapped out. With Nvidia’s new 456.55 drivers, the the GPU clock speed in Horizon Zero Dawn now flitters between 1980MHz and 1995MHz (though it can still hit a stable 2010MHz in menu screens). Nvidia’s fix appears to be dialing back maximum performance with the GPU Boost feature. I’ve asked Nvidia for more details about what the new drivers do to improve stability and will update this with more info if I hear back.
     
    Last edited: Sep 28, 2020
    DreDre, Robbo99999, Johnksss and 4 others like this.
  23. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    The mere concept or general idea that they have the ability (which is already well known and documented, not new news) to manipulate processor clock speeds using drivers is totally unacceptable. I also honestly believe they do surreptitiously manipulate benchmark scores to some degree using drivers in conjunction with firmware, especially when a product is new or getting close to EOL with something else about to replace it. It's dishonest as well as overstepping boundaries on top of being an unacceptable practice.
     
    DreDre, Talon, electrosoft and 3 others like this.
  24. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,766
    Messages:
    4,107
    Likes Received:
    3,945
    Trophy Points:
    331
    I can't sacrifice gamut or stomach VA viewing angles and I quickly discovered I really do not like curved displays at all (One reason I returned a Samsung Odyssey G7) after using a ZR30w for so long, so after weeks of research I went with a BenQ EW3280U. It's 60hz but supports AMD FreeSync and unofficially G-Sync while having ok HDR and HDRi to apply to SDR gaming while not destroying my wallet.

    I contemplated holding onto the 5700xt, but to see where it stood even with WoW, I increased render to 4k (from the previous 2560x1600) and watched my frames plummet in spots into the 20's while I suddenly rediscovered GPU fans on max 24/7.
    I then drug my PC into the living room and hooked it up to my Sony Z9F 4k and watched it tank @ 4k hard.

    I liked my 5700xt, but it definitely is not a full on 4k card. Plus the Xpac will be partially RT enabled and even hit GPUs harder than current xpac does.
     
    Papusan and Johnksss like this.
  25. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    I have (2) so I hung one on the wall! Haha.


    [​IMG]
    [​IMG]
    image upload
     
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    I like the tasteful decor.
     
    tps3443 likes this.
  27. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'm also dead against these limitations placed within the driver, read about it on Guru3D just now....I think they need to fix the problem in hardware and/or AIB's need to test their products better before comitting to a boost clock - but don't limit me in the driver arbitarily, that's a one-size-fits-all approach which is against the spirit of overclocking & maxing out your particular GPU you got....don't like!
     
  28. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681

    Leaked Zen 3 results. 8 core outperforming Intel's 10900K in some games.

    Still, take with salt. It is a little over a week and the AMD CPU event will happen (like Oct. 8).

    Edit: Other rumors have single core hitting 5GHz boost. Doing the math, that is about 10% faster per gen and 15% IPC per gen, coming to 26.5% faster. Now, it is better to wait to know all core boost, and rumors have the new chips at 150W, up significantly (but not far from Intel's real wattage, so). But just more info on things in the pipeline.

    Edit: But see this 10700K beating the 5800X
    https://hothardware.com/news/amd-ryzen-7-5800x-zen-3-benchmark-leak-parallel-single-threading
     
    Last edited: Sep 29, 2020
    Rage Set likes this.
  29. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,766
    Messages:
    4,107
    Likes Received:
    3,945
    Trophy Points:
    331
    Agreed, this is very stomach turning IMHO.
     
    Papusan likes this.
  30. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Could someone clarify a few things for me on X299 platform?

    Mesh Voltage
    VCCIO Voltage
    VCCSA Voltage
    Uncore voltage

    Increasing uncore voltage helps with memory overclocking. Or enabling XMP profiles, it automatically increases in the bios. Or you can manually increase up to +600 or around 1.500 volts. This is not the same as mesh voltage.

    Increasing Mesh voltage does nothing for me, it does not make a higher mesh frequency stable for me. And it doesn’t seem to help with overclocking at all. It just increases power consumption.

    Increasing VCCIO Voltage helps stabilize mesh overclocking. If I set x32 mesh at auto VCCIO, I get a BSOD loading windows. If I set 1.250V VCCIO x32 Mesh it is 100% stable.


    A lot of articles online are very confusing, and contradict almost every thing I say here, or have personally experienced. For example, increase mesh voltage for mesh overclocking. Or increase VCCIO for memory overclocking. But, that doesn’t seem to be the case

    And is VCCSA used in X299 overclocking?
     
  31. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    First, you should start by visualizing power planes. I wish they still published these types of slides where they could easily be searched:

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
  32. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Now, on X299, the mesh is a mesh topology interposer on which the actual core chip sits. This isn't the smarter active interposer types or a topology like AMD researched that they plan on using in the future or that Intel uses for lakefield. It has its own speed and replaces the ring. The System Agent and the IOA and IOD are still there. Mesh has limits. It is routing the data instead of on chip on a ring down on through silicon vias to the mesh interposer, then the traffic is forwarded to its location. Mesh is fine for monolithic chips, but for chiplets, you can get traffic jams. That is a longer discussion and one I talked about at length years ago. Not going to rehash here and now.

    Now, increasing SA and IO both can effect memory stability. Increasing mesh voltage should help with mesh stability. And Vcc i/o obviously can help stabilize the signal, which can help a bit elsewhere aside from just memory.

    I just couldn't quickly find the power plane diagrams for X299. But that and uarch diagrams showing the interaction between these would help you to visualize what you are doing, possibly.
     
    Rage Set, Papusan and tps3443 like this.
  33. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    7960X and 7980XE with 32x mesh ran fine for me with cache/mesh voltage set at 1.300V. I set just about all of those voltages on that page in the BIOS manually, not leaving them set on Auto.

    Follow the color-coding. As long as the text is not red, you should be in the safe zone. You may need to increase the CPU Input voltage. Don't leave it on Auto. Set it to 2.000V unless you are going above 5.0GHz on all cores, then 2.100V is better. What I found success with (assuming you have adequate cooling) is to increase all voltages until they turned from yellow to pink and stop for more modest clock speeds, then for more aggressive, increase the values until the text turns red, then go back down one or two steps so it is in the pink, not in the red.

    I have found on the X299 systems (at least on mine, with the CPUs I had) that leaving things on Auto only works well sometimes if you use Auto for almost everything. The harder you push the system, the less using Auto works well. If you start doing a hodge-podge of custom and Auto voltage settings on that main screen it can throw off other things that are trying to automatically compensate for values that might not be ideal.
     
    Last edited: Sep 29, 2020
    Rage Set and Papusan like this.
  34. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    I have my mesh voltage at 1.250V and my VCCIO at 1.250V.

    X33 Mesh is 100% stable. I have numerous screenshots of your bios from watching your videos. And I use them as reference quite often for tweaking.
     
    Rage Set, Papusan and Mr. Fox like this.
  35. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Nvidia put all brands cards under same hat. Similar as Dell did for the burning mess. Revised models with better components got same firmware treatment as the burning up cards. But Nvidia fixed it with their drivers. Most can be fixed with some driver magic.

    Users of the r/nvidia subreddit report an end to crashing with their RTX 3080 cards in various games. Hardwareluxx.de editor Andreas Schilling reports that the new drivers change the voltage frequency curve on his Asus TUF RTX 3080, which would explain the slight difference in maximum GPU Boost clock speeds in some scenarios.

    upload_2020-9-30_0-2-6.png
     
    Last edited: Sep 29, 2020
  36. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So like apple slowing phones down for battery life!
     
    Rage Set and Papusan like this.
  37. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    What's Your Old Graphics Card Now Worth? techspot.com | September 29, 2020

    upload_2020-9-30_5-44-10.png
    upload_2020-9-30_5-51-4.png

    upload_2020-9-30_5-51-56.png
     
    Rage Set, Johnksss and electrosoft like this.
  38. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    Mr. Fox likes this.
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Looks like I forgot this. But could one from the @PremaMod Team put a Cinebench R2000 score for the stage 4 HWBot Team Cup 2020 Cinebench 2003 Notebook? Short time left. I forgot it :(

    upload_2020-9-30_7-55-4.png

    Remember add in the Bot wallpaper...

    https://hwbot.org/competition/team_cup_2020_CB2003/stage/5016_ddr4_cinebench_-_2003_(core)/

    Use Team Cup 2020 Background, download here: https://i.postimg.cc/NM7jpw4F/TC2020.jpg
    [​IMG]

    I have added this one... [​IMG] Premamod - 445.75 points - 34 pts

    And have this one
    upload_2020-9-30_8-0-24.png

    It looks like single results from newer Cpu with higher score give lower points in the Competition. Maybe because of rare Cpu's or many of the i5-8250U in the ranking get a higher score.

    Rookies Of United States - 131 points - 16 pts HWBot Team Cup 2020 Cinebench 2003 Notebook (4192 points with Intel Core i7 7820HK).

    Premamod - 445.75 points - 34 pts HWBot Team Cup 2020 Cinebench 2003 Notebook (3566 points with Intel Core i5 8250U).

    https://oc-esports.io/#!/round/team_cup_2020_CB2003/5016/ddr4_cinebench_-_2003_(core)

    ------------------------------------------------------------------------

    More benches.

    https://hwbot.org/submission/4564467_papusan_pcmark10_core_i5_2430m_2394_marks
    [​IMG]
    Futuremark software have serious problems or the latest NVIDIA drivers for GT 555M is really screwed up. PCMARK 10 wont start properly and will crash after loading the benchmark with newer/newest drivers. Only older drivers is usable.

    https://hwbot.org/submission/4562573_papusan_cinebench___r11.5_with_benchmate_core_i5_2430m_2.74_pts
    [​IMG]
     
    Last edited: Sep 30, 2020
    Rage Set, Johnksss and Mr. Fox like this.
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    Last edited: Sep 30, 2020
    tps3443 and Papusan like this.
  41. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Last edited: Sep 30, 2020
    Rage Set, Johnksss and Mr. Fox like this.
  42. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Trust me the whole thing was totally blown out of proportion. Some** GPUs were crashing. The drivers were a bit of a mess but these second drivers are much better and I gained performance over the first driver. Also some of the quoted losses of 30Mhz is a joke. 30Mhz is run to run variance depending on ambient temps or rendered scene changing in the slightest. These GPUs are power clamped and running at their limits out of box, which causes larger clock fluctuations than we've seen on previous gens so I don't put a lot of stock into a 30Mhz loss. The outlets need something to talk about after they all claimed the sky was falling and it was a hardware issue that couldn't be resolved.

    Cross flashing is up and running but not working at all for me so far. Cross flashed FTW3, Asus TUF OC vBIOS to the card, both of which have higher power limits and no go. They work, but the power limits won't increse as they do on their native card. Something is at play preventing power slider operation or something when a vbios is running on a card it wasn't intended for. The Asus TUF OC vbios is however providing me with just slightly more power stock and therefore allowing for a few couple hundred points higher on Port Royal.

    I plan to get rid of this card though. It's been fun and is more than enough for gaming even at 4K, but I want to try out the 3090 FTW3 or more likely the Strix 3090. Hopefully I can snag one soon.

    https://www.3dmark.com/3dm/51086371 -- Power starved run with about 330-340w pull. I have decent quality chip and memory but the card can't maintain boost because of the PWL. I think Nvidia playing games by forcing AIBs to clamp down the PWL on any 'reference' board. The TUF is technically an aftermarket board so is the FE. They are treating the ref board as a non-A chip in a sense and locking out higher PWL somehow, outside of hardware mod with crazy amount of shunts. Ridiculous for a GPU that costs a premium over the FE and TUF card, and makes me somewhat sour towards EVGA TBH.
     
  43. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    Nvidia will still tune drivers around cards that don't work properly. Same as Dell. If a few have problems they put all under same hat. The only differences is that Nvidia do it with drivers (Dell = All got same firmware that forced lower GPU throttle point due a few bad eggs. Revised or the flawed ones doesn't matter - All of them has to be crippled). No mercy.

    So there you have it: Drivers Matter™. We suspected there was a good chance Nvidia could implement fixes without the need for firmware updates or hardware recalls. At the same time, the 456.55 drivers certainly aren't the end of the road. Nvidia will continue to tweak and tune its drivers, and there are probably still cards and/or games that have issues. As we noted earlier, higher factory overclocks often seem to be the main culprit, so if you do have an RTX 3080 or RTX 3090 that still isn't running fully stable, and you're not ready to return the card, try using MSI Afterburner, EVGA Precision X1, or a similar utility to drop the GPU clocks by 20-50 MHz. That might be all that's required to go from frustrating crashes to stable high-end gaming bliss.
    https://www.tomshardware.com/uk/new...roves-rtx-30-series-stability-with-new-driver

     
    Last edited: Sep 30, 2020
    Rage Set, Johnksss and Mr. Fox like this.
  44. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Certainly possible but I’m not hearing about crashing. All of the guys in my discord have 3080s and 3090s of varying brands and not one of us has had a crash on either driver. it’s not some widespread issue. It was essentially 2 brands having the issue and it’s been fixed. Many owners are reporting they can clock higher now with the new drivers, myself included.

    I don’t put much stock into Toms Hardware. The fact that they talk about 1250mV on the curve shows me they have no idea what they’re talking about. The cards lock out at 1100mV on Ampere and under realistic loads will have you power limited to far less than 1v making the changes nearly non issue. 15-30mhz isn’t something that makes any practical difference. Less than 1fps.
     
  45. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
  46. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
    @Papusan

    I almost bought these. 4x16GB Samsung Bdie. 3600Mhz 16-16-16. $10 dollars more expensive than what I spent for double the ram, yep I feel kinda stupid lol. He swapped the plastic neo RGB light covers for the Royal z covers.

    This is about as fast as it gets for 64GB DDR4. I was a little concerned with overclocking on dual rank ram sticks in quad channel though.


    [​IMG]
    [​IMG]
    [​IMG]
     
  47. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
  48. tps3443

    tps3443 Notebook Virtuoso

    Reputations:
    746
    Messages:
    2,421
    Likes Received:
    3,115
    Trophy Points:
    281
  49. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    The "good" thing about this though is you can just increase the overclock offset right? I mean they've not capped the max allowable frequency right? I'm genuinely asking to be certain. Yes, by looking at those 2 curves side by side it's the top end of the curve they've changed more than the rest of the curve, so a simple offset overclock won't restore the whole curve to it's prior state, but an increased overclock offset would offset the difference at the top of the curve.....so this driver change shouldn't effect your max overclock I'm thinking.....because you'd just up the offset by +25Mhz or whatever the difference is between those 2 curves at the top end. What you reckon?
     
    Papusan, Johnksss and Mr. Fox like this.
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,654
    Trophy Points:
    931
    I think you are correct on that. Not locked down, just a lower default clock. As long as there is no impediment to manual override and new throttling algorithms happening under the hood, a simple lowering of a default clock speed is not a big deal. We don't really know one way or the other yet what the behavioral differences are.
     
← Previous pageNext page →