The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Intel Core i9-9900k 8c/16t, i7-9700K 8c/8t, i7-9600k 6c/6t 2nd Gen Coffee Lake CPU's + Z390

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by hmscott, Nov 27, 2017.

  1. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    It's not common sense to pay the price premium. Seriously, Millennials are cash strapped. So, performance per dollar and making sure it fits one's needs is more important that fastest at all cost.
     
  2. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Bingo. The power and heat required to run 8 cores at 5ghz is crazy. I am looking forward to more testing but from my limited time I am pretty happy and amazed at what they managed with this aging process and chip. If a Ryzen 2700X were able to do 5ghz+ on its 8 cores it would run just as hot and consume just as much power.

    You were spot on with the soldered chip though man. I wish they didn’t use soldered chips honestly. I would rather go delidd and LM. I could achieve 1-2C core differentials and better thermals. I might break down, delidd the chip and go direct die cooler that you linked me to earlier. I’ve been wanting to go custom loop for awhile but I’m a total noob in that area. Might need some guidance.
     
    ajc9988 and Mr. Fox like this.
  3. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Correction, AMD chips would run HOTTER at 5GHz. That is DICE and LN2 territory for their current process with GF and uarch.
     
    Mr. Fox likes this.
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    I'm eager to see that. I hope you can do it soon.

    And, custom loop is super easy, especially with flexible hose. A child of ordinary intelligence could do it. A long as you do not spill water on parts that have power to them you should be golden. The important thing is to buy good quality parts and remember that bigger is better. Hard tubing is more challenging. It looks nice, but it's quite the pain in the butt if you take your system apart a lot like I do.
     
    Talon and ajc9988 like this.
  5. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681

    This is a good example of a nice looking soft tube loop that could fit anyone's needs....
     
    Mr. Fox likes this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    4.6GHz is too low. You should be able to run minimum stock 9900K. Aka +25% more performance than you run today. You could wait to see what improvement Clevo will do for Z390. And adopt it for your machine
    [​IMG]
     
    ole!!!, Mr. Fox and jclausius like this.
  7. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,842
    Trophy Points:
    681
    Isn't the real question whether you prefer running 8 Cores at say 4.8GHz, or 6 Cores at 5.0GHz? Assuming what you are doing will use all 8 Cores, won't it get more done at 4.8GHz than 6 Cores at 5.0GHz?
     
    jaybee83, Robbo99999, ole!!! and 3 others like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    Exactly.
     
    Donald@Paladin44 and Mr. Fox like this.
  9. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    I think that falls into the personal preference or ability to pay buckets. I do agree that there is an element of common sense to it, but when you want the fastest regardless of cost there is a common sense element to that as well. If you pay less and know you are going to get less, (in terms of clock speeds,) then expecting it to be the fastest demonstrates a lack of common sense. It's all relative based on end user wants and the common sense can be applied from more than one angle.
     
    ajc9988 and Papusan like this.
  10. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    im just too spoiled at the moment so let me rant on lol. hard to swallow fact that heat and physics is destroying my megahertz and now im megahurting.

    i coulda get a better binned 8700k but missed it thinking 9 series will be better binned,now i gotta hope intel will pull another refresh with better refined 14nm+++, or hoping a better cooling design to come around.
     
    Mr. Fox, ajc9988 and Papusan like this.
  11. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    Be happy you didn't jump on the soldered joke :D And the new ain't much better. 8700K is still a nice cpu.
     
    Last edited: Oct 29, 2018
    Mr. Fox likes this.
  12. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Now, I'll agree anyone that doesn't know what they are paying a price for is an idiot, or if they have unrealistic expectations. But, talking generally, not specifically for computer market ATM, not always is the highest priced item the best. Sometimes brands, like Alienware or Razer or Apple, charge an unjustified premium where you pay more but get less (also seen in other areas). Then there is like Fluke and Fluke clones. The clones cost way less, but if you find a good one, you can get equivalent quality to a Fluke for less, but then you have other factors like supporting US built products, etc. But you get my point.

    Also, definitely agree anyone trying to say you get more performance on equal core count system between AMD and Intel is stupid. Putting price premium aside and price point comparisons aside, Intel has the faster chips (meaning as a percentage of performance rather than the literal frequency or IPC values, just so it is easier to compare for future gens, where, like Ice Lake, the density will be higher, so frequency may be lower, but an IPC boost would still put it ahead of the 14nm++ chips).

    Once price comes in, that is where I put the 9900K in no man's land, hanging out between mainstream and HEDT where I'd tell a person take the 8700K (or keep it if they already have it) or 9700K, or jump onto HEDT with a used 7900X (which should potentially see some at discounts used with the new 9000X series coming) or grab the 1900X for about $300, 1920X for $400, the 2920X for $650, 1950X for $680-700. That is assuming they have productivity workloads or are streamers, though. I also think the coming mainstream chips from AMD may finally put on pressure for a price war, where we will at least see Intel cut some prices to more reasonable levels. But that is around March to April next year.
     
    Papusan and Mr. Fox like this.
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    Unless a person just wants it for giggles and can afford to subsidize their fetish, it would be silly to upgrade from 8700K to 9900K. This applies for most people, including gamers. Now, if they want it and have the money, then I say "God bless 'em" and go ahead and have whatever you want. Nothing wrong with that.

    In most cases you'd get more in-game FPS upgrading the GPU and running it at stock clocks than the CPU will add to the experience with a max overclock. If you go really nuts, you will actually lose FPS in gaming by buying the strongest CPUs with the highest count of cores and threads. Unless you are an avid overclocker/bencher, or you truly do need the extra cores/threads for some sort of special business purpose, an 8700K is more than adequate for just about anything that arises in the consumer, business and gaming space. Same can be said for all of the Ryzen 7 consumer X processors. But, many people have a hard time distinguishing wants from needs. There is nothing wrong with wanting something, but needs exist whether we like it or not. There are far fewer needs than there are wants.

    Edit: an excellent real world example is my 7960X. There are some things that, no matter how hard I try, I cannot match or beat my previous results with an 8700K and dual channel memory. At the same time, there are things I do with the overclocked 7960X and quad channel memory that completely and utterly obliterate any results I achieved with an overclocked 8700K. If you want something that always does the best at any task you put it to, you need to own several systems and only use the one that works best for a specific task that it excels at.
     
    Last edited: Oct 29, 2018
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    Another example...
    [​IMG]
    [​IMG]

    If you're not sure which one is right for the job at, then you should stay home and play with Legos instead. If you cannot figure out which one is more powerful, then you're just stupid.
     
    ajc9988 likes this.
  15. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    i was THIISSSSS close to jump on gt83vr soldered joke. mr.fox said not to compromise and here i am. 6 cores > 4 cores.
     
    jaybee83, Papusan, Robbo99999 and 2 others like this.
  16. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    did some quick comparison samples on caseking.

    8700k 5.1ghz was at 1.42v, 9900k 5.1ghz is at 1.36v.

    so there are definitely some silicon improvement there, except that 2 more cores at 1.36v makes it too hot.

    i'll just plunge and buy one, and regret later who cares
     
  17. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    your wallet does! but your tech itch will be SO satisfied :p :D
     
    ole!!! and Mr. Fox like this.
  18. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So, as promised, the gaming results. Note, if gaming at 1440p or 4K, because it is GPU limited more, differences are less pronounced, where 4K doesn't care which CPU you use, except for the WX series AMD chips. Also, the 1080p gaming tested was with Ultra settings, making it more realistic than the medium or low settings which show full CPU gaming performance, but in no way represent a real gaming scenario in the slightest (this was adopted after Intel pushed testing at 720p and got laughed out of the room, except by PCPerspective, which ran with it, along with a couple smaller review outlets, leading to the 1080p at low or medium settings to do the same thing). But, as seen, Intel steals the top of the charts.

     
    hmscott and Talon like this.
  19. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Since I play a lot of Battlefield those results are applicaable to me. Ryzen and TR get absolutely clobbered in BF1, and when BFV beta was tested the results were similar, even at 1440p. 9900K is truly a beast all around and these gaming results are not surprising at all. It can do work station tasks and game at the top of the hill. Definitely fills a niche for those that want top end gaming performance and yet still want cores and threads to get the job done during the day. TR on the other hand can't keep up with high end gaming but offers more performance in the other arena although an overclocked 9900K isn't far behind a 1920X or even a 2920X and it's cheaper to boot. I like how they're using the "inflated" prices for their comparison instead of the MSRP.

    https://www.walmart.com/ip/Boxed-In...-16M-Cache-up-to-5-00-GHz-FC-LGA14A/701836320 Just last week this was $499 and yes people have taken delivery. Also Amazon had it listed again just last week for $499. So using the inflated extortion prices isn't exactly fair, nor is saying they cost about the same. Them motherboards are far cheaper on Z370/90 and the chip is also $150 when found at $499.
     
    win32asmguy likes this.
  20. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Cross post but..

    This 9900K is running 5Ghz without breaking a sweat on my crappy H100i V2.

    1.208v under load LOL. It just smiled back and asked for more. Looks like I'll shave down more voltage today and keep testing.

    https://imgur.com/a/hBkw8KB
     
    raz8020, hmscott, Robbo99999 and 5 others like this.
  21. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Maybe it is just being used to 16-cores and having purpose for them, along with my limited gaming being at higher resolutions, that has jaded my perspective, but everything is based on what your display is working at. If you do not have a 120Hz, 144Hz, or 240Hz display for 1080p, then you are gaming at 60Hz, whether or not you like that number in the corner of the screen saying a high number. The display is the limit. Looking at 1440p, many will have 60Hz displays, same with 4K, although 120Hz displays recently came out for 4K or are about to come out and have existed at 1440p. I have a 4K display that does 4:4:4 chroma @30Hz and 4:2:0 or 4:2:2@60Hz (don't judge, the TV is a couple years old and is a 50in display, VA rather than IPS or TN). As I said, for my setup, jaded, although the display can do 1080p@120Hz 4:4:4 with HDR10 or Dolby Vision, IIRC. So first is matching the display and the graphics card, which is why having an 8700K and spending the extra $200 on the GPU seems more worthwhile, if being honest (unless CPU power is needed otherwise).

    But, I disagree, it keeps up just fine in GPU bound workloads. But I don't game much and don't game at high refresh rates, so I am not representative of the target audience, and do recognize that. But, if a person is gaming at 1440p@60, I wouldn't say the TR nor Ryzen chips can't keep up, as they clearly, even on the lows, provide mid 50s and up, and averages above 60fps. If your display cannot do above 60Hz, it seriously doesn't matter, and instead people should focus on the VRR implementations of the display, if present, and what the 1% or 0.1% lows are on a given platform, as that can say whether the frame drop will effect game play quality. And there are a couple games shown where that comes into play, although the WX chips are most effected on that front.

    Then, as to "inflated" prices, you point to transient posts of lower prices. This is what I see on searches done this morning:
    upload_2018-10-30_7-17-41.png
    upload_2018-10-30_7-18-25.png

    So, Amazon isn't even returning a result for the processor and Newegg lists $580. As such, using the inflated pricing is fair, as it is representative of what is normally seen rather than transient posts, like that of Walmart going out of stock immediately after posting at that price. If the average consumer cannot find the product at the specified price, then using the price they can find it at, rather than MSRP which is a delusional rate of $480 that NO CONSUMER has found that product for which makes it a number not worth mentioning. To be fair, I say ignore MSRP altogether and use the rate that a person will find on the market.

    As to the MB argument, sure, there are cheaper ones, but after seeing what happens on the 4-phase power limited boards, and the reports from Silicon Lottery that the Maximus X Hero couldn't keep up with the 9900K, it seems that argument doesn't hold unless buying boards for the 8700K or 9700K. Who would knowingly buy a board that has been shown to limit feeding the CPU the power it needs in some way? Now, the testing with those boards were not incorrect, rather, it exposed what happens when that limitation is present, and that a person should look more to higher phase count boards. To be fair, I think Giga, which I personally don't buy, has a couple of the 12-phase boards in the high $100 range, and I know they have those in the low $200 range. HEDT boards for TR, you are looking at $500 if future proofing with the MEG Creation, or the Zenith Extreme or X399 Taichi, although with the Taichi, you may need additional cooling for the VRM (but the VRM itself is the same as the Zenith Extreme). So definitely a good point there, although that makes the price difference of getting a 1920X at $436 and paying $360 for the board (so about $800, $1K for the 2920X and that board, and slightly above that for a 1950X) in the ballpark of buying a 9900K ($580) with a high end board, which are around $300+ (around $900).

    But that depends on need. Meanwhile, you can pick up an 8700K for around $370, or a 9700K for $400-420, and a cheaper board that can power those for the sub $200 range, which puts the price of that around $600 or less, or $650-680 if getting something like the Maximus X Hero. That comes out to $200 savings by going that route without compromising on frame rate, really, and giving that much extra to spend on the GPU.
     
    Last edited: Oct 30, 2018
    Talon likes this.
  22. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    I believe the Maximus Hero X is an 8phase board using 2 4phases with "doublers".

    The Maximus Hero XI is a "twin" 8 phase board and is certainly on the QVL list for CFL-R and is driving my 9900K without issue overclocked nicely.

    But fair enough on those cheaper motherboards, I would avoid them anyways. It's similar to the situation where some TR1 boards can't keep up with the high core TR2 especially when overclocking due to their inferior power delivery.

    I would argue most 1440p monitors today are 144hz minimum. Some being 165Hz+.

    Edit:

    It would be nice to see the supply issue fixed and the prices can "normalize". Just because I got a good deal on mine doens't mean everyone can find it at those prices or make a deal with their local Microcenter. It's helps to befriend those guys and get personal text messages when your CPU is ready for pick up haha.

    Maybe there is hope? https://www.digitimes.com/news/a20181030PD205.html
     
    Last edited: Oct 30, 2018
    hmscott and ajc9988 like this.
  23. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    And all of that is fair enough. After the reports of power delivery issues, i haven't dug deep enough into which boards will be best for overclocking, instead noting which boards have been ID'd by reviewers or people using those boards as potentially problematic, providing large differences on power delivery, or being noted as insufficient by binners like SL or caseking/Der8auer. Any mistakes in statements come from lapse of memory or lack of hands on experience with the products, ATM.

    As to it being on the QVL, doesn't mean it isn't having issues with power delivery. Reason I brought that board up specifically is I thought that was the board Linus used in his review. Could be wrong there, but something worth investigating.

    And definitely a fair point. Even with the Zenith Extreme passing barely, the Taichi with the same VRM FAILED immediately, showing that their sink isn't working as well. I water cool my taichi, and can vouch that my VRM were way shooting up until I liquid cooled them, whereas TANWare's VRM actually stay cooler than mine were with similar clocks (edit: before I liquid cooled, now mine are cooler with 40C-50C under load, depending on ambient temps).

    And I do agree, many monitors do offer the higher refresh rates now, and at more reasonable pricing, but a person that isn't rebuying the display likely has an older 1440p model, so the question of support for high refresh rate varies. I don't want to fully go through all the considerations with displays, but I did want to mention the supported Hz of a display, as that is something many purchasers and gamers forget, rather focusing on the peak number (which is also why I complained about testing at 1080p medium, which no one would use unless they are really going for the 240Hz displays, etc., which are few and far between, with Ultra settings at 1080p and 1440p being the most realistic, followed by medium to high settings for 4K, with some doing ultra on 4K, which is limited to 1080 Ti, 2080, and 2080 Ti owners).

    But definitely fair points.
     
    Last edited: Oct 30, 2018
    Talon likes this.
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    Some of them are fake refresh rates or factory overclocked.

    I have random weird issues with GeFarts driver instability with my Acer Predator XB271HU monitor running 1440p at 165Hz and 144Hz. It only behaves correctly at 120Hz or lower refresh rate when it is having a bad day and being temperamental. Other times it works great. In some things I also notice running 144-165Hz lowers benchmark scores compared to 120Hz. I think it has something to do with the G-Stink processor and now wish I had not purchased a monitor with G-Stink. I might look into seeing if there is a way I can remove the G-Stink hardware and make it a "normal" monitor.
     
    ajc9988 likes this.
  25. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    ALWAYS disable g-sync when running any type of GPU benchmarking. The lower score with higher refresh rate is definitely confusing and something I don't think I've encountered before, but I could always run a benchmark to test the theory myself.
     
    Mr. Fox and ajc9988 like this.
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    I have G-Stink disabled 24/7. I have no use for it at all and don't like it. I already knew that and regret buying a G-Stink monitor. But, hindsight is always 20/20. Should have ignored all the advice and gone with my gut that told me to continue saying no to G-Stink.

    I think the lower scores (which are random) are being caused by G-Stink hardware (since it is disabled, that is not affecting scores) or the GeFarts drivers (regardless of version) or a combination of both. But, as I mentioned it is random. Sometimes the display has artifacts at 144Hz and 165Hz, even just on the desktop or web browsing, but when it does that if I temporarily ENABLE G-Stink the artifacts go away. Then I can disable G-Stink again. That's why I think the G-Stink hardware might be causing an issue.
     
    mat89, jclausius and ajc9988 like this.
  27. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yeah, I've heard that the newer drivers basically break performance with G-sync. The drivers recently don't seem to be helping on fixing that problem, if Nvidia even gets around to fixing it at all. On one driver, I was getting screen tearing just browsing the internet with the display on 4k@30hz (don't need high refresh rates for browsing and typing or non-graphical workloads). First time I had that and figured my GPU might be dying until it stopped with a different driver.

    I am definitely thinking driver issue. Also why I don't think the couple hundred dollar chip for the gsync2 or whatever is worth it, especially with the upcoming HDMI 2.1 standard supporting VRR (same as AMD's freesync, but without the AMD approval) which also supports a wider range of frame variance.
     
    Mr. Fox likes this.
  28. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    I thought my GPU was dying as well. It seems to be an issue with any driver version. Once I figured out toggling G-Stink on and off stopped the problem, or setting the refresh rate to 120Hz or less, stopped the problem is when I started blaming the G-Stink hardware. It could also be the display clock speed is too high for a chintzy monitor.
     
    ajc9988 likes this.
  29. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    You mean more like this? Unusual high failure rates for GeForce RTX 2080 Ti? [​IMG][​IMG][​IMG][​IMG]:D

    The problems reported are rather diverse, some have unexpected BSODs, other cards just die and other show artifacts. All this does not seem to be related towards tweaking and overclocking


    @ajc9988 Is it so that the lid on mainstream Ryzen chips cant be used with Liquid metal in same way as with the Threadripper chips?
     
  30. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    I need more explanation on that, because the IHSs on all CPUs are nickel plated copper.

    Sent from my SM-G900P using Tapatalk
     
  31. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    Then it seems some of AMD's workers don't know their products. Almost like Dell :D
    [​IMG]
     
    Last edited: Oct 30, 2018
    Arrrrbol, ajc9988 and Mr. Fox like this.
  32. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    No, not the same. But, I was kind of concerned until I figured out what the deal was.

    I am kind of glad I haven't had the money to upgrade my GPU yet so I can let this RTX mess run its course using somebody else's money, LOL. Seems that most reports involve FE cards, and I have never had any interest in spending money on the FE GPUs.
     
    Arrrrbol, ajc9988 and Papusan like this.
  33. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Almost all failures I've been reading about seem to be FE cards. Possibly a bad batch or maybe something far worse with the GPU design?
     
    Arrrrbol, Mr. Fox and Papusan like this.
  34. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I might have something similar with my 1080p 144Hz monitor. I was noticing a slight noisy graininess to the display on desktop and in games occasionally, as well as what looked like poor texture filtering (blurry) on distant textures in game, which I managed to sometimes eliminate by turning Gsync off/on, or sometimes by changing the refresh rate from 144Hz down to 120Hz and then back to 144Hz. This didn't always work. What I have found that works is turning off all power to my entire PC (monitor & desktop) at night, I leave it off all night, turn it all back on in the morning - this complete power cycling seems to have prevented the screen anomalies. I'm still impressed with 144Hz gaming on my monitor, but I am disapointed that these screens seem to be so finicky and temperamental, I thought it was just my own & bad luck, but yours sounds temperamental too.
     
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    That should come as no surprise, and shouldn't be viewed as unique to Dell or AMD. I think people not knowing their products is too common. Probably even the norm. Incompetence is rampant in all lines of business and industries now. Especially with leadership and customer-facing personnel, which is totally counter-intuitive.
     
  36. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    Dang, yeah... that sounds almost exactly the same. My system always seems normal after it has been turned off all night. And, you're right about toggling G-Stink not always fixing it.

    I have both DP and HDMI connected to my GPU now. When it starts having issues with the artifacting misbehavior, I use the control on the monitor to switch the input to HDMI and it instantly works correctly (but at 60Hz). Now I kind of wish I hadn't spent any money on a 1440p 165Hz G-Stink monitor. There was and is nothing wrong with my dual 60Hz 1080p ASUS monitor setup. I'm using them for work.
     
    Arrrrbol, Papusan and ajc9988 like this.
  37. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,701
    Trophy Points:
    431
    Are you using an Acer monitor as well? I've never experienced anything like this with the Asus and Dell/Alienware G-Sync displays I've owned.
     
    Papusan likes this.
  38. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Oddly enough I've never heard of this issue until now. I have a Dell 144Hz G-Sync 1440p, and my brother has the Acer 165Hz 1440p g-sync and he also has no issues to report. I doubt it's related but did you guys ever update your 1080 Ti display port firmware when nvidia released that update awhile back?
     
  39. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    I have never heard anything about this until now. Where do you get this firmware update? Got a link? That might have something to do with it.
     
  40. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    https://www.nvidia.com/object/nv-uefi-update-x64.html

    As always proceed with caution. I never had it create any issues for me, but you know how companies can go and bugger **** with firmware updates. This was simply to update the DP to support the tech that was already there for high refresh rates and resolutions.
     
    Papusan, Mr. Fox, ajc9988 and 2 others like this.
  41. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'm connected just through DP. So weird we see this strange behaviour, it's quite subtle so if a person is not paying attention they might not notice that the screen is not looking as good as normal, maybe that's why I've not seen lots of reports on it - I googled it, but couldn't find anything related. I think the longer I leave my monitor plugged in (and on standby at night when my computer is off), then I have a feeling the picture slowly gets worse & worse over a number of days/weeks until it reaches a plateau, perhaps that's also why people don't notice, it creeps up. By turning everything off at the electrical wall outlet at night seems to reset everything back to hunky dory, don't know how long I need to leave it completely powered off to restore the picture quality as I leave it off all night. Strange that there's not much to be found on this when googling.
     
    Papusan and Mr. Fox like this.
  42. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I've got AOC G2460PG 1080p 144Hz monitor.

    I've got GTX 1070, not GTX 1080ti, but what's that update, I don't quite understand what it does, what does it actually do? Is it gonna replace the vBIOS in my GPU?? I don't want it to do that, because I've got an "unofficial" vBIOS from Zotac for my Zotac GTX 1070 that allows for higher overclocking on Micron VRAM - they never released it officially for some reason.
     
  43. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    No it doesn't mess with your vBIOS it simply updates a firmware somewhere else on the GPU to activate the hardware spec that has been there since release. Nvidia had to update GPUs when the HDR/G-Sync/4K 144Hz released to support the required bandwidth via Display Port. Even with that update I was still running the XOC vBIOS on my 1080 Ti. It will work on your card too.
     
    Papusan, Mr. Fox and Robbo99999 like this.
  44. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yeah, that is just bull posts on reddit and an idiot trying to post on AMD forums. Literally, same lid as Epyc, and der8auer showed how to remove LM with acid on a TR lid, and used LM for his 2P epyc system he used to get the OC record for the gamescom display.

    Edit: also, that is a forum post, not a response from amd employees

    Sent from my SM-G900P using Tapatalk
     
  45. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    I know this thread has already been derailed enough but does NBR have a discord?
     
    bennyg likes this.
  46. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Is it compatible/necessary on mobile GPUs?
     
  47. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    No it doesn't work on mobile, at least it never worked on my 1070 laptop.
     
    yrekabakery likes this.
  48. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Thanks, just went ahead and applied the update. The tool said it required an update, ran without issue & wanted a restart. GPU working fine still, tested on Firestrike, nothing weird. I also ran the tool again after it updated, just to check that it had applied the update - it said I already had the latest update, job done. Whether or not it solves the weird panel issue I've been seeing I do not know, I might leave my monitor on standby rather than unplugging from the wall at night, to test to see if it's made a difference to the weird screen issue I've been seeing. OK, this was off topic, sorry, I won't post anymore about this off topic.
     
    ajc9988 and Mr. Fox like this.
  49. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,218
    Messages:
    39,333
    Likes Received:
    70,631
    Trophy Points:
    931
    Thank you, Brother @Talon . Whether is it relevant or not remains to be seen, but I needed it, so that's good by any measurement. And, I have randomly seen the bug with the black screen after POST and what seems like a lock-up before the OS UI loads as mentioned in the release highlights when using DP versus HDMI.

    Capture.JPG Capture2.JPG

    Edit: I can see a difference there already, but it is hard to put my finger on why it seems different. In fact, I replaced my original DP cable with a high quality DP 1.4 cable about a week ago hoping to correct some of the glitches and it made no improvement. But, the screen already seems more responsive in some way. I am wondering if some kind of DP bandwidth limitation was causing it for me and @Robbo99999?
     
    Last edited: Oct 30, 2018
    Robbo99999, Papusan and Talon like this.
  50. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,617
    Trophy Points:
    931
    From what I read... He contacted AMD :D And they say...

    What more can be said? :p
     
← Previous pageNext page →