The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Brand new card, settings might be off?

    Discussion in 'Desktop Hardware' started by Gelynna, Jun 28, 2017.

  1. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    So, after years of coming around here off and on, I've finally felt like it was time to treat myself and build a decent desktop that can handle more intense art files, and do pretty well in gaming. I know some might tell me not to go lower than certain graphics cards, but I felt like the GeForce GTX 1050 will do just fine for now. I can always upgrade, but I was more worried about having a decent base build.

    My problem right now is that I'm noticing some blotching/artifacting when I view images with intense light-to-dark. With games and images that have this intense fade-to-dark, they look improperly loaded or like I'd saved an image file on very low internet speeds. I don't notice this as bad on my recent sunset beach photos, but I do notice it on similar images that I look up. Perhaps it's because I'd transferred images and didn't download lower resolution images.

    During my planning period with this build, my more intense-gaming friends who have much nicer builds than me told me that my setup was fine. I have a 400w power supply, and even though I have seen other sources say that 450 isn't powerful enough, I'm not sure if that applies to this card. [ Edit: I'd seen that the minimum for the Ti version of my card is 300w, so maybe I can rule that out?] I've also looked at the idea of lowering the clock, but when testing that in a game to see if I notice a good difference, the quality often looks much lower. When I use Valley Benchmark on Ultra, it doesn't look like the game I've tested this on, nor the other images that require more smooth blending. I'm on a TV monitor, and since I've used a Roku stick on this, and since the picture was nice and crisp, I know this isn't the problem.

    I do have the software installed to tweak the card settings if there's something that can help me adjust this. I also have the EVGA OC Scanner, but since I'm new to that, it'll take some more poking around Google first.
     
    Last edited: Jun 28, 2017
  2. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Loose connection somewhere? Is your FPS affected, or just the colors?
     
  3. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    Just the colors, and no loose connection. The only connection I had trouble with was the wireless antenna connectors, but I got that (after an hour of wanting to kill things). I'm double checking the driver updates, since in my install craze, I may have missed out on what I thought I had already taken care of. I was also talking with a couple friends, to help me think up any simple yet easy to forget tidbit. If this is it, then I had one of the biggest brain farts of this year!

    Edit: So I went back, checked updates and had only one update with the GeForce Experience. The last update available was from about 20 days ago. But even with that, and checking the settings with the Nvidia control panel, I can't find a tweak to fix this issue. I'm not sure how to describe what I'm seeing, nor how to show it off given it's only on my screen. Though, after playing WoW for about an hour, I noticed areas like Bloodmyst Isle looked too vividly red/purple like a Fauvist painting, lol. However, in areas with lots of blue/green, I saw more green or section of color were more washed out.

    I still don't believe it's a connection issue, since I'd see a "bad connection" type of screen than artifacting like a poorly loaded Jpeg. This is what has be believing it might be something driver related, but I can't think of what to check right now since I've been through the usual suspects.

    Edit 2: I was reading this thread [ link] which has similar issues to mine. I tried some of the fixes, even re-calibrating to see if that helped. While the images within the Windows calibration display looked better when I tried some different settings, I didn't notice any changes elsewhere. I doubt rebooting will help since I didn't make any driver changes, but I'll do it again anyway.
     
    Last edited: Jun 29, 2017
  4. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    I believe this is my real problem: HDTV with HDMI only being read as a TV instead of monitor via Nvidia.

    After even more digging around and using different keywords during my searches, I've found this helpful article. I seriously believe this may be the real issue, since I'm using a HDTV with HDMI input. It's a purely HDMI cable, and with how the article talks about colors looking washed out, the concept of not getting a full RGB range makes perfect sense -- because this is what I'm not getting. I'd actually already tried a lot of the solutions in this article, but none have worked yet. I even tried the toolkit. I need something that will for sure force a full RGB range.

    Also, when I view this image, I only see 1 shade of black.
     
  5. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Does your TV have other input types than HDMI? That might help determine if that's the cause.
     
    Gelynna likes this.
  6. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    I'm going to link instead of hotlink the image because it's big. [ This is what I have in the back.] I've tried the other HDMI to no avail. I'm not familiar with the "components" listed,but with the L and R right next to it, I want to assume it has to do with audio hookup. This TV is a Vizio with Surround Sound HD. [ This is what pictures look like to me.] Phone pic so that I could get the most accurate visuals. The TV has no other hookups that I know of.
     
  7. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    The other stuff is component/composite links which are analog, not digital signals. The HDMI might be your only option. If there is a way to connect the system using a different cable and/or to a different display, that might help.
     
  8. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Do this:
    Nvidia Control Panel > Display > Change resolution > 3. Apply the following settings > Output dynamic range > Limited

    Full RGB is for monitors. Limited RGB is for TVs.
     
  9. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Didn't think of that, I'm used to newer TVs that seem to get around this somehow.
     
  10. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Well, HDMI displays default to Limited, so HDTVs are fine. It's PC monitors that need to be set to Full (needs to be re-set every time you update the Nvidia driver), because Limited makes everything washed out.

    If you set your TV to Full, it won't look right either as blacks will be crushed.
     
  11. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    I didn't want to post again until I made an exchange on some monitors. I decided on a smaller 20" as my second. But here's the interesting part: after a straight up plug & play, then setting it up as my #2, I notice the exact same artifacting. I'm using a DVI to DVI hookup on this one. This is an Acer K2, nothing fancy but does the job.

    Here's the funny thing: it defaulted to Full, so I automatically assumed that it wasn't being pushed on through as Full since I was getting the washed out and crushed blacks display. I'll do that for the TV. One of the articles I'd read actually said that a TV should have Full RGB forced on it. Guess they were wrong, heh.

    What I don't understand is why I'm also getting the same washed out colors and crushed out blacks on the Acer. It's 1080p and 32bit default. Should I customize this so that a full RGB goes through? Granted I may tinker with this in the mean time.

    I have tried putting the TV on Limited before, and didn't notice a difference. But now that I know it would be fine this way, I'll have another go at it.
     
  12. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    PC monitor: Set to Full. Limited = washed out.

    HDTV: Set to Limited. Full = crushed blacks
     
  13. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    Actually, the Limited on HDTV didn't help. Still has crushed blacks. I've literally tried every Resolution setting to see if they helped, but nothing did. Because this TV has been used as a normal TV with a USB Roku stick just the other day, and I saw some nice, clear picture with no dither problems, I know the TV itself isn't bad. It's also not an old purchase. So I've ruled out various hookups and hardware issues, and most of the basic driver/setting issues.

    What I'm also saying is that there are bits where the dithering is very choppy, and game wise or with certain high-dither photos, I'll see washed out patches of color or smooth-dithering being replaced by blues/cyans than what they should be. Given that most games and art shade warm brights with cool darks, this makes sense to me. Edit: I guess you could say I'm seeing crushed colors, not just black.

    The only thing that helps me kind of off-set the over-vividness of this issue is if I tone down the contrast in Nvidia to 20%, and Gamma over to 0.85. I also adjusted the Digital Vividness down to 35%. The crushed blacks are tolerable now. But it makes my sunset photos, that I'd processed myself, look as if I'd saved a heavily compressed Jpeg, regardless of what I do right now.

    After calibrating the PC Monitor for color and toyed a bit with the Nvidia, it looks fine given the quality of picture I already expect from it. Whatever fixes I could do to improve the driver settings will help smooth things out.
     
  14. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    After spending a few more days with this, here are some things I'm noticing. Even on my PC Monitor, which is very basic $80, super standard, I'll notice the same issues as the TV monitor. It's not as noticeable because it's almost 1/4 the size of my tv, and if I didn't know any better I'd assume it's the limitations of it. However, it's 1080p. The TV is more vivid and displays very large, so naturally I do notice the nuances. I want to believe that regardless of what I have, things should display very smoothly.

    I've finally figured out the short way of explaining my problem: color banding. I saw this topic on GeForce and they describe the same problem I'm having. However, the solutions that worked for others who posted aren't working for me. My smaller PC monitor is a full DVI hookup, yet it color bands as well.
     
    Last edited: Jul 1, 2017
  15. namaiki

    namaiki "basically rocks" Super Moderator

    Reputations:
    3,905
    Messages:
    6,116
    Likes Received:
    89
    Trophy Points:
    216
    Does your TV menu have any such options to adjust the range or processing level of the image?
     
  16. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    It has basic TV picture settings, and my PC recognizes the TV as a PnP generic monitor. I even went to double check and make sure. However, my Acer monitor displays the exact same color-banding, so checking my TV's processing level won't help.
     
  17. Gelynna

    Gelynna Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    4
    Trophy Points:
    16
    Oh a whim, I decided to do more digging based on a search I hadn't tried yet: about getting rid of color-banding in Nvidia's driver itself. What I was calling color-banding is what I'd initially called posterization, but I wasn't sure that was the right term for graphics/driver issues, since we use that term in Photoshop. Here's a link to a reddit comment update from a user who explained the exact same issue I'm having. Ironically, people told me I should go with Nvidia over AMD, but now I'm seriously considering exchanging my Nvidia for AMD.

    I'll also quote the post, in case people are lazy. But the fix right now is "no Nvidia until they fixe everything from 900 and up."

    "Sorry for the late reply, In the end I never heard back from anyone at Nvidia on this issue. But from what I found out it's because of the way the windows drivers apply dithering/ Or lack of dithering is more accurate. AMD cards and older Nvidia cards use dithering on 6bit/8bit displays to simulate 10bit colour giving a very smooth albeit not very accurate colour space. The controls for dithering are not available on the Geforce Cards from Nvidia and a solution has never been found. Your only options are to get a monitor that is True 10bit with display port so the driver sends out the correct colour. Or get an AMD card. which is what I will be doing when AMD finally release something around GTX 1080 levels."