The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    None know nothing. See etc here. From Techpower database for 2080 Mobile Bandwidth 384.0 GB/s

    And I'm sure Nvidia vill screw up this gen Mobile graphics. They started so nice last time with Max-Q. Then 1070 Ti for desktops.

    See also
    Nvidia Turing: Mobile RTX graphics chips 2050/2060/2070 announced during CES Guru3d.com | 12/10/2018 09:33 AM

    Over at the pending CES in Las Vegas Nvidia might be introducing first mobile offspring based on the Turing graphics architecture. If correct then the first Turing notebooks could already be available in about two months.
     
    Arrrrbol and hmscott like this.
  2. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
     
    hmscott likes this.
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Not enough of a market for software developers to deliver optimized SLI / Crossfire, and gone are the days of developers all trying to do it in the final optimization before they get moved to new projects.

    It's too bad, the hardware is there, and Nvidia has even faster NVLink hardware.

    DX12 allows better multi-GPU support, but that hasn't happened yet either.

    Adding work to developers is a sure way for such features to get trimmed from the development cycle to release on time, and developers get switched to new projects as each are made available toward release, so no going back and spending resources to add features, unless the dev's are personally motivated.

    Nvidia / AMD can fund such things, encouraging developers, but they can't do it for every game development, or even enough to make it worthwhile. Hairworks isn't in everything, and neither will RTX features DLSS or RT moving forward.

    SLI / Crossfire is a good heads up example on how much support new cool hardware implementations will get from developers even if you provide great hardware implementations over a long time, it's still not adopted. So goes RTX features.
     
    Last edited: Dec 10, 2018
    Talon likes this.
  4. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    By and large DX12 and mGPU has been a huge disappointment unfortunately. DX12 was suppose to bring among other things mGPU support with different GPUs. SLI/Cross-fire hasn't been worth it for years though which is sad. I never liked SLI myself because of I was able to perceive the stutters and the frustration of even a single game not working properly or at all with mGPU. I tried it recently in the last few years twice and both times I had instant regret. Never again, well unless something really changes with SLI/Crossfire support.

     
    hmscott likes this.
  5. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Cool, but how much are these things gonna cost, can you even IMAGINE how much a laptop would cost with these given how expensive the desktop cards already are!
     
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    It looks like AMD has nothing to offer.
    Ex-Hardware.fr GPU Editor Damien Triolet Jumps Ship from AMD RTG to Intel
    Techpowerup.com | Today, 23:03
    Oh hey remember this news post from July last year? Damien Triolet's work history off-late has been one of many such recent stories. These tend to begin with AMD, and RTG in particular, getting a cash infusion and growing in 2016 and 2017 to where they hired some of the best engineers and marketing personnel from the industry- media or otherwise. This follows a more stagnant GPU division in 2017-2018, Intel deciding to dip their toes back into the discrete GPU market, and in turn.. persuading many to cross over to the blue side.

    According to Damien's LinkedIn and FaceBook profiles, he has started working for Intel from November 26, 2018 in a technical marketing position in their Gaming and Graphics division, a role analogous to his from his days at AMD. Presumably, he joins Raja Koduri and the many others who have followed this exact path off late, and everyone remains curious as to what the finished retail product will be. In the meantime, we here at TechPowerUp wish him the best again for his new venture. We had the pleasure of interacting with Damien on multiple occasions in the past, some as colleagues in the media giving hardware manufacturers a hard time, and others when he was hosting us as an AMD employee. His tenure at Hardware.fr has been inspiring to us, with excellent reviews that no doubt were what caught the eyes of AMD in the first place, and Intel will definitely gain from his presence.
     
  7. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    That is a marketing guy, just like the last one and just like Ryan Shrout. Wake me when it is engineers, which the only one they took was Raja.

    Sent from my SM-G900P using Tapatalk
     
    hmscott and bennyg like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Put it this way... Desktop cards have meet/approached the Mobile graphics price point. But expect pay premium at least for the " high end" Mobile version. Both the Max-Qripple and the normal Mobile version of the cards.
     
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Intel's got stacks of cash and lots of bad decisions to fund, so that's a logical place to go.

    Maybe one of them can save Intel. I doubt it. :(
     
    Last edited: Dec 11, 2018
  10. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, exactly!
     
    Papusan likes this.
  11. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    DLSS now in it's first game, RTX 2080 Ti delivery up to 2x the FPS of the 1080 Ti with better image quality.



    DLSS has been added to PUBG config file, I am willing to bet we see DLSS implemented in PUBG with the "official" release of the new snow map in the next big update.
     
  12. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's not out yet (test / Beta isn't a release), so don't increment your DLSS count from Zero, not quite yet. ;)

    The Beta is Monday, with the release scheduled for December 19th for PC, and if they get DLSS out in that update, Nvidia will squeak by with 1 DLSS game in 2018, but where are the other 24 games promised at release??

    But, this PC is release info says nothing about DLSS:

    PUBG Vikendi Snow Map Release Date for PC, PS4, and Xbox One Revealed
    Rishi Alwani, 10 December 2018
    https://gadgets.ndtv.com/games/news...date-for-pc-ps4-and-xbox-one-revealed-1960387

    None of the search results for the pubg snow map + DLSS show DLSS in any recent info...

    This early reddit thread, naively think it's going to be available when RTX cards are released:

    When will PUBG support DLSS?
    https://www.reddit.com/r/PUBATTLEGROUNDS/comments/9crucr/when_will_pubg_support_dlss/?sort=new

    NightKev 3 points 3 months ago
    "It was already announced that PUBG will have DLSS support by Nvidia, so presumably the day the RTX cards are released or whenever the next PUBG update after that is."
     
    Last edited: Dec 13, 2018
  13. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331

    Nope DLSS patch is already out as are the drivers so DLSS is in a game. Users are reporting higher image quality and watching the video RIP 1080 Ti. Getting near double the FPS at 4K with better image quality isn’t exactly nothing to be excited about. It’s HUGE.

    In fact I think it’s in 2 games now including that Chinese game that released awhile back and is a non-US release.

    As to those other games I’m sure we will eventually get patches for them. Some of those games aren’t even out yet.
     
    c69k and hmscott like this.
  14. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Why Does Everyone Hate RTX Graphics Cards?
    Joker Productions
    Published on Dec 11, 2018
    Check out the Comments Section for this video. :)
    Ever since the NVIDIA RTX cards were announced in August there has been a huge amount of backlash over the price, performance and lack of games supporting DXR.
     
    Talon likes this.
  15. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    DFKI - How AI Can Improve Disaster Management
     
    Vasudev likes this.
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    DLSS FFXV Patch is out. 15.4 GB
    Submitted 1 day ago by BnanaRepublic
    https://www.reddit.com/r/nvidia/comments/a5oln3/ffxv_patch_is_out_154_gb/

    apoppin Editor 2 points 1 day ago
    "Where is the option to enable DLSS? I see the new quest.
    Nevermind - I found it under Anti-aliasing - you must be at 3840x2160 resolution to enable it. :O"

    Aemony 6 points 1 day ago
    DLSS 1x uses a render resolution of 1440p alongside an upscaling algorithm to simulate 3840x2160 running with some 64xSSAA or some other antialiasing.

    The reason it is only available when 4K is selected is because the algorithm is only trained to handle 1440p->2160p upscaling. It can’t upscale any other resolution than 2560x1440 to a simulated 3840x2160 resolution.
    So it only being available when 4K is selected both gives an insight into its limitations but also its intended user-purposes (it is a cheaper alternative than running native 4K). Having to select 1440p and then DLSS would’ve been more confusing to consumers since Nvidia is pushing it as a cheaper 4K alternative.

    I both agree and disagree with Nvidia’s choice myself, as having it only available when 4K is selected also inadvertently gives consumers the false notion that it is running at native 4K render resolution, which just isn’t happening. DLSS 2x is another mode Nvidia have that does use a native 4K render resolution, but said mode is so slow it isn’t meant for real-time applications (it compares horribly vs. even native 4K without DLSS 1x).

    That said, 1440p + DLSS 1x is in many ways more comparable to a native 4K + other post-processing anti-aliasing filters than anything else, I guess, so.... ¯\_(ツ)_/¯
    Freeloader 4 points 19 hours ago
    " so DLSS is only for 4k resolutions ? way to go Nvidia, you forgot to mention that in your keynote also huh"

    Final Fantasy XV DLSS versus TAA IQ and Performance Analysis
    By Mark Poppin - December 13, 2018
    https://babeltechreviews.com/final-fantasy-xv-dlss-vs-taa-review/

    "...After good work from NVIDIA’s deep learning using DLSS and the latest GeForce 417.35 diver, together with Square Enix large content update, Final Fantasy V becomes much more fluid and playable at 4K on the very highest settings with the RTX 2080 Ti and DLSS.

    If you want to see DLSS for yourself in action you will need a Turing RTX GeForce video card. We also expect that a RTX 2080 will also be able to play at 4K/DLSS with some setting compromises, and perhaps with a GSYNC enabled display, although RTX 2070 gamers can not. We also look forward to the other 24 games for which DLSS support has been promised."

    The Worst CPU & GPU Purchases of 2018
    Hardware Unboxed
    Published on Dec 13, 2018
    RTX Section starts at 09:00...
    josrxoivb4421.png

    Intel XE GPU Announced + NVIDIA RTX Mystery Unboxing

    Joker Productions
    Published on Dec 13, 2018
    Intel has revealed their GPU roadmap for discreet Graphics Cards in 2019 & 2020 with Gen 11 and Intel XE. Also, a mystery package from NVIDIA is UNBOXED!
     
    Last edited: Dec 14, 2018
    Vasudev likes this.
  17. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    So that was the reason Nvidia's CEO quoting 4K.. 4K ... 60 FPS.. 60 FPS.... on any scene in the keynote.
     
    hmscott likes this.
  18. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331


    DLSS is amazing. The image quality is far better, better looking blacks, and doesn't have blur that TAA creates all while getting better FPS. DLSS is certainly the future.
     
    Vasudev, Robbo99999 and hmscott like this.
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I could have sworn Nvidia also discussed running games through the Nvidia Artifical Intelligence " Gonculator" (Nvidia Saturn V DGX-based supercomputing cluster) at other resolutions to create the rule-set's needed for other than 4k output to be installed through Nvidia Geforce Experience Updates - either as part of the drivers or as a new class of update for DLSS rule-set's.

    Hogan's Heroes S04E02 Klink Vs the Gonculator
    Robert E. Santiago
    Published on Jan 14, 2017
    Hogan bamboozles Burkhalter and Klink into believing that Carter's rabbit trap is a "Gonculator": a top secret contraption that will help Germany win the war. Great episode!

    Could this be why Nvidia is so darned late delivering DLSS?, why it's taking so long to deliver DLSS rule-set's for games?, is it because the Nvidia DLSS "Gonculator" is fritzing out? :D
     
    Last edited: Dec 17, 2018
    Vasudev likes this.
  20. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    If you're referring to only 4K available, they are. 1440p should be coming as this is a "beta" version of DLSS in the game. As seen with BFV, I'm sure we will see improvements and updates. :)

    If you want to use DLSS now with a 1440p, simply select DSR, launch the game and viola it works on 1440p monitors.

    https://imgur.com/a/IwHpkUE#Yxc3r00
     
    hmscott likes this.
  21. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, someone else in that reddit thread mentioned selecting DSR to get around the 4k limit, but that is silly, as then DSR reduces performance only to then invoke DLSS to save performance? - one negating the other, ending in reduced FPS overall + double scaling?

    DSR + DXR seems like a "Rube-Goldberg Device", and a waste of time making unnecessary contortions on the data to fit an incomplete DLSS implementation.
    09_rg_rube_goldberg_inventions_usps_stamp.jpg
    DLSS as a service is not viable until Nvidia provides native resolution DLSS rule-sets for each native output resolution, not just 4k. Maybe limit rule-set processing to 4k, 1440p, 1080p output resolutions. But, DLSS would be best if there were a seamless scaling across any output resolution fueled by the rule-set's scaling all the way up to the 8k DLSS source rendering.
     
    Last edited: Dec 15, 2018
    Talon likes this.
  22. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Laptop RTX Discusion
    PCWorld
    Streamed live on Dec 12, 2018
    Nvidia RTX Laptop discussion starts @ ~35:20
    Join The Full Nerd gang as they talk about the latest PC hardware topics. In today's show we talk about Intel's big CPU plans for 2019 and beyond, rumors of Nvidia RTX in laptops, and Corsair's wild and fake RGB RAM. As always we will be answering your live questions so speak up in the chat.
     
    Last edited: Dec 15, 2018
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The Disappointment PC 2018: Worst Parts of the Year
    Gamers Nexus
    Published on Dec 14, 2018
    Nvidia RTX release discussion starts @ 09:10, but don't miss the haunting "RTX just works" intro at the beginning of the video, from 00:00 - 03:00. :)
    This PC has the worst parts of 2018, combining into the 2018 Disappointment PC. Starring RTX and featuring Walmart, et al.
    Buy the Disappointment Build Shirt in cotton ( http://geni.us/EVMKAWK) or Tri-Blend ( http://geni.us/lpc5a3Y) to support GN & commemorate a year of letdowns!

    The Disappointment Build is specially accompanied by our most complicated short feature film to-date, including a custom mini-game we made in Unreal Engine 4 to show the "RTX Off" world with PS1 graphics, offering a cameo of the PS1 Classic disappointment. The film includes several remarkable quotes from 2018, eventually leading into our more detailed walkthrough of the components that let us down the most for the year. To support future high-effort content like this, please consider buying one of our Disappointment PC shirts (linked above) or donating via Patreon: http://www.patreon.com/gamersnexus

    PART 1 2017: https://www.youtube.com/watch?v=9tbD-...


    Nvidia’s Stock Plunges, Largest Investor Rumored to Exit
    by Nathaniel Mott December 13, 2018 at 11:50 AM
    https://www.tomshardware.com/news/nvidia-gpu-stock-softbank-shares,38243.html

    "What happens if the golden goose starts laying regular eggs? That’s probably a good thing for someone who’d rather eat an egg than melt one down, but for the goose’s owner, going back to selling food instead of precious metal would suck.

    This is pretty much what’s happening with Nvidia. The company’s share price continues to plummet from the highs reached during the cryptocurrency mining craze when eager miners purchased so many graphics cards the entire market changed. Now SoftBank, its largest backer, is reportedly mulling dumping its stock in the graphics giant.
    You probably know what happened next: Nvidia increased supply of its GPUs right as miners lost interest because it wasn’t really feasible to turn a profit anymore. The company thought it would finally meet demand; it ended up exceeding it.

    This was good for enthusiasts—they could finally buy graphics cards for reasonable prices instead of having to watch miners push them out of the market. But for Nvidia, the missed expectations led to oversupply and then to falling share prices.

    The golden goose had gone back to laying edible eggs. Now, according to Bloomberg, that change has inspired SoftBank to sell its stake in Nvidia for a $3 billion profit. (Although the deal isn’t finalized and the investment group’s plans could change.)

    SoftBank’s motivation is clear. Nvidia’s share price has fallen from its $289.36 peak to $147.83 at time of writing. Investment groups don’t hold on to stock like that when the peak’s conditions—in this case, the Ethereum boom—are unlikely to be replicated.

    Nvidia has other problems, too. Some 20 percent of its revenue came from China, and with rising tensions between the U.S. and the country responsible for one-fifth of Nvidia’s revenues, investors are probably concerned by that as well.

    Combine all that with the dubious reaction to RTX graphics cards and their up to $1,200 price tags and things get even murkier. Should the company exacerbate its crypto blunder by eating the cost of existing mid-range cards to offer new options?

    We don’t know. But it seems clear that Nvidia’s share price is unlikely to stabilize any time soon. There are too many factors at play, from the crypto market to America’s relations with China, to predict how the company will fare heading into 2019.

    At least we know what happens if the golden goose can’t live up to its promise: its owner sells it off to someone who doesn’t mind eating a few eggs while they wait to see if things go back to the way they were. It still sucks for the goose, though.
    Comments
    sdlazl6ay6421.jpg
    https://www.reddit.com/r/AyyMD/comments/a62gug/woohoo/
     
    Last edited: Dec 15, 2018
  24. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    RED Digital Cinema and NVIDIA Make 8K Movie Editing a Reality
     
    hmscott likes this.
  25. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    4K DLSS definitely looks better than the 4K TAA there, 4K TAA is blurry in comparison whilst also running at a lower framerate. There was one scene where I spotted there were more visible jaggies than 4K TAA, but I'd take the increased framerate and lack of blurriness in contrast to TAA - I've always hated TAA. Speaking of TAA, has BF V finally managed to include the option to turn off TAA? Last I heard it was enabled by default and was impossible to turn off as it was an intrinsic part of the rendering engine or something!
     
    hmscott and Talon like this.
  26. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    There actually is a way to turn off TAA in BFV, you have to use the console command WorldRender.LightTileCsPathEnable 0 along with setting Motion Blur to 1% or higher.

    1: TAA on, TAA off
    2: TAA on, TAA off

    (Ignore the frame rate difference, the person who posted those had a CPU bottleneck so FPS was fluctuating a lot.)
     
    Robbo99999 and hmscott like this.
  27. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    There's not a lot of difference in the appearance there that I can see, in fact in #1 example the TAA off looks more blurry in the distant textures of the wall. Pity you have to enable motion blur to get rid of TAA too, as it kind of defeats the purpose too. Are there other examples where TAA off is crisper than TAA on, or does turning off TAA in this game not really provide improved clarity in this game do you think?
     
    hmscott and Vasudev like this.
  28. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    DLSS isn't perfect (yet) but it's far better than TAA or other current AA techniques. TAA is an abomination.
     
    Robbo99999 likes this.
  29. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    I accidentally used my scroll wheel in guru3d DLSS benching results and image looked worse at max zoom and samples looked just like Pixel 3's AI based zoom(Not the best but average). With No zoom in DLSS the image quality looks stunning w/ vivid details.
     
    hmscott and Dr. AMK like this.
  30. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The overall quality of TAA in Frostbite is much higher than in FFXV. The reason textures look sharper with TAA on is that a sharpening filter gets applied with TAA (BF1 was the same way), which I personally don't care for as I think it degrades image quality. Some people may like sharpening if they find TAA too blurry, but I wish there was a way to disable it because I never thought TAA was blurry at 120Hz.
     
    Robbo99999 likes this.
  31. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I also heard that running TAA High is less blurry than TAA Low, I hadn't realised that when I played the beta version of BF V, and I thought it was blurry as hell. I'll probably buy the game when it's on a major sale, as only started playing BF 1 over the last 6 months or so, and then I'll experiment with the graphics settings to see if I can rid it of it's blurry mess!
     
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Hmmm, do you find the TAA in BF1 blurry as well?
     
  33. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Wow! Nvidia Has Lost HALF Its Value in 2 Months!
     
    hmscott likes this.
  34. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Last edited: Dec 16, 2018
    hmscott likes this.
  35. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I run with zero AA in BF1. I'd choose that same option in BF V if I could. The way I see it, I'm always moving or panning the camera, so the natural blur of LCD is enough to get rid of jaggies, and another reason to remove any motion blur settings in the graphics menu too.
     
  36. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Ah OK. I personally find shimmering and aliasing distracting when trying to play competitively, but to each his own.
     
    hmscott likes this.
  37. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Best RTX 2080 Ti Video Cards for Overclocking - 2018 PCB Round-Up
     
    Vasudev, hmscott and Papusan like this.
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    RTX DLSS vs. TAA Analysis: RTX Finally in Use, Is DLSS Crap?
    Gamers Nexus
    Published on Dec 16, 2018
    The first public release of NVIDIA DLSS (deep-learning super sampling), an RTX feature, is out for FFXV. We benchmark and compare vs. TAA. NVIDIA's RTX GPUs launched without any RTX features publicly available, but they're finally beginning to get implemented in games. Part of this hinged on DXR's availability via a Windows update, like the Battlefield V implementation, and part of it was waiting on developers. FFXV already had DLSS for its built-in benchmark, but that benchmark has proven difficult to trust in the past. With its full implementation in the real game, we return to FFXV to benchmark DLSS vs. TAA vs. No AA, then look qualitatively at the differences between DLSS and traditional anti-aliasing methods. If you're wondering how DLSS works, it's explained in this video and accompanied with visual examples of gameplay differences. We may publish an article for this one. TBD. The holiday season means we're slammed.

    Yes, DLSS is crap...fast crap. :D
     
    Vasudev likes this.
  39. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    Nvidia better teach their "supercomputer" about Moire effect and let it have another crack with some different settings because that looks just as horrible as you'd expect upscaled 1440p to look

    Maybe it's designed for increasing RT throughout on blurry shadows and reflections off imperfectly reflective surfaces where blur and fuzz will be unnoticeable, but being used like an AA mode has it sucking hard so far
     
    Vasudev and hmscott like this.
  40. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    The fact that I dislike TAA type antialiasing (for the increased blurriness) combined with the fact that DLSS looks worse than TAA in a few instances there, it makes me think that DLSS is pretty rubbish in that implementation - the one slightly saving grace is that it's running at a significantly higher framerate with DLSS enabled, but to be honest I'd rather NVidia develop hardware that is significantly faster so they don't need to use DLSS! I am quite interested in their DLSS x2 mode, which is native rendering (no dirty upscaling) combined with the tensor cores just performing an AA effect based on deep learning - so theoretically this should give really good crisp visuals but with zero jaggies and with no rendering performance hit to the CUDA cores (I think), but I don't believe we see that mode being used anywhere in demos or games yet.
     
    bennyg, Vasudev and hmscott like this.
  41. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Interesting you should say that, as I think the problem is that Nvidia's source material for DLSS rule-set's is at 8k, instead of at the 4k display resolution.

    The problems stem from mapping that 1440p render to 8k source only to be further processed to display at 4k.

    I think the problem is that the rule-set needs to be native at the display resolution, not 1 higher than display resolution or 2 higher than rendering resolution: 1440p render -> 8k source => 4k display output resolution.

    Nvidia outta give native 4k source a try, and see how it looks, with less up/down processing the fine lines should be solid instead of flickering or invisible.

    The nice side-effect is that the DLSS " Gonculator" processing will go much faster using a 4k source than an 8k source, an order of magnitude less I/O throughput, RAM foot-print, and depth of processing time.

    And as Nvidia, I wouldn't try downsizing the existing 8k source files to 4k, I'd regenerate the source files at native 4k for the best results. That way Nvidia can continue forward with the faster process path to 4k source for rule-sets should that solve / reduce the flickering problems.
     
    Last edited: Dec 17, 2018
    Vasudev and bennyg like this.
  42. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  43. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    SoftBank May Sell Nvidia Shares — But Not Because of Crypto Downturn
    DECEMBER 14, 2018 02:47 CET
    https://www.ccn.com/softbank-may-sell-nvidia-shares-but-not-because-of-crypto-downturn/

    "In August, Nvidia shelved its focus on the cryptocurrency market, blaming the downward trend of cryptocurrency prices. At the time CCN disputed this reasoning by comparing Bitmain’s profitability in an equivalent period to that cited by Nvidia before it exited the market.

    It was not the demand for cryptocurrencies that was impacting Nvidia, but a decline in demand for graphics processor unit (GPU) mining chips. ASIC mining chips and machines were performing much better in the market due to their efficiency, drawn from being designed purposely for cryptocurrency mining.
    At the time of Nvidia’s market exit, the demand for cryptocurrency mining equipment was high, and competitors like Bitmain, Canaan, and Samsung saw growth. The demand for these specialist machines was fueled by large-scale crypto-mining operations which benefit from economies of scale far over the efficiencies of home and small miners.

    In August, Nvidia CFO Collette Kress said: “Whereas we had previously anticipated cryptocurrency to be meaningful for the year, we are now projecting no contributions going forward.”

    Last month, CNBC Mad Money’s Jim Cramer illustrated a more likely reason for Nvidia’s sudden woes:

    “Nvidia still makes the best graphics chips, which have become more powerful than traditional microprocessors. It still has a lead over the competition in a lot of uses, although you could argue that AMD’s catching up to them in the data center while Intel rivals them in self-driving vehicles. I think Nvidia made an honest forecasting mistake.”

    Though Nvidia’s share price has fallen, it is still the largest gaming graphics card maker. Some of the share price fall should be attributed to a correction in demand for gaming chips. During Q3 there was an oversupply of gaming chips to the market, the firm slowed its supply and adjusted fourth-quarter forecasting down. Analyst consensus is that after this adjustment works through, the company will continue to see earnings growth in the region of 15 percent each year for the next five.

    If SoftBank sells its Nvidia share, it is still likely to make $3 billion from the deal, as it constructed a “collar-trade” to protect against a share price decline. The company, incidentally, also recently denied reports that it had participated in Bitmain’s latest pre-IPO funding round."

    Nvidia Stock Slides on Rumors of Major Investor Softbank Departure
    Joel Hruska on December 17, 2018 at 8:15 am
    https://www.extremetech.com/g00/gam...-slides-on-rumors-of-major-investor-departure
    nvidia stock 6 months.JPG
     
    Last edited: Dec 17, 2018
    Vasudev and Dr. AMK like this.
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    GPU Pricing Update, AMD Destroys Nvidia in the Mid-Range, Pascal Is Disappearing
    Hardware Unboxed
    Published on Dec 18, 2018
     
    Vasudev and Dr. AMK like this.
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Last edited: Dec 19, 2018
    Vasudev likes this.
  46. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
  47. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Well it was uplifting music! ;-) Ha, but no, what's the difference here with Turing & this demo? Is it saying it renders closer objects with a great Level of Detail (LOD) than objects that are further away in order to get a performance uplift while not losing any appreciable visual quality - this is my inference? Don't games already do that anyway? I've probably missed the point.

    EDIT: ah yes, I missed your link to the article. Yes, I was kind of right in my inference above, but they also do the following (quoted from article):
    "The Asteroids application can achieve very high frame rates by moving key performance bottlenecks of object list processing off of the CPU and into highly parallel GPU mesh shading programs. Starting from an extremely large dataset comprising trillions of potentially visible triangles at any given time, the shaders efficiently eliminate primitives that will never be seen and shade only those contributing to the pixels displayed."
     
    hmscott likes this.
  48. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    [​IMG]
    NVIDIA Titan RTX 3DMark Performance Unveiled – Scores Over 40,000 Graphics Points in Firestrike When Overclocked Under Watercooling wccftech.com | Dec 20, 2018

    NVIDIA Titan RTX graphics card is the flagship solution for prosumers who want workstation grade performance and also want to enjoy AAA gaming titles. The Titan series has become the go-to option for users who want the best of both worlds but you have to pay a large premium to get your hands on the best that NVIDIA has to offer


    [​IMG]

    --------------------------------


    [​IMG]
    Intel Xe Graphics Card: Rumors, News, and Release Date Tomshardware.com | by Paul Alcorn December 19, 2018

    As with all market upsets, that could be good for the consumer. Nvidia has taken the uncontested performance lead with its 2000-series graphics cards, granting it license to charge steep premiums for its top-end gear. Meanwhile, AMD seems resigned to cater to the entry- and mid-level portions of the market while it sheds talent during the bring-up of its 7nm Navi architecture. These conditions mean the graphics industry is ripe for a shake-up that could ultimately lead to lower prices for the consumer.
     
    Vistar Shook and hmscott like this.
  49. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Another RTX model release failure, another disappointing RTX GPU, for 2x the price of the last overpriced and disappointing RTX model release, with even less of a performance uplift.

    The Titan RTX performance is actually 5-10 FPS slower in Battlefield V with "RTX ON" compared to an RTX 2080ti.
    Titan RTX vs RTX 2080ti in BFV with RTX ON.jpg

    To add insult to injury, Jay was looking forward to showing off 2 Titan RTX's in SLI, but the Titan RTX's wouldn't OC as far as 2 RTX 2080ti's in SLI, so the $5000+ Titan RTX SLI performed worse than the ~$3000 RTX 2080ti SLI (WC). :(
    Titan RTX SLI fail.jpg

    Nvidia face-planted 100% with the Titan RTX release, yet another RTX flop.

    So then according to Nvidia, neither of the cards are gaming cards?...(Titan RTX / RTX 2080ti) - jayztwocents

    RTX Titan vs RTX 2080Ti Gaming Benchmarks

    JayzTwoCents
    Published on Dec 19, 2018
    RTX Titan is here... how does it perform in games?


    Tech With Sean 1 hour ago
    "Dang it would get me 3 more FPS in far cry than the 2080ti! Better take out a loan"

    ONE EYED GAMER! 1 hour ago (edited)
    "Nvidia : we need to release new card
    Employee: lets just make the same thing just change the name and add $1200
    Nvidia: YOU ARE PROMOTED!"

    One Two 41 minutes ago
    "Why does Nvidia want to kill PC gaming?"

    NVIDIA Titan RTX 3DMark Performance Unveiled!

    WccftechTV
    Published on Dec 19, 2018
    NVIDIA just launched the TITAN RTX and while there are no reviews, performance numbers are strolling in from buyers of the ultra premium card.
    NVIDIA Titan RTX 3DMark Performance Unveiled – Scores Over 40,000 Graphics Points in Firestrike When Overclocked Under Watercooling
    By Hassan Mujtaba, 8 hours ago
    https://wccftech.com/nvidia-titan-rtx-turing-flagship-graphics-card-benchmark-3dmark-unveiled/

    "NVIDIA’s flagship Titan graphics card, the Titan RTX, went on sale yesterday and that allowed many PC enthusiasts and content creators to get their hands on the beefiest prosumer aimed Turing 12 nm graphics card. Since its announcement, there has been no official performance data available but with consumers getting their hands on their new, ultra-expensive purchase, we get to see the first performance results in the 3DMark benchmark.

    NVIDIA Titan RTX Benchmarked in 3DMark Firestrike, Scores Well Over 40,000 Graphics Points WIth Watercooling and Overclock on Both GPU and Memory NVIDIA Titan RTX graphics card is the flagship solution for prosumers who want workstation grade performance and also want to enjoy AAA gaming titles.

    The Titan series has become the go-to option for users who want the best of both worlds but you have to pay a large premium to get your hands on the best that NVIDIA has to offer. So before getting into numbers, let’s talk about the specs of the new Titan.

    NVIDIA Titan RTX Specifications / Pricing / Compute Performance Recap

    The TITAN RTX uses the full TU102 GPU configuration with 6 GPCs, 36 TPCs, 72 SMs and 4608 CUDA Cores arranged within those SMs. There’s also 576 Tensor Cores and 72 RT cores that handle the bulk of AI/DNN and Raytracing workloads. The clock speeds will be maintained at 1350 MHz for the base and 1770 MHz for the boost frequency. The card features 24 GB of GDDR6 VRAM along a 384-bit bus interface that is clocked at 7.00 Gbps (14.00 Gbps effective) clock.

    The card pumps out 672 GB/s of bandwidth and additionally comes with 6 MB of L2 cache. Power is provided through dual 8-pin connectors with a rated board TDP of 280W. The card also packs in the latest display connectivity with 3 DP, 1 HDMI, and a single USB Type-C port.

    If we talk about performance, the card rocks 16.2 TFLOPs of FP32 compute which is higher than the Titan V’s 15.0 TFLOPs compute. It also comes with 11 Gigarays per second of ray tracing prowess, again, which is slightly higher than the 10 GRays/s of the Quadro RTX 8000 solution. NVIDIA states that the card would provide a 100 GB/s, full range NVLINK solution when two cards are paired for together in a multi-GPU environment. All of this can be yours for a premium price tag of $2499 US which although lower than the $3000 US of the previous Titan V graphics card, is still twice as much as the RTX 2080 Ti Founders Edition.

    NVIDIA Titan RTX Benchmark Performance
    Moving on to the performance numbers posted by Twitter user “Death” (Via Videocardz), we can see that the Titan RTX graphics card was fitted with a water block from BYSKI that also packs a nice little LCD to display stats such as clocks, voltages, and temps. The graphics card was overclocked to 2070 MHz on the core and 2025 MHz on the memory. This pushed the bandwidth to 778 GB/s over the reference 672 GB/s which is a nice uplift.
    [​IMG]
    NVIDIA Titan RTX 3DMark Firestrike User Submitted Performance Numbers – Credits: Twitter user “Death” (Via Videocardz)

    Running 3DMark Firestrike, the Titan RTX reported an overall score of 31,862 and a graphics score of 41,109 points. This is impressive considering that a single chip based graphics card can now score well over 40,000 points. When comparing it to my GeForce RTX 2080 Ti overclocked score, I see that the difference between the two cards isn’t that big with the GeForce RTX 2080 Ti scoring 39,958 points on an air cooler with an overclock of 2175 MHz on the core and 2025 MHz on the memory.
    [​IMG]
    Now there are things to note that the Titan RTX core wasn’t pushed as much as the RTX 2080 Ti overclock which I used but the Titan RTX does come with more CUDA cores than the RTX 2080 Ti. Also, I can probably squeeze slightly more juice out of the RTX 2080 TI to hit past 40,000 points (graphics).

    Overall, its a nice demonstration for the Titan RTX and hopefully, we get to see more performance numbers in the coming days. For those who want the best gaming performance, the GeForce RTX 2080 Ti is the king of the hill given its half the cost of a Titan RTX."
     
    Last edited: Dec 20, 2018
    SMGJohn likes this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The RTX 2060 looks same / weaker (6GB vs 8GB) than 1070, and not even close to a 1070ti:
    RTX 2060 vs 2070 1060 1070 specifications comparison.jpg

    $400 of Pure Disappointment - RTX 2060 Detailed Rumors

    UFD Tech
    Published on Dec 19, 2018
    • 1:00 - 2060 is Officially RTX 2060
    • 2:00 - RTX 2060 Specs
    • 7:35 - Titan RTX Now For Sale
    • 7:40 - Windows October Update Available for All
    • 8:04 - 3DMark Ray Tracing Benchmark Super Stronk

    • 6:07 - Nvidia's Fake AI Faces
    • 8:51 - Nvidia's MX250 Leaked
    • 9:18 - Twitter Allows for More Chronological
    • 9:42 - Elon Musk's Boring Tunnel Unveiled
    • 10:41 - Gigabyte's Aorus Memory


    RTX 2060 good for nvidia, bad for gamers !!
    not an apple fan
    Published on Dec 19, 2018
     
    Last edited: Dec 20, 2018
← Previous pageNext page →