The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    hmscott likes this.
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    I am looking forward to testing that. The part that most interests me is this:

    Advanced GPU Boost control for NVIDIA GeForce RTX 20×0 series graphics cards. Extended voltage/frequency curve editor on GeForce RTX 20×0 family graphics cards allows you to tune additional piecewise power/frequency floor and temperature/frequency floor curves. Control points on those new curves allow you to control GPU Boost power and thermal throttling algorithms more precisely than traditional power limit and thermal limit sliders.
    Edit: LOL, I already had this version installed and never noticed the new feature.


    So, I played about 2 hours of BFV yesterday with DX12 and DXR (ray tracing) enabled and, as I suspected and expressed more than once, DXR is (at this point) what I view as just another worthless gimmick to sell GPUs to the game-crazy. I was not impressed by the look of it. I know it's new and BFV is being criticized by some as not being a good first example. The gameplay with Ultra preset, DX12 and DXR modes enabled is between 55 to 70 FPS average, mostly in the 60 to 65 FPS unless you are looking toward the sky or a background without much detail. Then it jumps up to well over 100 FPS. I was not really looking for high FPS. But, I was looking for a reason to be impressed with new eye candy from ray tracing. I am not impressed. Yes, I have seen the videos from JayzTwoCents and still shots look impressive in YouTube videos. When I am playing a game I am not paying enough attention to that kind of stuff for it to add any value to the game experience. I am focusing on the action at hand.

    I am going to play it in DX11 mode later today and see if I can tell any difference in how things look compared to yesterday. I'll come back and post something later if my opinion changes after playing BFV in DX11.

    As soon as I read that it was going to be a Windoze OS X and DX12 feature, my expectations were extremely low, so I cannot say that I am disappointed. I do not expect ANYTHING awesome--now or ever--from Windoze OS X or DX12. I bought the GPU for overclocked benching, not ray tracing. I am not disappointed with the performance of the 2080 Ti. I expect the hype over DXR to fizzle out, just as it did with everything else related to DX12. DX11 is still a better product. Windows 7 is still a better product. Vulkan is a better API than DX12. Nothing to see or be excited about here with DXR kiddos. At this point not having DXR support for Windows 7 is irrelevant only because DXR seems irrelevant.



    The tax write-off from donating retired AMD RX mining GPUs to charities, like Goodwill and St. Vincent de Paul, might be a better (more beneficial) option financially. I would not pay more than about $50-60 for a used RX 580/590 and no more than about $100-125 for a used Vega 64.
     
    Last edited: Dec 26, 2018
    Vasudev, Installed64, hmscott and 2 others like this.
  3. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Hey Papusan, that's some very interesting news re automated overclock scanner for MSI Afterburner on Pascal cards! I tested it this afternoon on my GTX 1070, I get higher overclocks just by using my own manual overclocking efforts. OC Scanner gives me +62Mhz on the core, but my manual highest stable overclock is +100Mhz. Interestingly they also have a "Test" feature in the OC Scanner where you can just test your manual overclock, by clicking the "Test" button - interestingly it says at stock clocks my card is "Test completed, confidence level is 90%", so it thinks it's not 100% certain that my card is stable even at stock clocks (which is not true). At my max manual overclock of +100Mhz it also says "confidence level 90%", but the most interesting part is that if I run +113Mhz through the test which I know to be unstable then this returns "confidence level 85%" - so to me it seems that for Pascal cards you might want to aim for a "confidence level 90%" when testing your overclocks using that application. Here's a screenshot of the test, and you can see it uses a hell of lot power through the GPU at times, blips lasting a few seconds of 250W on just a GTX 1070!
    OC Scanner.jpg

    @Vasudev , from my testing & results it looks like the automated OC Scanner is pretty conservative when setting an overclock, so I don't think it will be crashing any games.

    EDIT: Unwinder, the guy that created MSI Afterburner just gave me some insight over on Guru3d forums, he said that "Confidence Level 90%" essentially means 100% stable, as NVidia who created the testing algorithms were not bold enough to ever report a confidence level of 100%, so 90% is the highest you can ever see in that test! Here's where he told me (Post #7 in the following thread):
    https://forums.guru3d.com/threads/m...-pascal-support-extended-sliders-more.424559/
     
    Last edited: Dec 26, 2018
    Vasudev, Papusan and jclausius like this.
  4. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    6,160
    Messages:
    3,265
    Likes Received:
    2,573
    Trophy Points:
    231
    Nice honest opinion. And TBH, from the info I'm gathering by other reviews, posts, etc., not too unexpected. Also, in the spirit of fairness, I'm pretty sure that @hmscott has also been 'dinging this bell' on that aspect of RTX (price vs. payoff) over the past few months.
     
    Last edited: Dec 26, 2018
  5. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Yeah, he and I totally agreed on that. I haven't harped on it though. It's just not important to me, whether it works well or not, just like G-Stink is unimportant to me. I only wanted an RTX 2080 Ti for overclocked benching. We have been fed nothing but worthless gimmicks revolving around DX12 since 2015 and every bit of it is a crock-load of crap shrouded in lies. Micro$lop and their insane clown posse have colluded to create an impression that DX12 and Windoze OS X are important advancements in technology when, in reality, they are just garbage. Wonder what the next "DX10 only" piece of crap they're going to offer as a dangling carrot will be?
     
  6. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'm not so negative about ray tracing as you are, but then I'm going on theory rather than actually experiencing it! I do think it's the start of something for the future, rather than just a gimmick, but it's current implementation is lacking, I just found this review on HardOCP, RTX 2070 on Battlefield V - a dissapointment all in all - first DX12 has to be enabled which takes away a large chunk of performance (vs DX11), and then on top of that you have the performance impact of the ray tracing itself (and that's just ray traced reflections, nothing else is ray traced, everything else is traditional rasterisation):
    https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/1

    Ha, I just put an entry into a competition on Guru3d to win a Palit RTX 2070, at least it's faster than my GTX 1070 & will be FREE, but I have to win it first! In the back of my mind I might even be disappointed to win it because it means I'll feel less worthy to upgrade to the next NVidia gen which is gonna be likely a whole lot better (on 7nm, + refinements to architecture & ray tracing I guess).
     
    Last edited: Dec 26, 2018
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    It's not that I am negative about the concept, just that it's not (at this point) anything amazing and I don't like their manipulation crap. Making something a DX12 and Windoze OS X exclusive feature is evidence of manipulation. If it were actually something worth having, then it wouldn't be a Windoze OS X exclusive feature. The only reason it would be is manipulation. Making DX12 only available in Windoze OS X is also manipulation. There is no excuse for that. As a result of their manipulation and greed, DX12 hasn't turned out to be as important as they had hoped it would be. Most things work as good or better with DX11.
     
    TBoneSan, Vasudev, jclausius and 3 others like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Sell it if you win. Do the same for your 1070 and add in the needed cash for 2080. Then do next upgrade when 4070/4080 is out :D Or just save the “free” cash or spend them on games.
     
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Ha, will cross that bridge when I come to it, I've got to win it first, and I'll be sure to let you know if I DO win it!
     
    TBoneSan, Mr. Fox, Vasudev and 2 others like this.
  10. Arrrrbol

    Arrrrbol Notebook Deity

    Reputations:
    3,235
    Messages:
    707
    Likes Received:
    1,054
    Trophy Points:
    156
    To me the only thing that matters is performance. I'd never buy an AMD card over an Nvidia one just because of the manufacturer, or conversely i'd never buy an Nvidia card over a theoretical better performing AMD one. All companies do scummy things, Nvidia and Intel certainly do, but AMD are guilty as well. But to be honest, none of them have done anything so bad I would stop buying their products if the performance is there. If your budget is sufficient to buy an RTX 2070, it would be foolish to buy a Vega 64 instead of that unless you are doing a specific set of tasks that run better on AMD hardware.
     
    jclausius, TBoneSan, Mr. Fox and 3 others like this.
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You missed the part where the performance between the Vega 64 with new drivers trades performance advantage with the 2070, they are close enough to not matter in gaming.

    There is no reason to spend more (if your budget is sufficient) just to get a few more FPS more or less in some games, it's a waste of money.

    Just like any commodity there are sufficient purchases that save a lot of money, and overpriced purchases for the same results that waste money.

    All the information is at your disposal, if you will look at things with an even mind, no favorites by brand - take off your Nvidia Blinders - there is plenty of opportunity to not buy Nvidia products and save money.

    Why pay for Nvidia overpriced GPU's and give them your hard earned money needlessly? You could buy AMD products and save $ and have the same gaming enjoyment. Stop encouraging Nvidia to overprice products higher and higher, stop buying Nvidia products so they know you don't like what they are charging or doing, or both.

    Encourage competitors by buying their products, and they will respond. AMD has been very responsive to owner requests in both software and hardware, and if you support them doing that Nvidia might get the idea that it would be good for them to do it too.

    I've given plenty of info on how to save money to get the same result, it's up to you if you decide to blow money needlessly. You are either a wise consumer or an unwise needless waster of resources - your own resources.

    If you are interested in a 2080, get a 1080ti instead. If you want the performance of the 2070 buy a Vega 64 / 56 and undervolt / tune it (flash 56 with 64 firmware as well). If you want to spend less, buy used, if you want to buy new and spend less get an RX 580 / RX 590 or an RX 570.

    There are lots of choices other than succumbing to Nvidia's overpriced siren call.

    You can be a wise shopper and not a fool like Fry :D
    takemymoney.jpg
     
    Last edited: Dec 27, 2018
  12. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Okey-dokey. Here you go. And, the winner is...

     
    Last edited: Dec 27, 2018
  13. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Hard to tell via a video with all the compression and such, it's much better to see it in person, what do you think?

    I'd bet it's not worth it, from all the reports I've read - and from the video's I've seen - but it really needs to be decided in person, but I wouldn't buy one myself to find out as I think it's a boondoggle to get saps to pay 2x or more for the same performance as the 10 series.

    Given that the RTX effects are tacked on, and not replacing existing processing, it's a performance loss no matter how you play it. Light effects, light loss, heavy effects, heavy loss, it's not a matter of graduating to a better method, it's just wasting resources for no good reason.

    Given the Nvidia follow on 7nm (if they have the $ to roll it out), and AMD's new GPU's, I'd really wait unless you really need to build or replace right now.

    Then I'd get an AMD GPU unless you *really need* 1080ti performance, then get one of those used.

    There are plenty of non-RTX GPU's for sale at reasonable prices, so there's no need to get stung by the Nvidia "the more you buy the more you save" BS. :)
     
  14. Arrrrbol

    Arrrrbol Notebook Deity

    Reputations:
    3,235
    Messages:
    707
    Likes Received:
    1,054
    Trophy Points:
    156
    In my country at least, the Vega 64 and 2070 are very similar in price. In that case, it would be silly to buy a last gen GPU over something newer which performs better in most games - unless you only play Forza Horizon 4. In my opinion, it is worth spending an extra £20-50 to get more performance, especially if you plan on keeping your GPU for several years since you will have better performance in future titles.

    Also, what GPU someone gets is influenced by what they are upgrading from. If someone already has a 1080 or 1080Ti there isn't really much point in buying Turing (unless you want to run benchmarks). I wouldn't suggest anyone on a 1060 or 1070 bothers with the 2060 or 2070 since the performance uplift is marginal. If anyone is still using Maxwell or an old Fury/300X etc then it may be worth upgrading now though.

    The same goes for CPUs. If you don't care about overclocking and do a lot of productivity tasks then Ryzen is a better choice than an 8700K for example, but if you only play games and enjoy overclocking then an 8700K would be worth the extra £50 or so.

    There is no choice between Nvidia and AMD anymore on MXM anyway, so I don't have any option other than Nvidia unfortunately. If AMD start making standard sized MXM GPUs which perform as good or better than Nvidia GPUs for a similar price then i'd be happy to buy them, but I don't see that happening any time soon.
     
    jclausius, Robbo99999 and hmscott like this.
  15. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    I think they're running a Monte Carlo simulation on a small dataset to predict the confidence level with parameters such as Temps,Core clocks,Mem clocks,shader clocks,TDP limit,Time period etc...
    Can you see any unknown exe whilst running their automated Stress test in Task manager. The exe will be signed by nvidia or MSI or unwinder of RTSS.
     
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I'd suggest that the "Turing" GPU is also "last generation", except for the parts of the die that are currently "useless", the RTX parts are all that are new.

    Otherwise the Nvidia core is the same more or less unchanged, small increase in counts to provide some kind of bump in performance.

    The Vega 56 / 64 matching 1080 performance, 2070 doing the same.

    There's no loss in any way for going with the AMD "last gen" over Nvidia's current "last gen" rehash. :D

    At least you don't need to spend mind space on "RTX" BS, or get turned to buy "RTX" featured games just to get some value out of that 50% of the die that lies idle otherwise.

    With a 10 series Nvidia GPU or an AMD GPU you can be free and clear to enjoy gaming, period.

    All we have to choose from is "last gen" right now, until AMD releases true next generation GPU's on 7nm next year. :)
     
    Vasudev and Arrrrbol like this.
  17. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    It's a 3.32MB file called gpu_stressor.exe, it doesn't have any signed information associated with it or who might have produced it. Unwinder says that it's created by NVidia though.
     
    Vasudev likes this.
  18. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Don't Buy RTX GPU's (self.pcmasterrace)
    submitted 10 hours ago by Edgy_Reaper
    https://www.reddit.com/r/pcmasterrace/duplicates/a9vtjs/dont_buy_rtx_gpus/

    "I recently bought a new RTX 2070 on black friday. At the time the reviews were good with only 1 person who said their games started freezing and crashing. At the time or maybe a bit later people found that there were cooling and hardware issues with the RTX 2080ti. I believe it is across all RTX gpu's.

    Now after a month passed and my 30 day return expired, my GPU has now completely crashed when under load. I downgraded my drivers, updated my windows, i played the games without my GPU, and everything points to a problem with my GPU. I go over to newegg and check out my card and see a third of the reviews are 1 star.

    All of the reviews across multiple cards were similar in the fault, games freezing, crashing, artifacts and random pixels showing when the card is under load. Some were saying this was a cooling issue, but to occur between multiple manufacturers makes the problem a lot worse. Some review said this was a cooling issue but after checking my temps and changing around my fan speeds, there was no cooling issues.

    Here below are all the page reviews for a list of cards if anyone needs some proof.

    RTX 2070
    #1 (the card I purchased)
    #2

    RTX 2080
    #1
    #2
    #3
    #4
    #5

    RTX 2080 ti
    #1
    #2
    #3
    #4

    I'm not completely sure but there was news about these problems specifically for the RTX 2080ti's, but the problem is now across all of the RTX series, there is even a video of a youtuber who faced a similar issue, though his situation is a lot worse than mine.

    For now I would hold off in purchasing RTX gpu's, I suggest going for the 10 series for now if you dont want the chance of a dead card in a month. I hope this issue is addressed by Nvidia and fixed before everyone has broken GPU's."

    Wow, there are predominantly bad reviews on newegg for his 2070, I followed his newegg product link to the reviews:
    Gigabyte RTX 2070 reviews on newegg.JPG
    https://www.newegg.com/Product/Product.aspx?Item=N82E16814932091&Description=rtx 2070&cm_re=rtx_2070-_-14-932-091-_-Product

    The other 2070 link he gave on newegg, an EVGA model has a high percentage of bad reviews too, not as many but 33% 1-star reviews is poor:
    EVGA RTX 2070 reviews on newegg.JPG

    The other links show the 2080ti and 2070 are the worst high percentage of 1-2-start reviews 30%-40%+, the 2080 is around 20%-35% 1-2-star bad reviews.

    What I also noticed is the prices have crept up even higher...

    icemanthrowaway123 10 hours ago
    "Good list thanks OP.
    Most people on this sub know to avoid them unless you need entry price tensor cores for machine learning.
    The fact that the 2070 exists and breaks so often when people are getting Vega 64s in the $300s off ebay is sad."

    oCuHo 3 points 20 hours ago
    "Yeah, RTX was a fail. The 2070 and 2080 are pointless and the 2080 Ti is too expensive. Not to mention all the problems with them.

    Glad I bought a 1080 Ti instead."
     
    Last edited: Dec 27, 2018
    Vasudev likes this.
  19. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    YouTube butchers video quality to the extent that it is very difficult to tell any difference in some things. As I mentioned in the video, I prefer the gameplay with DXR disabled. That comes as no surprise to me. I suspected that would be the case even before I purchased an RTX card. I like the look and feel in DX11 better than DX12, and that was the case before DXR was ever mentioned by NVIDIA. Leaving all other settings untouched, using Ultra presets and changing nothing but the switch to DX11, the game runs as smooth as glass in DX11, and everything looks clearer to my eyes. Aesthetically, the differences are very small, to the point of being difficult to identify in terms of being beneficial or desirable, and I lean in favor of DX11, no DXR.

    Where our opinions differ on GTX versus RTX is the subject of performance. 2080 Ti performance is noticeably greater than 1080 Ti, and that was the sole reason I got one. They are definitely overpriced, and so are all of the used 1080 Ti graphics cards.

    The AMD RX cards are an OK option for the simplest of gamers with very low expectations on performance and a limited budget. The only compelling thing they have to offer is that they are comparatively extremely inexpensive products. They do not manufacture a GPU that anyone can be proud to own, and nothing they can or should feel proud of having their brand name on at this point. Suggesting that casual gamers get one makes perfect sense. They can use the money saved on other things they enjoy doing, or just keep the money in the bank (or not incur debt if they don't have the cash). I have been supportive of Radeon cards in the same scenario. You don't need to spend a lot of money for casual gaming at 1080p and modest quality settings. However, suggesting an AMD card to anyone that views themselves as what some call a "hardcore gamer" is nonsense.

    While Intel and NVIDIA have pulled enough shenanigans over the years, as every manufacturer of technology products has and will continue to do, I still have a hard time with all of the hate AMD fanbois have for them. I cannot forgive and forget more than a decade of abject failure, producing one crappy product after another, year over year, while that entire time Intel and NVIDIA were there giving us comparatively superior products that almost everyone has been happy with. AMD GPUs are still lackluster products. Jumping from the green team to the red team because of the latest perceived blunder by the Green Goblin during the same decade or so is pretty fickle, and the fact still remains their products are vastly superior to anything the Radeon Rats have on offer. AMD's recent progress toward excellence on the CPU side with Zen is noteworthy, but still lacking from my perspective. They suck at overclocking and don't handle memory overclocking gracefully. The gap between Intel and AMD CPU has never been narrower in more than a decade and we're all watching that with great anticipation. That being the case, they have to be less expensive in order to be competitive. If they were priced the same, hardly anyone would give them a second thought based on their long track record of selling inferior products.

    TL;DR - everything is screwed up right now with respect to GPUs. AMD graphics cards still suck, DXR is a joke (in my opinion) and NVIDIA products (new and used) are severely overpriced.

    It wears me out watching you burn so many calories on the hate, but I guess if you are enjoying it knock yourself out. Don't get too bummed out when you find it makes no difference in the end. People are going to buy what they want no matter what. Look no further than all of the pathetic BGA filth on the market. You can't fix stupid, bro.
     
    Last edited: Dec 27, 2018
    jclausius and Arrrrbol like this.
  20. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    I don't know what to say, I found Dx11 to be perfect on Nvidia and worse on AMD. As always Dx12 is a nightmare on some dx12 titles unlike AMD or Intel GPUs.
    99% of the time I tried Dx12 on Hitman Game, it crashed midway taking my save games to oblivion and I prefer to have stable Dx11 rather than buggy dx12. At the same time, AMD mid range run fine on Dx12, FPS is low but gameplay is stutter free 95% of the time.
     
    Last edited: Dec 27, 2018
    jclausius, hmscott and Mr. Fox like this.
  21. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Just played about an hour's worth of BFV in DX11 mode and Ultra preset. Smooth as silk and very cool (not using water chiller).
    upload_2018-12-27_13-33-53.png
     
    Last edited: Dec 27, 2018
    Vasudev, hmscott, Papusan and 2 others like this.
  22. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Which AMD GPU did you test, and how long ago?

    If you look at the performance numbers in the video review I posted the DX11 results for AMD have gone up steadily over the last year, as have Nvidia's DX12 numbers improved with driver and game updates.

    Battlefield V DX12 has problems outside of the RTX goop, which Nvidia drivers still have problems giving smooth game play, DX11 is fine.

    Also you can power mod the Vega 56 to a Vega 64 power and beyond - the mod works on the Vega 64 too - and then both Vega GPU's can compete with the 2070 when tuned.

    This Gamers Nexus video doesn't do all the mods which can increase memory performance and reduce power draw too - but does mention them - buildzoid did them last year, but this is fun anyway. Looking at the scores you can see the Vega 64 could also benefit from tuning from stock - as the tuned Vega 64 would perform higher than the tuned Vega 56.

    Beating the RTX 2070 with Vega 56 Mods | Unlimited Power
    Gamers Nexus
    Published on Oct 24, 2018
    We modded an AMD RX Vega 56 card to outperform the RTX 2070 with some powerplay table mods.
    Article: https://www.gamersnexus.net/guides/3382-rtx-2070-versus-power-modded-vega-56
    The AMD RX Vega 56 card has remained a top choice of ours since its launch, but it has been scarcely available over the past year. Vega cards -- including Vega 64 -- have finally become more available with the collapse of mining, and they're particularly interesting when considering the current landscape.
    https://www.gamersnexus.net/guides/3382-rtx-2070-versus-power-modded-vega-56
     
    Last edited: Dec 27, 2018
    Vasudev likes this.
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Gigabyte + Nvidia went nutz with 40 EEC ( Eurasian Economic Commission ) registrations for variants of the 2060 (see video / article below), with 6 variants sifting out.

    0:58 - 6 DIFFERENT RTX 2060's
    Published on Dec 27, 2018
    Sources & Timestamps!
    • 0:58 - 6 DIFFERENT RTX 2060's: http://bit.ly/2Rea5fx
    http://bit.ly/2GFMMan
    • 4:18 - Afterburner OC Scanner for Pascal: http://bit.ly/2Ahjvgu
    • 5:12 - Nvidia's New Mesh Shadows: http://bit.ly/2R43hB9
    • 5:39 - 4 New Intel CPUs: http://bit.ly/2SmrIXP
    • 6:20 - Intel's New Fabs: http://bit.ly/2QGU0iP
    • 6:57 - Netflix Happy with Witcher Show: http://bit.ly/2QT3rfh
    • 7:34 - Elder Scrolls Cookbook: http://bit.ly/2ENl0WJ
    • 7:57 - 120Hz OLED Monitors: http://bit.ly/2AaylFr
    • 8:39 - In-Win's Insane Case: http://bit.ly/2UZo8oe
    • 9:15 - FF VII Remake Progressing Smoothly: http://bit.ly/2GyqXcA
    • 9:43 - Video Game Map Sizes: https://youtu.be/UvFdMax3wlA
    • 10:16 - Halo Infinite for PC: http://bit.ly/2SdvVNd
    • 10:49 - MK11 PC Specs: http://bit.ly/2QM4wWf
    • 11:10 - AMD Joins NASDAQ-100: http://bit.ly/2GvK5rB
    http://bit.ly/2UZYC23

    NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type
    by btarunr Tuesday, December 25th 2018 Discuss (174 Comments)
    https://www.techpowerup.com/250924/...in-six-variants-based-on-memory-size-and-type

    "There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5.

    Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from.

    It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019."

    Prima.Vera
    "The leather jacket idiot seems to be loosing completely his mind. By releasing 6 different cards under the same name it’s the proof that nVidia lost it. And lost it hard...Posted on Dec 25th 2018, 21:01"

    GoldenX
    "The whole RTX launch seems something out of Bethesda / EA Games."

    GeForce RTX 2060
    A huge leak or a futureproof entry to EEC registry?
    https://videocardz.com/79452/gigabyte-submits-geforce-rtx-2060-6gb-4gb-and-3gb-variants-to-eec
     
    Last edited: Dec 27, 2018
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Another RTX Mobile leak...

    GeForce RTX 20 Mobile Series teased by Chinese OEM
    Published: 26th Dec 2018, 18:43 GMT | 16 Comments
    https://videocardz.com/79462/geforce-rtx-20-mobile-series-teased-by-chinese-oem

    "It has been speculated that RTX Mobile series might launch soon, possibly at CES 2019 early next year. We have, in fact, seen numerous notebooks featuring 8th Gen Core Coffee Lake H series, which are set to launch next month. The 9th Gen Core S Series are also to be paired with RTX Mobile series but only in barebone systems which offer full Z370 chipset with desktop socket support.

    Chinese OEM called CJSCOPE confirmed such system featuring RTX 2080 MXM graphics card. The notebook codenamed HX-970 GX will be offered with RTX 2080 MXM featuring 2944 CUDA cores and up to 1847 MHz core clock.

    The optional specs also list RTX 2070 and RTX 2060. The latter will be available on January 15th, this is actually the date that we have heard for the desktop version.

    The clock speeds are slightly higher than desktop Founders Editions. These are probably A variants, which are capable of sustaining higher frequency. The Max-Q variants (even more power efficient SKUs) will probably launch later when thinner designs are ready for launch (such as ASUS Zephyrus)."
    CJSCOPE-GeForce-RTX-2080-MXM.jpg
    GEFORCE-RTX-2080-2070-2060-CJSCOPE-Specs (1).png
    There is also a "performance" comparison chart image, but it's so far off I'm not going to post it. It shows the "1080ti" (instead of a 1080) as the same performance as the 2070, so right there it's wrong already. The whole thing might be bogus given the performance chart is wrong...or it could be an unfortunate typo... adding "ti". :(
     
    Last edited: Dec 28, 2018
    Installed64 and Vasudev like this.
  25. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    LOL, that is absolutely ludicrous. "A chicken in every pot." :bigrolleyes:

    I don't get it. I could see a few different models, by why so many 3GB, 4GB and 6GB variations of the least desirable RTX card? Gigabyte has never impressed me, but now we have evidence they are idiots. I wonder if any of their competitors are going to be equally dumb like this.

    One of my sons used to work at the Chandler fab location. Quite the facility. It is still under construction and has been for almost 8 years. Huge place.
    upload_2018-12-27_21-32-38.png

    That is a Clevo P7XX. Should be interesting to see how they cool that little beast. Hopefully, they won't be trying to use the same heat sink as before, because it was not adequate for a i7 quad core and 1080.
     
    Last edited: Dec 27, 2018
  26. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    That is an ungodly level of SKU bloat. They surely can't all be 2060s intended to be available at release.
     
    hmscott and Mr. Fox like this.
  27. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's very strange, and it may be nothing more than placeholders that will never ship.

    Or, it's another Nvidia BS business move, a cheap underhanded way to steal a bunch of shelf space at the mid-range where AMD has a sane / small number of sku's for the RX580 / RX590 / RX570 / ...

    AMD's normal number of sku's won't be hardly noticeable among the sea of RTX 2060's, if all of those show up and take a place on the shelf.
    The reports were that even with a double MXM sized card with more cooling the 1080ti wasn't doable in the highest end Clevo laptop.

    IDK what's changed since then, I guess we are gonna find out if it ships. :)
     
    Last edited: Dec 28, 2018
    Mr. Fox likes this.
  28. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    It was on r5 m330 on 720p with minor OC. A crap GPU in enthusiast's eyes.
     
    hmscott likes this.
  29. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,842
    Trophy Points:
    681
    Was this a prototype 1080Ti? I ask because there has never been a laptop 1080Ti.
     
    hmscott and Mr. Fox like this.
  30. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Vasudev likes this.
  31. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    " ...wasn't doable..." - as in never done?

    I searched for posts by @Mr. Fox @Prema @Papusan as I recall some discussions about the feasibility of a 1080ti in a Clevo Maybe they remember? Clevo P870xxx 1080ti musings? Links?

    What about fitting / cooling an RTX 2080 in Clevo chassis? Any comments? :)

    http://forum.notebookreview.com/threads/nvidia-thread.806608/page-128#post-10838935
     
    Last edited: Dec 28, 2018
    Vasudev likes this.
  32. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Yeah. Its my sister's laptop. I secretly game on it when she's not at home just to test it out. Lenovo downclocked it for some reason via BIOS update but I can push it higher with power limit of +20-30% and also put +200MHz on Core and +200MHz on memory w/o much increase in temps since ASIC quality is very good.
     
    hmscott likes this.
  33. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Nvidia crippled working 1080 chips and created the 1080Max-Q scam (120w TDP) last year. Bin up Turing chips with 200W TDP to perform more as old Pascal 1080Ti shouldn't be more difficult.
    1. April:vbthumbsup:
     
    Last edited: Dec 28, 2018
    Aroc, Mr. Fox, Vasudev and 1 other person like this.
  34. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,842
    Trophy Points:
    681
    Yup...April Fools strikes again :)
     
    Vasudev, Papusan and hmscott like this.
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, I remember that post by @Mr. Fox , but that's not what I was thinking of, as there were serious discussions before or after that. Thanks for bringing up that link though. :)

    That's just it the 1080 desktop 200w trimmed down to a 120w 1080 is what most people got in the laptop 1080 MXM.

    The top end Clevo 1080 MXM were 180w/200w, and the desktop 1080ti is 250w stock / 300w+ OC'd, with the desktop 2080 265w stock FE, and who knows how much OC'd.

    You'd need at least 200w for the 1080ti / 2080 cut down mobile version, and that wouldn't leave any room for OC.

    But, there are those (mockup?) ad's for Clevo RTX 2080, not even a P870 according to @Mr. Fox - a P7xx which has less cooling capability(?).

    We heard about GTX 1180 Clevo rumor's last year, so why not a new 2018 Xmas rumor for the RTX 2080 Clevo? :)

    Update: Found this closed thread, but it was somewhere else I participated in that discussion over a year ago...

    http://forum.notebookreview.com/threads/gtx-1080-ti-in-laptops.801596/

    Maybe here:

    http://forum.notebookreview.com/thr...scussion-thread.794965/page-255#post-10344498
     
    Last edited: Dec 28, 2018
    Aroc, Arrrrbol and Papusan like this.
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    I'm not interested how Turing in BGA version will perform (crippled design). And Pascal mobile cards have already power cap and need some love to be able to overclock. Same will it be for Turing. Not much have changed.

    Shave of from 230 down to 200W should no matter give minimum 1080Ti performance in "some" laptops.
    upload_2018-12-28_11-30-31.png
     
    hmscott and Arrrrbol like this.
  37. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Nvidia can't save themselves from demise even if they turn up the TDP to 400W on RTX-M GPUs because RTX is too expensive BGA and benefits of owning RTX-M is almost Zero. Even a name change for max performance "MAX R Rated GTX" won't make people jump to RTX mobile because the cost will be higher than desktop cards.
     
    hmscott likes this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    I do not think that is going to happen. People have grown numb to it and already pay way too much for notebook junk without batting an eye regardless of where the product falls on the performance spectrum. A couple hundred bucks difference (which is what I anticipate) won't be much of a hiccup in the grand scheme of things. I still believe the pricing issues we are seeing with Turing are a residual effect of how bitcoin miners ruined things, not specific to RTX. Once prices inflate it is difficult to normalize them ever again, but high supply and low demand are the best way to get there. We still have the exact opposite (high demand and low supply). Heck, even used 1080 Ti prices are insane due to high demand and residual effect of inflation due to the miner craze, and to some lesser extent, due to recoil against the high demand/low supply pricing of RTX cards.

    Speaking of Clevo 1080, here is a nice desktop versus Clevo monsterbook comparison recently posted by Brother @Prema...

    https://www.3dmark.com/compare/fs/15149060/fs/17595622
     
    Last edited: Dec 28, 2018
    Aroc, Papusan, hmscott and 3 others like this.
  39. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    El Munstro! Ray tracing... meh, whatever. :nah: But, OMG, look at that bad boy run. :biggrin:

    https://www.3dmark.com/vrpor/290292 | http://hwbot.org/submission/4023309_
    [​IMG]
    https://www.3dmark.com/vrpcr/23279 | http://hwbot.org/submission/4023261_
    [​IMG]
    https://www.3dmark.com/vrpbr/51120 | http://hwbot.org/submission/4023273_
    [​IMG]

    Compare it to 1080 Ti (based on 3DMark average score)
    upload_2018-12-28_10-58-31.png
    But, I prefer to compared it to my own best score (higher than shown above) and it is a +43% graphics performance increase. And, that's up against a modded 1080 Ti compared to a 2080 Ti with a typical stock throttle-meister cancer vBIOS. Just imagine how it will run with decent mods to correct the NVIDIOT's approach to metering performance. https://www.3dmark.com/compare/vrpbr/51120/vrpbr/38449
    upload_2018-12-28_11-1-43.png
     
    Last edited: Dec 28, 2018
    Aroc, Papusan, hmscott and 2 others like this.
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    More rumors + vendor puffery...

    NVIDIA GeForce RTX 2060 to cost 349 USD

    Today we present the full, confirmed specs of GeForce RTX 2060.
    https://videocardz.com/79505/nvidia-geforce-rtx-2060-pricing-and-performance-leaked

    Metal Messiah ++ Jerome Subia2 hours ago
    "Nothing is mentioned by the way, with regards to the in-game graphic settings used, except the resolution used. So, this chart makes little sense, because it lacks proper data.

    These fps numbers are a bit too early to come to any conclusion though, so we need to wait for third party benchmarks.

    As these numbers have been taken from Nvidia's reviewers guide, I would rather wait for actual Gaming benchmarks from other TECH sites."

    Sharkiller3 hours ago
    "Battlefield V with ray tracing enabled (1080p)
    Battlefield V: RT Off: 90 FPS
    Battlefield V: RT On + DLSS Off: 65 FPS
    Battlefield V: RT On + DLSS On: 88 FPS

    if they are using 1080p for rtx then they are using dlss on 1080p? WTF? so rendering at what? 720p?"

    eddmann4 hours ago
    "Launch MSRP: (non-founders)
    760: $250
    960: $200
    1060 6GB: $250
    2060: $350?!

    They finally broke their low mid-range card price record."

    Comments
     
    Last edited: Dec 30, 2018
    Aroc, Vasudev and Talon like this.
  41. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    The most interesting bit for me is the use of and presumable patch to include DLSS into BFV alongside ray tracing. Dice Devs said before they would likely use DLSS in the future and it looks like it’s coming soon.

    Navi will be DOA before AMD ever gets it launched. Similar to the 10 series Nvidia will be a year out in front of AMD before Navi launches and still won’t match or come close to Nvidia high end. Nvidia will likely price slash after their 10 series stock pile is gone sometime around Q1 2019 and will really turn the screws on AMD.
     
    Mr. Fox likes this.
  42. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That's why it's best to wait for reviews and more reports from owners before investing, as there are many questions to raise about the settings, configuration, patch availability - whether it's just a future game patch, future GPU driver update, or additional future Windows patches, with the accompanying delay for each until a bug-free release finally arrives many months later.
    Nvidia's greed will likely work against them, and for AMD keeping the pricing in AMD's favor, and with the 2019 AMD performance / release dates unknown - even if AMD is releasing new GPU's and of what range they will be, it's way too early to make pronouncements.

    Nvidia is dying right now with almost $2B of dead inventory - GPU's that would compete against their own offerings - not just AMD's - and although eating all that inventory would cause financial harm, selling it for what Nvidia wants to charge would kill their RTX / 20 series GPU sales.

    If you think about it clearly, the 20 series GPU's are just higher priced 10 series GPU's - for the same performance - except for the 2080ti / Titan RTX, those are the only "steps up" in performance from the 10 series.

    The RTX / 20 Series BS is another sneaky Nvidia way of selling the same GTX core performance hardware in "re-named" / "re-badged" skins, they are not the same dies, but they have the same core performance at higher prices.

    2050 = 1060
    2060 = 1070
    2070 = 1080
    2080 = 1080ti

    If you are looking to buy those 20 series models you are overpaying for last generation performance, and would be better off finding new / refurbished / used 10 series GPU's.

    2080ti = 2x too much money for performance value
    Titan RTX = 4x too much money for performance value

    Those are the "I've got more money than sense" sku's, with each purchase encouraging Nvidia to continue to gouge us with unfair pricing, monopoly pricing - encouraging Nvidia by enabling them.
     
    Last edited: Dec 29, 2018
  43. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    $100...

    From my earlier post:

    eddmann4 hours ago
    "Launch MSRP: (non-founders)
    760: $250
    960: $200
    1060 6GB: $250
    2060: $350?!"

    http://forum.notebookreview.com/threads/nvidia-thread.806608/page-129#post-10839810

    As we know from long experience, MSRP with Nvidia is usually a "Lure for the Unwary", with the real desirable units costing much more, with the additional high "sucker tax" added for "early adopters".
    takemymoney.jpg
     
  45. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    I post what they charge now. Apple vs. apples can be so much :D
     
  46. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Apples vs Apples would be MSRP release pricing for both products, as a fair comparison.

    Fanboi troll posts would be MSRP of product against "best possible fantasy price" that you can find for a " what they charge now" troll post. :p
     
    Vasudev and Papusan like this.
  47. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    HaHa, could be worse... Nvidia could throw out Radeon RX Vega 56 prices and performance ($400).

    For the records... I don't defend Ngreedias price policy :) They are ****y.
     
    Last edited: Dec 29, 2018
  48. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Just looking at the leaked reviewers guide it would appear the 2060 is actually around 1070 Ti to 1080 performance. In some games it beats both.

    2060 is double 1060 FPS for $50 more MSRP.
     
  49. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The GTX 1060 6GB FE MSRP was at $299, the 1060 6GB AIB MSRP was $249:

    " The GTX 1060 will enter the market with a range of approximately $250-$300. NVidia's MSRP is $250 for board partners, with scalability to higher prices as those AIB partners feel appropriate. The Founders Edition card will be priced at $300, and is properly limited in production for this run of the FE. "

    Official NVIDIA GTX 1060 Specs, Price, & Release Date
    By Steve Burke Published July 07, 2016 at 9:00 am
    https://www.gamersnexus.net/news-pc/2505-official-nvidia-gtx-1060-specs-price-release-date

    The 2060 "leaked" AIB MSRP is $349...

    That means the AIB Price differential between the 1060 6GB AIB MSRP vs 2060 AIB MSRP is $100, not $50.

    Which means the 2060 continues the RTX trend of drastically overpricing the MSRP for each sku price point compared to previous Nvidia releases that kept the price point the same but gave us more performance for the same price point, for many generations.

    Buying RTX GPU's encourage's Nvidia to continue this trend of overcharging, gouging us at each price point, pricing many people out of the gaming GPU market, ending up making people spend more for less or spend the same for the same performance that has been available for well over 2 years.

    This whole Nvidia RTX release cycle has been a continuation of the rip-off of gamers suffering from overpriced GPU pricing brought about by the nutzo crypto mining craze.

    Besides, there are going to be plenty of used and refurbished AMD and Nvidia GPU's for less than $349 that are as good or better in performance to the 2060, with RTX GPU's being "sucker taxed" even higher than MSRP.

    The market will adjust and Nvidia's RTX GPU's will continue to be overpriced, MSRP or actual prices, with AMD and GTX 10 series GPU's as much better price performance buys.
     
    Last edited: Dec 29, 2018
  50. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
← Previous pageNext page →