The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    If you keep going like this you're probably going to have a heart attack or something. Just chill, dude, LOL. It's OK to hate something passionately, like I hate BGA turdbooks. But, you're burning a lot of calories on trying to make AMD look like their poop doesn't stink and making Intel and NVIDIA look bad. Just like my loathing of BGA filth isn't going to stop the oozing of feces from Father Technology's butt-cheeks, you're not going to change anything. Kind of like going outside with a fly swatter on a mission to rid the world of flies.

    I think you are going to like the video I am making over the next few days. The performance of this RTX 2080 Ti GPU is very impressive in spite of its ludicrous price tag. It's more than a little bit faster than my modded 1080 Ti, even though I am running it with air cooling at the moment. I am basically doing an assortment of benchmarks (synthetic and games). The start of the video will show the "ideal 2080 Ti" running on chilled water at 20°C with the hardware power mod, stock EVGA vBIOS and stock clocks. The ultimate perfect environment for a stock 1080 Ti to shine brighter than the typical gamer-boy scenario (i.e. hot air cooling and spastic thermal throttling at 50°C like an ordinary 1080 Ti totally eliminated). I am comparing the "perfect scenario" 1080 Ti to the standard horrible stock usage scenario (i.e. hot air cooling) of the 2080 Ti, then compare that with the 2080 Ti on chilled water. So, a 3-way comparison of perfect scenario 1080 Ti vs standard horrible scenario 2080 Ti vs perfect scenario 2080 Ti. I just finished the first two phases of testing and I can tell you the 2080 Ti more or less rapes the 1080 Ti at stock versus stock even with the hot air cooling handicap. I hope to have that ready by this weekend.

    It seems like I am seeing a bit of a pattern. It is too soon in the testing process for me to validate my impression. A lot of the YouTube reviewers are comparing mainstream game performance, which is understandable. But, that seems to be a little bit misleading in a certain way. I have to do more testing to confirm it. If my 2080 Ti is a good one (meaning it survives without dying from artifacting and BSOD issues like a small percentage of them do) and I can finish my testing to confirm this, my impression is that the harder the 2080 Ti is pushed, the more it shines. It is most similar to the 1080 Ti (meaning a small performance bump) under lighter workload scenarios.

    I am testing DX10, DX11 and DX12 under high stress scenarios. It is looking like it really opens a can of whoop ass on the 1080 Ti under a severe workload, but not so much when it doesn't have to work super hard. In a few minutes I am going to pull the GPU and install the Hydro Copper block and over the next few days (after work in the evenings) I will finish my testing and compile the video.

    Here is a teaser... Again, this is stock versus stock, exactly the same driver version... chilled water 20°C ideal functional conditions for the 1080 Ti, giving the 1080 Ti every opportunity to do everything it is capable of doing extraordinarily well against ordinary hot air thermal throttling average gamer-boy scenario on the 2080 Ti.

    https://www.3dmark.com/compare/spy/5449865/spy/5459348
    upload_2018-12-19_22-45-10.png @jaybee83 @Robbo99999
     
    Last edited: Dec 20, 2018
  2. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Very much looking forward to your video bro. Also can't wait to see what you're able to achieve with chilled water.
     
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Yeah, me too. Using Precision X1 to let it auto-tune it's own overclock, it runs like 2075MHz boost clock at 1.075V using the stock EVGA vBIOS. I haven't benching it overclocked yet because of the testing I am doing for the video, but that's pretty impressive for such a noobish way of establishing an overclock. I am going to have to do some Google searching to see if I can do the same hardware mod that I did to the 1080 Ti. I bought extra resistors when I modded the 1080 Ti, so I have what I need if it works exactly the same.
     
    Robbo99999 and hmscott like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    My Pro-AMD statements are nowhere near the extreme levels of frothing at the mouth ranting that NBR Anti-BGA'ers exhibit about LGA. I'm a pussy-cat in comparison. Bambi's less bold cousin. A benign and innocuous champion of common sense. :D

    Besides, I am not doing the heavy lifting, Nvidia and Intel are the ones falling over themselves screwing up. I expect Intel and Nvidia will continue to do themselves in in 2019.

    As far as AMD, we'll see how they fair in 2019. AMD has a much more limited budget and a limited ability to branch out into all of their opportunities simultaneously, but overall the trend of failure for Intel and Nvidia should help AMD look better and better as 2019 unfolds.

    AMD has been given a wonderfully large window of opportunity by Intel and Nvidia, and I hope AMD are able to take full advantage of it in 2019. :)
     
    Last edited: Dec 20, 2018
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You should consider re-testing the 1080ti again, as the score for CPU looks higher for the 2080ti run as well, you may have tweaked things in between to get a better overall system result more recently, and those new system tunings would get better 1080ti results if redone today along with the 2080ti runs.

    The good news is the Titan RTX barely is a step up from the 2080ti, so you dodged a 2x $2500 => $5000 ante to benchmark a pair of Titan RTX's.

    As you mentioned SLI, are you going to up the game to a pair of 2080ti's in SLI?
     
    Last edited: Dec 20, 2018
    Falkentyne likes this.
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    I agree with that statement. Just waiting for the end results to see how much they can accomplish. That fanboi street runs both directions and you're drinking that red Kool-Aid pretty hard, LOL. I just want the "bestest" and don't care about brand. I don't care who wins or loses, and I will spend my money on the winner's toys. That said, I can see the value in low cost parts that can do respectable things without breaking the bank. My previous post about the RX 580 for $179 was about 95% sincere. I can see that as being perfectly suited for my 18 year old grandson that hasn't found his first job yet, or a college student living on Top Ramen and peanut butter and jelly sandwiches. For someone on a fixed income or a teenager gaming on mommy's and daddy's dime, it has a lot of merit and there is a place for that. There is also a legitimate place for those dirt cheap Walmart Overpowered gamer-boy turdbooks as well. $499 for a gamer-kid turdbook is a win for parents or a low-income gamer, or someone who just doesn't care as long as it works.
     
    Last edited: Dec 20, 2018
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Nope. Nothing was tweaked at all. Look again. There is only a 3.6% difference in CPU performance. Identical BIOS settings. All I did was swapped GPUs and reinstalled the same driver. And, you're only seeing a teaser. There are other examples where the CPU flips the other way. That tiny difference is well within a normal margin of error. On top of that, Turing may require less CPU involvement than Pascal. All of the huge performance differences are on the GPU.
    I am pretty sure I will never be willing to buy a Titan GPU. I have never been interested in buying one before and I really doubt that will change. I have an expensive hobby to a certain extent, but there is a cap on my insanity. I also doubt that I will have enough discretionary funds for a second 2080 Ti. If I did, I would absolutely love to do that. Mrs. Fox would probably tear a chunk out of my hide if I did that. But, the Titan is out of the question. I probably wouldn't do the Titan even if I were filthy rich. Dumping $5000 on a GPU for playing games or benching is just stupid.
     
    Last edited: Dec 20, 2018
    Arrrrbol and hmscott like this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    In the review benchmarking that Jay did with the Titan RTX (see recent review) he said he had a big performance bump up in (3dmark benchmarking) Graphics Score when he went from a stock 8700k to an OC'd 8700k, so I would expect the same from the 2080ti: improvement in CPU performance will translate into a leveraged increase in (3dmark benchmarking) Graphics Score results.

    Also, I found that review interesting in that the Titan RTX had a larger improvement from 1080p FPS up from the 2080ti, but the 1440p and 4k FPS increases were in the noise - too small to matter.

    Glad you are having fun with your new 2080ti. :)
     
    Last edited: Dec 20, 2018
    jclausius and Mr. Fox like this.
  9. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Here is another teaser. Huge difference. And, it looks as if the 2080 Ti plays nicer with the higher core count CPU than a 1080 Ti does. That, or maybe Turing has some architectural changes so it just works better with any CPU. (I have no way to validate that, just a guess.) Everything is identical in this comparison. Same CPU/BIOS settings, same driver, same NVIDIA Control Panel settings and same in-games settings. DX12 with everything maxed out in graphics settings except for MSAA. Both are using TAA. The only change is the GPU hardware. The 1080 Ti is 99% GPU bound at 62 FPS and the 2080 Ti is only 30% GPU bound at 95 FPS, so there is definitely something majorly different with how Turing works compared to Pascal.

    At any rate, I don't believe we are seeing a 100% clear picture of Turing performance and potential from the professional reviewers on YouTube that are just playing some of the most popular games. I'm not saying their results are not accurate, only that it is a narrow view and there is more to be explored that maybe hasn't been fleshed out yet. My impression is that the declaration that it is too much for too little is not entirely accurate. Too much money just for the fun of playing games at a playable framerate, yes... but the gains might be bigger than some are making them out to be.
     

    Attached Files:

    Last edited: Dec 20, 2018
    Robbo99999 and hmscott like this.
  10. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Your 1080 Ti appears to be underperforming: https://www.techspot.com/review/1701-geforce-rtx-2080/page3.html

    [​IMG]
     
    hmscott and Mr. Fox like this.
  11. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Not necessarily. As is often the case, we have no information on the details of the graphics settings other than "1440p [Highest Quality] and DX12" from this "review" LOL. There is a whole different screen of settings that affect performance that is not mentioned. For example, we do not know what they set for AA (which can make a huge difference). I do not see where they have disclosed any finer details. And, that is why synthetic benchmarks are best suited for testing. It removes human error and other inconsistency. I trust my own results more than a professional review. I will show all of my settings so others can replicate and validate. We cannot do that from this review.

    Did my Time Spy result look like the 1080 Time was underperforming? ;)
     
    Last edited: Dec 20, 2018
    hmscott and jclausius like this.
  12. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Cool, looking forward to the video & seeing the results you get, as well as your insights!
     
    Mr. Fox likes this.
  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    They used the “Highest” preset like you did. Here are other results showing you are underperforming in SotTR.

    https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,3.html

    [​IMG]

    https://www.legitreviews.com/nvidia-geforce-rtx-2080-ti-and-rtx-2080-benchmark-review_207896/8

    [​IMG]

    https://www.overclock3d.net/reviews/gpu_displays/nvidia_rtx_2080_and_rtx_2080_ti_review/21

    [​IMG]

    Platform differences maybe? It’s relatively common that X299 performs lower than Z370/Z390 and X99 in actual games rather than synthetic benchmarks due to its cache/mesh design.
     
    Mr. Fox and hmscott like this.
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Yes, that is entirely possible. Intel and AMD HEDT processors all perform slightly less in games, even compared to a quad core in some cases. There are a number of benchmarks runs I did with 8700K that I cannot match with 7960X. Superposition and Gears of War 4 Benchmark are two right off the top of my head. Some titles have a harder time coping with the higher core/thread count than others.

    I will drop the 1080 Ti back in again, DDU the drivers and test once more to confirm it. After flashing the stock vBIOS on the 1080 Ti I did not do a clean driver install and that may have had some effect on it. The hardware ID is totally different. After installing the 2080 Ti I also had to reinstall SOTTR because it refused to launch after the change in GPU.
     
    jclausius, hmscott and yrekabakery like this.
  15. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    If it turns out the same after I re-test, that may also give some validity to my initial impression that maybe the RTX 2080 Ti is a better "partner" when paired with an HEDT i9 or TR CPU than pre-Turing GPUs are. It may have an architectural change that more effectively leverages the extra CPU horsepower. If that is actually true, it could also be very beneficial for the Zen/Zen2 processors with more than 6C/12T.
     
    jclausius, hmscott and yrekabakery like this.
  16. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Hardware Unboxed did report with the 2080 Ti and 8700K even at 5Ghz they were still getting slightly bottlenecked in BFV at 1440p. It's possible the higher core count CPUs paired with the 2080 Ti help it perform even better. I haven't seen any bottlenecking at 1440p BFV with my 9900K @5Ghz though. The extra cores definitely help in that game. I don't have Tomb Raider so I can't comment on that title.
     
    hmscott and Mr. Fox like this.
  17. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    I think there were a couple of issues. First issue... I didn't want to buy TR, so I was using the demo. I had to reinstall it twice and it would no longer run any more on the 1080 Ti for some reason. I sucked it up and bought the paid version on sale today just for the benchmark only (don't really like TR titles/genre) and the paid version works correctly. Second issue... the latest driver seems to be goofed up or buggy. I downgraded to GeFarts driver 411.63 and now the framerate is what a 1080 Ti stock is supposed to look like. So the TR DEMO is the issue with the crashing and instability and the driver seems to be the issue with the lower than expected framerate. Maybe the Green Goblin is already working on cancer drivers to cripple last gen GPUs to may a new 20-series GPU purchase more attractive.

    There are also some graphics settings that are not working on the free demo. Not sure why the demo would no longer run on the 1080 Ti but would on the 2080 Ti. It would either refuse to launch or freeze, but the paid version has no issues. Since I downgraded the driver, I will need to do that for the 2080 Ti and re-test to get a more accurate comparison.

    Edit: Just my luck... dang-it! :vbmad: Guess what is available now? FTW3! Darn. Already paid for the special warranty and 5 year extension. And, the Hydro Copper won't fit, so I cannot use their "Step Up" program. Maybe I will contact them tomorrow and ask for an exception and see if I can trade the water block and transfer the extended warranty as well.
     
    Last edited: Dec 20, 2018
    hmscott likes this.
  18. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I think it's been proven in a number of places that NVidia haven't gimped drivers (in terms of performance) for older architectures when a new one comes out. It's true that older cards might not get as much time spent on driver optimisations for new titles that are released, but I think it's been proven that they don't gimp existing performance.
     
    Mr. Fox and hmscott like this.
  19. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Ignoring what happens to prior generation cards is a way of passively gimping them. Or, to be more kind, let's just say they don't give a rat's ass. Did you forget what they did to 780M SLI already? I haven't.
     
    Papusan likes this.
  20. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Here are some pix. Been working since I got off work to get ready for stage 3 testing.

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
    Arondel, Falkentyne, Papusan and 3 others like this.
  21. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Well, that's a work of art :)

    But, that 2080ti + Hydro is also not $1200, or $1300, or even $1500? With tax and shipping likely closer to $1750?
     
    Last edited: Dec 21, 2018
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Thanks. It does look pretty nice. Nicer than I thought it would. I wanted black, but they only make silver now.

    No, it was under $1500 for the GPU and water block, no tax and free shipping. I am selling my 1080 Ti and the new Rampage motherboard ASUS is delivering tomorrow and that should cover about half the cost of the upgrade, maybe a little more. So, I will probably be into it for around $600-700 net.
     
    Falkentyne, Papusan and hmscott like this.
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Nice price, best prices / availability I could find was these, no direct link:

    P/N: 11G-P4-2487-KR - $1349.99
    https://www.evga.com/products/produ...=GeForce+20+Series+Family&chipset=RTX+2080+Ti

    P/N: 400-HC-1489-B1 - $199.99
    https://www.evga.com/products/product.aspx?pn=400-HC-1489-B1

    Ahh, you got the XC gaming and not the FTW3, got it. :)

    P/N: 11G-P4-2282-KR - $1149.99
    https://www.evga.com/products/produ...=GeForce+20+Series+Family&chipset=RTX+2080+Ti

    P/N: 400-HC-1389-B1 - $179.99
    https://www.evga.com/products/product.aspx?pn=400-HC-1389-B1
     
    Last edited: Dec 21, 2018
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Titan RTX is the WORST tech product of 2018
    Timmy Joe PC Tech
    Published on Dec 20, 2018
    2 years after pascal, 8 months of sky high mining prices and GPU shortages and all Nvidia has for us is $600 - $2500 graphics cards and dwindling over priced GTX 1060s? The RTX Titan is a terrible metaphor for the state of Nvidia right now, don't you agree?


    Titan RTX Frozen at 1350MHz: VBIOS Bug on Retail Card
    Gamers Nexus
    Published on Dec 20, 2018
    One of our two Titan RTX cards has a VBIOS bug that limits the frequency to 1350MHz, similar to what we saw on a dying RTX 2080 Ti card previously.
    We have two Titan RTX cards, one of which is presently dysfunctional. Fortunately, we think it's as simple as a VBIOS reflash to fix it, but we are parting with this card to help NVIDIA get to the bottom of the issue. If we fix it, which a reflash probably will do, it will make it difficult for NVIDIA to diagnose and fix. The good news is that we can still work on SLI/NVLINK Titan RTX benchmarks, but our Titan RTX review will hit the channel first, with a Titan RTX tear-down around the same time. In this video, we show how you can use NVFlash64.exe to fix the issue (most likely), although it should be an uncommon one.

    Kommando Kodiak 14 hours ago (edited)
    "Imagine how pissed Nvidia is right now for this particular card to show up of all places in steves lap"

    Osaka2407 15 hours ago (edited)
    "It just works, right? It's briliant! It works! Wait... Except it doesn't."

    Dead End 15 hours ago (edited)
    "Does the R in RTX stand for rushed? I can't recall them launching a line of GPUs that was (or at least very much seemed to be) this prone to issues."

    2018 TITAN RTX vs 2080TI vs 2080 vs 2070 | Tested 13 Games |
    For Gamers
    Published on Dec 20, 2018

    TITAN RTX vs 2080TI SLI vs 2080 SLI | Tested 13 Games |
    For Gamers
    Published on Dec 21, 2018
     
    Last edited: Dec 21, 2018
    Falkentyne likes this.
  25. mitchega

    mitchega Notebook Consultant

    Reputations:
    23
    Messages:
    103
    Likes Received:
    130
    Trophy Points:
    56
    Nice...please detail before and after performance. I've thought about water cooling mine as well, but after the initial $1250...I just can't stomach putting more into the cards performance. Especially since I would have to go full water cooling (OEM water cooled CPU only). I was looking at the EK blocks, but the full setup would be another $750 to fully convert, plus the time to set it up.
     
    Mr. Fox and hmscott like this.
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    I hope to have the video done this weekend.

    Here is a sample of STOCK performance with the GPU on chilled water. It is boosting to 2040 with no core offset, LOL. Crazy. I need to hurry up and get done with this stock versus stock testing crap so I can start having fun overclocking the snot out of it.

    https://www.3dmark.com/3dm11/13071439

    [​IMG]
     
    Papusan, hmscott and jclausius like this.
  27. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I do remember some driver issues you were having with 780M sli and extreme overclocking. I think you had to use an old driver to avoid some strange downclocking behaviour at high wattage draws? But we're talking about gimping performance in games for old generation cards when new ones come out, and I think that's been proven on various review sites to be bogus. I'm not disputing your strange downcocking behaviour you were seeing, but those 780M were overclocked way beyond almost any 780M and drawing more current than almost anyone would have ever been able to push them (vBIOS mods & air con cooling), so that's kind of an outlier situation. For instance I had my 670MX in my M17xR3 overvolted & overclocked a whole bunch also with a modded vBIOS for extra power draw (87% overclock, but only 600Mhz at stock so not as impressive as it sounds), and my card didn't have any throttling with the newer drivers so it was unlikely that many other users would see issues - your 780M were drawing insane watts though, I think it must have tripped some kind of bug or something.
    That looks great! Yeah, I'm happy for you to spill the beans in your video and/or written review, so not gonna ask you questions about it now.
     
    Last edited: Dec 21, 2018
    Mr. Fox likes this.
  28. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    OK, enough of the stock baloney. I didn't buy this to run stock, LOL.

    Already got a 7th place on Fire Strike on my first attempt at an overclock. :vbthumbsup: I know it can and will go higher. Need to figure out how to power/shunt mod this to uncork the power limit and unlock the voltage like I did my 1080 Ti. The voltage and power limits are WAY TOO LOW. Needs like a 200% power limit and 1.200V.

    https://www.3dmark.com/fs/17526010 | http://hwbot.org/submission/4016900_
    [​IMG]

    https://www.3dmark.com/3dm/31495595# | http://hwbot.org/submission/4016915_
    [​IMG]
     
    Arondel, Arrrrbol, Papusan and 2 others like this.
  29. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Tear-Down of $2500 Titan RTX Video Card
    Gamers Nexus
    Published on Dec 21, 2018
    We take apart NVIDIA's new Titan RTX video card, exposing the new TU102 GPU for all to see. The new card is $2500 and looks familiar to a 2080 Ti.
    The new NVIDIA Titan RTX card deserves some disassembly, and that's what we're doing today. The cooler follows the same design as the 2080 Ti, 2080, and 2070 Founders Edition coolers, except it's been gilded with a champagne finish. The GPU underneath is the TU102, but it's had 4 more SMs enabled over the 2080 Ti and is accompanied by 24GB of GDDR6, rather than 11GB on the 2080 Ti.
     
  30. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Expreview: NVIDIA rumored to launch GeForce GTX 11 series: GTX 1160 planned
    Are GeForce GTX 11 series happening after all?
    Published: 21st Dec 2018, 08:35 GMT
    https://videocardz.com/79425/exprev...launch-geforce-gtx-11-series-gtx-1160-planned

    " GeForce RTX 20 vs GeForce GTX 11
    Yesterday I received I tip that NVIDIA is set to launch GeForce RTX 2060 and GeForce GTX 1160 mid-January. Today, Chinese website Expreview claims that this might indeed be true.

    According to Expreview, this model is called GTX 1660 Ti, but from what I heard it’s just 1160.

    NVIDIA would split branding into GeForce RTX and GeForce GTX series, the former would stick to 20xx naming scheme, while the latter would use 11xx. This is partially what some GPU enthusiasts were expecting months ago.

    The new series would still feature Turing GPUs, but their different variants. It is said that GeForce RTX 2060 will feature TU106-200 GPU, while the GeForce GTX 1160 would feature TU116 instead.

    The GeForce GTX 1660 Ti marketing material posted by Expreview mentions ‘Turing Shaders’, instead of ‘Ray Tracing’, which should remain exclusive to RTX series.

    More importantly, it is said that there will be no GeForce RTX 2050 model. The mid-range and entry-level models will be part of GeForce 11 series only.
    Source: Expreview (article behind a password)
    Many thanks to BullsLab Jay for the tip!"

    NVIDIA GeForce GTX 1160 Turing GPU Sans Ray-Tracing Rumored For 2019 Launch
    by Paul Lilly — Friday, December 21, 2018
    https://hothardware.com/news/nvidia-geforce-gtx-1160-turing-gpu-sans-ray-tracing-2019-launch

    "...A GeForce GTX 11xx series would presumably cost less. For users who just want a performance bump and don't care about real-time ray tracing at the moment, this would give them a cheaper upgrade path to the latest GPU architecture."

    NVIDIA Allegedly Launching GeForce RTX 2060 and GeForce GTX 1160 in January – Both RTX (Ray Tracing) and GTX (Non-Ray Tracing) Lineups To Co-Exist
    By Hassan Mujtaba, 10 hours ago
    https://wccftech.com/nvidia-geforce-gtx-1160-and-geforce-rtx-2060-launch-specs-rumor/

    Nvidia GeForce GTX 1160 Features TU116 GPU | AMD on NASDAQ | TSMC 3nm Plant
    RedGamingTech
    Published on Dec 21, 2018
    A new rumor is swirling that the GeForce GTX 1160 will launch in january, featuring the Turing architecture thanks to a TU116 chip. This chip (if the rumors are accurate) will not have ray tracing, but will indeed still have the other features of the Turing Architecture (such as the mesh shaders and cache improvements). Intel's rather upset at Qualcomm, calling the company out regarding its practices in the industry. AMD make it to the NASDAQ 100 and finally, TSMC 3nm fabs construction will be underway soon!


    Nvidia's GTX 1060 Deceit
    UFD Tech
    Published on Dec 21, 2018
    • Gigabyte Unveils 3 GDDR5X Cards: http://bit.ly/2PP4SpL
    • Excess of Stock: http://bit.ly/2R9kwRl
    • 1060’s with GDDR5X: http://bit.ly/2rQDRZa
    • 1070’s with GDDR5X: http://bit.ly/2EIIxJ4
    • Wootware Listing: http://bit.ly/2PTMVWQ
    • Midrange Cards Could be Delayed: http://bit.ly/2V8ernA
    • Nvidia’s Site with GDDR5X: http://bit.ly/2J9nYoC
    • New 1060 is Cut-Down 1080: http://bit.ly/2qkgBC5
    • Nvidia’s Site with 1070 (no 5x): http://bit.ly/2BBNFL5
     
    Last edited: Dec 21, 2018
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Runs 16GHz on memory stable (that is a 2.0GHz overclock). And, for those that believe you need Samsung, stop it. This is Micron GDDR6. And, my 1080 Ti also has Micron GDDR5x and it overclocks the memory like a banshee. Core overclock is severely limited due to inadequate ability to overvolt with this vBIOS, but this is expected and can be easily fixed with a better vBIOS (when one surfaces).

    https://www.3dmark.com/sd/5369384 | http://hwbot.org/submission/4016997_

    [​IMG]
     
  32. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Arondel, Arrrrbol, Cass-Olé and 3 others like this.
  33. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    I managed to snag a second 2080 Ti at $999 shipped from EVGA no tax or shipping. I have the order in and have till Monday to decide to cancel or go Nvlink 2080 Ti. Other than ridiculous benchmark scores I have to wonder if it would be worth it. But $999 out the door for a 2080 Ti is a great deal IMO.

    https://www.evga.com/products/product.aspx?pn=11G-P4-2281-KR

    Still in stock. $999.99 2080 Ti.
     
    Robbo99999 likes this.
  34. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    That is a good price (relatively speaking). If I had the money to burn, I'd go SLI/NVlink without any reservations. But, for benching. No point in doing that for gaming. Even one 2080 Ti far exceeds the performance needs of most gamers. About the only people that would "need" it are those hell bent on gaming in 4K.
     
    Arrrrbol, Vasudev and Talon like this.
  35. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    How much did you for 2080Ti? NVLink will be pointless because consumer versions are bandwidth limited unless you have spare money for a Quadro NVLink'ed cards to blow that crap out of any rendering benchmarks but are behind in gaming benchmarks.
     
  36. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    NvLink isn’t pointless, the consumer version is still two orders of magnitude more throughput than the previous HB Bridge, and it fixes all bandwidth-related issues in SLI.
     
    Mr. Fox and Vasudev like this.
  37. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    But the cost is very high, correct? And nvidia don't ship the bridge for free at all unless you pay extra! Mr. Fox uses GPU for benching and overclocking investing in NVLink is not beneficial unless he does GPU based data mining or machine learning to detect objects or even realtime ray tracing.
     
  38. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It’s really not worth the arm and leg unless you’re gaming at extremely high settings, resolutions, and refresh rates with G-Sync and HDR.
     
    Vasudev likes this.
  39. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Even arms and legs won't suffice these days, you might sell your soul,kideneys,heart and your brain.
     
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Why would Nvidia release new GTX (non-RTX) GPU's in 2019 for lower prices undercutting their own recent RTX releases? To compete with AMD GPU releases in 2019? 2H 2019 is the time frame AMD has talked about in the past, rumors notwithstanding. :)

    Intel is also moving in a competitive direction releasing their 9th generation CPU's without iGPU's, allowing for lower prices to compete with AMD CPU releases in 2019? That's more likely to happen sooner than later.

    This is how AMD can CRUSH Nvidia
    Coreteks
    Published on Dec 9, 2018
    In this video I discuss the recent leaks from AdoredTV and then move on to propose a different approach for AMD to gain control of the GPU market.
     
    Last edited: Dec 22, 2018
    Vasudev likes this.
  41. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    That's simple not accurate, brother. People that don't really understand SLI talk about it all the time. Benching is where it actually shines brightest. I wouldn't spend money on it for gaming. Nor would I spend money on a single 2080 Ti for playing games at less than 4K resolution. In fact, I'd spend as little as possible for 60 FPS or greater framerate at whatever resolution I am running. If you can accomplish that with an RX 590 or a GTX 1060 or 1070, there is no reason to spend any more than that on a GPU for playing games. Save the money for getting raped to buy the grossly overpriced games instead.
     
    jclausius and Vasudev like this.
  42. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    OK, so SLI works 9 out of 10 times? Most of them say get the strongest GPU you can afford instead of SLI for less heat and better performance!
     
    jclausius likes this.
  43. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    9 times out of 10 when it comes to synthetic benchmarks. ;)
     
    jclausius likes this.
  44. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Hmmm... Thanks. I always thought SLI were PITA.
     
  45. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, for gaming you'd get the strongest single card that you need or can afford, because then you don't have to mess around with sli compatibility issues - microstuttering or simply not being compatible with the game. For benching you'd get sli for the good scaling, and increased GPU scores, but the sli experience for gaming is 'disappointing' by many accounts. I suppose if you had a 4K 144Hz monitor and lots of money you'd have to get sli to have the chance of using some of that monitor's potential, but I suppose there'd be times when you'd have to fall back on just using one of your cards (rather than both), and then you'd still have more chance of microstuttering when you had both cards enabled - 4K 144Hz monitor is a bit of a silly monitor at the moment, for those reasons.....and I think 1440p 144Hz is the max for sensible high refresh rate gaming.
     
    Vasudev likes this.
  46. JasonLLD

    JasonLLD Notebook Geek

    Reputations:
    44
    Messages:
    86
    Likes Received:
    104
    Trophy Points:
    41
    I remember hearing about same thing when hype about Polaris and Vega was pretty high. Unless AMD makes dramatic overhaul on their GCN architecture, they won't be able to catch up on Nvidia for a long time.

    While I don't have much hopes on their GPU division, they are about to give strongest challenge Intel has ever faced ever since Athlon 64 days so at least that part is exciting.
     
    Last edited: Dec 22, 2018
    hmscott, Vasudev and Mr. Fox like this.
  47. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Nah, take it from someone who has been exclusively a multi-GPU user until recently. I'm not now only because of cost. Otherwise, I'd still be running SLI. It is NOT a PITA at all. What is a PITA is the crappy console port game trash and the half-assed developers that don't care if it works or not. There are almost always tweaks you can do even for games that do not natively support SLI and 9 times out of 10 it will outperform a single GPU unless the game is so hopelessly screwed up that it can't. But, yeah... benching is a whole 'nother matter entirely. All benchmarks that are actually worth their salt support it.

    The cool thing about SLI (other than much increased performance) is that when it works there is nothing better. When it does not, you get to enjoy the same experience as the single-GPU users. Just turn it off and you're good to go. You get to enjoy the best of both worlds. The only downside is the added cost.

    The people that are most critical of it either repeat what they have heard from other talking heads, or they didn't have it long enough to figure out how to use it, tweak it, and get the most out of it. They just gave up and bought into the chatter of the talking heads.

    Now, CrossFried is a different story. AMD sucks at GPUs all the way around, and CrossFried just amplifies their suckiness where GPUs are concerned.
     
    bennyg and Vasudev like this.
  48. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    At that time I didn't even have a PC since it was barely affordable here in India.
    Games and apps that take advantage of SLI were few.
     
    Mr. Fox likes this.
  49. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    That is one of the issues, bro. Besides the plethora of crappy console ports and lazy game devs that don't care is there are too many people that cannot afford it. And, now that the chintzy BGA turdbook crap is all the rage, it's only going to get worse from here. Apathy and ignorance always stifle excellence. Sadly, the people that cannot afford now it may not get the chance to enjoy it some day if their situation improves.

    Heck, I cannot afford it now that GeFarts GPUs require the soul of a first-born son and a donation of one functional body part. At least I have the pleasure of having been there and done that, and loved it.
     
    hmscott and Vasudev like this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    AMD doesn't need to outperform Nvidia, or even Intel, in order to offer you what you need. Most buyers don't need the top end model of Intel or Nvidia. AMD already offers mid-range products right in the sweet spot of performance, what most people want to buy.

    Buy on delivery of performance, not on random hype before release. Speculation can be fun, but the reality is on delivery, not on the speculation beforehand.

    There is this unthinking thinking easily fed by the haters and Nvidia fanboi's that AMD must exceed Nvidia's top models to be in contention for purchase, but that's a lie meant to keep Nvidia's Greed Engine fed.

    Fewer people can afford to buy the highest end 1080 / 2070 / 1080ti / 2080 / 2080ti / Titan GPU's. Why compare AMD GPU's to Nvidia GPU's they aren't meant to compete with?

    Compare AMD GPU's at the price / performance level you want to buy - that's what you are buying. Why remove AMD GPU's from your options when most people buy a GPU in the price / performance range where AMD has models to offer?

    When the new AMD GPU's come out and compete on price and performance against Nvidia's models you are interested in - but they are too expensive, buy those AMD GPU's instead of continuing to feed Nvidia.

    If you want a 1060 level GPU, buy an RX 580 / 590, or save some $ and OC an RX 570. If you want a 1070 / 2060 buy an Vega 56, and if you want a 1080 / 2070, buy a Vega 56 OC or Vega 64 OC.

    The new Adrenaline 2018 software release offers new features, without Nvidia's login or telemetry tracking, so even with the software features AMD is ahead of Nvidia in some areas.

    There really is no reason to keep feeding the Nvidia Greed Monster, spread your money around intelligently instead of always unthinkingly giving it to Nvidia, Intel, and Apple, they've got enough money.
    AMD is a small company compared to Nvidia and Intel, and it's not possible to compete across the whole range of products to the top level all at once. AMD had to choose where to invest their R&D and product development $'s. Buy what AMD offers that fit's your needs, that's always enough.
     
    Last edited: Dec 22, 2018
    Vasudev likes this.
← Previous pageNext page →