The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Ah, you’re a TDM player. That makes more sense now. I was about to say, if you only started playing BF1 6 months ago, I doubt you’ve played 2400 Conquest/Operations games in total, let alone won/MVPed 2400. :p

    As far as the potato settings is concerned, all that is is an LOD bias tweak in the driver that forces all textures to the lowest quality mipmap. It can be done to any game, without using an external tool such as Nvidia Profile Inspector, by modifying the registry, and works on Intel and AMD GPUs as well. It doesn’t even improve performance or reduce VRAM usage. Tbqh, as someone who has played around with it a fair bit (and no, I don’t consider it cheating), I would say it’s more of a disadvantage than advantage in BF1. Reason being that it also removes important visual cues like sniper scope glint and the AOE of gas/incendiary grenades, and blurs out the crosshair on all scopes, making weapons like the Autoloading 8 .35 Marksman (one of my favorite guns) unusable. The only reason the tweak is even worth mentioning in BFV is that the game’s visibility is just so bad.
     
    Robbo99999 likes this.
  2. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    hmscott likes this.
  3. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331

    This is the cheapest "buy it now" GTX 1070 Ti I could find on eBay with my search just now. Sure maybe you can score better deals, maybe CL or whatever. Still at $300, nah I'll take a brand new card with warranty for $349 with better driver support going forward and architectural improvements in new games. DLSS and RTX are free at that point.

    https://www.ebay.com/itm/PNY-GeForc...4b5650c:g:iWkAAOSwmUBb3cG3:rk:3:pf:0&LH_BIN=1

    This is the cheapest 1070 Ti brand new on Newegg..

    https://www.newegg.com/Product/Prod...0 Ti&cm_re=GTX_1070_Ti-_-14-137-261-_-Product At $439.99 you would be stupid to pay this price for a card that has replaced this 100%. I'll keep my $90 and get the new card.

    I wish the 2060 was $300 too because at that price this would be a great deal rather than being just a good deal. We will likely see some price slashing IF AMD brings something competitive to the table. I am looking forward to watching their CES event.
     
    Papusan and hmscott like this.
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The one thing about the 2060 that should give you pause is the lack of VRAM. It’s at a level of performance where the amount of VRAM will limit your ability to crank up settings before running out of compute performance does, especially considering the consoles that AAA game dev target have 8-12GB of shared memory.
     
  5. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Ha, that's right, TDM...I just feel that Conquest/Operations are too random in dying or surviving and feel I can't influence the match as much as a solo player, so TDM made sense for me - I still cooperate with the rest of the team rather than being totally selfish, although being non-selfish in handing out ammo is actually selfish because it increases your score! I don't play in a clan or with known squad members, I just play at random with randoms.

    That's right, I played BF V in the beta and I couldn't see sh*t, so frustrating! The distant blurring was one of the problems, and while I was playing I saw that as the main problem (correctly or incorrectly), and it was the forced TAA that ruined it for me there. I know now, you told me before, you can use console command to turn off TAA as long as motion blur is turned on, but yeah, I'm not gonna buy BF V - I might buy it in a sale or when I get a 7nm NVidia GPU in 2020 that will be able to run BF V at Ultra with ray tracing turned on at 144 fps!
     
    yrekabakery likes this.
  6. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    1080p should be ok though? Currently 6GB is plenty for 1080p and also 1440p I think. I definitely think a 6GB RTX 2060 is a better buy than an 8GB GTX 1070ti. (I also think it's a better buy than the GTX 1080, but marginal toss up here, but if my GPU died now I'd be buying the RTX 2060 rather than the GTX 1080.)
     
  7. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I’m with you on that one. I think DICE ruined Conquest in BF1 with the new scoring system that promotes lopsidedness and makes exciting comebacks basically impossible, and Operations is way too cluster****y for my taste. TDM isn’t my cup of tea, but I thought Frontlines was a well-designed and competitive game mode, but there are zero active Frontlines servers in NA, which makes me sad. The overwhelming majority of them run Conquest/Operations, so if I am playing BF1 it’s usually pubstomping in Conquest with randoms.

    Even at 1080p, it’s limiting in some games and will prevent you from increasing certain settings that a weaker GPU with more VRAM, like an RX 580/590 or Vega 56, will be able to do. It’s a common misconception that resolution affects VRAM usage a great deal. While it does, the impact is usually far less pronounced than texture quality or streaming related settings, or high quality shadow settings, for example.
     
  8. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Which games at 1080p are limited by 6GB VRAM? I can't think of any. I know a lot of games can cache a fair bit of information into VRAM and use a lot of it, but it's not always essential for it to cache that much into VRAM (meaning you can get away with less VRAM than the amount it might want to cache on a card with increased VRAM). I don't really monitor my VRAM usage in games, and I don't really play many different ones, but if I recall I don't often remember it going beyond 4GB whenever I've looked at it.

    EDIT: scratch that last part, just seen over 7GB VRAM usage in Rise of the Tomb Raider during it's benchmark. However, RTX 2060 doesn't suffer due to 6GB of VRAM in Shadow of the Tomb Raider here: https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),13.html
    So that kind of goes hand in hand with my comment previously about games caching more VRAM than they actually need. Whether or not 6GB is enough, the fact I saw over 7GB being used on my GTX 1070, even if it's not all needed, that is making me feel more worried about the 6GB VRAM on the RTX 2060. In the reviews so far 6GB is not hampering it's performance today, but tomorrow's games could be a different matter. Hmm, if my GTX 1070 died today, I'd be in a real quandry about which GPU to buy now - I don't want to buy any of them!!!
     
    Last edited: Jan 8, 2019
    hmscott likes this.
  9. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    I feel like a lot of the AAA titles released this year actually push it quite a lot. Fallout 76 and BFV definitely do, possibly others. 6GB is not acceptable for a card like the 2060, at least in my mind.

    RTX is quite VRAM heavy as well if BFV is anything to go by, so going forward this will limit the 2060 even more relative to cards like the 1070/1070 Ti.
     
    Robbo99999 and hmscott like this.
  10. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Middle-earth: Shadow of War, Deus Ex: Mankind Divided, and Rising Storm 2: Vietnam are the ones I’ve played recently that actually need more than 6GB at 1080p when maxed out. I don’t play many AAA games though, so I’m sure there are others.
     
    Robbo99999 likes this.
  11. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I edited my previous post after testing Rise of the Tomb Raider on my GTX 1070 just now - to test for VRAM usage.
     
    hmscott likes this.
  12. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    "Need" or they just cache more than 6GB VRAM if you have it available? RTX 2060 performance isn't suffering due to 6GB VRAM in current games, including some of the ones you listed ( https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),23.html), but for tomorrows games I'm starting to become worried about the 6GB VRAM given that games are currently caching more than 6GB regardless of whether that extra caching is required or not.

    EDIT: thinking about it, in general, it feels like this generation of RTX cards are just a short stop gap until the next process shrink down to 7nm in 2020 where we would likely get RTX cards with greatly improved performance because of this. The fact that RTX 2060 only has 6GB VRAM just adds credence to that thinking, that it's a stop gap card until those 7nm GPUs arriving in 2020.
     
    Last edited: Jan 8, 2019
  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Need. I’m aware of caching, but those games are actually smoother on 8GB VRAM than on 6GB VRAM. The thing is Guru3D is still using the extremely outdated methodology of comparing average FPS, when they should be measuring frame times or 0.1% and 1% low FPS, as those will show more readily the impact VRAM has on the smoothness of the experience.
     
  14. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    That's an interesting idea, but Gamers Nexus use a very updated & stringent methodology for testing GPUs, and frametime is amoungst the variables. No frame time issues for RTX 2060, even in some of those games at 4K:
    https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56

    I'm quite convinced that 6GB VRAM is enough for games on the market today, but for tomorrows games I'm less than optimistic about that 6GB VRAM amount. That's why I see the RTX 2060 as the epitome of a short lived stop gap product until year 2020 when the 7nm GPUs are likely to launch, I definitely don't see it being useful beyond that time period. If my GPU did die today it would now be a hard toss up between GTX 1080 and RTX 2060, either way I would know it would only be a stop gap product until those 7nm GPUs in 2020, no way I could hang onto either of them longer than that I don't think.

    EDIT: Given that either purchase would only be a stop gap till 2020, then I reckon I'd probably still go for the RTX 2060 - for the novelty value of DXR and DLSS (DLSS being very useful for a somewhat underpowered RTX 2060). Yep I'd go RTX 2060 on the expectation that 6GB VRAM amount wouldn't hold me back until after the new 7nm GPUs come out in 2020, at which point I would upgrade again. As it stands I'll probably be using my GTX 1070 until 2020, unless it dies!
     
    Last edited: Jan 8, 2019
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Unfortunately I’m not familiar with how VRAM-hungry any of those games are.
     
    Robbo99999 likes this.
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    What does a non-validated FreeSync gaming monitor look like?
    PCWorld
    Published on Jan 8, 2019
    Nvidia says that not all FreeSync monitors are created equal - in fact only 12 out of over 400 are certified to be G-Sync compatible. Gordon checks out some examples of FreeSync monitors that don't stack up.
    http://forum.notebookreview.com/thr...ts-video-articles.826767/page-7#post-10843817

    Daniel Nordlund 12 minutes ago
    "I think the monitor that has the blinking issue is the LG 34UM69G-B, same stand at least and I have had that monitor with a Vega 56 and RX 570 and had no issue with it. Currently I am on my third 34" ultrawide from LG and I have never had any issues at all that couldn't be traced back to something in the drivers."

    Is Nvidia, rather than playing an open and fair hand to provide VESA Adaptive Sync "compatibility" instead using this move to throw shade at existing VESA Adaptive Sync monitors?

    AMD is able to provide inexpensive FreeSync <=> VESA Adaptive Sync compatibility with satisfying good user experience, but Nvidia "can't"?!

    More Nvidia BS, or is G-sync problematic, or both? :)

    Nvidia Admits Defeat, Will Support G-Sync on FreeSync Displays
    Joel Hruska on January 7, 2019 at 3:14 pm
    https://www.extremetech.com/gaming/...feat-will-support-g-sync-on-freesync-displays

    "When Nvidia launched its G-Sync standard, it used customized controllers and its own specialized IP to bring variable refresh rate support to desktop monitors that otherwise lacked this capability.

    The entire point of AMD’s FreeSync standard and the Adaptive Sync standard supported by VESA was to obviate the need for this approach by creating an open standard that wouldn’t slap a huge price premium on the cost of displays.

    Nvidia, however, has steadfastly refused to play ball. It has kept G-Sync in the ecosystem as a halo brand, despite the fact that the capability it once used specialized hardware to provide can now be baked into display timing controllers without additional costs.

    Announcing GreeSync G-Sync Compatible Monitors
    Today, Nvidia announced that it would finally stop preventing its video cards from using capabilities already built into FreeSync monitors. Nvidia GPUs will now support FreeSync displays, with some displays enabled by default and universal support available via a manual setting change. On January 15, Nvidia will release an updated driver with support for G-Sync on FreeSync displays for Pascal and Turing-class GPUs."

    What does a non-validated FreeSync monitor look like?
    Submitted 2 hours ago by jrruser
    https://www.reddit.com/r/nvidia/comments/ads8cp/what_does_a_nonvalidated_freesync_monitor_look/

    renatofontes NVIDIA GTX 1060 3GB 10 points an hour ago
    "The bad monitor fixed itself after he moved the mouse around.
    Also, does anyone else here thinks that is just shoddy driver support? I don't think that only 12 freesync monitors work as they should... Probably with a future driver update more monitors will be supported."

     
    Last edited: Jan 8, 2019
  17. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    True! I just researched, they all use less than 6GB VRAM, so kind of doesn't help prove my point about frametimes! Hmm, I'd still go RTX 2060 over GTX 1080 though currently, can you show me any links to frametime or other issues re 6GB VRAM on RTX 2060?
     
  18. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Not sure where I’d look for those as the card just came out, so not many end-users have them yet. Because reviewers are not good for this sort of thing. And that leads me to another thing I want to point out about VRAM requirement is that it can be very specific. For example, most reviewers benchmark Deus Ex: Mankind Divided using its built-in benchmark. Well, that is not representative of the in-game experience because it does not stress the CPU or the engine’s texture streaming system like the large Prague hub does. 6GB VRAM is smooth in the benchmark at Ultra settings, but it’s not smooth when you’re running around Prague, which I’ve seen 980 Ti users reporting frame pacing issues there.

    Likewise, Rising Storm 2: Vietnam uses 7 to 8+ GB on some maps with texture streaming disabled, and while those maps are smooth for me, other players who use 6GB cards like the 1060 report stuttering due to inadequate VRAM (the game is not demanding on the GPU, but uses a ton of VRAM when texture streaming is disabled.)

    Also, you mentioned you tested the built-in benchmark of Shadow of the Tomb Raider. That’s another one where the benchmark and actual game have vastly different performance profiles (seems to be a Nixxes specialty), for example Geothermal in-game has much higher CPU usage and lower performance than Geothermal in the benchmark. So it wouldn’t surprise me if that disparity extended to VRAM requirement as well, with most reviewers using the built-in benchmark to test GPUs.
     
    hmscott likes this.
  19. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Hmm, we don't know enough about it yet then to determine if 6GB on RTX is an issue with current games or not. 6GB not being enough for Pascal & Maxwell in some games is a worrying observation, but I guess VRAM usage could somehow be better optimised in Turing (I don't know). We need to see actual proof re 6GB on RTX 2060, hopefully this will be cleared up with user reviews and website reviews, lets see what occurs!
     
  20. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Nvidia might provide some assistance by shipping 4GB and 3GB VRAM versions, matching 1060 6GB / 3GB, with 3GB and 2GB VRAM GPU's already proven to be a bad idea in current times, and in the near and ongoing future.

    There are 1060 6GB vs 3GB tests, HardwareUnboxed had several games that were gimped on 3GB VRAM. I'll see if I can find them later, if you all haven't found them already.

    It's odd that Nvidia didn't put in 8GB in the 2060 at it's price range given the 1070 / 1070ti both have 8GB and the 2060 matches their performance otherwise.

    Especially odd because RTX features use more VRAM on top of existing game VRAM use.
     
  21. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I don’t think 8GB would be possible due to the 192-bit bus, barring some weird ROP/MC partitioning a la GTX 970 tree fiddy VRAMgate. And if the 2060 was given the full 256-bit 8GB, it would cannibalize the 2070 too much.

    But you could say the same thing about the 2080 matching the 1080 Ti perf-wise but having less VRAM, or the 2080 Ti and Titan V.
     
    bennyg and hmscott like this.
  22. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The specifications across the lineup could have been adjusted to fit Game + RTX VRAM requirements giving the 2060 level GPU 8GB VRAM, and that could have been realized by re-jiggering the rest of the lineup to provide differentiation vs cost. Maybe DDR5x in the 2060?

    At this point there's no fixing it, so if you want RTX I'd suggest ponying up for a 2080ti, otherwise get a 10 series card used - or pay a premium for a new 10 series GPU - the value of those cards should have dropped, but since many aren't happy with the RTX series the prices have held.

    Maybe with release and production ramp up of the 2060 the 1070 / 1070ti / 1080 new prices will drop should demand for them dry up, the market could go either way.

    As long as nice cheap used 1080ti mining liquidators are pumping cards into the market, I'd still suggest going for one of those.

    Given most serious mining operations undervolted their GPU's and many hardly had time to run more than a few months or the last year - and many shutdown months ago - you might even find unopened mining GPU's being liquidated.

    If you really want RTX, IDK why given the lack of software, and you've got to get it "now" then the 2080ti is really the only stand out buy. All the other models are either far too expensive (Titan RTX) or have shaved down RTX features. Really though, by the time enough RTX games arrive, the 2nd gen 7nm RTX GPU's will be here.

    It's a tough time to buy a GPU, but it's been that way for a long time. :)
     
  23. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    After the statement "40+ RTX laptops to come" I wonder what's with the rumored gtx 1150 and gtx 1160... Will they even come anymore or has Nvidia suddenly ditched them?
     
    hmscott likes this.
  24. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    If they release 3GB or 4GB versions I will laugh. But it will no doubt catch people out who wanted to save $30-50 which is unfortunate.
     
    hmscott likes this.
  25. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The only leak from a laptop maker, as I've seen personally, was the Lenovo - but Lenovo updated that datasheet from 1160 to 2060, so at this point I doubt there are going to be non-RTX GPU's.

    Mainly because non-RTX GPU's already exist, they are the 10 series GPU's known as Pascal. :)

    Also, I think the recent 1160 "leak" from Lenovo is simply the carry over from more than a year ago when Lenovo leaked the same thing during an in booth interview mention...:
    wrong_year.png
     
    franzerich likes this.
  26. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    And, if these were 10 series GPU's the 3GB / 4GB models would be bad enough, but trying to then use DLSS / RT features on top of the normal VRAM requirements I would think cause noticeable performance problems, or require further RTX feature gimping in games running on those low VRAM models.

    Here's a thread musing on the inadequacy of 8GB of VRAM with the 2080 vs the 11GB of VRAM that comes with the 1080ti... :)

    One thing to note comparing the 2080 to the 1080Ti
    https://www.reddit.com/r/nvidia/comments/9h6b98/one_thing_to_note_comparing_the_2080_to_the_1080ti/

    carebearSeaman 4 points 3 months ago*
    "18% more bandwidth won't help me in FFXV 4K once the VRAM usage goes above 8GB which it does, and no, it's not just "filling the VRAM up for efficiency" in this case. I know many games do this, but FFXV starts showing poorly rendered textures because of lack of VRAM. It actually needs more than 8GB of VRAM at 4K and ultra textures.

    Rainbow Six Siege at 4K and ultra textures also needs more than 8GB VRAM as in it starts stuttering. Same goes for Wolfenstein 2, proof https://imgur.com/a/kXhhyeM

    As you can see, the 2080 performs terribly in Wolfenstein at 4K simply because it's running out of VRAM.

    It's not unreasonable to think many future games will struggle with 8GB VRAM at 4K."
    t7OtRQo.jpg
    https://www.hardwarecanucks.com/for...a-geforce-rtx-2080-ti-rtx-2080-review-17.html

    "Believe it or not, the latest Wolfenstien game actually provides one of the most surprising yet not completely unexpected results of this entire review. At 1440P things seem to be going really well for the RTX 2080 Ti and RTX 2080 since they’re able to chew through framerates despite every detail setting being maxed. BUT when the resolution is increased to 4K, the RTX 2080’s performance simply falls through the floor while the GTX 1080 Ti and RTX 2080 Ti surge ahead.

    The reason for this is pretty simple: the 8GB framebuffer and lower memory bandwidth cause a severe bottleneck on the Manhattan mission we chose. Even Vega 64’s higher bandwidth number allows it to finally become sort of competitive."

    So, another good reason to go with the 1080ti as a holdover position until AMD or Nvidia 7nm GPU's are released.
     
    Last edited: Jan 8, 2019
    sniffin likes this.
  27. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Aroc and Papusan like this.
  28. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    Yeah, seems correct. I just checked that news again which claimed (and screenshotted) the Lenovo website listing the gtx 1160 on the Lenovo Legion Y530 or Y730 - and the gtx 1160 is not listed anymore. Instead only the gtx 1060 is listed - which is more realistic. Not sure what's with the rumor of the gtx 1150 though.. because there is still this benchmark leak and alleged 896 cuda cores indicator https://www.tomshardware.com/news/nvidia-geforce-gtx-2050-gtx-1150-specs,38291.html
     
    hmscott likes this.
  29. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I can't find it again right now, but I did see the correction to 2060 on the page that had 1160 previously. Maybe Nvidia pulled rank and asked Lenovo to regress that listing one more time and now it says 1060 - at least until the "embargo" is lifted.

    As far as lower end GTX 20 series, IDK if that makes sense as Nvidia is supposed to have had $2B in inventory of mid/low end GPU's still sitting on the shelves, so it really doesn't make sense that Nvidia would make more of those same tier GPU's.
     
  30. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    hmscott likes this.
  31. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    CES 2019 - Highlights From the NVIDIA Keynote!
    NVIDIA GeForce
    Published on Jan 8, 2019
    If you missed Jensen's presentation, we've got you covered. Julian and Shannon take us through all of yesterday's top highlights: everything from the RTX 2060 reveal to the DLSS tech demonstrations.
     
  32. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    RTX 2060 vs Every GPU Benchmarks | 53 tests
    Benchmark PC Tech - MultiTechnopark
    Published on Jan 8, 2019
    RTX 2060 BENCHMARK REVIEW / DX12 INCLUDED / GAMING TESTS – 1080p / 1440p / 4K
    CHANNEL MAIN TEST RIGs setup as following:
    Intel i9 9900K
    ASUS STRIX Z390-E Gaming Motherboard
    Corsair Vengeance LPX 16GB DDR4 3200MHz
    Noctua NH-D15 CPU cooler
    Samsung 960 EVO SSD 500GB PCIe NVMe
    MSI RTX 2080 Ti GAMING X TRIO
    Seagate 2TB Barracuda Sata 6GB/s 128MB Cache
    EVGA 850 BG, 80+ GOLD 850W PSU
    Cooler Master MasterCase H500P
    Acer Predator XB271HU 27"
    Windows 10 Pro | USB Flash Drive

    Specs: GeForce RTX 2060 – 6 GB GDDR6 / 192-bit memory / Bandwidth 336 Gb/s. GeForce RTX 2060 features 1920 CUDA cores. Base clock 1365, Boost clock 1680, 160 W Graphics Card Power (W), 500 W Recommended System Power (W), 8-pin connector, 89 Maximum GPU Temperature (in C), TDP 160W.

    Games tested: BATTLEFIELD 5, DEUS EX: MANKIND DIVIDED, FAR CRY PRIMAL, GEAR OF WAR, GTA 5,HITMAN 2,SHADOW OF THE TOMB RAIDER, ASSASSIN'S CREED ODYSSEY, TOTAL WAR: WARHAMMER, WATCH DOGS 2, WITCHER 3, STAR WARS BATTLEFRON 2, DESTINY 2, HELLBLADE SENUAS SACRIFICE, MIDDLE EARTH SHADOW OF WAR, PREY, PROJECT CARS 2, 3DMARK FIRESTRIKE, 3DMARK TIME SPY.
     
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    #rtx2060 #nvidiartx2060 #hardwarecanucks
    RTX 2060 Review - Finally A Reason to Buy RTX?
    HardwareCanucks
    Published on Jan 7, 2019
    The NVIDIA RTX 2060 is here and while it does cost $350, its gaming performance is pretty impressive. Just don't expect this GPU to be affordable.
     
  34. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331


    No GPU for you!
     
    hmscott likes this.
  35. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Does that mean they're not working on one, or just chose not to announce it just yet.
     
    hmscott likes this.
  36. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Nah they're working on one when they get an aftermarket card. They said they won't be using the card Nvidia sent them and will instead use aftermarket card from partners. That said, I feel it's intentional as we all know the prices can be higher for aftermarket overclocked fancy cards and they love their price to perf ratios. This is misleading for obvious reasons.
     
    hmscott likes this.
  37. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
  38. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
  39. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The GTX 11xx rumors continue...

    Cheaper RTX 2080 Leaked, Cheaper Intel “F” CPUs!
    Gamer Meld
    Published on Jan 8, 2019
    The GTX 1180 was spotted. Cheaper version to the RTX 2080?? Intel released more 9th gen CPUs, and 10nm is FINALLY coming! Stay tuned...
     
    steberg and Talon like this.
  40. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Excellent. This will allow Nvidia to turn the screws to AMD with their 7nm and Navi. They can sell more high end GPUs at a lower cost as the overall die will be far smaller. OR they can sell defective GPU with issues with the RT and tensor cores. GPUs that would have otherwise been thrown in the trash? Thereby appeasing additional consumers that want faster GPUs at a lower cost. They can simultaneously push RTX while selling cheaper GPU or defective GPUs while gaining market share.

    No more G-Sync tax either. Nvidia seems to be positioning themselves to scoop up more of the market.
     
  41. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'm a bit sceptical about the rumoured 11xx series without RT cores, to me they'd be shooting themselves in the foot to release such a product because they're diluting the number of RTX cards in the market and hence not stimulating adoption of RTX by game developers as much. I suppose they could get away with this if they committed to bringing out a new RTX 3xxx series of cards (on 7nm) within the relatively near future, so that there was no chance for RTX to lose excessive momentum with game developers and the public. Perhaps if they bring out the RTX 3xxx series in early 2020 then they could get away with launching GTX 11xx series GPUs, but having said that they'd need to launch those GTX 11xx series cards virtually immediately for that to make any sense either - to get a years run on them before the RTX 3xxx cards come out.
     
    Last edited: Jan 9, 2019
    hmscott likes this.
  42. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Just checked how much it would cost to buy an RTX 2060 here in the UK, £329 on the NVidia website: https://www.nvidia.com/en-gb/geforc...RCE_UK_2060_RTX_Launch_Google_Search_CES_2019

    That's about £50 cheaper than GTX 1070's listed on amazon! While I'm not impressed with the price increase over the previous GTX 1060, which is down to no competing products from AMD, it's a good buy if you HAVE to buy a GPU right NOW - Vega64 and GTX 1080 performance for cheaper than current GTX 1070 prices. Probably better to wait for 2020 and 7nm NVidia GPUs in the bigger picture, but would be the card of my choice to buy for anyone who has to buy NOW. Another option is to see how the Pascal used market responds to the RTX 2060 once it is launched, and if price drops of Pascal are massive, then might be worth buying one of those to tide you over until year 2020 & 7nm GPUs, but I wouldn't be buying Pascal new now and wouldn't be buying Pascal now at all unless it's just to tide over for short time until 2020 - it's just ridiculously old tech now!
     
    hmscott likes this.
  43. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Hopefully the new / used 1070's, 1070ti's, and 1080's will now price correct lower to match the $349-$399 retail pricing - hopefull that won't creep up much higher - the Nvidia store prices are of course "protected". :)

    It could be that the opposite happens, the demand for 10 series cards continues or increases keeping prices high, and those that want to avoid RTX will be happy to pay what they need to when they need a new GPU.

    If Nvidia slaps down some 11xx GPU releases - GTX only - then that could help lower prices too.

    Then all that RTX goodness will be available to those that want it, and the rest of use can source new generation rasterization GPU's for less.

    IDK how the AMD releases will affect things, announcements aren't shipments, so it will depend on when the production starts and shipments begin, as to how those new CPU's / GPU's / APU's affect the market.

    Fun stuff. :)
     
    Robbo99999 likes this.
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It looks like we can add " the cards in the mail" to the list of official Nvidia BS statements:

    RTX 2060 Review, Did Nvidia Screw Us? The Possible End of Our GeForce Day One Coverage
    Hardware Unboxed
    Published on Jan 8, 2019
     
    Papusan and Talon like this.
  45. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'm not so certain he has been 'screwed' by NVidia themselves, because Steve over on Gamers Nexus is not that rosy about the RTX launch, and he's got a review sample & review written already. Perhaps Hardware Unboxed were the victim of an adminstrative mess up, or maybe the Hardware Unboxed guy has a really bad personal relationship with his NVidia contact to the point that the NVidia contact himself or herself wanted to make life difficult for him this time.
     
    jclausius and hmscott like this.
  46. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Same thing really, Nvidia corporate or a soured personal relationship with Nvidia, or it's just a case of bad timing for shipping last minute to Austrailia.

    If he and Tim's stories are true, Nvidia told them everything was fine and Nvidia appreciated their honest if not 100% gungho reviews, so who knows.

    At this point I've heard many such stories from many people, and it's not always personal or even professional at all.

    It's just that there are only so many minutes in the day and so many resources to spread around to many deserving destinations.

    At some point you have to triage, and if someone(s) have to get stiffed for a review copy near the end of the line, then try to move around the fickle finger of fate to a different person or group each time.

    So far it's once, so let's hope someone else gets stuck next time. :)
     
    Last edited: Jan 9, 2019
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Just how much will RTX laptops cost?
    PCWorld
    Published on Jan 8, 2019
    Gordon found out just how much of a price premium might hit laptops equipped with RTX while visiting CyberPower's suite at CES. Spoiler: not much! CyberPower's Tracer III 15R and Tracer III Extreme 17R are equipped with H-class Intel CPU's and Nvidia RTX 2070's.


    Critique Truth 5 hours ago
    "Wow! Now I guess GTX laptop gonna be cheaper."

    Bevilliage 3 hours ago
    "Just picked up a legion y730 for $950. it was $400ish off."

    Sitohas Wang 3 hours ago
    "Some rumors show that RTX 2060 laptop version may have 1536 Cuda core rather than its desktop counterpart which has 1920 Cuda Core. So it may perform more like desktop 1070, not 1070 Ti. So a little disappointed here."
     
    Last edited: Jan 9, 2019
  48. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    GN Steve said at the start of his 2060 review they had it weeks ago which would have added more salt to the wound if they saw it

    Either way HUB Steve seems like he's normally the kind of bloke who'd respond with sheelberightism rather than salt, and his number 2 has also seemed salty as they called BS on AdoredTV Jim's Ryzen 3000 leaks, so there's more to it. Don't care what tho
     
    hmscott likes this.
  49. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    I can't believe this guy is surprised, seriously watch his videos. Like I said earlier and he admitted it, he's salty and upset. His review at this point is useless and biased.
     
    jclausius likes this.
  50. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    What's sheelberightism? She'll be right'ism? But I still don't get that, ha!
     
    hmscott and bennyg like this.
← Previous pageNext page →