The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia RTX 20 Turing GPU expectations

    Discussion in 'Sager and Clevo' started by Fastidious Reader, Aug 21, 2018.

  1. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Why manage expectations? Nvidia will get a snap back if they are not met, that's no forum user's job.
     
    ajc9988 likes this.
  2. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,848
    Trophy Points:
    681
  3. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    I'm not naive when itr comes Nvidia's tactics. They've done it every card release and benchmarks have always revealed the truth. For the most part, each generation has been a decent upgrade (possibly with the exception of the GeForce 5 FX gen...that wasn't great...).

    As far as being a Nvidia rep...hah! I wish I was, then the price problem wouldn't even matter!

    Plus I live in Australia where we get EXTRA reamed on pricing. Don't assume that I'm oblivious to the huge price difference. The Founders 1080Ti is $1899 AUD here which as a direct comparison is $1391.21 USD.

    All I can say, is it's a hell of a lot cheaper than modding my car :eek:
     
  4. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,848
    Trophy Points:
    681

    My response to the "buying time" part was: "Also, on buying time, you are buying the time to feel like you overpaid for an unsupported feature and that Nvidia took all your money, which is a hell of an experience!"
     
    Last edited: Aug 25, 2018
    Hackintoshihope likes this.
  5. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Pretty much sums up Physx and Hairworks in my experience.
     
    ajc9988 likes this.
  6. Hackintoshihope

    Hackintoshihope AlienMeetsApple

    Reputations:
    308
    Messages:
    1,042
    Likes Received:
    227
    Trophy Points:
    81
    This video is amazing and truly explains why one should not hesitate to buy an RTX card now!

    Edit: Now this is truly satire. Haha!
     
    Last edited: Aug 25, 2018
    hmscott and ajc9988 like this.
  7. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    This is Nvidia with a truly dominant market share with a feature they are likely to push for some time, it's likely to see decent adoption to be fair.
     
  8. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Maybe. Might be just well adopted as Hairworks and PhysX

    All depends on how well it works out the gate for the consumer.
     
  9. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    I think it's a bit more significant, it depends how easy it is to add in i guess.
     
  10. aaronne

    aaronne Notebook Evangelist

    Reputations:
    321
    Messages:
    466
    Likes Received:
    524
    Trophy Points:
    106
  11. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    going from a 1070 to 4080 will be a nice decent jump in performance and efficiency, assuming my laptop can still upgrade at that time! but this new RT, will put it off until it matures a bit to be worth the money.
     
  12. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
  13. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
  14. RampantGorilla

    RampantGorilla Notebook Deity

    Reputations:
    72
    Messages:
    780
    Likes Received:
    313
    Trophy Points:
    76
    At that point, the PCIe 3.0 x16 connection in your laptop would bottleneck the 4080...
     
  15. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    woah x8 maybe yes but x16? 4080 gotta be pretty damn powerful than 1080 to do that, like at least 10 folds maybe? i dont think nvidia can improve their GPU that much in just 3 generation.
     
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Unlikely as a 4x connection just now is fine so long as it's from the CPU directly.
     
  17. RampantGorilla

    RampantGorilla Notebook Deity

    Reputations:
    72
    Messages:
    780
    Likes Received:
    313
    Trophy Points:
    76
    Actually, I take that back:
     
  18. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Running two cards is very different to one as the cards have to talk to each other. Plus the Titan-v has no interconnect like an SLI bridge or NVlink to supplement the PCI-E connection.
     
  19. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    well, assuming no bottleneck or anything. theres also the question if 4080 is worth it, and even if its a complete rip off, can it work in the current p870tm is another question.
     
  20. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    You can never read that far into the future with tech.
     
  21. RampantGorilla

    RampantGorilla Notebook Deity

    Reputations:
    72
    Messages:
    780
    Likes Received:
    313
    Trophy Points:
    76
    Not really, I'm pretty sure there's no interconnect in laptops with 2 MXM 1080s in SLI, which would mean that two 4080 or 4070s would definitely see a bottleneck when in SLI with each other. You're correct about the two cards talking to each other though.

    I'm pretty sure when you hook up a 1080ti to a MBP 15" 2017 using a TB3 eGPU enclosure, the card has performance that puts it somewhere between a 1070 and 1080. The TB3 controller on the MBP 15" 2017 is connected directly to the CPU. There is a bandwidth bottleneck.
     
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    There is a high bandwidth SLI bridge, in fact mobile SLI until high bandwidth came along always had a bandwidth advantage over desktop cards.
     
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    "Everyman's" laptop, Bravo...

    " Notebook with Core i9-9900K and Geforce RTX 2080 for 9,999 euros"

    The NVlink connector is pretty beefy / wide, I wonder if the distance will be an issue with "SLI" 2 GPU laptops? Will the design need to put the 2 GPU's closer together than opposite edges of the laptop?
     
    Last edited: Aug 27, 2018
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Limiting the RTX 2080 TDP to 90w from 215w isn't going to help performance, especially with the Tensor / RT cores for ray-tracing and DLSS features...

    That's where the crazy co-habitation of Tensor / RT cores stealing real estate on the die - power and thermal headroom - is going to pinch off performance overall - especially when all 3 areas are active like when doing RTX fluffery.

    The Asus thin Zephyrus is a poor host for RTX GPU's...so will most laptops.
     
    Last edited: Aug 27, 2018
    aaronne likes this.
  25. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    It's why I'd be perfectly fine with some "hotrod" 20s for laptops. Strip out the Tensor/RT and boost the number of cores memory and clocks. Maybe not as much power as the Desktop RTX but having their own strengths.
     
    hmscott likes this.
  26. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,848
    Trophy Points:
    681
    Can't speak to the software side, but interesting on the hardware side:



    Sent from my SM-G900P using Tapatalk
     
    hmscott likes this.
  27. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,754
    Likes Received:
    2,197
    Trophy Points:
    181
    Yup, classis gorilla marketing. Put a product online which doesn't exist, tip off the press and then take it offline as quickly as posisble so it looks like you're going to be first to market but don't get into too much trouble. Happens before every single launch, Clevo seem to be the most abused ODM in this regard unfortunately. Google cache is dead so I didn't see who it was, but direct Clevo customers who have this information first and then are able to test samples early can't afford to even hint or leak this kind of information ;-)

    EDIT - Ok, I noticed there's huge discussions on computerbase on this topic, will leave it alone as it's already run its course.
     
    Last edited: Aug 28, 2018
    ole!!! likes this.
  28. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    we all know its coming eventually. even if clevo dont officially sell it, the bios is ready to take in 9900k. from the looks of things tho it'll be the same chassis so likely same heatsink, no extra fan, similar mobo. for mobo have to wait and see how much different it is.

    @Papusan clevo saving money predicted, everything the same except z390. I was hoping to see some built in wifi/bt guess too cheap to happen :D
     
  29. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Well that does make sense if you are building something that's meant to run portable VR.

    I mean that's the whole VR market as of right now. Nothing out there worth the time for the average user.

    Just trying to Apple this isn't encouraging.
     
  30. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,754
    Likes Received:
    2,197
    Trophy Points:
    181
    Yup, but to clarify - I was refering to the GPU GTX 2080 info included in some of the links rather than the CPU side ;-)
     
    Falkentyne likes this.
  31. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    I give a damn in built in WiFi. Much more important they add in the correct 4th fan and use more powerful fans on all 4 fan headers :) Better designed, more powerful fans will make a significant difference in cooling capacity. This can be done with equal or slightly higher fan noise.
     
  32. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    from my testing adding that extra 5v tiny 6mm fan did very little and adds a lot of noise. in my case could be poor heatsink contact that i have yet to fix but it is coming just waiting for a few extra things to arrive.

    having built in wifi + bt would be amazing, assuming it would free up more PCH pcie lanes that can be used towards additional storage is always good.
     
  33. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    It will be different with a fan properly designed for the models small heatsink. And I talk about all 4 fans. Need better fans.
     
    Falkentyne likes this.
  34. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    what they need is a cpu vapor chamber heatsink that would distribute the heat quickly to all 4 of the existing pipes. take a 8700k and put it in, we can see the die is sideways (horizontal) if looking at the normal direction when disassembly the machine.

    the iGP on the die should be far right, because my core0/1 are both hottest which from my pictures shows contact is poorer in that area as well, also make sense the outer pipe in poor contact with it, only goes to thinner grill instead of the biggest main CPU fan.

    real sad that iGP is far right is having good contact with main heat pipes and goes to the main fan, it is beyond retarded.
     
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Oh I agree, it's a cool leap, it's this whole thing if how you take the move they doing. The underlying transistor design and sheer die size is fun from an engineering point of view.
     
    ajc9988 likes this.
  36. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    If this thing is good, I'll buy it first thing and mod the fans myself. Maybe I'll have a 5.5ghz 9900K before brother Papusan. But I need someone clever, skilled and holy like @Prema to mod the GPU Vbios so we can properly draw 215W from these things. Or maybe someone can write another vbios editor that requires a programmer. We KNOW that someone here is going to want to pull 300W through this GPU

    Right @Papusan ?
     
  37. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    damn if a gpu is going to be 200+ watt i would only want one of them in my machine. dual fan cooling off that giant heatsink is enough.
     
  38. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    For this level of stuff I'd think it would definitely have to be some kinda Vapour chamber setup. Just pipes and fans wouldn't cut it any more .

    I remember I had an old Sapphire card that had a pretty slimline Vapour Chamber design. That could possibly be effective.
     
  39. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Those power levels are not too dissimilar from the current higher end cards.
     
    ajc9988 likes this.
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That is what I am hoping for too, plain non-RTX upgraded GPU's.

    Nvidia should have done what AMD was doing, providing Professional Ray-tracing development GPU's for creators to start getting games ready, and then come out with ray-tracing in another process generation or two when the games are ready and the hardware delivery is optimal.

    I think Nvidia might have been worred about 7nm AMD GPU's coming out, getting "close" to Nvidia's best 12nm, and Nvidia thought they'd confuse the issue by coming out with RT/Tensor cores 1st.

    1st is rarely best as we all know, so Nvidia might have set themselves up to take the "pioneer arrows", and lose confidence in their products by releasing way too early.

    But, maybe Nvidia will "fix" this by releasing mobile non-RTX GPU's that will find their way into desktops too?
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    A worried nvidia would not take huge risks like dedicating a large piece of silicon to a feature.
     
    Vistar Shook likes this.
  42. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Last edited: Aug 29, 2018
  43. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Any fixing might be a while as they seem pretty set on Ray Tracing AI work to be their next big step in performance.
     
    hmscott likes this.
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    That's a progression of what they have been doing. I still don't see they have anything to worry about.
     
    Papusan likes this.
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    IDK, the high power requirement of the RTX sections, 50% of the die, and poor FPS @ 1080p performance even in the 2080ti suggests underpowered versions will be useless for RTX games.

    The only way out is to disable those features and allow the legacy 50% of the die to get full performance so it can compete against mobile 10 series laptops.

    People aren't going to be happy getting little to no real game performance upgrade, and poor RTX results - poorer than desktop gets.

    90w GPU vs 215w GPU really cuts into the power budget, even if it all goes to the legacy die sections.

    Given even at the most "wow" effect RTX seems to be a worthless extension of eye-candy - really the first thing I'd turn off - it already seems like a loser to me.

    It shouldn't take too long for people to realize that 50% of the die is wasted on things they don't use or care about, and that the 50% they do care about is now saddled with junk they can't use and don't want.
     
    Last edited: Aug 29, 2018
    TheDantee likes this.
  46. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    Why not wait with the final judgement til after the new Nvidia graphics is out, with proper drivers and tested/reviewed?
     
    jclausius and bennyg like this.
  47. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    You've got a point. I mean when I heard about RTX features for their commercial grade cards I thought it would be excellent for film production where they've got the cash to really make it shine.

    But on the consumer level there is a lot less space to work with.
     
    TheDantee and hmscott like this.
  48. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, RT and TC have got a future, but it really is "in the future".

    For now RT and TC should remain in the creator's realm, professional and commercial needs can be fulfilled for now.

    Making us all pay for it with higher costs and lower performance than we are used to (<60fps @ 1080p!!) isn't going to fly.

    Flying Car's still don't fly and neither will RTX, it's too soon!! :cool:
     
    Last edited: Aug 29, 2018
    hayyan likes this.
  49. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Its future is another few years down the line with VR. Where such a process really comes into its own.
     
    hmscott likes this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That's true too, the cue's given by RT can be used to effect in VR much easier and to greater effect than in 2d flat graphics.

    But, then the FPS and lag is even less forgiving...

    It's a tough balance to have the right performance and cost at the right time in the right application.

    Or, you could be Nvidia and push it all out way before it's time, charge a massive price along with not delivering anything useful for most buyers, to confuse the market in their favor.

    It's not real ray-tracing, DLSS's not real 4k, it's all an illusion... except for the $$$$'s leaving your wallet. o_O
     
    Last edited: Aug 29, 2018
← Previous pageNext page →