Why manage expectations? Nvidia will get a snap back if they are not met, that's no forum user's job.
-
Meaker@Sager Company Representative
-
-
I'm not naive when itr comes Nvidia's tactics. They've done it every card release and benchmarks have always revealed the truth. For the most part, each generation has been a decent upgrade (possibly with the exception of the GeForce 5 FX gen...that wasn't great...).
As far as being a Nvidia rep...hah! I wish I was, then the price problem wouldn't even matter!
Plus I live in Australia where we get EXTRA reamed on pricing. Don't assume that I'm oblivious to the huge price difference. The Founders 1080Ti is $1899 AUD here which as a direct comparison is $1391.21 USD.
All I can say, is it's a hell of a lot cheaper than modding my car
-
My response to the "buying time" part was: "Also, on buying time, you are buying the time to feel like you overpaid for an unsupported feature and that Nvidia took all your money, which is a hell of an experience!"Last edited: Aug 25, 2018Hackintoshihope likes this. -
Fastidious Reader Notebook Evangelist
Pretty much sums up Physx and Hairworks in my experience.ajc9988 likes this. -
Hackintoshihope AlienMeetsApple
This video is amazing and truly explains why one should not hesitate to buy an RTX card now!
Edit: Now this is truly satire. Haha!Last edited: Aug 25, 2018 -
Meaker@Sager Company Representative
This is Nvidia with a truly dominant market share with a feature they are likely to push for some time, it's likely to see decent adoption to be fair. -
Fastidious Reader Notebook Evangelist
Maybe. Might be just well adopted as Hairworks and PhysX
All depends on how well it works out the gate for the consumer. -
Meaker@Sager Company Representative
I think it's a bit more significant, it depends how easy it is to add in i guess.
-
RTX first appeareance on mobile seems related to Asus:
https://www.geeknetic.es/Noticia/14...garan-a-finales-de-ano-con-un-TDP-de-90w.html
https://translate.google.com/transl...ano-con-un-TDP-de-90w.html&edit-text=&act=url
omg 90w max-qhmscott likes this. -
going from a 1070 to 4080 will be a nice decent jump in performance and efficiency, assuming my laptop can still upgrade at that time! but this new RT, will put it off until it matures a bit to be worth the money.
-
BrightSmith Notebook Evangelist
-
Meaker@Sager Company Representative
They had a 1080 max-q version of that chassis too. -
At that point, the PCIe 3.0 x16 connection in your laptop would bottleneck the 4080...
-
woah x8 maybe yes but x16? 4080 gotta be pretty damn powerful than 1080 to do that, like at least 10 folds maybe? i dont think nvidia can improve their GPU that much in just 3 generation.
-
Meaker@Sager Company Representative
Unlikely as a 4x connection just now is fine so long as it's from the CPU directly.
-
Actually, I take that back:
-
Meaker@Sager Company Representative
Running two cards is very different to one as the cards have to talk to each other. Plus the Titan-v has no interconnect like an SLI bridge or NVlink to supplement the PCI-E connection.
-
well, assuming no bottleneck or anything. theres also the question if 4080 is worth it, and even if its a complete rip off, can it work in the current p870tm is another question.
-
Meaker@Sager Company Representative
You can never read that far into the future with tech.
-
Not really, I'm pretty sure there's no interconnect in laptops with 2 MXM 1080s in SLI, which would mean that two 4080 or 4070s would definitely see a bottleneck when in SLI with each other. You're correct about the two cards talking to each other though.
I'm pretty sure when you hook up a 1080ti to a MBP 15" 2017 using a TB3 eGPU enclosure, the card has performance that puts it somewhere between a 1070 and 1080. The TB3 controller on the MBP 15" 2017 is connected directly to the CPU. There is a bandwidth bottleneck. -
Meaker@Sager Company Representative
There is a high bandwidth SLI bridge, in fact mobile SLI until high bandwidth came along always had a bandwidth advantage over desktop cards. -
"Everyman's" laptop, Bravo...
" Notebook with Core i9-9900K and Geforce RTX 2080 for 9,999 euros"
The NVlink connector is pretty beefy / wide, I wonder if the distance will be an issue with "SLI" 2 GPU laptops? Will the design need to put the 2 GPU's closer together than opposite edges of the laptop?Last edited: Aug 27, 2018 -
Limiting the RTX 2080 TDP to 90w from 215w isn't going to help performance, especially with the Tensor / RT cores for ray-tracing and DLSS features...
That's where the crazy co-habitation of Tensor / RT cores stealing real estate on the die - power and thermal headroom - is going to pinch off performance overall - especially when all 3 areas are active like when doing RTX fluffery.
The Asus thin Zephyrus is a poor host for RTX GPU's...so will most laptops.Last edited: Aug 27, 2018aaronne likes this. -
Fastidious Reader Notebook Evangelist
It's why I'd be perfectly fine with some "hotrod" 20s for laptops. Strip out the Tensor/RT and boost the number of cores memory and clocks. Maybe not as much power as the Desktop RTX but having their own strengths.hmscott likes this. -
Can't speak to the software side, but interesting on the hardware side:
Sent from my SM-G900P using Tapatalkhmscott likes this. -
Yup, classis gorilla marketing. Put a product online which doesn't exist, tip off the press and then take it offline as quickly as posisble so it looks like you're going to be first to market but don't get into too much trouble. Happens before every single launch, Clevo seem to be the most abused ODM in this regard unfortunately. Google cache is dead so I didn't see who it was, but direct Clevo customers who have this information first and then are able to test samples early can't afford to even hint or leak this kind of information ;-)
EDIT - Ok, I noticed there's huge discussions on computerbase on this topic, will leave it alone as it's already run its course.Last edited: Aug 28, 2018ole!!! likes this. -
we all know its coming eventually. even if clevo dont officially sell it, the bios is ready to take in 9900k. from the looks of things tho it'll be the same chassis so likely same heatsink, no extra fan, similar mobo. for mobo have to wait and see how much different it is.
@Papusan clevo saving money predicted, everything the same except z390. I was hoping to see some built in wifi/bt guess too cheap to happen
-
Fastidious Reader Notebook Evangelist
Well that does make sense if you are building something that's meant to run portable VR.
I mean that's the whole VR market as of right now. Nothing out there worth the time for the average user.
Just trying to Apple this isn't encouraging. -
Yup, but to clarify - I was refering to the GPU GTX 2080 info included in some of the links rather than the CPU side ;-)Falkentyne likes this.
-
I give a damn in built in WiFi. Much more important they add in the correct 4th fan and use more powerful fans on all 4 fan headers
Better designed, more powerful fans will make a significant difference in cooling capacity. This can be done with equal or slightly higher fan noise.
-
from my testing adding that extra 5v tiny 6mm fan did very little and adds a lot of noise. in my case could be poor heatsink contact that i have yet to fix but it is coming just waiting for a few extra things to arrive.
having built in wifi + bt would be amazing, assuming it would free up more PCH pcie lanes that can be used towards additional storage is always good. -
It will be different with a fan properly designed for the models small heatsink. And I talk about all 4 fans. Need better fans.Falkentyne likes this.
-
what they need is a cpu vapor chamber heatsink that would distribute the heat quickly to all 4 of the existing pipes. take a 8700k and put it in, we can see the die is sideways (horizontal) if looking at the normal direction when disassembly the machine.
the iGP on the die should be far right, because my core0/1 are both hottest which from my pictures shows contact is poorer in that area as well, also make sense the outer pipe in poor contact with it, only goes to thinner grill instead of the biggest main CPU fan.
real sad that iGP is far right is having good contact with main heat pipes and goes to the main fan, it is beyond retarded. -
Meaker@Sager Company Representative
Oh I agree, it's a cool leap, it's this whole thing if how you take the move they doing. The underlying transistor design and sheer die size is fun from an engineering point of view.ajc9988 likes this. -
Falkentyne Notebook Prophet
If this thing is good, I'll buy it first thing and mod the fans myself. Maybe I'll have a 5.5ghz 9900K before brother Papusan. But I need someone clever, skilled and holy like @Prema to mod the GPU Vbios so we can properly draw 215W from these things. Or maybe someone can write another vbios editor that requires a programmer. We KNOW that someone here is going to want to pull 300W through this GPU
Right @Papusan ? -
damn if a gpu is going to be 200+ watt i would only want one of them in my machine. dual fan cooling off that giant heatsink is enough.
-
Fastidious Reader Notebook Evangelist
For this level of stuff I'd think it would definitely have to be some kinda Vapour chamber setup. Just pipes and fans wouldn't cut it any more .
I remember I had an old Sapphire card that had a pretty slimline Vapour Chamber design. That could possibly be effective. -
Meaker@Sager Company Representative
Those power levels are not too dissimilar from the current higher end cards.
ajc9988 likes this. -
That is what I am hoping for too, plain non-RTX upgraded GPU's.
Nvidia should have done what AMD was doing, providing Professional Ray-tracing development GPU's for creators to start getting games ready, and then come out with ray-tracing in another process generation or two when the games are ready and the hardware delivery is optimal.
I think Nvidia might have been worred about 7nm AMD GPU's coming out, getting "close" to Nvidia's best 12nm, and Nvidia thought they'd confuse the issue by coming out with RT/Tensor cores 1st.
1st is rarely best as we all know, so Nvidia might have set themselves up to take the "pioneer arrows", and lose confidence in their products by releasing way too early.
But, maybe Nvidia will "fix" this by releasing mobile non-RTX GPU's that will find their way into desktops too? -
Meaker@Sager Company Representative
A worried nvidia would not take huge risks like dedicating a large piece of silicon to a feature.
Vistar Shook likes this. -
Nvidia sure seems worried now:
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-61#post-10787732
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-61#post-10787604
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-62#post-10787971Last edited: Aug 29, 2018 -
Fastidious Reader Notebook Evangelist
Any fixing might be a while as they seem pretty set on Ray Tracing AI work to be their next big step in performance.hmscott likes this. -
Meaker@Sager Company Representative
That's a progression of what they have been doing. I still don't see they have anything to worry about.
Papusan likes this. -
IDK, the high power requirement of the RTX sections, 50% of the die, and poor FPS @ 1080p performance even in the 2080ti suggests underpowered versions will be useless for RTX games.
The only way out is to disable those features and allow the legacy 50% of the die to get full performance so it can compete against mobile 10 series laptops.
People aren't going to be happy getting little to no real game performance upgrade, and poor RTX results - poorer than desktop gets.
90w GPU vs 215w GPU really cuts into the power budget, even if it all goes to the legacy die sections.
Given even at the most "wow" effect RTX seems to be a worthless extension of eye-candy - really the first thing I'd turn off - it already seems like a loser to me.
It shouldn't take too long for people to realize that 50% of the die is wasted on things they don't use or care about, and that the 50% they do care about is now saddled with junk they can't use and don't want.Last edited: Aug 29, 2018TheDantee likes this. -
Why not wait with the final judgement til after the new Nvidia graphics is out, with proper drivers and tested/reviewed?
-
Fastidious Reader Notebook Evangelist
You've got a point. I mean when I heard about RTX features for their commercial grade cards I thought it would be excellent for film production where they've got the cash to really make it shine.
But on the consumer level there is a lot less space to work with. -
Yeah, RT and TC have got a future, but it really is "in the future".
For now RT and TC should remain in the creator's realm, professional and commercial needs can be fulfilled for now.
Making us all pay for it with higher costs and lower performance than we are used to (<60fps @ 1080p!!) isn't going to fly.
Flying Car's still don't fly and neither will RTX, it's too soon!!
Last edited: Aug 29, 2018hayyan likes this. -
Fastidious Reader Notebook Evangelist
Its future is another few years down the line with VR. Where such a process really comes into its own.hmscott likes this. -
That's true too, the cue's given by RT can be used to effect in VR much easier and to greater effect than in 2d flat graphics.
But, then the FPS and lag is even less forgiving...
It's a tough balance to have the right performance and cost at the right time in the right application.
Or, you could be Nvidia and push it all out way before it's time, charge a massive price along with not delivering anything useful for most buyers, to confuse the market in their favor.
It's not real ray-tracing, DLSS's not real 4k, it's all an illusion... except for the $$$$'s leaving your wallet.
Last edited: Aug 29, 2018
Nvidia RTX 20 Turing GPU expectations
Discussion in 'Sager and Clevo' started by Fastidious Reader, Aug 21, 2018.