Yeah, it starts to be fast enough for 4K but too slow for higher resolutions. Hence they have to go for new things such as improve rendering quality (to be able to continue earn money). No point for Nvidia to push for much higher old fashion graphics performance as it is now with AMD.
-
-
yrekabakery Notebook Virtuoso
I'm talking about the planar reflections on the windows, mirrors, floor, and water in the opening level. It's an expensive way to do real-time reflections (albeit not as expensive as ray tracing), and it can't account for second order reflections (reflections of reflections), but it still looks good. -
yrekabakery Notebook Virtuoso
Eh, it's early days yet. Ray tracing is the future, just like programmable shaders were the future back in 2001.
Vistar Shook and hmscott like this. -
It's all a bunch'a hype from hucksters to separate cash from suckers.
They didn't have a good story to sell with normal performance upgrades, which were already in a traditional cycle of delivering higher performance at the same price points for many generations.
After that sweet "gold rush" of mining suckers that would pay "anything" to get in on the action Nvidia needed to come up with another kind of sucker bait to justify bumping up their MSRP to the "mining levels" of profit, so they could join in on the action directly to their endless pool of end user suckers.
DXR as treated by AMD, professional level GPU's for content creation, and participating in the DX12 DXR effort is how Nvidia could have treated this, and could have held back the wide delivery of RT / Tensor cores until it was actually cost effective, but that would negate that sucker bait profit they could grab right now.
Technical advances shouldn't require draining the blood of your customers. And, we shouldn't encourage this kind of behavior from vendors.
If you are a real Nvidia fan, buy the old inventory and ignore the RTX GPU's, as the Series 10 is enough for gaming, and costs 2x-3x less for adequate performance.
Avoid getting sucked into the peer pressure "leading you around by the benchmark" behavior exhibited by most suckers with more money than sense.
As a dedicated Nvidia fan, or just someone that needs a good / cheap Gaming GPU, help Nvidia out of their inventory backlog black hole of despair by buying Series 10 hardware, or save even more money and buy a nice used GPU for pennies on the dollar as the mining craze continues to cause miners to fold their operations and liquidate their assets.
http://forum.notebookreview.com/thr...l-transformation.812591/page-47#post-10831411
There is no reason to buy or support RTX GPU's as they are currently released and priced. All 3 of these videos do a good job putting the RTX GPU pricing into stark contrast against past releases, and show what these overpriced ripoff's should cost:
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-109#post-10831556Last edited: Dec 9, 2018Woodking likes this. -
My perspective may be incorrect, however RTX is just 10XX expanded with more TDP, slightly more CUDA cores, and slightly faster memory. That alone doesnt merit the huge price tag, so they threw in Ray tracing and renamed GTX to RTX to make it seem unique.
Not really interested in playing that game with nvidia but hopefully when I have more money then sense I can participate in those kind of transactions.hmscott likes this. -
yrekabakery Notebook Virtuoso
Don’t care about your opinion, been posted thousands of times already. Ray tracing is and has always been the future, whether RTX succeeds in the long run is irrelevant.
The title of this thread is “ray tracing is BS” (which is a BS statement), not “RTX is BS”.Vistar Shook and hmscott like this. -
Bigger die + more and better components (power delivery, cooling etc) vs. previous gen has its price. And parts (components) ain’t cheaper than last year. Or when Pascal was released.
Vistar Shook likes this. -
I doubt that it is true at all, certainly in the short run 5-10 years, and probably in the long run, it's a waste of computing power for no good reason.
Ray-tracing is over the top cost prohibitive to add even the mere sparkle of raytracing through RTX.
Nvidia has blown it already by letting that cat out of the bag too early. RT costs way too much.
The compute requirements to replace what is working just fine are too large, and the current "ginger bread" effect of tacking on a "springling" of RT effects is a joke.
Full coverage RT isn't what's on offer - that's ******** thinking right there if you think it is, what's on offer is the RTX "bag of magic beans" that won't grow anything of use out of the pile of ******** it's planted in.Last edited: Dec 9, 2018 -
yrekabakery Notebook Virtuoso
Ray tracing is the next frontier. Some very smart people in the industry, like Carmack and Sweeney, have been saying this for many years. Ray tracing will eventually win out. It’s not a matter of if, but when.Vistar Shook and hmscott like this. -
It's incredible looking and definitely the future. For the first time in a long time we are finally pushing graphics technology and image quality thanks to Nvidia. No worries though, AMD will copycat like they always do. Maybe they will come out with FRT/Free-RT or some crap in a couple years when they can figure it out.Vistar Shook likes this.
-
When is the key answer we need. It's certainly not RTX 1.0, likely not even RTX 5.0, so that's 10 years away at least.
Fusion Plasma Power on-grid is now advertised as being 15 years away, and many other things we can't do yet are just over the horizon, but the future can be much further away than you think. Progress is much slower than I'd expected when I was starting out.
I thought everyone would soon be using Wireless networks back in 1976-1984, and they could, for about $100,000 per node pair, then $50k, then $10k. And, they were the size of huge assemblages, then the size of luggage. It took until 1999 before Wireless was in notebooks.
So that was 20+ years from the time I was exposed to wireless networks until notebooks got them for the first time. RF, optical, it doesn't matter the mode, the solutions were a gradual drop in size and cost before it was available to the consumer for what was an incidental cost.
I thought very soon everyone would be using UNIX and the Internet back in 1978, and we are still struggling to get everyone online, although I think we can declare a win at this point. Not so much for UNIX for everyone, at least not quite yet.
Ray-tracing is going to take even longer, starting at this point in compute power given the slowing of doubling in performance for silicon designs.
To turn off the old methods and rely on only ray-tracing in real-time - at 144 FPS in 4k, for costs $400 and under, is much further off than you think.
It's a nice idea, but like fusion reactors, decades away. And, that's only if people make it an investment priority and invest $10B's over that time.
And, is the pay-off really worth it?? We can't seem to do it for "free" power, why would we do it for "free" ray's?
Nuclear fusion on brink of being realised, say MIT scientists
Carbon-free fusion power could be ‘on the grid in 15 years’
Fri 9 Mar 2018 00.01 EST
https://www.theguardian.com/environ...on-brink-of-being-realised-say-mit-scientists
Which will we reach first? On-grid Nuclear fusion?, or 100% real-time Ray-tracing? Which do you want us all to invest in and deliver first?
Last edited: Dec 10, 2018 -
It's so cute that you all are so excited about RT performance improvements when they are all from reducing ray shots and reducing objects in use (no leaves rustling).
If you then applied those freed up RT resources and applied it to more RT treatment where there is zero coverage, you'd quickly be back to 30 FPS.
Even with that huge silicon commitment, there just isn't enough compute power to do 100% RT in any real-time usable form, and hardware isn't going to improve in performance quickly enough to give you the compute needed for 100% RT only implementation - not for a long time.
So, I'm glad you find these FPS "improvements" all so amazing, but it's not at all amazing, it's sad. No one would normally allow a release without those optimizations already done - done many times - with resources reapplied to meet the "eye" candy target within an FPS / quality target.
RT treatment in BFV was rushed in and out so fast it's just a smattering of % implementation and optimizations against that minor %, and the RTX hardware can't hope to cope with the remaining implementation left on the table to complete to reach 100% replacement.
Last edited: Dec 10, 2018 -
yrekabakery Notebook Virtuoso
Old man shakes fist at technological advancement. The boomer memes write themselves.
-
You've lost your way in the discussion, and you have no good come back. If this is the best you can do in response, then instead of posting drivel, wait until your brain catches up so you can deliver an intelligent response. We'll wait for you. Take your time.Last edited: Dec 10, 2018
-
Newer isn't always better but it's not like I don't understand your point.
Call me when have better enemy ai, F. E. A. R. Still seems to be the bar unless I missed something. I have been rather disenfranchised with games of late.Last edited: Dec 10, 2018hmscott likes this. -
@hmscott
wifi is hardware
raytracing is software
hardware always takes longer, i'm in the same boat. Waiting patiently for the future, but it never comes soon enough. if your intelligent as you claim. why not bring up anti-aging medicines, I figure life is more important than connecting wirelessly to McDonalds......so yes 144 will back yrekabakery...you both have points, and you appear more intelligent due to age.hmscott likes this. -
WLAN modules with no software to grant function is a paper weight and not a very good one at that.hmscott likes this.
-
RT Cores / Tensor cores, 50% of the area of the RTX GPU die is ruined with that crap.
Wifi is mostly software, the hardware is the RF, modulation, and interfaces supporting the software. Running all that hardware is the software, as are the protocols and applications that drive it.
As far as length of life vs. quality of life vs costs, I'd have to say RTX is a complete failure at providing both quality of life while it exists and length of life as it's also going to be a sharp failure of short duration.
The only successful thing RTX GPU's are doing is gathering lots of money for Nvidia and it's partners. At least until people wake up and figure out they are being ripped off. Then they can buy GTX GPU's and AMD GPU's, save a bunch of money, and still game just fine.Last edited: Dec 11, 2018electrosoft likes this. -
Even without RTX Nvidia could still charge premium. No competition. They have virtually full monopoly.
See above.Vistar Shook likes this. -
ray tracing technique is software that needs new hardware to run....I have no idea how you guys argue the most bizzare stuff
ripped from googles definition
Ray tracing is a technique that can generate near photo-realistic computer images. A wide range of free software and commercial software is available for producing these images. This article lists notable ray- tracing software.
In computer graphics, ray tracing is a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects.
144 strikes backLast edited: Dec 11, 2018 -
Hey, IDK why you are insisting on arguing with data not of Nvidia - they are the first to add this ray-tracing assist hardware to get closer to real-time ray-tracying. Even that quote about software still relies on the underlying hardware assist of the GPU and CPU, software doesn't run in thin air.
Go to Nvidia's Turing / RTX release info, they discuss there that 50% of their die is for RTX features, and without it they couldn't have accomplished real-time ray-tracing or DLSS.
" Turing represents the biggest architectural leap forward in over a decade, providing a new core GPU architecture that enables major advances in efficiency and performance for PC gaming, professional graphics applications, and deep learning inferencing.
Using new hardware-based accelerators and a Hybrid Rendering approach, Turing fuses rasterization, real-time ray tracing, AI, and simulation to enable incredible realism in PC games, amazing new effects powered by neural networks, cinematic-quality interactive experiences, and fluid interactivity when creating or navigating complex 3D models."
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/
The graphical lines show a rough % of die usage for context, they don't actually dilineate the exact boundries of use. The actual circuits are mixed in together.
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive
Ray-tracing interface is through software, but like I said most things like RTX and Wifi are part hardware and part software, they work together, not independently.
The difference with RTX is that the standard GPU hardware wasn't tuned to be optimized for ray-tracing so Nvidia built in a hardware assist using RT Cores to make it faster, but nowhere near powerful enough for a 100% ray-tracing replacement of traditional ray-tracing software running on shader / compute hardware.Last edited: Dec 12, 2018 -
thegreatsquare Notebook Deity
Ray-tracing reminds me of a joke about a newlywed wife who is upset to find her husband masturbating and exclaims how she wants to take care of his needs, to which the husband replies, "I appreciate the thought honey, but I don't think you comprehend the workload".
Ray-tracing done right is everything done in Ray-tracing, otherwise you're going to catch things left to screen space reflection. Ray-tracing is tough now and it's not even doing the whole job. Some compare Ray-tracing to tessellation, but it will probably mature like SSAO with increasingly complex types. The hardware and software need to mature.Vistar Shook and hmscott like this. -
Ray-tracing is not a good reason to buy one. Playing games is not a good reason to buy one. There is only one reason (as highlighted below, along with Cuda core count) I would want one, but the pricing is ludicrous.
Vistar Shook, TBoneSan, Arrrrbol and 3 others like this. -
These analysis of what the 20 series pricing would have been go into a lot of detail, and provide good estimates of what the RTX GPU's are really worth, forgetting about the RTX baggage and focusing on generation to generation price points:
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-109#post-10831556
If Nvidia was looking for "next generation GPU repellent" so Nvidia could sell off the Series 10 backlog of inventory, the RTX GPU's are the perfect storm of no value and no benefits to differentiate them, saddled with prices that make people sad.
The 2080ti and the Titan RTX are priced way too high for any rational purchase decision. Both are the only GPU's that exceed the 10 series GPU's. Otherwise you can find a GTX GPU that will match performance for much less, far less if you buy used.
Last edited: Dec 12, 2018 -
Yes, it is extremely, beyond retarded, overpriced. I want a 2080 Ti FTW3, but they are not available and the price makes me want to punch someone in the groin with brass knuckles.
I am bidding on a new EVGA 1080 Ti SC2 Hydro Copper to SLI my current setup. If I get it for the price I am willing to pay for it, then I'll do the same power mod to the second GPU. The trouble for people that want the best is no more 2080 Ti are available new from retail sources. They've sold out all 2080 Ti GPUs everywhere because lots of people want the best of the best. I am unwilling to settle for a lesser GPU unless the pricing falls to AMD or lower price levels for some extra cheap HWBOT hardware points. I have no interest in the weaker GPUs since what I already have is many times better. I suspect NVIDIA is waiting for more people to lower their standards and buy 1060 and 1070 stock before any of the RTX prices fall in line. From the way it looks now, that might take a long time because the lesser GPUs do not seem to be selling well even at hugely discounted prices. It would be silly for those with the financial means to waste money on an AMD 580, 590 or Vega 64, 1060 or 1070 because they are smart enough to know they won't be satisfied with the performance. All of the benchmark comparisons on YouTube constantly highlight their performance inferiority, so only a moron (or someone with limited budget) would want them. And, there is the possibility that they will never fall in line, even after all of the 1060 and 1070 are gone, as long as there is such disparity in performance and zero competition from AMD in the high-end GPU space.Vistar Shook, Arrrrbol and Papusan like this. -
Honestly, I'm convinced nVidia is attempting to recover from when they pulled that 1080ti release and dumped the price for better perf over the Titan Pascal. 700 dollars that beat out a ~1200 dollar card that a lot of people bought into.
I mean, seriously, ~30% perf increase (2080ti vs 1080ti), for almost double the price? Someone at nVidia (or multiple someones) are hitting the crack pipe hardhmscott likes this. -
It might be the crack pipe. It could also be that they are just trolling for dumb-dumbs. Normally, I would say you can't blame them for seeing what they can get away with... but, the trouble is if too many people drink the Kool-Aid, asinine becomes the new norm. They blamed it on the bitcoin miners before, but now that mining is no longer a thing they are perpetuating the insanity and some are numb to the nonsense. If and when the prices do return to normal, we need to brace ourselves for the non-stop expressions of outrage from the dumb-dumbs that paid double for the same product. I may ultimately end up falling into the dumb-dumb camp before its over, but I am not rushing toward the edge of the cliff and waving my wallet in the air.Vistar Shook, Arrrrbol, Papusan and 1 other person like this.
-
And, then Nvidia has the gall to follow up with another 2x price jump between RTX GPU models, from $1300 -> $2500 for the Titan RTX, and then try to market it to consumers.
As a professional card, fine it's gonna get budgeted for commercial use, but then Nvidia actually tries to sell the Titan RTX to the gamer / consumer crowd by doing that "cute as a button" sneaky sneak of a Titan RTX box into the background of "social media" influencers video's as a soft release.
Nvidia needs to put triple effort's into delivering RTX DLSS and ray-tracing through new games otherwise people might gather in groups asking for refunds on their RTX GPU's, considering Nvidia never delivered RTX useful software for their high dollar RTX GPU's.
How long will people wait before calling Nvidia on their release promises of 25 DLSS games + 10+ Ray-tracing games? It's been 5 months soon, with 1 RT game and no DLSS games. Sad.Last edited: Dec 12, 2018 -
Yea, I'm not falling for it. I've got a 1080ti and a 980ti in my desktop, and I'm perfectly satisfied with the performance out of them. The 1080ti rocks socks, and honestly, I'm a 2k guy, not a 4k, so I'm not worried about nailing 144fps at 4k. 144 @ 2k is more than enough, and the 1080ti definitely hits that spot, at least on games where the engine isn't complete and utter garbo
Sigh. I miss the days of the Radeon HD 290x. That was AMD's last great, competitive card. Excellent performer, and still relevant even today. After that, they just did a HAWAII XT re-release with the 390x, and then completely tanked. Fury was ****, Vega is an absolute bomb, and the 580/590 **** they're trying to pull now is just a rehash of tech from 2012.
It seems when AMD does well in one field, they completely tank in the other. Hopefully, with the new market share they've got from the Ryzen success, they're able to rebuild their GPU division and actually force nVidia to stop being twats.
I only made the switch to intel/nVidia because AMD started to suffer hard in the CPU field, then their GPUs started slowly tankingVistar Shook, electrosoft, Arrrrbol and 3 others like this. -
Hey, mr fox, sorry to derail this thread, but is there a way to PM you? I've got a question, but it looks like your PMs are off
-
all software runs on hardware.....I honestly don't know how to take you guys anymore...its like lights on but nobodys home
-
I used to respect you guys.....but now I realize your not half as smart as you appear
-
cold day in hell when i reply to you.
edit i stopped posting frequently after you guys declared nvidia geforce now obsolete when its the future....really now.....even linus agrees
-
yrekabakery Notebook Virtuoso
Do you enjoy talking to yourself?
Vistar Shook, killkenny1, Prototime and 5 others like this. -
saturnotaku Notebook Nobel Laureate
I hope the irony of this post is not lost on anyone.Vistar Shook, killkenny1, electrosoft and 2 others like this. -
Yea, it really bothers me how they tried to market the Titan RTX. 2500 dollars is legit like 2 really high powered gaming PCs. And they actually expect normies to buy that? Wot?
I don't know how nVidia expects to stay in business with these kinds of price asks. Yes, there will always be slobbering, rabid fan-bois that will drop cash at every new release, even if they have to go the next two months without food, but that alone cannot possibly sustain.
I think nVidia realized that they had a runaway train with the 10 series, when AMD came out with the really weak Vega line, and they're trying to capitalize on it by targeting (as mr fox says) the dummies with more money than sense. It used to be tit for tat when it came to red vs green, now it is just green steamrolling the GPU world. And here we see the monopoly hard at work. This is why competition is good.
I don't see how they can justify asking those prices too, for tech that, like you said, basically doesn't exist. "Yeaaaa, here's this super awesome tech that gives you SHINY THINGS".
K. But where is it?
And when it does finally show up, the performance isn't great. I can forgive that, at least, because it is Gen1 immature tech. I cannot forgive sucking up 50% of a GPU die on a 12nm process for tech that we don't even have, and that requires a ****load of extra time on the development cycle, just to integrate nVidia's ray tracing into the rendering process for a game.
AMD is supposed to be releasing their NAVI line on the 7nm process, and supposedly (unconfirmed though), they've got ray tracing tech in the works too.
Given how well AMD cards seem to do with Vulkan, and the continuing adoption of it (thank ****, I've hated DX for years), their cards might actually outperform nVidia's, given the right conditions, for fractions of the cost. I mean, with a 290X, I was able to do 1080p on Doom 2016 hitting 150+ FPS on maxed out settings. That's really good for a card from 2012.
What nVida really needs to do is take the Turing line, and do a second release, without RTX cores on. The cards would be really good, and in fact, probably outperform the hell out of the current Turing line, purely because of all the freed up die space. They could also drop the prices to match last gen prices. That would make more sense. Like, do a 2065, 2075, 2085, 2085ti (please), etc.Last edited: Dec 13, 2018hmscott likes this. -
electrosoft Perpetualist Matrixist
I’m over Nvidia’s shenanigans atm.
There is no way in hell im paying anywhere near what they’re asking for the 2080ti.
I even regretted buying an EVGA Kingpin 1080ti because it was so expensive. Luckily (why I do not know) the market for Kingpins went back up recently and I was able to sell my Kingpin for the price I paid earlier this year.
I ended up buying a Sapphire Vega 64 brand new with discounts for $321 shipped from Newegg and pocket the massive difference and it does the job for what I need.
It will be quite awhile before I return to Nvidia’s camp.*
* I reserve the right to be a hypocrite anytime
hmscott likes this. -
Turing *might* get cheaper when the Pascal stocks finally run dry, but it will likely still end up costing more than the equivalent Pascal GPU did. To be fair to nVidia, these Turing cards probably do cost more to manufacture, but they will still be getting some silly margins on them at the current pricing. The only thing which will bring Turing prices down to a reasonable level is if AMD manage to give us something that can compete with the 2070 and 2080 at a lower price. They should be able to manage that on 7nm, since Turing was a pretty small improvement over Pascal. If by some miracle they beat Turing by a considerable margin then I would expect Turing to be very short lived before nVidia replace it. Not holding my breath for that one though.Vistar Shook, hmscott, Mr. Fox and 1 other person like this.
-
Support.3@XOTIC PC Company Representative
It's not hypocrisy if you get what's good for you at the time. Better not to get overly attached to or overly opposed to any particular brand.hmscott, undervolter0x0309, hfm and 1 other person like this. -
Its only the future if you like to give away control over your games etc and be totally dependent on internet. I rather run all my stuff in house. Also linus his video is just sponsored content in the utmost ideal settings. No way it can replace my setup at home. 1440p 165hz gaming goodness and an 150mbit fibre optic connection. Its pretty much the best you can get for online gaming. No online streaming platform can offer such framerates, fidelity and minimum input lag.Last edited: Dec 15, 2018Vistar Shook and hmscott like this.
-
saturnotaku Notebook Nobel Laureate
Lest we also forget that a lot of HSI connections in the US are capped now. Assuming my math is correct, playing for 3 hours per day at the recommended 50 Mb/s quality setting will blow through a 1 TB data limit in a little more than 2 weeks.Vistar Shook, hmscott, Papusan and 2 others like this. -
I used to be an "AMD fanboi".
I was one of those insufferable drooling spods that would sit there, waving AMDs flag all day. I had been using computers since the great Athalon days, when AMD would release CPUs on par or better than Intel's for a fraction of the cost back then.
Then I grew up and realized, brands don't mean jack ****. What matters is what camp is offering the performance per dollar victory.
Admittedly, that did leave me in AMDs camp for a little while longer, but I was at least investigating Intel after that.
Then the 8350FX dropped, and, frothing at the mouth for an 8Core CPU I was like "YES PLEASSSE".
But it slurped hot turd on a sunlight shingle. I think I used that CPU about a year? Then I went with Intel's 6700k. Which is in the very computer I'm using right now, along with a 980TI and a 1080TI.
Now that AMD is back on the rise(en < see what I did there?) at least in the CPU field, I am strongly considering marching right back into their camp.
If Zen2 drops and is as good as everyone is speculating, I am definitely returning. Intel and nVidia have lost their damn minds with these latest price asks and Intel especially with this SKU remix ********. i7 no longer has hyperthreading? i9 does now? And it's SIX HUNDRED FARTING DOLLARS? No. I will not do it
Also, Intel is STILL vulnerable to the occasional hauntings from spectre. I would like to place myself on a platform that has been taking active steps to mitigate that, even while advancing hardware manufacturing. That's been AMD all the way.
That being said, if Intel smartened the hell up and started changing their tunes, I might stick with them for this next round.
I've been thinking about a MicroATX build for my desktop iteration, because mini-PC gaming is pretty cool. It's lookin like AMD might be powering that build this time aroundhmscott and undervolter0x0309 like this. -
Support.3@XOTIC PC Company Representative
I'm leaning somewhat that way myself for a media center, little Ryzen box for emulators and hosting a media library. I think I can squeeze more life from my main rig's CPU so I'll probably re-evaluate with what's going on in the industry when that becomes necessary.hmscott and undervolter0x0309 like this. -
undervolter0x0309 Notebook Evangelist
Not being loyal to a brand keeps the industry in check and grounded. Brand loyalty, imo, is what creates irrationalities in the industry. Like Apple pushing for forever/throttling laptops and many OEMS capitulating due to the blind/indoctrinated apple consumer base.
Thank God that's starting to erode very quickly. The industry eventually becomes grounded again, but when customers don't have brand loyalty, the groundedness happens much quicker.hmscott likes this. -
That's the thinking that gets you saddled with $1300-$2500 of RTX sucker bait instead of spending $500 on a used GTX 1080ti.
It's better to go in patiently, informed and awake, and to not buy things with the knee jerk reaction induced by slick marketing and peer pressure.
And, given a long track record of snake like behavior, there's no reason to not have the shovel ready, just in case.
undervolter0x0309 likes this. -
Support.3@XOTIC PC Company Representative
That actually fits with what I said, in many cases, what's good at the time may well be that used 1080ti. After all, I'm notoriously wary of the first gen of anything, so it's unlikely you'll see me singing praises of RTX right away or whatever that thing Intel is working on when that shows up. I just think it's poor form to go "I'll never buy from [insulting portmanteau of manufacturer name]" and leave it as a foregone conclusion that it will never be a good idea to buy something, new or used, from a particular one. -
Well nVidia's stock just tanked too. They're gonna need a rabbit out of a fat hat if they want to stay afloat. Either major price tanks to move more units or a release on par with the 1080ti of last gen. Because if they don't do something, they're gonna end up in a situation like RTG is in. Poor performance releases and people vacating the company
hmscott likes this. -
undervolter0x0309 Notebook Evangelist
I'm actually grateful customers are pushing back against these ridiculous prices. Whether it's apple's $1k iphone or graphics cards that offer minimal visual change for a premium price.
A lot of tech workers get paid very well but are apathetic to most of the population. Almost any mediocre engineer can deliver inefficient/expensive technology. Only the few and passionate will strive to deliver high performance that is also very efficient.
These investor led companies need to learn to let customers breath and not have to keep pumping half-baked and expensive tech
My two cents
Ray tracing another BS
Discussion in 'Gaming (Software and Graphics Cards)' started by Danishblunt, Nov 25, 2018.