My current build which I built in 2011 is as follows and has never crashed or blue-screened until this day:
FX-8150 Bulldozer
Radeon HD7970 3GB VRAM
16GB RAM
I recently bought a power supply, Radeon VII, Samsung EVO 970 Plus 500GB and 3TB Seagate Barracuda and going to be purchasing an X570 and Ryzen 9 3900X with DDR4 16 GB next month (can someone recommend a good 16GB or 32 if you think that's necessary from NewEgg?).
However, I bought the VII from AMD (Gold Edition), which is now Sold Out. But I am conflicted.
I do A LOT of gaming, mostly Unreal games, and I love Resident Evil 2. But want to get back into Photoshop and Adobe Lighroom with photography and photo manipulation stuff.
A lot of people are trashing the Radeon VII from the Nvidia side, and even the CEO of Nvidia trashed the VII calling it a garbage card. I got the card for a good deal (920$ Canadian plus tax with 2 free games and a shirt).
I do like AMD personally, because this current build I have is all AMD and it has yet to see a single crash and is still going strong.
My goal is literally 4K 60FPS in gaming, which I saw the VII can do on AMD's site and that's why I waited for an AMD 4K card. My monitor is currently 1440p which I bought 3 years ago, but within next few weeks I will definitely be getting a 4K monitor. I don't think I will ever ever go to a higher resolution because there is no noticeable difference.
However, everyone is saying the card is "garbage" and it's discouraging me now. I thought buying into ray-tracing right now wasn't worth it for me because if it won't do ray tracing, PhysX, or something called Hairworks?
I would be pretty upset if I spent 1000$ on a card that couldn't do ray tracing at a smooth frame rate at 1440p and higher.
Did I make a bad choice? I don't get it, if it's such a bad card as every YouTuber and reviewer says, the Gold edition SOLD OUT and people are buying it?
-
-
Ray tracing is kind of not worth it across the board as a whole. If I had a desktop I would be getting Vega VII that being said Navi cards are supposed to be around the corner. Though thats supposed to be the mainstream offering iirc.
Vega VII was the GPU everyone thought would never make it to the consumer side of things. -
Its a powerful card, of course it won't compete with a 2080Ti at stock but they are nice overclockers from what i've heard. The 16GB of HBM2 will help if you intend on playing at 4K.
Ray Tracing is an interesting technology, but personally I wouldn't buy into it until it has matured 1-2 years from now. -
Ok, thanks for the reassurance, but why do all the Nvidia card owners and a lot of YouTubers trash the card and AMD? I don't like trashing companies Nvidia or AMD. I personally just choose AMD because I have had a better history with them in terms of longevity.
No way I would purchase a 2080TI at the price that it's at, I have the funds for it, but most ray-tracing games are not coming out until late 2020, and by that time, Nvidia will already have the next generation out and I would be like "well, f***, there goes $1800 CAD". -
Theres always been a lot of hate for AMD, some of it was justified over the years but its likely because there was a lot of marketing for Vega 56/64 and while I wouldnt say they were failures they were certainly underwhelming to the point where Raja quit and went over to Intel.
AMD is typically be the performance per dollar angle and Nvidia doesnt have to be, and performance crowns mean a lot to those that seek it, and typically have the loudest voices.
Other than that in desktop forums like OCN there has always been a lot of Red vs Green. In NBR its not as bad mostly because AMD doesnt even really have a presence in the market.
These are my thoughts on that at least, every time AMD is coming out with a new product everyone hypes themselves into thinking its going to be the one that will crush Nvidia and when it doesnt happen its obviously because AMD is garbage. -
-
Why did you sell it?
How come everyone is saying the high memory is so important ?
Everyone on YouTube is saying that high memory on a graphics card is not important because games don’t use more than 8GB anyways
Wait how is a card that is worse than a 2080 a better investment ? Can you explain ?
Thank you so much. -
Why is the amount of memory so important and why is everyone on YT is saying a large amount of memory on a GPU isn't important? Long explanation shorten and simplified, is that some applications WILL [properly] utilize the memory to its full capacity and speed. The R7 is an amazing compute card for the price. The YT'ers are short-sighted, like the majority of humanity. You only have to take a look at older GPU's that still have some powerful processing abilities that are hampered by the amount of memory. Make no doubts about it, newer software will start using more of the memory on those GPU's. Which leads me to your last question.
The R7 while slower than a 2080, will age better simply due to its 16GB of memory. Based off your older hardware, you appear to keep your hardware for some time. This is where the investment comes into play. You will not need to upgrade as fast as others when games and applications require more memory.
Lastly, take everyone's advice (including my own) as opinion. Ultimately, research with facts will beat popular opinion. The simple fact is that the majority of the negative comments on higher end components come from people who have never used the hardware and cannot afford the hardware. Keep the card and that is from someone that uses high end components in both my personal and business environments.Mr. Fox, Starlight5, Arrrrbol and 1 other person like this. -
Reminds me of the GTX 580 1.5GB
Was made kind of useless pretty quick.Starlight5, Arrrrbol and Papusan like this. -
saturnotaku Notebook Nobel Laureate
NVIDIA's real advantage this generation is heat and power consumption, which is great for people like myself who have small form-factor systems. My RTX 2070 Founder's Edition is overclocked out of the box and only uses a single 8-pin PCI-E power connector. Were the Radeon VII or Vega 56/64 similarly capable, I'd probably be using one.
Mr. Fox and Starlight5 like this. -
Ok , thanks guys I’ll keep the card.
I guess , to be honest, a lot of my buyers remorse was coming from Nvidia advertising real time ray tracing in their new cards and the Radeon VII not having it.
A lot of people online kept saying:
8GB GDDR6 with RT cores >> 16 GB HBM2
Apparently having a ray tracing card is more important than anything else.
The marketing is so well done and is making games look like **** without it. Although to be honest I think the ray tracing in games is not done realistically because reflective services are not that reflective in real life.
Has Nvidia ever marketed new technology before hand that just became universal ??? -
Ray tracing is about as amazing as physx at the moment.
As for "new technology" freesync works on nvidia cards that nvidia "allows". Any time anything flashy comes along marketing has to push it like its gods gift to mankind.Rage Set likes this. -
And to answer your last question. No, none of Nvidia's proprietary tech has become a standard. On the other hand, AMD uses a lot of open standards in their products. Take for example, FreeSync. As @TheReciever stated, Nvidia's take on Ray tracing is PhysX at this point.hmscott likes this. -
yrekabakery Notebook Virtuoso
saturnotaku likes this. -
How is the 2080Ti performance wise? Of course it obliterates every graphics card on the market but do you think it’s worth double the price of R7/2080?
I’m actually confused because the majority of ray tracing games don’t even come out until late 2020 but Nvidia’s release cycle is literally every two years and by that time in September 2020 they should be releasing a new set of cards.
It makes no sense. Their strategy.hmscott likes this. -
The 20XX was a meh update, Ray tracing was thrown in to give it justification as well as something to market.
-
Is it worth it? To me it is but what you value is different than what I value. For the majority of people, the 2080 TI is overkill for their needs. If you aren't OC'ing or running an ultrawide or 4K monitor, you don't need a 2080 TI.Mr. Fox likes this.
Did I make a mistake buying a Radeon VII (Canada)?
Discussion in 'Desktop Hardware' started by AnnekeVanG, Jun 12, 2019.