The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Profile: LISA SU - A new AMD is rising
    Coreteks
    Published on May 2, 2019
     
    Vasudev likes this.
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Nvidia vs AMD: Moving Past the "GPU War"
    The Good Old Gamer
    Published on Apr 27, 2019
    Deciding which GPU is right for YOU should have NOTHING to do with Brand!


    GTX 1650: Price Premium for ENTRY Level?
    The Good Old Gamer
    Published on Apr 23, 2019
    Nvidia has released it's Geforce GTX 1650 today... with NO reviews available for potential customers... and for GOOD Reason! The RX 570 outperforms for less $.


    Does High-end Gaming Make Sense Anymore?
    The Good Old Gamer
    Published on May 4, 2019
    May the 4th Be With You!
    http://forum.notebookreview.com/threads/does-high-end-gaming-make-sense-anymore.828724/
     
    Last edited: May 5, 2019
  3. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    Last edited: May 3, 2019
    Vasudev and hmscott like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The Next Generation Sony PS5 and Microsoft Xbox have already said they will support Ray-tracing along with other next generation features.

    AMD supplies the CPU / GPU / APU's for the next gen console hardware. When the Ray-tracing in Console games get ported to PC's it will be using hardware / software / API's on PC's that support the console game features hosted on AMD hardware.

    As software only ray-tracing demo's have shown dedicated hardware may not be necessary to provide RT RT (VT) as implemented.

    I doubt AMD is going to license Nvidia's proprietary hybrid RTX ray-tracing methods for consoles, or PC hardware, so that's a dead end.

    Intel can promise anything at this point, it's all smoke and mirrors and so far Intel cancels more projects than it delivers, or ages them out forever, so I wouldn't put much stock in anything Intel promises for their GPU's, they haven't a clue as to what they are really going to deliver.

    In the last couple of months I saw Intel trying to drum up people to do survey's as to what consumers want in a next gen GPU. If Intel is still trying to figure out what people want in their next gen GPU's we aren't going to see those requested features implemented and delivered in a commercial consumer product for another 12-24 months.

    Intel wants your feedback on what it should be doing with graphics
    By Paul Lilly 9 days ago
    https://www.pcgamer.com/intel-wants-your-feedback-on-what-it-should-be-doing-with-graphics/

    Intel appears to be clueless, and is asking others for a clue as to where to start off on their GPU building quest.

    If Intel is promising hardware accelerated ray-tracing in their data center GPU's so far it looks like a vague formless promise, nothing more:

    " ...As David closed his blog he mentioned, “We will look forward to sharing more details on the Intel® Xearchitecture in the months ahead.” I’m pleased to share today that the Intel® Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel® Rendering Framework family of API’s and libraries."
    https://itpeernetwork.intel.com/intel-rendering-framework-xe-architecture/#gs.97iqpj
    https://itpeernetwork.intel.com/intel-xe-compute/

    " ...The blog only mentions that the company's data-center GPUs support the feature, and not whether its client-segment ones do. The data-center Xe GPUs are targeted at cloud-based gaming service and cloud-computing providers, as well as those building large rendering farms....

    "I'm pleased to share today that the Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API's and libraries," said Jim Jeffers, Sr. Principal Engineer and Sr. Director of Intel's Advanced Rendering and Visualization team. Intel did not go into technical details of the hardware itself...
    "
    https://www.techpowerup.com/255073/intel-xe-gpus-to-support-raytracing-hardware-acceleration

    There is also no mention of real-time ray-tracing... Intel's data center ray-tracing could be optimized for render farms in a non-real-time environment. As mentioned RT is for data center only products, no mention of consumer GPU RT RT product has been mentioned or promised.
     
    Last edited: May 3, 2019
    Vasudev likes this.
  5. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    I think Enterprise and Console will get full RTX treatment from AMD while consumers of AMD GPU might get those features a little late because they don't want it to be DOA like RTX powered games w/o having OS support and drivers.
     
    hmscott likes this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Geez, that's another tactic of Nvidia's that sucks the air out of the industry.

    "RTX" is Nvidia's proprietary BS made up name - a bastardization of their other made up name, "GTX".

    "RTX" is not an abbreviation for Real-Time Ray-Tracing.

    AMD mentioned other methods of accelerating features without dedicating silicon to the feature, unlike Nvidia did with their RT / Tensor cores (really just added pipelines), so AMD doesn't have to waste silicon area when those features are not in use.

    Let's wait and see what AMD comes up with and then likely shares with the community as a free and open standard for all to standardize on in the Console, PC, Streaming, and mobile GPU spaces.

    The best way to implement new features are in software and tune the hardware for unique on the fly needs, using the hardware and software together to optimize for all uses dynamically without wasted silicon.
     
    Last edited: May 3, 2019
    Vasudev likes this.
  7. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    I agree because people only remember who came first, now its RTX and even AMD fanboys are in the RTX fantasy world instead RT reality world. Once RT picks up the pace in DX,Vulkan and Metal; RTX will be superseded.
     
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Given the prevelance of the use of RTX and "hardware assisted RT" AMD marketing like Intel marketing may feel they are forced to use those terms in some fashion to align as competitive, spinning "hardware assist" in some form to match Nvidia.

    I hope neither actually waste silicon space doing so for such a minor use %, time will tell.
     
    Vasudev likes this.
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Analysing Navi - Part 2
    AdoredTV
    Published on May 4, 2019
    What Navi could be, what Navi should be...and what I'm hearing Navi is.

     
    Vasudev likes this.
  10. DackEW

    DackEW Notebook Consultant

    Reputations:
    47
    Messages:
    178
    Likes Received:
    24
    Trophy Points:
    31
    Hi, any info if Windows 7 driver exist for Ryzen i7-3750H?
     
    hmscott likes this.
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Atom Ant likes this.
  12. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    How Strong Can APU's get in the coming years?
    Moore's Law Is Dead
    Published on Mar 28, 2019
    How strong can APU's get 5 years from now? In my opinion, way more powerful than most realize. But before we estimate the future, it is worth discussing the past....


    The Good Old Gamer 1 month ago
    "Moore's Law Is Dead I think the general issue is that PCMR used to be about pc gaming. Today it’s all about numbers and not about the games. This is why people are over paying so much. They’re sold on bigger, albeit mostly irrelevant, numbers. I’ve been getting grief because there’s no reason to game at less than 144fps apparently. The marketing from Nvidia has been so good that no one remembers most games from last gen (much better games in my opinion,) were mostly capped at 60fps on PC XD."

    ImTheSlyDevil 1 month ago
    "The other day I asked Ed from Sapphire something related to this. Basically, APU's will get to a point where they replace mainstream gpus and he said that Sapphire was open to the idea of making consumer motherboards if this ever happens. Imagine a high end Sapphire SFX motherboard with an embedded APU with HBM."

    LazyGeekgamer HD is dead channel 1 month ago
    "This will be exciting for itx gaming and portable gaming too"

    The Good Old Gamer 1 month ago
    "I like your chart. Spelled out what I was talking about via visuals. Once we have hard numbers on the next gen consoles that'll tell us exactly how far we've come. I suspect 8c/16t 2.5-3Ghz CPU with 16MB L3 Cache (1/2 of DT Zen 2,) with Vega 56-64 performance launching next year. There's nothing stopping AMD from slapping on HBM and selling those to OEMs for DT usage."

    Moore's Law Is Dead 1 month ago
    "Thanks Chris, I actually think the PC gaming world is in for a rude awakening when it comes to the PS5's specs. It is an open secret on my channel that I keep mentioning the PS5 is going to blow away most high-end PC gaming builds people have right now. I am by no means ready to double down on this theory, but this is one of those hunches I feel pretty confident about. (My Cut Down Radeon 9 PS5 Leak Video elaborates)

    People keep forgetting that Sony is willing to sell their consoles at a loss when necessary, and also that a monster gaming console would be the perfect thing to make Streaming Platforms look like junk in comparison. Finally, I really....REALLY don't think most people around here understand just how much they are overpaying for GPU's right now. I will have a series on the coming console war some time this Spring... I just need the time and a few more sources to confirm what I think I know."
     
  13. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    If RADEON Navi isn't Good, what will also be bad?
    Moore's Law Is Dead
    Published on May 5, 2019
    A lot of hopes are riding on Navi's success at this point - whether it's PS5, RTX 3000, or Gaming Prices in general. Let's discuss a bad outcome to Navi.
     
  14. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    ATI PRADEON, $96 Grey Market RX 580
    Hardware Unboxed
    Published on May 5, 2019


    DipetSourced 8 hours ago
    > bet this thing didn't work/work poorly
    > still beating 1650 with a 66% of it's price
    > pikachu face

    rush21hit 5 hours ago
    "A shady $96 RX gpu kills 1650 in every tests lmao Talk about insult "

    JRV 2017 8 hours ago
    "Its exactly the same as the MAXSUN branded RX580s you can pick up on aliexpress"

    The Corn RX580 mentioned in the video, actually for sale on newegg for $105 is the lower spec 2048SP version, not a full RX580. I re-watched the HU segment and although he doesn't mention the Corn RX580's reduced 2048SP, the video pans over the newegg product page and shows it @11:22 (as 2048 CUDA Cores).

    CORN AMD RX580 [2048SP] 256-Bit 4GB GDDR5 Graphic Card support DirectX12 with dual fans Video Card RX 580 GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder etc.
    Limited time offer, ends 05/11 By Corn Electronics
    $105.99 Sale Ends in 7 Days (Sat) Save: $54.00 (34%)
    https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE93K4231

    Listed on newegg Corn has 3 models of RX580, the 2048SP version HU pointed out for $109 (now $105), and 2 other full 2304SP RX580's:
    https://www.newegg.com/Product/Prod...&Description=corn+rx580&N=50120625&isNodeId=1
    newegg corn RX580 models.JPG
    HU did mention there were plenty of name brand RX580's for sale only $60 more or less (from $70-90 Pradeon prices) and HU didn't recommend going with the off-brand Pradeon / other off-brand (chinese "knock-off") models.

    The "ATI Pradeon" benchmarked like a full RX580 by HU's tests - although they weren't extensive tests, and GPU-Z shows 2304SP (@07:40) so the "ATI Pradeon" is a full RX 580. It is only the Corn RX580 HU pointed out with 2048SP.

    Corn also lists a full RX570 with 2048SP for a few bucks less than their RX580 2048SP. :)

    CORN AMD RX570 [2048SP] 256-Bit 4GB GDDR5 Graphic Card DirectX12 Video Card GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder,Apex etc.
    $159.99 => $102.99 Sale Ends in 7 Days (Sat) Save: $57.00 (36%)
    https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE91E9033
     
    Last edited: May 6, 2019
  15. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Hmmm... Looking at the product overview, this is an RX 580 2048SP. In essence, an RX 570. So it's not as good of a deal as one would expect. Especially considering you can get this Gigabyte card for $120 + 2 games or open box PowerColor RX 570 for less than $100 (pre-tax).
     
    Last edited: May 5, 2019
    hmscott likes this.
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    HU didn't mention about either the ATI Pradeon or the Corn RX580 being limited to 2048SP, thanks for noticing and for posting. :)

    I updated my post to point out the 2048SP, as well as other details (below). I re-watched the HU segment and although he doesn't mention the Corn RX580's reduced 2048SP, the video pans over the newegg product page and shows it @11:22 (as 2048 CUDA Cores).

    Listed on newegg Corn has 3 models of RX580, the 2048SP version HU pointed out for $109 (now $105), and 2 other full 2304SP RX580's:
    https://www.newegg.com/Product/Prod...&Description=corn+rx580&N=50120625&isNodeId=1
    newegg corn RX580 models.JPG
    HU did mention there were plenty of name brand RX580's for sale only $60 more or less (from $70-90 Pradeon prices) and HU didn't recommend going with the off-brand Pradeon / other off-brand (chinese "knock-off") models.

    The "ATI Pradeon" benchmarked like a full RX580 by HU's tests - although they weren't extensive tests, and GPU-Z shows 2304SP (@07:40) so the "ATI Pradeon" is a full RX 580. It is only the Corn RX580 2048 HU pointed out with limited 2048SP.

    Corn also lists a full RX570 with 2048SP for a few bucks less :)

    CORN AMD RX570 [2048SP] 256-Bit 4GB GDDR5 Graphic Card DirectX12 Video Card GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder,Apex etc.
    $159.99 => $102.99 Sale Ends in 7 Days (Sat) Save: $57.00 (36%)
    https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE91E9033
     
    Last edited: May 6, 2019
  17. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    USED Vs. NEW RX 570.... How Risky is it Buying Used?
    Tech YES City
    Published on May 5, 2019
    Buy a New or a Used Graphics Card? That is a question that keeps coming up over time, though today I am comparing a brand new RX 570 to a USED Gigabyte RX 570, one that cost under half that of the new solution. Though what are the caveats of doing this? Let's investigate and discuss.
     
  18. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
  19. AlexusR

    AlexusR Guest

    Reputations:
    0
    Arrrrbol and Papusan like this.
  20. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    I am not sure about others but I had not anticipated AMD toppling performance of nVidia. Unlike Intel they have not stagnated in development and performance improvements. While not enormous gains they have been enough to stem the AMD tide pretty much. Unlike Vega64 this may finally be the leveling ground to nVidia offerings.
     
    hmscott likes this.
  21. AlexusR

    AlexusR Guest

    Reputations:
    0
    It may, at least for low to mid-range. The article still says AMD is struggling with Navi 20. These are only unconfirmed rumors, of course, so we will see how it will turn out. Hopefully there would not be further delays because I like to see prices for 2080 start dropping faster and I like seeing more rapid progress in terms of graphical fidelity - this is equally valuable to me as the gameplay itself (all of which now depends on how soon AMD will be able to catch up).
     
    Last edited by a moderator: May 6, 2019
    Papusan likes this.
  22. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's all rumor at this point, and some suggest is counter to what Lisa Su has stated, with the Radeon VII being the highest priced highest performance GPU to be released by AMD this year, which would make a higher cost higher performance Navi GPU pushed out into 2020.

    However, the way Lisa Su said it there is some wiggle room, as the Radeon VII is a different class - cost - than what Navi will be released under.

    So, maybe, a Navi GPU that is both faster and cheaper than the Radeon VII is possible. :)
     
    Arrrrbol and Papusan like this.
  23. Arrrrbol

    Arrrrbol Notebook Deity

    Reputations:
    3,235
    Messages:
    707
    Likes Received:
    1,054
    Trophy Points:
    156
    If the prices are good it will be fine, but it looks like Navi is still based on GCN. AMD really need to move on now, its been 7 years now and they are still using it. It has never been power efficient compared to the competition. In comparison, Terascale only lasted 4 years.
     
    Papusan and AlexusR like this.
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's all rumors right now. I was under the impression a couple of years ago that when Navi was announced it was going to be the first non-GCN architecture, so IDK what has changed since then if anything.

    The only thing I've seen close to "reality" is a brief mention in a code release for Linux, and that could simply be a compatibility naming issue and not a validation of architecture - it's just a name:

    " These first drops are pretty lightweight stuff, with the definitions including some architectural workarounds and a few new instructions, and don’t really indicate a whole lot by themselves. But the language used in the code seemingly confirms the GCN architecture.

    EF_AMDGPU_MACH_AMDGCN_LAST = EF_AMDGPU_MACH_AMDGCN_GFX1010
    "

    https://www.pcgamesn.com/amd/navi-linux-confirms-gcn-design?amp
     
    Last edited: May 7, 2019
    Arrrrbol likes this.
  25. Arrrrbol

    Arrrrbol Notebook Deity

    Reputations:
    3,235
    Messages:
    707
    Likes Received:
    1,054
    Trophy Points:
    156
    That was my thinking too. Perhaps they had issues with it and decided to use GCN again until they can get it working? Who knows for sure though, the only thing I hope is that it can compete with Nvidia at competitive prices.

    To be fair to them, they have really stretched GCN out a long way past what other architectures would have managed. The gap between the 7970 and the Radeon VII (or even Vega 64) is huge. I just think its time to move on to something more efficient now so they can start pushing clock speeds higher without needing a nuclear reactor to power it.
     
    Papusan and hmscott like this.
  26. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Check my post again (refresh page to see additions), that's the only thing close to an official mention of GCN, and it's not clear to me it is saying anything about the architecture, it's just a name and could be used for any number of reasons for continuity of the code naming, given it's all code previously written for GCN hardware.

    Maybe there will be a whole nuther branch that correctly defines it later, at this point it's just a placeholder.

    Do you (or anyone else) have an official mention from AMD that Navi is GCN?
     
  27. Arrrrbol

    Arrrrbol Notebook Deity

    Reputations:
    3,235
    Messages:
    707
    Likes Received:
    1,054
    Trophy Points:
    156
    There is nothing 'solid' but the evidence so far points to Navi being GCN.

    https://wccftech.com/amds-navi-gpus-confirmed-to-retain-gcn-design/
    https://www.eurogamer.net/articles/digitalfoundry-2019-amd-navi-next-gen-graphics-pcb-leak

    Some people seem to think this will be the last time they use GCN however, which if true is a very welcome change.

    Its all still mostly rumours obviously, but enough of them point to it being true. Not that it really matters if the performance is good for the money, which is the most important thing.
     
    Papusan and hmscott like this.
  28. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The "evidence" all comes down to 3 letters in a name in a file that's a placeholder for the real thing down the road. It's not definitive, in fact what else would they put in there? The new name? Of course not, that would give away the whole game before announcement. If anything I'd expect the GCN use in plaintext files that need to be shared ahead of release.

    GCN, or not GCN, we'll know soon enough. And, I'm mostly fine with it either way, although GCN is the more well known quantity so perhaps that's better to believe right now for comfort's sake.

    GCN as implemented through the AMD Radeon VII is awesome, and along with driver updates users are learning how to use the tuning software to reduce voltage, reduce power, reduce temperature, while getting more performance - and once the GPU is tuned the fan curve can be tuned to reduce noise.

    If Navi does as well as the Radeon VII in performance upgrade for GCN on 7nm and is available for less cost and in far greater quantities, there will be little reason to pick Nvidia over AMD on cost performance, the same as with Intel.

    I doubt Nvidia will drop prices, and given the poor performance vs cost for the low end RTX / GTX GPU's AMD should have no trouble placing lower cost performance competitive GPU's into the market.

    GCN might actually be good news for stability and reliability measures, as it's a long term well known quantity. We know how to tune it for best performance, and apps / games written to GCN performance will scale predictably.

    A GCN Navi that costs less and performs better than AMD legacy and current Nvidia GPU's would be a good thing.

    Let's hope that whatever or whenever AMD releases a new GPU architecture past GCN that the new architecture actually outperforms GCN.

    Otherwise AMD would be stuck like Intel is with 10nm and Nvidia is with RTX, both committed to failing costly blunders - IMNSHO.
     
    Last edited: May 7, 2019
    Arrrrbol likes this.
  29. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    https://www.cryengine.com/news/how-we-made-neon-noir-ray-traced-reflections-in-cryengine-and-more#

    That Vega 56 "Ray Traced" demo ran at 1080p 30fps. LOL. So a closed loop, tech demo ran at 1080p 30fps. Yikes.

    "However, RTX will allow the effects to run at a higher resolution. At the moment on GTX 1080, we usually compute reflections and refractions at half-screen resolution. RTX will probably allow full-screen 4k resolution. It will also help us to have more dynamic elements in the scene, whereas currently, we have some limitations. Broadly speaking, RTX will not allow new features in CRYENGINE, but it will enable better performance and more details."
     
    AlexusR likes this.
  30. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Probably isn't doing it now, it's an estimate. Handling more dynamic elements on RTX GPU's won't sell games to people with non-RTX GPU's.

    For the few people with overpriced RTX GPU's it's just not worth coding to them, as evidenced by these very developers - they coded for non-RTX GPU's and are only speculating on what might have been if they actually had taken the time to code to RTX features - but it never happened because there just aren't enough overpriced RTX GPU's out there to code to.

    Their comments sound more like a play for some juicy funding from Nvidia to implement RTX features. :)

    IMNSHO, there are far more GPU's that don't have Nvidia RTX for developers to code new games, and clearly that's what game developers need to do - target code for GPU's without Nvidia RTX, as they have been and continue to do to this day.

    Developers coding for DXR / RT RT - GL API supporting GPU's , which far outnumber Nvidia RTX GPU's, and their games need to run well on all GPU's in order to sell enough games to be profitable.

    The new consoles for 2020 will have RT RT and GL features, without Nvidia RTX and without Nvidia drivers / libraries, and those games will be the games that make it to PC with those RT RT and GL features coded to run on GPU's without requiring Nvidia RTX.

    It doesn't matter that a few 2080ti's out there can do more, there simply aren't enough of them to make it worthwhile to dedicate resources to reach out to them with specific Nvidia RTX game features in addition to supporting the far larger pool of non-Nvidia RTX GPU's.

    Game developers need to sell as many games as possible and that means coding new features to play on the largest number of GPU targets, without limiting the fun to the overpriced Nividia RTX GPU's.
     
    Last edited: May 7, 2019
    Talon likes this.
  31. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    AMD and Cray are building the world's most powerful supercomputer
    Engadget
    Published on May 7, 2019
    It'll be an exoscale machine that can do a quintillion calculations per second.


    AMD and Cray partner to build the world's fastest supercomputer

    CNBC Television
    Published on May 7, 2019
    AMD CEO Lisa Su and Cray CEO Peter Ungaro join "Squawk Alley" to discuss their partnership with the U.S. Department of Energy to build a new supercomputer named Frontier.


    Expected World’s Fastest Supercomputer Powered by AMD EPYC CPUs and Radeon Instinct GPUs
    AMD
    Published on May 7, 2019
    AMD joined the U.S. Department of Energy (DOE), Oak Ridge National Laboratory (ORNL) and Cray Inc. in announcing a new exascale-class supercomputer scheduled to be delivered to ORNL in 2021. To deliver what is expected to be more than 1.5 exaflops of processing performance, the Frontier system is designed to use custom AMD EPYC™ CPUs and purpose-built Radeon Instinct™ GPUs.


    AMD EPYC CPUs, AMD Radeon Instinct GPUs and ROCm Open Source Software to Power World’s Fastest Supercomputer at Oak Ridge National Laboratory
    https://www.amd.com/en/press-releas...t-gpus-and-rocm-open-source-software-to-power

    Powering the Exascale Era
    https://www.amd.com/frontier

    AMD back to Super form with Frontier Supercomputer

    Graya Overload
    Published on May 7, 2019
    Department of Engery (DOE), Oak Ridge National Laboratory (ORNL) Announced a new Supercomputer Frontier that Cray and AMD will be working on. It is a big win for AMD. Epyc CPUs and Radeon Instinct GPU's will be used.
     
    Last edited: May 7, 2019
    Ashtrix likes this.
  32. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    The best part are the rookies that said it was running 4K LOL. I said it was 1080p 30fps when I first saw the video, and what do you know. RTX isn't proprietary my dude, it's Nvidia's dedicated code/hardware for DXR via DX12. AMD could do the exact same thing with their drivers and hardware. It's not like they're being blocked somehow. Seems dedicated hardware IS IMPORTANT for RT.
     
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The Nvidia hardware design is the proprietary part - unless Nvidia releases the design IP into the public domain like AMD did for FreeSync, and anyone cares enough to pick up on it, doubtful.

    Nvidia releasing RTX hardware IP into the public domain, Unlikely? Probably, knowing Nvidia's penchant for useless proprietary features that even tank the performance of their own GPU's, for nothing special in trade, only to use those to sell the Gameworks minded that's its something special - that other vendors don't have.

    IMNSHO, none of the Nvidia eye-candy is worth halving the FPS, none of it.

    But, "it's a thing" and Nvidia can claim exclusivity, and some of the time you can almost see a difference, and some of that is subjectively interesting - but not critical to game play - in fact the FPS drops interferes with game play vs. not taking the performance hit.

    Nvidia's useless FPS stealing BS eye-candy is just that, BS that Nvidia can put on display, point to it so a certain percentage of shiny object enthusiasts can then pay a huge price for it, but none of it improves game play - subjectively or not we can do with out Nvidia's BS just fine.

    It's going to be a while before consoles release with new games using RT RT / GL features with PC releases running on DXR / RT-RT-GI API supporting GPU's. I hope this RT-RT-GL eye candy can be disabled on Consoles as well as PC's.

    That's enough on that for now, lets see how it plays out over time. :)
     
    Last edited: May 8, 2019
    Talon likes this.
  34. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    GCN is more compute oriented. That's one of the main reasons it requires more power to run.
    Navi being GCN based isn't necessarily bad (or new for that matter - AMD already said that Navi will be GCN based from the start). They could have decided to disable/remove a certain amount of compute and increase geometry engines, ROP's and texture units that games would actually use.
    So, wait and see until Navi is released (at any rate, its supposed to be the last iteration from AMD that uses GCN).
     
    Last edited: May 8, 2019
    Arrrrbol and hmscott like this.
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
  36. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Nvidia dropped a new driver today after having time to work with the game, you know since it was an AMD sponsored title and all. Large performance uplifts in Vulkan, not surprising though. Same thing happened a few years ago with another game? Was it doom?

    1080p average with Vulkan 217 fps. But who games at 1080p with a 2080 Ti?
     
  37. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    What is the link to the new driver test results? I think I'll wait for a re-review to publish the new results comparison.
    Any RTX GPU that wants a decent frame rate running DXR enabled? :D
     
  38. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Unfortunately I doubt anyone will take the time to re-review a dead game. Had this game released on steam I think it would have been a bigger hit, but EPIC store exclusivity ruined it! It's also overpriced for the content available. The game needs mods and more content.

    217FPS was with my 2080 Ti with my daily driver settings (overclocked). Still previously this was not achievable. The driver has fixed performance issues. Stock performance averages 204fps at less than 90% utilization. There is still a massive bottleneck. 4K is what the card was designed for and 4K is where it blows the doors off the competition.

    On release the 2080 Ti scored 124fps min and 160fps average in 1080p Vulkan at stock. Current scores at 170fps min and 204fps average at stock with new driver. Amazing what Nvidia can accomplish when given access to the game and development isn't choked off to a single company.
     
    Last edited: May 9, 2019
  39. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    1080p Vulkan World War Z.png
     
    Papusan and hmscott like this.
  40. AlexusR

    AlexusR Guest

    Reputations:
    0
    This is not true. It is not exclusive to Epic store. It is available on consoles. I played it on PS4. The game has failed because it is mediocre, this is a fact that is shown by average rating on Metacritic even for console versions. No releases on Steam/Origin/GoG would ever make it "a bigger hit" ;-)
     
  41. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Cool, ok, now plug in the AMD Radeon Vega 64 and AMD Radeon VII and do the same runs (with the same game visual settings) with AMD's latest drivers, so we can get a current run comparison. :D
     
  42. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Lenovo adds AMD Ryzen Pro-powered laptops to its ThinkPad family
    You can get the first available models at the end of May starting at $939.
    VALENTINA PALLADINO - 5/8/2019, 6:00 AM
    https://arstechnica.com/gadgets/201...n-pro-powered-laptops-to-its-thinkpad-family/

    "Lenovo is adding more choices to its beloved and iconic ThinkPad lineup this year: the new T495, T495s, and X395 laptops are all powered by AMD's Ryzen 7 Pro processors with integrated Vega graphics. With the same design and MIL-spec level of durability, these new ThinkPads will give customers the option to go with AMD without sacrificing what they love about the premium ThinkPad lineup.

    The ThinkPad T495 and T495s models are 14-inch laptops while the X395 measures in at 13 inches. They will look similar to the T490 and T490s Intel-based laptops announced last month because Lenovo essentially took the same frames and stuck AMD APUs inside. That means they all have MIL-spec tested designs and features like far-field mics for VoIP conferences, Lenovo's camera privacy shutter, and optional PrivacyGuide screen filter.

    The 14-inch displays on the T495 and T459s and the 13-inch display on the X395 will be FHD 1920×1080 panels with touch and non-touch options. They will also have AMD's FreeSync technology for improved refresh rates and pixel quality.

    The biggest differences between these laptops and Intel-powered ThinkPads are performance and ports. According to Lenovo, the second-gen AMD Ryzen 7 Pro processors combined with integrated Vega graphics should provide an 18-percent improvement in performance over previous generations, as well as a better multimedia experience.

    Lenovo claims improvements in battery life as well, but those will only be over previous generations—in fact, representatives told Ars that the AMD-powered machines will likely last 45 minutes to one hour less than their Intel-powered counterparts. That's not a huge decrease in battery life, but it's worth taking into account before choosing which CPU you want in one of these premium ThinkPads. The AMD models also do not have Thunderbolt 3 ports as the Intel models do, but they do have USB-A ports and an HDMI 2.0 port.

    Lenovo's new ThinkPads with AMD Ryzen processors and Vega graphics will be available soon: the T495 will be available in late May starting at $939, while the T495s will be available in early June starting at $1,089. The ThinkPad X395 will also be available in June starting at $1,089."

    I think it's great that AMD is getting into more laptops, this is a great start to a new era in laptop tech.
     
  43. custom90gt

    custom90gt Doc Mod Super Moderator

    Reputations:
    7,907
    Messages:
    3,862
    Likes Received:
    4,806
    Trophy Points:
    331
    I just hope the cooling is better than say the A485. It's one of the biggest reasons I'm selling it.
     
    hmscott and tilleroftheearth like this.
  44. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Have you tried the latest mobile AMD drivers - do they allow you to tune the voltage curve, and fan curve? That might give you better management of the thermal output and reduce the cooling load?

    Thin laptops are the bane of performance, maybe the 7nm spin's of the Zen 2 APU's will help when they arrive as well.

    Yup, a bit disappointed in the possible delay, but as mentioned there is only so much 7nm production capacity. Perhaps the extra time will allow for more fleshed out BIOS and performance tuning - looking on the bright side.
     
  46. custom90gt

    custom90gt Doc Mod Super Moderator

    Reputations:
    7,907
    Messages:
    3,862
    Likes Received:
    4,806
    Trophy Points:
    331
    I have tried the newest AMD drivers and they still do not allow any voltage editing. AMD's laptop solutions have a long way to go sadly. Until there is undervolting support, I'll stick to Intel.
     
    Dannemand and hmscott like this.
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Without demand there's been likely no justification to apply funds to the problem, hopefully with renewed interest in AMD mobile solutions that will drive innovation again in the laptop space for AMD.

    It's nice that voltage tuning and undervolting has finally gotten the attention it deserves after decades of pushing for moderating CPU thermal issues in laptops - as well as GPU's, now there's still a ways to go for AMD to catch up and provide the same options for their mobile CPU's.

    Did you file a request / complaint with Lenovo / AMD about the need for voltage tuning with the AMD mobile APU's? It might help the cause and give your desire a push for implementation at least in the next generation of AMD mobile APU's. :)
     
    custom90gt likes this.
  48. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    While AMD has a mobile entry to the market IMO it is very minimalist. It does not seem they are yet going hard after the market. I think Epyc and consumer desktop is their primary targets right now. At some point they will most likely become more serious about the mobile market.
     
    hmscott likes this.
  49. custom90gt

    custom90gt Doc Mod Super Moderator

    Reputations:
    7,907
    Messages:
    3,862
    Likes Received:
    4,806
    Trophy Points:
    331
    I did not request the ability to tune voltage from Lenovo since that's not their area. For AMD, there are many requests on the internet and in their forums...

    It would be great if they can get back into the game on the mobile side, but for right now I'm not very impressed. For sure in the future if they offer something like ryzen master for the mobile cpus...
     
    ALLurGroceries, Dannemand and hmscott like this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That's the wonderful thing about AMD's breakout into demand for their new CPU, APU, and hopefully GPU's, an abundance of opportunity. But, there are too many needs and not enough capital to invest to get everything done at once.

    Lisa Su seems to have a good eye for what can be done and doing it in order of importance, but that means something(s) are bound to be left in the wings waiting for attention.

    AMD full performance laptops with desktop components have the best chance still, especially with the interest in full performance in large frames vs thin, that will match desktop options closest.

    AMD has already put a focus on the APU mobile software drivers - supporting updating directly - hopefully that includes improving the support software too.
     
← Previous pageNext page →