The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Mobile RX6000s in 2021

    Discussion in 'Gaming (Software and Graphics Cards)' started by BrightSmith, Mar 11, 2021.

  1. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Well at least AMD is calling their GPUs RX 6000M series.. what an "M" at the end so the buyer sure know that he is not buying the same graphic power of the desktop RX 6000 series. Unlike Nvidia that is quite shady with their naming. On the other hand I'm glad that AMD is delivering competitive GPUs on the mobile world, not the way I'd like to, but at least there is a real alternative to Nvidia.
     
  2. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    AMD Radeon RX 6600M (Navi 23) mobile GPU gets tested videocardz.de | Today
    [​IMG]

    Yeah, 7nm didn halp for the performance vs. wattage.

    But not all isn't bad, the Radeon RX 6600M deliver around 20-30% performance boost over the GeForce RTX 3050 Ti so if AMD can manage to price the mainstream Navi 23 chip accordingly, then it could serve in a really well position between the RTX 3050 Ti and the RTX 3060.
     
  3. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Yeah not looking at AMD to slide in and be king but damn it finally some competition is around the corner.

    It's a shame that dell will go out of their way to continue to offer AMD products in garbage platforms. Hopefully one of the other brands can offer a healthy candidate, acer did well with the helios 500 even if they ghosted on supporting it. Asus tripped at the door and continued laying there hoping nobody would notice with their offering (can't remember name) the first go around but the g14 seems to have some praise to it.

    Heres hopin'
     
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    In order for it to be truly successful, people need to be as critical of it as they would be with an Intel or NVIDIA product and not be too eager to forgive or make lame excuses for it's shortcomings simply because it is not an Intel or NVIDIA product. It needs to stand on its own merits and not be propped up simply because it is a new face from a player that has been AWOL for a long, long time.

    It is really sad to see competition surface at a time that notebooks have become castrated disposable trashbooks. It would have been a lot more meaningful before a death sentence had been leveled against high performance laptops.
     
    Clamibot and Papusan like this.
  5. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Nvidia with all their different castrated card's TGP drag the brand down the drain in benchmarks. Expect the same when AMD push out cards with all sorts of dumbed down TDP numbers as well.

    AMD specifies 145W + as the TDP for the AMD Radeon RX 6800M. In detail, this means that AMD's Smart Shift Boost can also provide up to 15 percent more power.
    upload_2021-7-3_3-12-7.png


    Back in the high-end area! AMD Radeon RX 6800M laptop GPU in the performance test
    https://www.notebookcheck.com/Zurue...-Laptop-GPU-im-Performance-Test.548251.0.html
     
    Last edited: Jul 2, 2021
  6. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Still a bit unsure as to why they went with the 6700XT instead of the 5700XT. Maybe the logistics just wasnt ripe for it.
     
    Spartan@HIDevolution and hfm like this.
  7. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Because 6000 is way better than 5000... like 20% better.. :)
     
    Spartan@HIDevolution likes this.
  8. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    So is the power consumption though when things are tuned. 6700XT @ 135w has less performance (47Mh/s) in mining than the 5700XT @ 90w (55-61Mh/s), but admittedly thats only from a mining standpoint so of course my bias is coloring my opinion here.
     
    Spartan@HIDevolution likes this.
  9. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The 6700 XT is 32% than the 5700 XT faster at 4K. So no point in using the latter when it is clearly slower.
     
    Spartan@HIDevolution likes this.
  10. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Well yeah, I didnt fully communicate my thoughts, been a long day.

    What I should've posted is I wonder why they didnt get the mobile variant of the 5700xt out there during the relevant window of time, I imagine they just didnt have the supply chains and relationships fleshed out etc etc
     
    Spartan@HIDevolution, hfm and Papusan like this.
  11. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    The 5700M roughly seems like that part. Same arch, same bus width, same memory config, relatively the same CU for mobile scale deployment and power.
    https://www.notebookcheck.net/Radeon-RX-5700M-vs-Radeon-RX-5700-XT-Desktop_9958_9874.247598.0.html

    I was wondering if perhaps Apple cornered the market on it, but it looks like the MBP 16 topped out at 5600M.
     
    Spartan@HIDevolution likes this.
  12. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    think ill pass on amds offerings...\

    no dlss is a deal breaker when dlss looks better than native and runs alot better I see no point in comparing apples to apples performance scores
     
    Spartan@HIDevolution likes this.
  13. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Yep, they are in minority and is an rarity :vbbiggrin:

    AMD RADEON RX 6000MRDNA 2 remains a rarity in the notebooks computerbase.de

    Around four months after the launch, the Radeon RX 6000M Series consisting of the RX 6800M, 6700M and 6600M remains a rarity.

    AMD --- 15 notebooks, 6 models, 4 manufacturers

    Nvidia --- +650 notebooks with a load of different Graphics cards SKUs.

    Yep, AMD better use the silicon for their Ryzen series processors. That way they can get more profit from their limited amount 7nm wafers.
     
    Last edited: Oct 20, 2021
    Spartan@HIDevolution likes this.
  14. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    The MacBook Pro was a strange one because its 5600M was equipped with HBM2 memory, the same as in the Radeon VII. It only had a 50W TGP, but its performance met or beat the RTX 2060 in many cases, at least with standard rasterization.

    Jarrod just compared the 6600M to the 3060 in the Legion 5. The latter comes out ahead in most cases despite its 2 GB vRAM disadvantage. That being said, I did take advantage of Lenovo offering the 6600M model with 32 GB RAM and 2 TB SSD for $1386 last week. I'm picking it up from the depot today.

     
    Spartan@HIDevolution likes this.
  15. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Seems like the 6600M is a fine 1080p performer, looks to struggle a bit in some cases at 1440p but probably workable with some settings tinkering.
     
    JRE84 likes this.
  16. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    KING19, JRE84 and hfm like this.
  17. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    FSR is AMD's DLSS equivalent. While DLSS generally has the edge in terms of image quality across the board, FSR's quality modes are still more than competitive especially at 1440p and 4K resolutions. AMD's option is also apparently easier to implement, so it's possible it may gain more traction more quickly. At this point, though, it's about where DLSS 1.0 was so it will take some time.
     
    KING19 and JRE84 like this.
  18. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    FSR in its current state will never catch up to DLSS in terms of image quality.
    Both technologies are vastly different !
    FSR is just a spacial upscaler.
    It will take a low resolution base image and upscale it to your native pixel grid. Then it will selectively sharpen the edges and texture cube maps. It may look convincing enough on high PPI displays but the technique will really break down on low pixel density.
    Its basically an evolution of fidelityfx CAS, and there's barely a difference between the two.
    CAS was simply an upscaler with a complete sharpening pass on top (post process sharpen). This will always have an unwanted effect of ruining some of the smaller details like in shadowed areas.
    Side by side, FSR might look just a little bit better because it's using edge detection algorithm that spares a bit of the detail.
    Both will have a "blown out" look on low pixel density. On higher pixel density displays, FSR might look a bit more convincing. It's the spiritual successor to CAS in many ways.

    DLSS, XeSS are not plain upscalers. They are actively reconstructing the image via multiple passes over a convolutional auto- encoder, then injecting that detail into a TAA frame buffer. All of this is done by using motion vectors over multiple frames and multiplying 4x4 frame grids (matrix multiplication of samples). This is why DLSS and image reconstruction in general will never work without temporal data(another way of saying that it absolutely needs TAA/some temporal component like t1x/t2x/t4x to work).
    It's still very much post processing...just more complex!
    This is what is known as Tensor "extrapolation" (working with existing data) ,its not smart enough for "hallucination"....yet.
    The real magic will be when AI gets smart enough to inject real time data into a rasterized data stream.
    In another couple of generations we will have AI adding extra detail to game worlds, reconstructing cinema level CGI in game in real-time based on real world video footage and movie techniques.
    That is when there will likely be no more need of extensive GPU horsepower jumps. Newer generation of cards will just come with newer feature sets instead, more tricks up their sleeve than just brute force number crunching.
     
    4W4K3, KING19, JRE84 and 1 other person like this.
  19. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    His friends call him Kunal. That is "Mr. Shrivastava" to the rest of us.
     
    JRE84 likes this.
  20. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    LOL guys, I'm an Orthopedic and Trauma surgeon currently practicing in Mumbai, India (just finished my residency in September last year, right around the time I posted the Ampere on laptops thread :D)
    I just had a keen interest in gaming since the retro console days. PlayStation 2 was my first ! Fell in love with God of War, haven't looked back since. Gaming and anything to do with video games has become my go to anti-anxiety prescription. Also, it helps a lot with my motor skills in the OT.
    I'm currently running a trauma unit here, trying to get a decent fellowship in spine surgery, preferably AO Germany by 2023!
     
    Papusan, Mr. Fox and JRE84 like this.
  21. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    The amount of interest I have in dlss is the same for the AMD equivalent, 0.

    The solution is always, overclock harder!
     
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    LOL. Overclocking harder is a solution I can definitely get behind.

    Gaming would be more fun if playing them was the primary objective. Most of the gimmicks that supposedly make gaming better are cash grabs.
     
    4W4K3, KING19, Clamibot and 1 other person like this.
  23. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    I'm not sure what all the hate on DLSS is about. I think it's an amazing tool.

    Even though I will only play games at 1080p, I still enable DLSS so my GPU usage will decrease. The resulting image looks exactly the same, the GPU usage is significantly lower, and the power draw is significantly lower. I can play games on my X170SM-G with the fans barely running as a result, which is awesome.

    DLSS also makes it possible to achieve 144 fps on battery power in games where that normally wouldn't be possible because of the required GPU power draw induced by high GPU usage.

    In addition, DLSS should also extend the life of the hardware since you're stressing the GPU cores less. I don't see any downsides to this.
     
    Reciever, KING19, Papusan and 3 others like this.
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    Neither do the haters know. They just like to hear themselves mitch and boan about things.
     
    Clamibot likes this.
  25. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    With or without Jesus, AMD want to try to compete in 2022 :wacko:

    Based on details from his sources, MLID states that the AMD Radeon RX 6000 series is going to be further expanded in Q1 2022 with entry-level products based on the Navi 24 GPU.

    AMD Entry-Level Navi 24 ‘RDNA 2’ Radeon RX GPUs Rumored For Q1 Launch, 120W Super-Clocked Design Aimed at RTX 3050 Ti & Intel ARC

     
    Mr. Fox, JRE84 and Clamibot like this.
  26. KING19

    KING19 Notebook Deity

    Reputations:
    358
    Messages:
    1,170
    Likes Received:
    782
    Trophy Points:
    131
    The only thing i dont like about DLSS that only RTX GPUs can use it while other turing GPUs like the GTX1650, 1650ti, and especially the 1660ti are left in the cold, that why i hope FSR and XeSS continues to grow as every GPU can use them as they're software based plus they're open source as well. Problem is just like DLSS we gotta wait until more games support it.
     
  27. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41

    This is what AI enhancement is eventually targeting, DLSS was just a first step.
     
  28. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    I could definitely see that perspective. Although not the majority opinion I couldnt help but see Crysis as just an interactive benchmark, or something like Witcher 2's uber aliasing or w/e it was called.

    I just simply dont care about it, no hate, its just not a selling point for me. The same goes for the open source iteration. Just because I dont care about it doesnt mean you shouldnt find utility in it, nor should I feel the need to not openly share my thoughts on the matter either. If the entire product stack has DLSS support then its simply as such, I wont go out my way to avoid it either. In fact I believe I have 2 GPU's with support for it as it currently stands.

    If you find great utility from DLSS and are extremely pleased with what it provides for you then that is a happy consumer and I am happy to hear it. For me personally I dont really care if its present, as I wouldnt use it anyways regardless. Im not sure what the criticisms even are from those who werent satisfied with its function.
     
    Clamibot and Mr. Fox like this.
  29. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Yeah sorry, my comment wasn't directed towards you. It was directed towards individuals on other sites who were saying that DLSS was stupid and that they'd rather have enough CUDA cores so they can game at 144 fps at 4K resolution in AAA titles.

    That kind of future is a long ways away. DLSS is much more efficient under this scenario with current technology.
     
    Reciever likes this.
  30. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Well I agree with simply wanting a better card, but at the same time I dont see any issue with DLSS being a present feature either. Maybe some other technology that initially started as DLSS will emerge that I might have a high desire to have in the future. You never know!

    For some puritans though they were pretty pissed when they bought 2x 3090's just to have SLI for CyberPunk @4K but CDPR never said it would support SLI to begin with. It tends to be an argument born of avoiding faults of expectations that should've never been placed in the first place. Though Nvidia does get some of my ire as well for only allowing SLI on the 3090 in an environment where nearly no game supports mGPU to begin with.

    Then again Nvidia also said 8K gaming sooooo, probably shouldnt take a salesman at their word regardless but most here already know that :)
     
    Clamibot likes this.
  31. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    SLI is dead for a reason, but that's another topic entirely!
    Lets just say there's a damn good reason why any modern AA techniques will never work with SLI.

    DLSS though is here to stay from the looks of it :)
    Also nvidia's already working on improving their pipeline, they have a new feature called DLAA that's basically the same tech minus the supersampling.

    https://www.google.com/amp/s/www.digitaltrends.com/computing/what-is-nvidia-dlaa/?amp

    Allegedly its supposed to make 1080p gaming on 1080p monitors look like its being supersampled at 4k.You don't get a performance increase from this one!
     
    KING19 and Clamibot like this.
  32. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Its dead because Nvidia dumped support onto the Dev's, case closed. I used SLI for every generation since Fermi and only the odd title here and there that didnt work with it officially the rest was basically covered via profiles. Now you just have to see if mGPU support is enabled but not many titles use it.

    Might be neat, but for now I am not interested in those technologies.
     
    Clamibot likes this.
  33. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I've also used it since fermi (gtx 295 as well actually) and agree it was great tech and could have still been used.

    mGPU had potential but developers don't care for it.

    Future GPUs will have chiplets so will be interesting how that works out as well.


    On a side note I like DLSS but probably because I just started using it (DLSS 2.0).
    I've heard 1.0 was horrible and that's probably why lots of people aren't fans.

    RTX 3000 in my M18xR2 gets a very nice boost thanks to DLSS so I look forward to 3.0 or w/e the next one is.
     
  34. KING19

    KING19 Notebook Deity

    Reputations:
    358
    Messages:
    1,170
    Likes Received:
    782
    Trophy Points:
    131
    DLSS has more pros than cons. To most people by enabling DLSS/FSR/XeSS you'll gain a lot more FPS without doing much and extending the life of your GPU depending on the games that support them.
     
    Clamibot and JRE84 like this.
  35. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Yeah there was potential for mGPU but developers dont want to be on the hook if something doesnt work so it all fell apart rather quickly

    chiplets will be interesting for sure, like most things until it hits the shelves I tend to forget about it lol

    Ok, so? I do not care. I am neither arguing for or against it and have stated as such multiple times already. If DLSS poofs out of existence or remains it doesnt change my day in the slightest. If people enjoy the feature then I am happy to see a consumer served well, beyond that, it means nothing to me.
     
    KING19 and JRE84 like this.
  36. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    JRE84 likes this.
  37. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    I dont really mess with image sharpening
     
    JRE84 likes this.
  38. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    It can be fun to add into a game that you are already modding but I dont have the time to compare before and after and see if I want to keep it included

    ME trilogy is probably the only series I had fun modding.
     
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    AMD make ready to compete with Nvidia's refreshed 3000 mobile graphics cards coming in spring 2022.

    AMD confirms Radeon RX 6500M and RX 6300M feature Navi 24 GPU videocardz.com

    According to the information featured in the latest Radeon driver “21.40.1”, Navi 24 codenamed “Beige Goby” will be used in four mobile graphics cards: Radeon PRO W6500M, Radeon PRO W6300M, and in their gaming variants: RX 6500M and RX 6300M. Those four cards are in addition to already known RX 6500XT and RX 6400 desktop GPUs which are to feature Navi 24 GPU as well.

    Since Navi 24 is considered primarily a mobile GPU (due to its low-power requirement), the manufacturer might want to launch its Navi 23 graphics cards around CES 2022 where AMD is also expected to announce its brand new Ryzen 6000 processors codenamed “Rembrandt”.
     
    Spartan@HIDevolution likes this.
← Previous page