The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Mobile RX6000s in 2021

    Discussion in 'Gaming (Software and Graphics Cards)' started by BrightSmith, Mar 11, 2021.

  1. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    And here I would be happy with a 175w-225w option but then again I would prefer to only carry 1 330w ac adapter around but for extended stays dual bricks is an easy ask
     
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    If they had just a smidgen of ingenuity they would develop a single 700-800W brick similar to what Eurocom created and then we could skip the dual brick nonsense. It was comparable in size and weight to a single 330W adapter. But, they don't. They just keep recycling the same bad ideas because being awesome and making something awesome are just not very important to them.
     
    SMGJohn, Vasudev and Papusan like this.
  3. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    They could even create a smaller 500w adapter. Nvida/Amd have castrated the mobile graphics cards so heavly nowadays that 500W is more than enough.
     
  4. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    It would be lovely if they consulted this forum for creating a high end, mid level, and budget option and actually leveled with us on difficulties they foresee like panel supply, component operation, power envelopes, MUX operation (or lack thereof) and actually followed through.

    They could go in kind of like Google did, you set up tenants you want in place and work with OEM/ODM to produce the product.

    Sleep aids hitting me now lol cant think straight
     
    Vasudev, Mr. Fox and Papusan like this.
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    If the RX6700M is a direct translation of the 6700 XT and comes within ~10% of its desktop counterpart like the 5700M was, it would be faster than the mobile RTX 3070.

    We'll see how many Tiger Lake laptops pick it up, assuming AMD is smart enough to have it ready soon.
     
    Last edited: Mar 26, 2021
    BrightSmith, Vasudev and Mr. Fox like this.
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,791
    Trophy Points:
    931
    That is precisely what needs to happen. It's long overdue and the only reason it would not be feasible is the lameness of notebook design. There is no excuse for the way any of them are gimped other than the need to be because of the digital anorexia phenomenon that has all but destroyed the notebook industry. When you stop and think about it, things are totally idiotic and bass ackwards... gigantic smartphones and mickey mouse laptops... it is so dumb... too stupid for words, even. But, that's the world we live in now.
     
    Last edited: Mar 26, 2021
    ViktorV, Atma, DaMafiaGamer and 6 others like this.
  7. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    According to NBR however: the RX6700M power consumption (TDP settings) ranges from 90 - 135 Watt with different clock speeds" which means it's as severely power limited as the rtx3700

    https://www.notebookcheck.net/AMD-Radeon-RX-6700M-GPU-Benchmarks-and-Specs.515472.0.html
     
  8. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    MorePowerTool can override TGP settings for 5600M/5700M. It can help close the gap with the desktop equivalents provided the laptop can handle it. I was able to boost the Alpha 17 by about 15% before it starts tripping the 230W power supply under load.
     
    Clamibot, Mr. Fox and BrightSmith like this.
  9. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Navi was surprisingly versatile, though it wouldnt have taken any performance crowns so I guess its might be why AMD or OEMs didnt push for it? I dont know
     
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    OEMs would've used it (my speculation) if AMD come with it sooner than late August of 2020. Why would they refresh laptops for the 5700M, when CES was a few months away and they were busy preparing Ampere board designs?

    AMD was so late with 5000M it bordered on incompetence. Or it was apathy to the laptop market.

    Who know, but the late release did the company no favors.
     
    Reciever likes this.
  11. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    Makes sense, I dont follow the timeline but that does provide reasonable context.

    Kind of a bummer since on the high end 5700XT can tangle with the 1080Ti, havent ran any tests at 115w though, maybe 1070Ti-1080 area?
     
  12. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    With their successful mobile 5000 series AMD might get more invested in the laptop arena, especially when Intel launches its DG2.
     
  13. SMGJohn

    SMGJohn Notebook Evangelist

    Reputations:
    141
    Messages:
    603
    Likes Received:
    356
    Trophy Points:
    76
    I have a strong feeling AMD is going to release their 6000 series for laptops and there is gonna be 2 laptops ready after a month they are available and then maybe at the end of the year they got a total of 6 laptops with their GPUs in them.

    Its pretty clear nVidia got iron hands on the market much like Intel used to do, only way to break that is through shock therapy, total blitzkrieg and AMD aint gonna blitzkrieg any GPU market anytime soon.

    I love AMD control panel over the archaic nVidia one, the fact you can overclock and do just about anything in one app, to me is convenient, with nVidia I need 3 software to get the same job done.

    I will however say that, Alienware WILL use RX 6800M, Dell Alienware has ALWAYS offered AMD's highest performing Mobile GPU in their Alienware laptops in a limited offering, but an offering non the less, they were the ONLY company to offer R9 M295 and R9 M390 mobile GPU's in a laptop, only other ones to use it was Apple. They also offered the 5700M you people talked about earlier, in a limited number.

    So its safe to say there will be an Alienware with RX 6800M, but these will be available very limited, probably just a few months and its gone. RX 5700M is not an option anymore at their site.
     
    Clamibot, BrightSmith and JRE84 like this.
  14. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
  15. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    That model looks impressive. Since it is the Strix instead of the Zephyrus it should not have soldered memory or wifi. Hopefully it has a mux switch as well but doubtful given what we see with the Nvidia variant.

    I hope MSI is also preparing something (new Delta series?) with 5900HX / 6800M in a GE76 chassis. If that had a MUX it could be one of the better all-amd options in the last few years. Probably still not as great a cooling system as the Acer Predator Helios 500 AMD was, but maybe close.
     
    JRE84 likes this.
  16. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    well fan-great point. if it has cooling like the helios I might consider it my next vr gaming rig..hard to believe I went from not caring about vr/amd/thin and light laptops too wanting even craving the forbidden apples
     
  17. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    So AMD is naming the desktop 6 700 XT the 6 800M. Gotta play Nvidia's game I guess.
     
    BrightSmith likes this.
  18. CLASSIF1ED

    CLASSIF1ED Notebook Consultant

    Reputations:
    18
    Messages:
    150
    Likes Received:
    46
    Trophy Points:
    41
    This is nothing new. Mobility Radeon HD 5870 = underclocked desktop HD 5770
     
  19. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    its smaller what did you expect....thats like wondering why your laptop is more powerful than your cell phone....gee I wonder whos stronger this elephant or this squirrel
    [​IMG]

    if you guessed squirrel im sorry
     
    Last edited: Apr 6, 2021
  20. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Yeah but it's got great big teeth and a mean streak a mile wide!
     
    jc_denton and BrightSmith like this.
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    We aren't going back over a decade.

    RX 5700M = RX 5700, just last year.

    What changed in one generation? Nvidia again made it cool to obfuscate.
     
  22. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    1060 is 1060
    means nothing, this year it isnt 1 to 1.

    oh and his example holds up for 90 percent of the last 10 years
     
  23. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    No one got the Monty Python reference? Dang... :(
     
    BrightSmith likes this.
  24. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
  25. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    You're correct, but technology is supposed to improve every year, not get worse.

    Since laptop GPUs reached performance parity with desktop GPUs in 2016, things should've stayed that way. There really isn't any good reason for the reintroduction of a performance gap between laptop and desktop GPUs.

    You could say the thinning of laptops is the reason, but gaming laptops didn't need to get thinner. A laptop engineered for performance doesn't need to be any thinner than 1.5 inches. That's thin enough.

    The second problem is that cooling systems for gaming laptops have been getting crappier because the laptops have been getting thinner. This is more of a lack of innovation than anything else rather than an actual limitation. Therefore I'd argue there isn't a valid reason for laptop and desktop GPUs to not be 1 to 1.
     
    Vasudev and BrightSmith like this.
  26. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    While I agree tacitly with everything pointed out, that only continues that course if power efficiency stays in line, which was one of the first things noticed about this generation of hardware; it went out the window.

    As for the downward spiral, people pay for it, so it encourages the behavior sadly.
     
    Clamibot likes this.
  27. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    There is a good reason for reintroduction of a performance gap between laptop and desktop GPUs. Bigger gap, the better. It's called Maximize the profits. If they offered same performance with laptop hardware vs what you find in desktops you could keep your laptop +5 years down the road. Who will lose $$$$ on this?

    Thinner laptops means less performance. See it this way... A great way to push you on the next latest and greatest castrated. Yooo lose and the OEM, Nvidia, Amd and Intel will win.
    Thinner and Lighter Laptops Have Screwed Us All

    If Pro users really were Apple’s target market, the company could redesign these laptops to use the older, thicker MacBook Pro form factor from 2015. With that available space, and improvements in processor design, it would be able to better cool the same hardware and squeeze out more performance— but it’ll never happen. Thicker laptops would mean admitting failure.
     
    Last edited: Apr 9, 2021
  28. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Pro is just a name, the hardware was never pro grade. Same as the"amazing" new features on their phones every year that have usually been available on Android for years and don't seem all that amazing or new to people with functioning brains. The pro is just marketing and I have to give credit where it's due: It works, Apple isn't run by idiots.
     
  29. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
  30. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,377
    Trophy Points:
    431
    145w and above is hardly reassuring but still nice to see.
     
    Papusan likes this.
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Or just the fact that it’s based on a power-starved 6700 XT with rasterization performance likely between 3070M and 3080M, with weaker ray tracing and no DLSS, isn’t much to get excited over. But at least it’s fairly easy to raise the power limit on AMD GPUs by editing the PowerPlay tables with MorePowerTool.
     
    Papusan likes this.
  32. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I'm calling BS when the 6800M is launching in the Strix G15, and that uses Max-Q power level Nvidia GPUs. Why would ASUS use power crippled 80W 3070s/3080s if they have the thermal budget for 145W 6800Ms?

    I expect a "we allow out partners to set their own TGP budgets" statement when it comes out that this graph is fugazi, and that AMD has allowed a situation just as messy as Nvidia.
     
    Clamibot, krabman and Papusan like this.
  33. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231

    Well, it's not 145w, but it's also not 80w for the RX 6800m in that laptop. So... it's something. Don't know what, though.
     
    Papusan likes this.
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581


    Skip to 4:45 to bypass the AMD PR and get to the benchmarks.

    Bruh
     
    Papusan likes this.
  35. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Yeah, not so hot considering the extra VRAM it has over most RTX 3000 laptop GPUs. Oh well.
     
    Papusan likes this.
  36. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    At least these GPUs can match their desktop counterparts if you give them enough power unlike Nvidia's mobile lineup.

    Edit: Nevermind, I'm wrong. These GPUs are cut down as well. At this point I'm just going to build myself an ultraportable gaming PC. I'm tired of this cut down GPUs in laptops crap.
     
    Last edited: Jun 1, 2021
    Papusan likes this.
  37. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    They’re even more cut down than Nvidia’s.

    3080 Desktop: 68 SMs
    3080 Laptop: 48 SMs

    6800XT Desktop: 72 CUs
    6800M: 40 CUs

    You’d have to go back to Fermi/Terascale days to see this kind of blatant performance gap between desktop and mobile parts of the same name. At least AMD is calling it the 6800 M so there’s no ambiguity about it being crap. :rolleyes:
     
  38. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    So Anandtech has the presentation for the RX 6000m series up on their website. Looking at the endnotes, we can see the comparison laptops AMD chose were an MSI GE63 Raider (for the RTX 2070 laptop), ASUS ROG G533QS (for the RTX 3080 mobile laptop), ASUS ROG G513QR (for the RTX 3070 mobile laptop), and a Razer Blade (for the RTX 3060 mobile laptop).
    At least they used the same laptop design for the RX 6800m comparison vs the RTX 3000 series (same can't be said for the RX 6600m comparison), but compared to the highest watt RTX 3080, it looks like the RX 6800m won't compete with it. Considering how LTT, Anandtech, and Jarrod's all have posted benchmarks where the RX 6800m is mostly losing, I have to wonder what AMD were doing with the laptops during their own benchmarking.
     
    unlogic and Clamibot like this.
  39. o.d.

    o.d. Notebook Guru

    Reputations:
    2
    Messages:
    51
    Likes Received:
    2
    Trophy Points:
    16
  40. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Think about the fact that AMD is enabling this with almost all compatible GPUs even from the competition, just makes you respect them a lot more than greedy corporates. All their technologies are open sourced: CAS was AMD first until Nvidia took notice and baked it into control panel sharpening, same with freesync-it has made gsync hardware completely irrelevant. This is exactly what technology is about, enabling easy access to everyone. I hope the 6800m(and notice it's not called 6800XT) edges out a "3080" with this tech while costing less than a 3070 , that should be a huge win AMD rightly deserves. They deserve the apex spot since Lisa Su took over.
     
    Clamibot likes this.
  41. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Cut down is the future for gamingbooks. Nvidia should follow AMD and brand their mobile graphics cards in same way. Or just use the term Max-Q.

    Dell Alienware already preper for the future with thinner than ever design. Only Apple is on par offer minimum I/O ports on the sides for their laptop products. Graphics cards with high TGP have no future in today or tomorrows gaming laptops.


    AMD Radeon RX 6800M Review: A worthy competitor to Nvidia's GeForce pcworld.com
    Ryzen 9 5900HX plus Radeon RX 6800M makes for a worthy competitor.


    AMD announces three Radeon RX 6000M GPUs to compete with Nvidia's best pcworld.com
    RDNA 2-based Radeon RX 6800M will duke it out with Nvidia's GeForce RTX 3080
     
    Last edited: Jun 1, 2021
    Spartan@HIDevolution likes this.
  42. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181



    looks like a stock 6800m matches a ocd 3080
     
    Spartan@HIDevolution likes this.
  43. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
    Jarrod already confirmed that Asus shipped the g15 test unit to him with ram modules that have bad timings. On default he got 102 fps in Shadow of the Tomb Raider like in the PCWorld test, however when he swapped out the ram for better ones the fps got up to 120 fps in the same game.
    Ray Tracing performance seems really low compared to the laptop 3080...
     
    Papusan likes this.
  44. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    It's not AMD have no interest, it's the call and demand from Windows Gaming crowds.
    With the past poor experience and poor impressions from older gamers, OEMs and Laptop makers were the one not showing interests on AMD GPUs.
    ...even though there are consumers whom were quite impressed with AMD GPU experience and impressions from MACBooks...
     
    Spartan@HIDevolution likes this.
  45. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
  46. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    Many probabilities. From CPU to driver version update for GPUs. And some games performed better with INTEL CPU.
     
  47. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
  48. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Yup... similar thing was posted here:

    Buyer beware: Hardware configurations can eat up 25% of Radeon 6800M's performance

    It looks like Asus is continuing with the long established OEM 'tradition' of reducing AMD's overall performance potential with loose RAM timings and bad hw implementation (I still get irked at the fiasco Asus did with GL702ZC in terms of the laptop killing itself after a month of use and excessive noise - its one of the reasons I steer clear of Asus to this day - that is until I get clear evidence they changed their ways).

    What makes this an odd case is the fact AMD was supposedly working with Asus to get the best out of the system... in which case, it could be partly AMD's fault... but I don't think they'd intentionally make this oversight that would result in 25-30% performance loss (someone mentioned that AMD's involvement extends to use of quality materials, proper cooling, better screens, etc. - however, RAM timings, and implementation of MUX switch would be left to the OEM's - aka, AMD had no say in what kind of RAM Asus can use, nor do I think they can force this... and Asus if I'm not mistaken has a history not implementing MUX switches to their RTX laptops either).

    There's also a chip shortage (which could have affected Asus choice of RAM here)... however, we know that OEM's tend to use RAM with poor timings anyway, and have 0 implementation for XMP profiles (at least when it comes to AMD based laptops and various Intel ones too - which seems stupid because XMP RAM for laptops exists, and XMP as a feature is NOT new).

    Anyway, it seems like for this ASUS laptop there is about 16.8% loss of performance from a lack of MUX switch implementation, and about 13.5% loss from using RAM with poor secondary timings.

    If Asus bothered with an MUX switch... the difference would be less noticeable (obviously)... and up to the user to upgrade the RAM.

    Dell had the same problem with getting higher framerates when testing with an external display on their Zen 2 and RDNA 1 (and the temperatures of the system were lower too if I'm not mistaken under those conditions)... so they also never implemented a MUX switch either.

    Correct me if I'm wrong, but while implementing MUX switch is costlier than not having one, it would still be cheaper than spending extra money on an external monitor and RAM with better timings, is it not?
     
    Last edited: Jun 7, 2021
    SMGJohn, Tyranus07 and Papusan like this.
  49. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    I'm more inclined to believe the RAM problem is due to component shortages since we know for sure that Asus and Lenovo are equipping their latest laptops with slower parts. This is why I picked up a 32 GB upgrade kit with tighter tRFC timings right now because I have a feeling those parts are going to become scarcer and more expensive as more of the slower systems start getting out into the wild.

    That being said, I'm going to be very curious to see what AMD Advantage laptops Lenovo has planned. They've been knocking it out of the park with their latest AMD/Intel + NVIDIA models with high-fresh 16:10 displays, a MUX switch, and full-power GPUs with exceptional cooling. If Fidelity FX Super Resolution can look as good, perform similarly, and be as well-supported in games as DLSS, I will likely make the switch.
     
    Papusan likes this.
  50. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    This.
    People are also waiting to see what Lenovo and MSI might do and whether they will implement MUX switches (hopefully they will).
     
← Previous pageNext page →