The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.

  1. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    The Intel HEDT is over priced and needs reduced. The 2950X seems to be a killer CPU for the money if you need all those cores and threads. I don't so I will stick with my 6/12 and soon 8/16 core CPUs. I don't really need an 8/16 but will grab it for the hell of it and donate the 8700K to the laptop.
     
    hmscott and ajc9988 like this.
  2. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    It will be the best CPU out there in that class, and I would readily recommend it for 8C and either the 8C/8T or 6C/12T for mainstream any day of the week for gaming and mainstream OCing. That, I believe, will change in or around April 2019, but that leaves 9 months with that recommendation on the 8700K if willing to OC (if not, then there are times I might say the 2700X, depending on uses if not strictly gaming and some OC), and 6 months with the upcoming i9 8C mainstream chip.
     
    Talon and hmscott like this.
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Last edited: Aug 14, 2018
    ajc9988 likes this.
  4. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    That is what I meant by extreme multitasking. Those numbers of rendering on two different items at the same time is phenomenal. Seriously, never seen anything like it. But, as I also mentioned, it is hard to stretch this CPUs legs!

    [​IMG]
    PCWorld review: https://www.pcworld.com/article/329...amds-32-core-cpu-is-insanely-fast.html?page=2
     
    Last edited: Aug 13, 2018
    hmscott likes this.
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    From the video, these charts stand out as the cross-over point for threads vs performance advantage - left side advantage 7980XE - right side advantage AMD 2990WX, note the cross over point is about 25T's.

    ThreadRipper 2990WX vs 7980XE
    Thread Load Scaling 1-64 Threads
    ThreadRipper 2990WX vs 7980XE Thread Load Scaling 1-64 Threads.jpg
    ThreadRipper 2990WX vs 7980XE
    Percent Performance 1-64 Threads
    ThreadRipper 2990WX vs 7980XE Percent Performance 1-64 Threads.jpg
    PC Worlds Dollars per Core chart with new ThreadRipper 2 CPUs added
    PC Worlds dollars per core chart with new ThreadRipper 2 CPUs added.jpg
    Threadripper 2 2990WX in-depth review & benchmarks
     
    ajc9988 likes this.
  6. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681


    Sent from my SM-G900P using Tapatalk
     
    Talon, Vasudev, TANWare and 1 other person like this.
  7. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Steve does not seems happy with the 2990wx, and rightfully so. It is a mixed bag and while some selected benchmarks do benefit some workloads do not. I doubt a 1990x would have faired better so it seems more reasonable as why they never saw the light of day.

    Now if the CPU scaled up 53% across the board over the 2950x, it would have been awesome and a no brainer but this is far from the case.
     
    Talon, Vasudev, Papusan and 1 other person like this.
  8. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    But, Steve also shows he reviewed Phoronix's review on a custom compiled Linux with the tweaks for utilizing the 2990WX. That is why he kept stating there may be an issue of the windows scheduler, which makes sense to reduce latency hits on the other two dies. How much that is is yet to be seen. So I am looking forward to more information on that front.

    Sent from my SM-G900P using Tapatalk
     
  9. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Below is my speculation (salt shaker not only recommended but required);
    I do not know if the Windows TS is that smart. Essentially under NUMA it should treat the CCX's as four computers but 2 of them being faster than the other two. If we use CB R15 to divide the power up it shows where 16 cores should give 3500 instead of 7000 then 32 cores yields 6000. This means the added cores only add 2500 to the score or 71% of the power. If badly schedules the tasks on the two fast CCX's could be waiting on the slow ones, not only that being as the fast ones are the direct memory access it could slow tasks down even further.

    So it needs to take threads that are dependent on other threads completion and run them to the faster cores. CB R15 has very independent threads, this is why I use it to show power difference between cores. Now I know this is an over simplification, so do not shoot me. I am hoping this points an issue out though.
     
    Last edited: Aug 14, 2018
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Phoronix actually has a few articles, and a publicly open-sourced benchmark suite you can run to compare your systems performance results.

    A Look At The Windows 10 vs. Linux Performance On AMD Threadripper 2990WX
    Written by Michael Larabel in Operating Systems on 13 August 2018. Page 1 of 4. 30 Comments
    https://www.phoronix.com/scan.php?page=article&item=2990wx-linux-windows&num=1

    AMD Threadripper 2950X Offers Great Linux Performance At $900 USD
    Written by Michael Larabel in Processors on 13 August 2018. Page 1 of 7. 10 Comments
    https://www.phoronix.com/scan.php?page=article&item=amd-tr2950x-linux&num=1

    "If you would like to see how your own Linux CPU performance compares to the results shown in this article, simply install the Phoronix Test Suite and run phoronix-test-suite benchmark 1808102-RA-AMD2950XT81."

    AMD Threadripper 2990WX Linux Benchmarks: The 32-Core / 64-Thread Beast
    Written by Michael Larabel in Processors on 13 August 2018. Page 1 of 11. 61 Comments
    https://www.phoronix.com/scan.php?page=article&item=amd-linux-2990wx&num=1

    "With the Phoronix Test Suite being open-source and designed to be reproducible and delivery fully-automated benchmarking, it is very easy to see how your own Linux system(s) compare to the Intel/AMD Linux CPU benchmarks shown in this article. With the Phoronix Test Suite on your Linux/BSD/macOS system, simply run phoronix-test-suite benchmark 1808115-RA-THREADRIP08 for your own fully-automated, side-by-side benchmarking comparison against the results in this article from test installation to test execution and result analysis."

    AMD Threadripper 2990WX Cooling Performance - Testing Five Heatsinks & Two Water Coolers
    Written by Michael Larabel in Peripherals on 13 August 2018. Page 1 of 5. 5 Comments
    https://www.phoronix.com/scan.php?page=article&item=amd-2990wx-cooling&num=1

    Linux Kernel Expectations For AMD Threadripper 2
    Written by Michael Larabel in AMD on 9 August 2018 at 03:50 PM EDT. 11 Comments
    https://www.phoronix.com/scan.php?page=news_item&px=Threadripper-2-Kernel-Bulletin

    AMD Threadripper 2000 Series Details: Up To 32-Cores / 64-Threads With The 2990WX
    Written by Michael Larabel in Processors on 6 August 2018. Page 1 of 1. 40 Comments
    https://www.phoronix.com/scan.php?page=article&item=amd-tr2990wx-preview&num=1
     
    Last edited: Aug 16, 2018
    Vasudev, TANWare and ajc9988 like this.
  11. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Vasudev, ajc9988 and hmscott like this.
  12. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    A couple of new articles, Phoronix. :)

    A Look At Linux Gaming Performance Scaling On The Threadripper 2950X
    Written by Michael Larabel in Linux Gaming on 16 August 2018. Page 1 of 4. 10 Comments
    https://www.phoronix.com/scan.php?page=article&item=threadripper-2950x-gaming&num=1

    "There are a lot of real-world Linux workloads that can benefit from 16 cores / 32 threads on the Threadripper 2950X (or even 32 cores / 64 threads with the Threadripper 2990WX), but Linux gaming isn't close to being one of them."

    A Quick Look At The Windows Server vs. Linux Performance On The Threadripper 2990WX
    Written by Michael Larabel in Operating Systems on 16 August 2018. Page 1 of 4. 14 Comments
    https://www.phoronix.com/scan.php?page=article&item=windows-server-2990wx&num=1

    "Windows Server did end up being faster than Windows 10 at the time to run various Git revision control system commands, but that too was still behind the Linux performance."
     
    Last edited: Aug 16, 2018
    Vasudev and ajc9988 like this.
  13. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Threadripper 2nd Generation Interview with AMD's James Prior
    Newegg Studios
    Published on Aug 17, 2018
    Trisha and JC talk to James Prior from AMD about the second generation of Threadripper https://www.newegg.com/promotions/amd/18-2407/index.html

    This massive new CPU has 32 cores and 64 threads, making it a multitasker’s dream. They’ll be diving in deep, and answering all your questions about this incredible new CPU.
     
    ajc9988 and Vasudev like this.
  14. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Hopefully there will be lots of used GPU's showing up soon on the 2nd hand market, and discounts on new Pascal GPU's to follow when the new gen Nvidia GPU's hit the shelves.

    GeForce GTX 1070 Ti vs. Radeon RX Vega 56, 2018 Update [25 Game Benchmark]

    Hardware Unboxed
    Published on Jun 3, 2018


    Can Custom Vega 64 Beat The GTX 1080?
    2018 Update [27 Game Benchmark]

    Hardware Unboxed
    Published on Aug 16, 2018
     
    ajc9988 and Vasudev like this.
  15. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Linux vs. Windows Benchmarks, Threadripper 2990WX vs. Core i9-7980XE
    Hardware Unboxed
    Published on Aug 20, 2018
     
  16. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    This one confirmed my suspicions on windows, that their optimizations can benefit both sides. And who wants to bet part of it is their telemetry, Cortana, and constantly trying to dial back, along with their tile ******** and wanting to become an advertising platform. The rest is scheduler and other issues. But, at least we now have multiple good reviewers pointing out the problem to users.

    Now, for anyone complaining on kernel use, phoronix did optimized kernel compilations AND used the most recent kernel in some of his testing, but I don't think that was done in his windows vs. Linux run. But, any variation between head to head testing can really be attributed to using the O3 and March:native switches for compiling, as a note here. Also, the most recent kernel did have a couple optimizations, iirc.

    Sent from my SM-G900P using Tapatalk
     
    TANWare, Vasudev and hmscott like this.
  17. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    I would dump Windows in a heart beat if the software I use and paid for ran on Linux. I would even re purchase some of the tools.
     
  18. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Windows 10 patch KB4343909 addresses high CPU usage with AMD processors after June/July patches and AMD microcode updates self.Amd
    https://www.reddit.com/r/Amd/comments/97a5p4/windows_10_patch_kb4343909_addresses_high_cpu/

    Submitted 6 days ago by softskiller - announcement
    • Addresses an issue that causes high CPU usage that results in performance degradation on some systems with Family 15h and 16h AMD processors. This issue occurs after installing the June 2018 or July 2018 Windows updates from Microsoft and the AMD microcode updates that address Spectre Variant 2 (CVE-2017-5715 – Branch Target Injection).
    https://support.microsoft.com/en-us/help/4343909
     
    ajc9988 likes this.
  20. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Have the same issue on Intel Skylake and its predecessors. I disabled Spectre Protection on skylake machine whereas older machine w/o BIOS update and only ucode and OS spectre protection doesn't seem to have the issue. The issue occurs when there's uCode in BIOS and it doesn't play nice with Windows. On Linux with retpo patch doesn't have that issue.
     
    ajc9988 and hmscott like this.
  21. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So, I installed it, but I had to do a complete power off, unplug, drain the caps, then boot to get it to work right. I tried loading the profile from last time and got 1100 points in CB15. After draining the caps, with it at defaults loaded in bios, then manually entering the settings, it booted without a problem, didn't kick mem errors (still 3466, didn't try to get 3600 stable with the new ram), and right back to 3540-3545 on CB15. But, it was updating the mdepkg, so having to really kick in those defaults with a clean start kinda makes sense, even though it was focused on helping with raid. Haven't done the normal thorough testing, but....

    Other than the initial issue, it has seemed stable, so nothing bad to report atm, just that you may want to do the same for a fully default load and making sure the components are setting right in bios.
     
    Vasudev, TANWare and hmscott like this.
  22. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Worked from first boot here without issue.
     
    hmscott likes this.
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Overclocked Dual-Epyc CPUs with Der8auer (Ft. Lots of Shade)
    Gamers Nexus
    Published on Aug 21, 2018
    There is so much shade in this interview that we needed more light. Der8auer talks about overclocking two AMD EPYC CPUs for "the world's fastest dual-socket" PC.
     
    ajc9988, Vasudev and TANWare like this.
  24. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    ajc9988, Vasudev and hmscott like this.
  25. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I recall reading this Dropbox blog post a while back, he mentioned Epyc favorable, briefly, a year ago...:

    "Specifically, if you go the Intel path, you are looking for at least Haswell/Broadwell and ideally Skylake CPUs. If you are going with AMD, EPYC has quite impressive performance."

    That shows how long the cycle is for bringing in new datacenter hardware. :)

    https://blogs.dropbox.com/tech/2017/09/optimizing-web-servers-for-high-throughput-and-low-latency/

    Here's the recent blog post from AMD:

    Dropbox Designs its Custom-Built Infrastructure with Single-Socket AMD EPYC Platform
    Posted by scott.aylor [​IMG] in AMD Business on Aug 21, 2018 11:01:15 AM
    https://community.amd.com/community...tructure-with-single-socket-amd-epyc-platform

    "As vast as the datacenter market is, it’s a relatively short list of companies working together in the day-to-day business. I don’t typically have the pleasure of engaging closely with a company that literally has hundreds of millions of customers like Dropbox. With over 500 million users and 300,000 Dropbox Business customers accessing its global collaboration platform, Dropbox is the latest big name in cloud to deploy the AMD EPYC™ processor in their custom-built infrastructure.

    “AMD EPYC is a compelling processor option for our compute technology, providing Dropbox with the technical specifications required to support the workloads that matter to teams and our individual users,” said Rami Aljamal, Head of Hardware Engineering and Supply Chain at Dropbox. “We are excited to deploy EPYC processors and look forward to working closely with AMD in the future.”

    Dropbox will leverage AMD EPYC™ 7351P one-socket processor platforms to support future growth beyond its current capabilities and refresh its existing infrastructure for its most demanding compute workloads.

    The AMD EPYC™ 7000 series delivers compelling options for the Dropbox offering, meeting performance demands throughout evaluation, qualification and deployment. With 16 high-performance cores on the EPYC 7351P processor and leading-edge memory bandwidth, AMD continues to drive a strong balance of compute and connectivity while eliminating the need for a second socket."
     
    ajc9988 and Vasudev like this.
  26. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    GeForce GTX 1080 Ti vs. Radeon RX Vega 64 OC - 4K Gaming Performance (i7-8700K)
    TechEpiphany
    Published on Aug 17, 2018
    00:01 - Far Cry 5
    01:02 - Middle-earth: Shadow of War
    01:52 - Total War: Warhammer II
    02:55 - Deus Ex: Mankind Divided
     
    Vasudev likes this.
  27. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Dropbox was an early sampler, but did get on board with sampling after other partners did. So, them announcing deployment is just toward the first of the rain that is coming in Epyc adoption. In fact, this third quarter is only supposed to bring them to 1-1.5% market share in the server space, IIRC. It is Q4 that brings them to 4-5% market share in servers, then next year, Intel said holding AMD to 15-20% market share is their goal, meaning they expect AMD to grab, at a minimum, that much, and that is coming from the competitor and largest player in the space.

    Intel's fear has caused them to announce this:
    [​IMG]
    So, they are throwing their board partners under the bus to allow backward compatibility to Haswell because of fear of making them buy new platforms, which favors just switching to AMD. It throws out the two gen cadence and shows they artificially restricted sockets. But, this is trying to get the drop in sales rather than full replacement servers. But, if the boards are not wired for the specs on the new chips, that will limit what features of the new chips can be used (like 48 lanes cannot be used on boards wired for only 40, etc.).

    This just seems really desperate on their part, honestly.
     
    Vasudev and hmscott like this.
  28. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Nice a stock 1080 Ti vs overclocked Vega.
     
    Vasudev likes this.
  29. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    16nm high perf. process (NV) vs 14nm low clocks and mobile parts perf. process (AMD).
    AMD also has 40% more stream processors (which are good for computing yes, but don't translate to gaming well at all), lower clock speeds, and lower memory bandwitdh limiting overall performance on Vega (one of the main reasons for this was manuf. process which was designed for low clocks and mobile parts - the voltages for any given frequency are bound to be higher in comparison to NV and will reach the 'threshold' for comfortable levels with relatively lower clocks).

    Considering what AMD had to work with... the results aren't bad at all, and should turn to AMD favor with 7nm (or at least, equalize things with Nvidia's products also on 7nm - unless Nvidia managed to make a new architecture, but I doubt that's the case - at best we are looking at a refresh of Pascal with some improvements as allowed by 12nm for example and 7nm).

    If you noticed, Pascal is usually clocked up to 30% higher than Vega on the core alone... thanks in great part to the manuf. process they use.

    Vega is a multipurpose card... not a sole gaming oriented GPU.

    Also, isn't the 1080ti an overclocked 1080?

    EDIT: I made an obvious error in asking isn't the 1080ti an overclocked 1080... there are clear differences between the said GPU's on a hardware level which give 1080ti an advantage in games... but those differences also give it an edge over Vega 64 that's been overclocked as well... so we shouldn't really be surprised of those results - and yet... it actually amazes me that people claim AMD clearly 'failed' with Vega when in fact they don't have to be at the very top in regards to gaming GPU's.
    Still, the inferior 14nm LPP process didn't work in their favor... whether things improve on 7nm however remains to be seen.
     
    Last edited: Aug 26, 2018
    Vasudev likes this.
  30. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    Vasudev likes this.
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    No, it's a completely different GPU.
     
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know you hate Nvidia and Intel with an abnormal passion but you really... REALLY should at least do a basic look at their products.
     
    Papusan and yrekabakery like this.
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Don't assume my views of Nvidia or Intel.
    For the record, I don't hate them.... I merely elaborated on the difference in manuf. processes and some other technical data between Vega and Pascal.
    Granted, my statement of 1080ti being an overclocked 1080 was incorrect, the remainder of what I said however stands.

    Besides, Vega 64 is an equivalent to 1080... not 1080ti.
    1080ti also has more ROPs than 1080, along with texture units, ROP's, VRAM speed and bandwidth that give it more 'oomph' beyond clock speed alone.

    The fact that an overclocked vega 64 gets within 10% of it is not bad... but also, not exactly impressive since a 1080 with a decent overclock could likely do something similar.

    Simply speaking, AMD never made a GPU in the 1080ti category... plus, I'm not sure they HAVE to.
    Shooting for the highest performance with their resources doesn't exactly strike me as a good idea, especially since its a niche market to begin with.

    AMD's approach relies more on 'versatility' of the GPU... which at the moment is relatively under-utilized by the industry which primarily optimizes for NV.
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yet you vehemently deny any chance of buying an Intel/Nvidia platform, to the point of overpaying for a neutered AMD solution from a known-horrible notebook manufacturer, just because it's AMD? I still remember all the conversations from before Polaris mobile happened, where I pointed out it'd be ill-suited, and you openly said you'd rather take it than a 1060 in a larger notebook with a bigger power brick.

    If you don't hate them, your actions are not reflecting on your thoughts about them. I see no other logical way you could insist on overpaying for less and be satisfied with it, other than you cannot stand the thought of the alternative companies.
     
    yrekabakery likes this.
  35. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    That's an interesting response.
    Now, granted, me saying the 1080ti is an overclocked 1080 is an obvious error... but you also said its 'nice' that 1080ti stock performs favorably against OC Vega which deserves a similar reaction.

    Especially when you cononsider that 1080ti has more ROP's, higher frequencies etc. to an overclocked Vega GPU that's an equivalent to 1080 at stock settings.
    Its not unexpected at all that 1080ti would perform better.

    First off (and this is something I explained before), I wanted an AMD platform as a change of pace because the new Zen architecture delivered on its promises and it was the only laptop at the time that offered a desktop grade 8 core/16th cpu in a laptop for a reasonable price which I could use in my productivity workflow (and no other laptop at the time really fit those specs).
    Plus, the initial promise of B350 platform upgradeability presented itself as an opportunity... until Asus decided (much later on) to disclose that they won't be releasing BIOS updates for CPU upgrades (nevertheless, that doesn't make the 1700 any less capable in the long run for my needs).

    Second: I had no idea that Asus were a 'known-horrible notebook manufacturer' and the responses I got about the prior to my purchase were less than descriptive of them as a 'bad company'... as for Asus 'neutering' the hardware... if you're referring to the mobile RX 580 in GL702ZC, one could say the same about Nvidia's mobile 1060. Both are 'neutered' to the point where they don't have the same performance as their desktop counterparts and have certain TDP limits so they can be placed into a laptop... even though a 17" chassis is more than capable of housing more powerful hardware (And I expressed these concerns after I got the unit).

    Third: the mobile RX 580 in GL702ZC was already operating at lower frequencies and was limited to LOWER TDP (meaning it was more efficient) than mobile 1060 while producing similar or same performance, depending on the game used.
    It wasn't AMD's fault that Asus botched the cooling or that they decided to drop the frequencies as low as they have (or that they didn't bother optimize the voltages for the GPU at given frequencies) - and with due respect, I'm hardly clairvoyant - nor am I the first one to make a mistake in purchasing a laptop from a bad OEM - because I actually thought that Asus would make an effort and do things right.

    The issues I encountered with the unit went well beyond the time frame any reviewers would be able to catch and also, subsequent units of the same model but ones that were produced at a later date seem to operate within advertised specs... its the initial batch that seems to be having failure issues (and that's the batch I apparently got).

    As for the power brick being large... considering the GL702ZC comes with a 65W TDP CPU (Desktop grade) no less and has no IGP, the thing has specific power demands.
    Large power bricks for laptops that house desktop grade hardware on either side (Intel NV or AMD) aren't unheard of.
    Not exactly comfortable, but that's the trade-off.

    If I 'overpayed' for the GL702ZC, I could say that every person who bought Intel/NV comparable laptop is also overpaying for their purchases.
    Do I think Asus charged more money for the GL702ZC than they should have for what they offered? Of course I do... I even wrote as much when I first got the laptop and tested it.

    For that matter, I think the current prices of laptops with nothing but APU's in them are ridiculously overpriced... and yet, EVERYONE keeps saying how its because of increased market prices.

    And, if I recall correctly, I also said I would have asked of Asus for a refund, but that option was not available to me anymore, so I had to resort to RMA... but I guess those 'details' don't really matter.
     
    Last edited: Aug 24, 2018
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh boy this is gonna be a long one.
    I am talking about well before Zen even launched, when Polaris had JUST emerged and people wanted MXM RX 480s. I was stating (and correctly so) that Polaris is not efficient enough to compete with the 1060, and that you either have a tradeoff... a larger brick than people want for smaller laptops (78W TGP for a 1060N and 55W spike for an intel mobile CPU is fine on a physically tiny 150W brick, impossible on a RX 480), and a power hungry, hot and heavy unit with an entry-level class current-gen GPU. Alternately, you have severely reduced performance to fit the lower envelope... which means once again you could buy a much weaker card for similar performance, like a 1050Ti.

    ASUS chose the latter. 65W RX 580, when the full RX 480 required 192W to play witcher 3 at 1440p at STOCK CLOCKS once you disabled power throttling. The RX 580 barely handles 980M performance, far less arriving near a 1060N. And if you want to say it can beat a 980M, then overclock the 980M; something you don't have the power envelope to do in that ASUS (or the cooling, apparently, considering how hot it gets while being bafflingly loud... it's about as loud as my P870, and almost as large too).

    As for ASUS not releasing updates... they always. Without a shadow of a doubt. Prevented upgrades. Even when they used MXM they used a custom PCB design so you couldn't physically insert a later MXM card even from their own units into your previous gen unit... you ALWAYS need to buy-upgrade with ASUS. And further to that, I also doubt the VRM solution on that board supports an upgrade in the first place, like most B350 boards; 2700X chips burn em out easy. Though I would forgive you for not realizing this fact, but then the unit isn't sold with a 1700X or 1800X, and those have higher power limits, so... some signs were there.

    To be honest, you've been on this forum long enough that having seen NONE of that talk is impossible. It's all over in threads I know you've seen, unless you've only checked maybe three or four threads since coming to this forum entirely. Which isn't possible with your post count.

    Now onto the 1060N... the 1060N is worst-case scenario 15% slower than the desktop card, usually less (without any voltage frequency tuning that is; since it was heavily talked about for AMD's side it's fair game here). The OC vBIOS for the 1060 which is 90W TGP is much closer in performance to desktop (probably nearly matching with undervolts). The 65W RX 580 is UNDER 980M performance in quite a few games according to notebookcheck... ignore the synthetics, look at the games. Fortnite and Star Wars Battlefront are both over 15% slower than a 980M, which is generally 30% slower than a 980, with a 980 being what the 1060 ballparks. That's nowhere CLOSE to being the desktop card performance, at all.

    As I said above, the 1060N and RX 580 don't even come close to having the same performance, but have about the same TGP (65W for the RX 580, 80W TBP), so... what's the comparison? And how is the RX 580 more efficient when the 1060N creams it pretty much everywhere? Even on the frostbite 3 engine with Star Wars Battlefront 2, WHICH IS AN AMD-FOCUSED TITLE on an engine that LOVES extra CPU power and scales to multiple cores like a squirrel scaling a tree trunk. Instead, the 1060N is a whopping 37% faster:
    [​IMG]
    As for the lowered performance and clockspeeds... this is why you wait a short amount of time for reviews like from Notebookcheck, where you can read the raw data. And speaking of reviews, this brings us to my next point:

    ASUS, like Razer, always review well. Because they send cherry picked units and in some cases fly out techs to outlets to make sure the unit is "working properly". And if you review the unit badly, guess who is unlikely to get more units in the future? You're never going to find a bad review of a notebook that doesn't completely break (like locking a CPU at 800MHz), because reviewers aren't in the business of giving you a proper picture.

    A full RX 580 would require somewhere around 150W on its own for power. This instantly limits it to 230W+ power bricks if you want the advertised performance, as opposed to 150W or 180W for a 1060N unit. We've been over this before in the other thread I referenced at the start. Either you neuter performance, or you make a unit designed to house a higher end part from the other vendor, like a 125W 1070N or even a 150W 1080N, both of which outstrip a 580 so much it's laughable. As for the CPU, we never discussed this before, though i knew from a week after Zen's launch it would be horrible in any notebook. We don't have the RAM to feed it properly.

    What you "overpaid" for, is the performance. You could have had a 1070N and a 7700HQ for $1800 USD with your existing storage/memory configuration easily around the time you bought it. Nvidia cards are indeed overpriced for notebooks. OEMs charge double what they're capable of selling them for to still make profit, despite what anybody else tells you. But what you bought, that cheap B350 chipset with that cheap 1700 and that completely neutered RX 580 for the price you paid and the poor cooling and loud noise when doing said cooling? That's beyond what any decent intel/Nvidia unit needs to run someone. Even the overpriced Aorus line from Gigabyte could be had for your price with 1070N cards and liquid metal repastes from GentechPC at the time.

    No, just like what I said initially to you here, you STILL insist on an all-AMD notebook. Even if price is exorbitant for minimal/reduced performance. Knowing full well you will never achieve full performance out of AMD on notebooks because they won't use 3200MHz+ memory to properly support the CPU, along with the fact that notebooks generally tend to use single rank single sided memory too, so you don't even have the benefit of dual rank memory improvements unless you start with 32GB in four stick configurations or hunt certain kinds of memory sticks.

    So once again, I make the statement: Your actions don't reflect what you said your thoughts are. When I told you RX 480s were ill-suited to mobile, you denied me. Look at the existing implementation now. The other unit that has it (with still-neutered performance) is a HP Omen with 120W TDP and unknown but certainly higher-than-120W TGP/TBP, far above even the high performance 1070N vBIOSes which are 125W TGP, which destroy it in performance, far less after undervolting is applied. Your above post JUST got through defending them doing this, saying it was ASUS' fault and not the tech's fault. You're still insisting Zen/Zen+ is the best case for a notebook, despite all known implementations lacking high speed memory support and any form of overclocking, with a 2700X unit still-to-come, when you could be using 4.8GHz 8700Ks for your production all this time with well over 3000MHz memory in a P870TM1 or even P750TM1 if the load was CPU-only and you bought it from the right vendor... it would not have been much more with a 1060N in it either in terms of price for your storage/memory setup, with better performance across the board.

    But you still insist on all-AMD. What am I supposed to think? And then you didn't even know a 1080 and a 1080Ti were two different cards, AFTER Pascal's entire life cycle ended, meaning you barely even glanced at their product line.

    Edit: Was pointed out that the HP Omen's 580 is ~120W board power and 85W TDP, not 120W TDP and higher board power. Still matches it to a 1070N which cleans its clock, though, but I am corrected on that.
     
    Last edited: Aug 24, 2018
  37. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Didn't I maintain that I got the GL702ZC mainly for productivity work and that gaming was secondary?

    Its you who make too much emphasis on the GPU (and appear to make quite a lot of errors in the process).

    The 7700HQ has about half the performance of 1700 and wouldn't have been adequate for long term productivity.
    Which laptop at the time had an 8 core/16th CPU of that performance available on the market for the same price?
    None to my knowledge - and that was in October last year (heck the GL702ZC was actually reviewed two months BEFORE it became available here in UK).
    So basically, I seem to have paid more for having a desktop grade CPU in a laptop.

    Then there are infamous issues with Intel CPU's in laptops which end up throttling (I noticed those particular problems being reported here frequently enough) due to poor IHS (which plagues most Intel CPU's on laptops and desktops) and also poorly implemented thermal paste by plenty of other OEM's.

    GL702ZC's flaws aside, Asus DID manage to implement the thermal paste decently, and compared to the rest of their ROG line (or other laptops) the GL702ZC didn't seem to suffer from throttling issues or too high temperatures (although I DID mention that Asus could have done FAR better for a 17" chassis).

    As for RX 580 neutered performance.... let's get a few things straight... at the time early reviews were released, they claimed about 5-10% performance differential, and honestly, the RX 580 handled everything I threw at it well to the point I didn't really find any differences that would be worth mentioning.
    Plus, the GL702ZC came with Freesync... and future drivers DID manage to increase overall performance on RX 580 in the unit.

    Also consider the fact that the RX 580 was limited to 68W... whereas mobile 1060 is limited to 80W.
    NV had more room to maneuver and was built on a more efficient process which allowed it to be clocked higher... Asus chose to restrict AMD more than necessary (whereas in reality, they could have also limited it to 80W, optimized the voltages and called it a day - at which point, performance would likely be similar enough to the point it wouldn't really be noticed - and as for the AMD title being 36% slower on rX 580... was that tested on DX11 or DX12? I noticed notebookcheck seldom compares games in DX12 and don't really publish which mode they are using - in DX12, the differential between the two gpu's is much smaller, or goes to RX favor actually).

    I also don't care how much power the full RX 480 or 580 need... because those are different power demands which also use the highest possible numbers on desktops without any voltage optimizations (and you should know by now that AMD overvolts their GPU's to increase yields, whereas Nvidia locked its voltages and clocks for the most part and didn't suffer same issues with production like AMD did).
    Furthermore, its not as if the desktop 1060 was too low on power consumption.
    When desktop RX 480 and 580 were undervolted, they both consumed similar amount of power as 1060.
    They were still a bit higher than NV in power consumption, but that is accountable to manuf. process limitations AMD had to use.
    Also, correct me if I'm wrong, but 1060 is clocked higher than Polaris on desktop (same thing is apparent with Pascal vs Vega) - whereas producing similar/same performance.

    Furthermore, there were no other all AMD laptops on the market at the time.
    That was the only unit that presented itself as viable for me and my needs... and when it worked, it worked well.

    As for me insisting on an all AMD laptop at this time... so what?
    It doesn't mean I hate Nvidia or Intel.
    It actually means I want to support AMD at this time (especially given both NV and Intel previous/shady market practices - which still doesn't mean I fundamentally hate them) and I consider their hardware viable (and it is when properly optimized) - this is also what I stated before, but somehow you managed to overlook this (although its entirely possible you will just discard this reasoning to reinforce what you already said- so forgive me if I will no longer try to justify my actions to you... and to be honest, I don't have to. I merely chose to write my reasons now so you can better understand why I chose AMD... and if you won't accept that response, then that's your problem - it also doesn't make your opinion correct).

    OEM's making mistakes with AMD hardware on laptops is an entirely separate issue... and OEM's are responsible for optimizing their units... which of course they almost never do properly, and have a history of cutting corners with AMD.
    Asus at the time seemed as if that they were doing things differently with AMD.

    Should this failure issue not get resolved and results in a refund... plus if no other option presents itself, then I will of course consider Intel/NV laptop.

    But don't act as if this issue is entirely AMD's fault, or that its inherently not viable for mine or anyone's else use.
    You've been making same comments for the desktop versions.
    People can get AMD or Nvidia without hating on either company.
    Our choice to support either doesn't have to mean we hate them... is it so impossible for you to understand that some of us want to give AMD a chance after they already proved themselves on desktop?

    AMD still has some limitations due to the existing manuf. process which limit its efficiency and performance, but that doesn't mean it cannot be optimized to work properly by OEM's (or users) who choose to use it... and if they do, its partly their responsibility to make it work properly (it is also AMD's responsibility to optimize their own hardware from the start... but given their limited resources at the time, not sure if they could have afforded to do so).

    Also, the failures I experiences with GL702ZC could have happened with other OEM's... in fact, they tend to occur in situations when people adopt early hardware releases, so its a risk to a degree with any OEM - Asus is hardly an exception with this.
     
    Last edited: Aug 24, 2018
    sniffin likes this.
  38. LunaP

    LunaP Dame Ningen

    Reputations:
    946
    Messages:
    999
    Likes Received:
    1,102
    Trophy Points:
    156
    Could you like just calm down and get back to actual information discussion, and maybe even perhaps READ fully what others are attempting to explain to you, by no means is D2 an AMD hater, we ALL want competition to come up to push things forward, and this "So what to X " and " I don't care about any actual information " BS is making this discussion pointless to read, there's nothing wrong w/ you being a Red fanboy, everyone has their choice on hardware, but get off the whole "I need to make this personal because I'm offended that you're teaching me stuff" because the purpose of this thread/forum ( or at least once upon a time ) is for people with misconceptions/lack of knowledge, to learn , discuss and debate, and those WITH information discuss and utilize that vs go on tangents on how factual information to support an argument is arbitrary and completely irrelevant to the point that you're failing to make.

    If you don't care, then why are you discussing it? If it doesn't matter why is it upsetting? If you feel someone is using hate, then pray tell re-read, since this is the internet and anyone that gets their heartbeat revved up to quickly can mistake anything for a personal attack.

    SO please can we get BACK to technical discussions vs sticks and stones? Thanks <3
     
  39. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    When people start making assumptions about others, don't expect them to go unanswered.

    D2 was the one who pushed the topic into this direction. I openly admitted my mistake in saying 1080ti was an overclocked Vega 64... everything else was an attempt at trying to explain the reasons behind my decisions as he apparently operates under certain assumptions of me which I tried to correct.

    As for your other claims.. they are rather bad assumptions, and frankly, I just stopped caring.
    Good night
     
    sniffin likes this.
  40. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    This is misleading. Your RX 580's 68W is only the GPU core power, while a notebook 1060's 78W is total board power, which includes not only the GPU core but also GDDR5 memory and VRMs. If anything, the 1060N is more power efficient when you factor in its higher performance.

    The title being compared was Star Wars: Battlefront II. This game uses EA/D.I.C.E.'s Frostbite 3.0 Engine, which is notorious for having a DX12 implementation that runs worse than DX11, on both Nvidia and AMD hardware.

    Nvidia's Pascal GPUs also come highly overvolted from the factory. The voltage/frequency curve is overvolted by about 150mV more than necessary across the entire range. That's why on my GTX 1080, I can stably run 1848MHz @ 0.913V, whereas stock would have had me at about 1.050V at that clock speed.

    Voltage and clock manipulation can be done using the curve editor in MSI Afterburner, it is not locked down:

    [​IMG]

    (Thinks back to 1080 Ti is overclocked 1080 comment).

    Pot meet kettle, I guess.
     
    D2 Ultima and LunaP like this.
  41. LunaP

    LunaP Dame Ningen

    Reputations:
    946
    Messages:
    999
    Likes Received:
    1,102
    Trophy Points:
    156
    Welcome to the internet, good people will do their best to help you, others will defend it, and get sappy if they're still learning things and assume they know everything and duck out the minute the pressure on them to acknowledge facts arises. There's no dishonor or bad merit to learning new things and listening to others, simply saying everyone else is wrong and using bad information/what you think MAY be the case only strengthens the need to point out your inability to properly discuss topics properly, (Which for some reason you appear to be confusing with people ASSUMING...there's a huge difference here I hope you learn from) hopefully after a good rest you'll come back and drop the defensive mode and resume intellectually speaking.

    You can claim everything is wrong, but its not just about you, its about ensuring you don't mislead other people here as well, the goal is good information with backup and support, not "guys I believe its this way and we should all keep this mindset, don't believe anything anyone else says cuz I said it first"

    Anyways thank you for at least backing down vs spilling more into this requiring further Damage control from the more savvy personnel here.
     
  42. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Keep this trash out of here. He seemed perfectly calm to me, and anyone with a brain knows that these kind of statements just serve to annoy people.

    RE the discussion, that 1700 stomps all over anything Intel has released in the notebook market, fast RAM or no.
     
  43. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Book below
    I pointed out a weak GPU strong CPU system to you for a similar price.
    I made no errors I did not already correct with my last edit (which fair you did not see when quoting this).
    Who said anything about finding you an 8c/16t? Your 8c/16t is smoked by the 8700K recommendation I made, which by your own admission was already out by the time of your purchase, as you can see here where the original post was on october 5th for its last edit, with a working link to the product page on HIDevolution. This is an invalid point, as I said you could get a weak GPU and strong CPU for a similar price above.
    I mentioned specifically "from the right vendor" also in my 8700K statement, also invalid point.
    Please don't tell me that laptop has a decent anything in it. We both know it doesn't. I see you admitting that they could have done far better but just because they're two rungs off the bottom where the rest of their notebooks are doesn't mean they're worth any defense. And before you say I am partial to any vendor of ANY kind, I am not. They do well, they get praised and recommended. They don't do well they get picked apart. Most don't do well. Someone asks me a question I pick the best thing for their use case that doesn't fail to perform in some way, whether it be overheating or artificial power throttling or just poor, poor quality control/post-sale service.
    Would be nice to see if you could clarify this against the SWBF2 test I posted, then, if future drivers helped so much. That aside, Freesync and Gsync mobile are the same thing. Both are proprietary GPU vendor software-based tech that manipulate VRR in displayport screens, and BOTH ARE FREE. Gsync costs nothing, all Nvidia mobile cards come with it, and the end-user needs not pay for it. Straight from the horse's mouth Nvidia told me this much. Anybody charging for it is doing so to pocket more for themselves. There IS a screen whitelist, but that is all, really. And it doesn't have a cost trickling down. I know this because when I wrote the article bashing Max-Q, Nvidia contacted me to correct my statement about the "price" of mobile gsync being attached to the cards.. to tell me there was no price, so I could not make that statement, and that anybody selling "non-gsync" versions of gsync units are doing it themselves, aka putting in money into a vBIOS that avoids enabling gsync entirely.
    I said TGP, aka TBP, which is full graphics draw (core + memory/VRMs/etc) is 78W on a 1060N, and 80W on that ASUS RX 580. It's not 68. It's 80. By that logic the 1060N is 60W and your point is MORE invalid. The performance is what it is. Trying to find AMD's absolute best case scenario (DirectX 12 in an AMD-focused title) doesn't make the card better than it is.
    Mobile Nvidia has no voltage optimization either... you can undervolt them, though, by adjusting the voltage/frequency curve using MSI Afterburner. Get more out of the lower power limit. It is unlike Maxwell or Kepler or prior units where undervolting either was impossible or required a modded vBIOS. And in fact, mobile Nvidia cards are even WORSE at optimization than their desktop counterparts, because the voltage doesn't filter out at the level it needs... once you approach the top 5 boost bins (1860, 1873, 1886, 1898 and 1911 MHz bins) you automatically pump full voltage into them on notebook cards, where desktop cards will oft remain lower like 1.025v instead of pumping to 1.063 like notebook cards do. And the cards don't need that voltage AT ALL. Here, look at the voltage required for 4K firestrike if I adjust the voltage curve a little bit on my cards:
    [​IMG]
    Pretty big drop from 1.063v flat ain't it? If I didn't adjust that curve look at how much throttle I'd be having due to power limits (not temperatures as you can CLEARLY see):
    [​IMG]

    So like I said... if you want to make the point about undervolting for performance, you need to apply it to Nvidia too... and oh boy does undervolting ever grant quite a bit of performance there on mobile. So, this point is pretty invalid.
    Here it is.
    Listen. I'm autistic. I literally only see in logic. Logically, there is no reason to do what you did unless you have a grudge against other options. You don't set out to get something knowing worse performance and from, as I believe it, a vendor you should know almost never delivers, for more money than it's worth ($60 B350 boards anyone? $220 RX 580 4GB? $275 1700? Near $2000 for laptop? Hello?) and just... no, I have nothing further to add. There is just no logical reason for you to make that decision.
    Well now you know better about ASUS
    Now that I'll be waiting to see.
    Giving AMD a chance when products are comparable, is understandable. Giving AMD a chance when their products are inferior, or are inferior on the chosen platform (in this case in that notebook), when other superior products are available at a competitive price, I cannot understand at all. Which is the point I made above with P750TM1s.
    I know Zen and Zen+ pretty well inside and out, I don't know about manufacturing processes that are limiting them.
     
    yrekabakery likes this.
  44. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    In what, CineBench? Clearly you haven't seen some of the overclocked 8700K/8086K CB scores posted on Clevo DTR systems that beat the 1700 handily. Especially in this Asus that doesn't have the VRMs and cooling to handle anywhere near a desktop level OC on the 1700 unlike the Clevos. And need I remind you about the E5-2697 v2 in the P570WM back in the day?
     
    jclausius and bennyg like this.
  45. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    While your assertion on DX12 vs DX11 in SW BF2 is accurate... actual gaming performance is higher on GL702ZC than what Notebookcheck got (what have they done, run the benchmark and rounded up the averages?).
    How quaint.
    If you also check GL702ZC and Star Wars battlefront II video on youtube, the player in question managed to play above 60FPS most of the time:

    You should also know that Notebookcheck also touts that Mass Effect Andromeda gets less than 30 FPS in 1080p High preset mode on GL702ZC... whereas in reality, the actual gaming performance is far higher, above 60FPS when I played it.
    The game never stressed my RX 580 to the maximum and temperatures were in 75 deg C range.

    The game that DID manage to stress my GL702ZC RX 580 to the maximum was Rise of the Tomb Raider... this was fairly evident from the temperature and fan noise among other things... and FPS were fairly high above 60.

    So, I'd take those benchmarks on Notebookcheck with a pinch of salt as I fairly doubt their 'averages' reflect on total gaming performance (Especially since benchmarks of those games aren't usually representative of actual gaming performance)... and possibly could be affected by relatively bad scaling across 8 cores (whereas most of the 1060 laptops that the GL702ZC were compared to had 4c CPU's).
    The main consistency that came out was that 1060 laptop version was about 10% faster than RX 580 in GL702ZC.

    But so what?
    TGP between the two GPU's may not be the same, but I also think people are complaining too much about it.
    1060 is more efficient yes... I also pointed out why this is the case.

    Besides, D2 is also consistently saying GL702ZC users (or rather myself) 'wasted' their money.
    In comparison to what?
    Here in UK, 8700k laptops with 1060 and otherwise same specs (SSD, HDD and RAM setup) are MORE expensive than GL702ZC.

    Finally, 8700k laptops weren't exactly around here in Europe when GL702ZC was out in the market.

    Besides, he keeps 'mentioning' about something that cannot be changed in the first place.
    I have to go through the RMA with asus because I have no other choice. Getting a refund is not an option at this stage due to Asus protocols.

    So, pointing out every single thing people think is wrong about the GL702ZC isn't exactly productive.
    Plenty of similar shortcomings can be found in other Intel/NV laptops...
     
    Last edited: Aug 25, 2018
  46. ALLurGroceries

    ALLurGroceries  Vegan Vermin Super Moderator

    Reputations:
    15,730
    Messages:
    7,146
    Likes Received:
    2,343
    Trophy Points:
    331
    Some posts have been deleted by the mod team. One post was un-deleted by the mod team to provide context.

    Please keep in mind that English is not everyone's first language. English grammar can be a challenge even for those who have grown up speaking it natively. A tech forum such as this is not the place to be arguing about grammar. We have noted the posters who attempted to derail this conversation with their linguistic critiques and will be issuing infractions if this behavior continues in the future.
     
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Warning: they swapped the colors, Green is AMD and Red is Nvidia

    GTX 1060 vs RX 580 - Post RTX Update

    UFD Tech
    Published on Aug 24, 2018
    Which GPU would you choose currently? Which one did you decide on previously? Do you think the landscape between the GTX 1060 6GB & the RX 580 have changed since the RTX 20-series was announced?
     
  48. chris89

    chris89 Notebook Consultant

    Reputations:
    45
    Messages:
    246
    Likes Received:
    13
    Trophy Points:
    31
    Has anyone tried running DDR4-3200MHZ on these AMD Ryzen 2500U CPU's? I believe they could benefit greater from more RAM & faster RAM... I think its bottlenecked by the RAM, because sometimes the graphics performance is phenomenal, and other times terrible when its running out of RAM & tapping into the PAGEFILE.

    I'm considering upgrading to 2x 16GB DDR4-3200MHZ is why & a 10TB M2 SSD (Teasing) haha lmao
     
  49. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    You won't be running those modules at 3200MHz, I can guarantee you that.
     
  50. chris89

    chris89 Notebook Consultant

    Reputations:
    45
    Messages:
    246
    Likes Received:
    13
    Trophy Points:
    31
    Do you think anyone has tested faster module's on the AMD Ryzen 2500U or any of these Ryzen laptop's? I guess no one has posted or had the cash to buy the ram?
     
← Previous pageNext page →