The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Does High-end Gaming Make Sense Anymore?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by hmscott, May 5, 2019.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    tl;dr - overpriced PC hardware are attracting "hardcore gamer" cash cows that will pay exorbitant amounts of money for slim improvements in PC components is ruining the gaming market - pricing normal gamers out of the market and chasing them to consoles.

    Does High-end Gaming Make Sense Anymore?
    The Good Old Gamer
    Published on May 4, 2019
    May the 4th Be With You!


    yurimodin 18 hours ago
    "I hear you......I have topped out for the next ~4 years: 8700nonK 144Hz gsync 1080p ultrawide 1080Ti. We need better games not hardware (waiting on you Cyberpunk2077)"

    Richard
    17 hours ago
    "A fool and his money are easily parted."

    Han Solar 17 hours ago
    "Never, PC = Freedom!!! Console have its place but my PC stays!"

    Nvidia vs AMD: Moving Past the "GPU War"
    The Good Old Gamer
    Published on Apr 27, 2019
    Deciding which GPU is right for YOU should have NOTHING to do with Brand!
    http://forum.notebookreview.com/thr...ga-polaris-gpus.799348/page-581#post-10905425

    Milking PC Gamers in 2019: Monitors are falling behind TV's in Tech and Price/Perf!
    Moore's Law Is Dead
    Published on May 1, 2019
    OLED TV's with INSTANT response times and 120Hz Native Refresh rates are starting to cost less than "gaming" monitors. We need to talk...


    Is pc gaming dying?
    not an apple fan
    Published on Apr 30, 2019


    Report: 20 Million PC Gamers Could Switch to Consoles by 2022
    Joel Hruska on April 29, 2019 at 9:18 am
    https://www.extremetech.com/gaming/...on-pc-gamers-could-switch-to-consoles-by-2022

    "The PC gaming industry was worth an estimated $33.4B in 2018, accounting for roughly 25 percent of all gaming spending. Consoles account for slightly more market share, at 28 percent, while mobile gaming’s $63.2B is estimated at 47 percent. But if a new report from Jon Peddie Research is correct, up to 20 million PC gamers could decamp for consoles over the next three years, mostly from the lower end of the market.

    The argument appears to be this: As Moore’s law has slowed and streaming services have ramped, the gaps between consoles and PCs have shrunk. At the same time, a plethora of new streaming services, including projects like Google Stadia, are offering more and more ways for consumers to engage with content.

    These services are primarily TV-focused and TV-centric, but with 4K HDR content ramping on both consoles and PCs, the image quality gaps are smaller than they’ve ever been. Meanwhile, the slowing rate of improvements in technology and the efficiency gap between PCs and consoles (games tend to be better-optimized on console due to the much smaller number of supported system configurations) has made it easier than ever for the PC market to bleed customers.

    JPR writes:
    " The majority will come from the low-end (under $1000 full build cost), but because of improvements in TV displays and console semiconductors, as well as console exclusive titles, the ranks of mid-range and high-end PC gamer populations are also affected."

    Is this likely? Maybe. But I’m not sure it practically matters. And at the same time, it’s going to matter for certain kinds of games.

    Taking the Long View
    One of the amusing things about the PC-versus-console debate in the Year of Our Lord 2019 is that PCs and consoles (not to mention the Xbox and PlayStation) have never been more similar to one another. This was not always the case. Compare, for example, the capabilities of the Sega Saturn and the Sony PlayStation, and you’ll find that the two had fundamentally different hardware, with the Saturn excelling at 2D graphics and the PlayStation performing far better with 3D polygons.

    If you go back farther in time, the differences get even larger. As someone who started gaming on the PC in 1987, I remember when my gaming platform of choice was literally incapable of playing the same sorts of games that one saw on the NES. Space Quest, King’s Quest, Wizardry, The Bard’s Tale, and Chuck Yeager’s Advanced Flight Simulator are very different games than Super Mario Bros., Mega Man, Metroid, or even The Legend of Zelda.

    Even if you compare game series that existed on both PC and console, like Ultima, they were so different as to constitute entirely different games. There was virtually no overlap between the two platforms in actual gameplay or content.

    Today, the overlaps are substantial. Even the specific differences that constitute major reasons why I prefer the PC, like modding support, have become more common on consoles than they used to be. This is not to say that PC gamers are willing to leap on consoles, or vice-versa, but that we’ve seen a substantial blurring of the lines over the past 30 years over what, exactly, distinguishes PC gaming from console gaming. So yes. Objectively, it’s possible that we’ll see a major shift from PCs over to streaming devices and televisions, particularly at the lower end of the market.

    But overemphasizing this feels like it would miss the larger point. Increasingly, gaming is something you can access on multiple devices sequentially, as you move through life. Microsoft will stream games to an Xbox from a PC or vice-versa. Game streaming, both from a service or across a local network, is an increasingly common feature. It’s not entirely clear to me that this represents a shift away from PCs specifically so much as a new trend in content consumption.

    And of course, even if this trend proves true, there are games that are never going to move away from PC. The experience of gaming on a PC remains tethered to the mouse and keyboard, and while plenty of games can translate between KBM and controller, again, many can’t. There’s a reason why series like Civilization don’t really head for consoles. Players of these titles don’t have much to worry about.

    Ultimately, I’m not convinced that PC players are going to dump PCs and shift over to TV-focused gaming. But even if they do, I’m also not convinced it would represent a true market transformation so much as a willingness to game on multiple devices at the same time. And frankly, the more PCs and consoles continue to evolve towards each other, the less this distinction matters in the first place. From a hardware perspective, the Xbox One and PS4 are both PCs with custom SoCs, particularly in the Xbox One’s case, given that it runs a variant of Windows.

    In fact, that might be the funniest thing about the way we slice up the console market versus the PC space. If gaming consoles were being invented today, we wouldn’t talk about them as separate, specialized hardware with their own history and heritage, but as cost-optimized, mass-market, PC-based game systems with custom software and a few bits of specialized hardware. From that perspective, shifts in the gaming market towards specialized streaming services that use PC server hardware to deliver PC games to your home would scarcely be shifts at all."
     
    Last edited: May 5, 2019
    Papusan, Starlight5 and jaybee83 like this.
  2. heretofore

    heretofore Notebook Consultant

    Reputations:
    10
    Messages:
    109
    Likes Received:
    37
    Trophy Points:
    41
    10+ years ago, computer games were fun to play on 4:3 displays with low quality textures.
    Today, almost all review sites will test games at ultra settings as if that's all that matters.
    Currently, I am playing Enderal (a total conversion mod for Skyrim) at 2560x1080 monitor resolution
    with a GTX 1050 (laptop) with 2 GB vram.

    All draw distances at max. Textures at highest setting. Reflections off. Anti-aliasing off and AF = 2x
    There is a little cpu bottlenecking in cities, but the GTX 1050 can put out 60 fps and 2GB vram is more than enough.

    Do these graphics (without AA) look bad to you? They look good to me.
    enderal1.jpg enderal2.jpg enderal3.jpg
     
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    PC Gimmicks vs Features... What do you REALLY need?
    JayzTwoCents
    Published on May 9, 2019
    It's getting hard to sort through all the gimmicks in the PC industry... so today I point out some obvious ones that you shouldn't fall for... Feel we missed some? List what you feel are gimmicks in the description below.
     
    Starlight5 likes this.
  4. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Nevermind currentc software not utilizing current hardware capability, its performance is held back by it. I think we should be seeing better performance. DX12 has not fulfilled its promises for low level access to hardware.on PC. AAA games like Division 2 have useless implementation of DX12 that it provides zero improvements over DX11.

    I truly believe that current hardware performance is severely held back by terrible software on PC.
     
    hmscott likes this.
  5. AlexusR

    AlexusR Guest

    Reputations:
    0
    Are you serious or you're just trolling?

    https://www.pcgamesn.com/the-division-2-pc-performance-analysis-nvidia-amd-gpu-benchmarks

    [​IMG]

    [​IMG]

    Yes. 0 improvements over DX11. /sarcasm

    Here's another 0 improvement over DX11:
    https://www.reddit.com/r/nvidia/comments/azz7dm/the_division_2_dx12_vs_dx11_benchmarks_on_gtx/


    What's next, you gonna argue that these numbers are "0 improvement" in your opinion? Or that you personally "have not seen improvement" and that people should just believe you instead of other data from different sources which does show an improvement?
     
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,619
    Trophy Points:
    931
    DX12 is more helpful for AMD.
     
  7. AlexusR

    AlexusR Guest

    Reputations:
    0
    Yes, because Ubisoft has partnered with AMD for this game. But this is not the point. The point is DX12 does have performance improvement over DX11 for this game, regardless of hardware brand.
     
  8. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Yes I am serious. Do you actually play the game? DX12 makes zero difference. Talked about it with my clan, none of us have seen any improvememt.

    Actually majority of us turned it off cause it is horrifically buggy causing stuttering and lags. DX12 in Div 2 is absolute garbage.

    I really don't care about your link. This is my experience and everyone I play the game with.

    Also not going to argue with you about it since I know what I have experienced and everyone I play with . Literally nothing you could say could sway what I have experienced so far with the rubbish DX12 in Div 2 so bad it is unplayable. We all use 2070 or 2080 or 1080Ti. We all use what I consider on the higher spetrum of pc gaming.
     
    Last edited: May 10, 2019
    hmscott and Starlight5 like this.
  9. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    I personally don't care much about graphics settings, and set them all to low for most games including older titles. In competitive gaming, flashy effects are often distracting, or may even obscure enemies (e.g. Overwatch, until Blizzard patched it); FPS and resolution are very important, though. As for single player games - I often play them on battery, so all settings to low(est) and limiting FPS to 30 when possible is my solution to reducing heat and battery use.

    p.s. I'm sure there are lots of people who like their competitive games flashy, and in single-player titles gamers generally tend to care about high graphics settings even more. I'm not trying to impose my opinion on others, merely expressing it
     
    hmscott likes this.
  10. AlexusR

    AlexusR Guest

    Reputations:
    0
    Nothing is wrong with personal preferences. Personally I like when developers try to use maximum graphical fidelity in either single-player games or MMOs, makes me feel more immersed (especially when interacting with other player characters in MMO outside of PvP or PvE loot grinding). Wish the progress would be even faster so we would get a fully immersive worlds like in Ready Player One faster, where you can be what you wish to be regardless if you are fat ugly nerd or a person of specific gender or ethnicity, in maximum graphical fidelity, unfortunately we are still held up by hardware companies, such as AMD who still can't provide proper competition to Nvidia (which in turn slows down Nvidia's own research). Too bad all we have is those 2 and not more companies like Matrox, 3Dfx and others who used to make 3D cards for games.

    However I do understand that many other people do not care for that, especially for competitive FPS games. And some cannot afford to buy hardware powerful enough. Different people have different preferences/needs ;-)
     
    Last edited by a moderator: May 10, 2019
    Starlight5 likes this.
  11. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,619
    Trophy Points:
    931
  12. AlexusR

    AlexusR Guest

    Reputations:
    0
    Yeap, "over 20 fps boost", this is with Nvidia card. Really shows the power of DX12 when implemented properly. I wish other games would start using this, games like MMOs should really benefit from that since they can easily become CPU-bound in large battles or large cities.
     
  13. custom90gt

    custom90gt Doc Mod Super Moderator

    Reputations:
    7,907
    Messages:
    3,862
    Likes Received:
    4,808
    Trophy Points:
    331
    I've removed a few posts from this thread because they were getting personal. It's great to have discourse, but when it turns to personally attacking the person you disagree with than it has to end. Just like all things in life, people have different experiences and opinions. This is not a place to change someone else's mind to conform to your narrative (I'm speaking to BOTH sides here).

    You are free to ignore other users, it works great, but it's not needed to post why you are ignoring XYZ...
     
  14. Dannemand

    Dannemand Decidedly Moderate Super Moderator

    Reputations:
    11,330
    Messages:
    4,414
    Likes Received:
    2,161
    Trophy Points:
    231
    Once or twice in the lifetime of a person, they may find themselves to be completely vindicated and proven right about something for which they have been fighting long and hard, and everybody else comes around saying "You were so right about that thing back then, and we were all so wrong. We're glad you pointed us in the right direction, we now know better, and we're all better off for it."

    I am serious, it really DOES happen. Occasionally. I know from experience.

    But if you think about it, you'll probably agree it's not very often. For one thing, it's not a black and white universe, where one side is right in the absolute, and the other is wrong. But more importantly, humans are wired to protect themselves against emotional losses, so admitting wrongness is uncommon for most people (though not all).

    Much energy and suffering has been spent over the centuries pursuing such debates of right and wrong in the areas of science, philosophy, religion, politics or human rights. And some of these fights were necessary and worthy. But they all came at great costs -- and most of them are never fully settled.

    So choose your battles wisely: By all means do invest you energy and emotions in causes and endeavors in life that are necessary and worthy, and which can justify the practical and emotional costs they may inflict on yourself and on others.

    When it comes to other issues in life, such as the most ideal graphics settings in a computer game, probably best to simply state your experiences, listen to the experiences of others, and just move on if you don't agree.

    NBR is meant to be a place for such civil exchanges. They often result in some consensus and valuable information being shared, which are wonderful things. Of course we are not a place for the emotional venom.

    And do not expect moderators to be judges deciding that one viewpoint is right, and ban the others as heretics and trolls.

    Meantime, I see that @customgt managed to capture all this in far fewer words. So in this, he is more right :)
     
  15. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,706
    Messages:
    29,840
    Likes Received:
    59,619
    Trophy Points:
    931
    hmscott likes this.
  16. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I look forward to seeing if Crytek uses this technology for something great. I have zero interest in The Hunt. I want another scifi shooter from Crytek :D

    A Vulkan Crytek game. Yeah drool.
     
    hmscott likes this.
  17. custom90gt

    custom90gt Doc Mod Super Moderator

    Reputations:
    7,907
    Messages:
    3,862
    Likes Received:
    4,808
    Trophy Points:
    331
    You're just much more eloquent and poetic. I'm a pretty boring person, lol.
     
  18. Dannemand

    Dannemand Decidedly Moderate Super Moderator

    Reputations:
    11,330
    Messages:
    4,414
    Likes Received:
    2,161
    Trophy Points:
    231
    No, that's not it at all.

    The truth is most of us have issues where we hopelessly try to convince others. In my case, I hopelessly try to convince others to avoid charging themselves up in the pursuit of hopelessly trying to convince others.

    And now you can rightly slap me for continuing to derail the thread with this hopeless pursuit of mine :D
     
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yes, please clean out the thread of the mod musings, as it's getting in the way of our never ending pursuit of "On topic" convincing. ;)
     
    custom90gt and Dannemand like this.
  20. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    If the title of the article didn't contain a large enough hint, " But There Is A Catch....", the article goes on to recommend switching back to DX11, as DX12 causes performance issues:

    " If you are facing performance issues like stuttering and low FPS, then check out our The Division 2 stuttering and perforance fix."

    Following that link the site offers solutions, including the best / ultimate solution, switch back to DX11:

    " For Nvidia GPU owners, open Nvidia Control Panel and select “Manage 3D Settings” and then select “Program Settings”. Select the Division 2 from the list and set “Power Management Mode” to “Prefer Maximum Performance” and set “Texture Filtering- Quality” to “High Performance”. This will eliminate stuttering.

    If this doesn’t resolve the stuttering then set Vertical Sync to “Fast”. This will not only eliminate stuttering but, will potentially give you an FPS boost if you have the GPU and the CPU to handle it.

    Also, switch DirectX to 11 as DirectX 12 causes the performance to drop. This might resolve with a future patch but, for now, disable it.
    "

    DX12 is unstable in many games, provides no advantage in many others, and in the rare circumstance that DX12 outperforms DX11, it's an anomaly, not a reason to celebrate.

    Besides, if you try to run DX12 you might be further tempted to enable DXR or DLSS, and that would be a real disappointment, why risk the performance and visual fidelity loss? :D

    Instead, try running Vulkan, on an AMD GPU; try it you'll like it... o_O
    Best part of the article, showing that the support for Vulkan is key for future uptake across OS's and vendors.

    "Crytek has recently presented a hardware independent solution for ray tracing in its own Cryengine game engine. It is now announced that the next version of the engine will receive support for Vulkan and DirectX 12.

    Many were surprised when Crytek presented graphic demo Neon Noir . It was remarkable that the graphics were rendered with ray tracing on a Radeon RX Vega 56, which lacks dedicated hardware for the function...

    Now the Crytek shows the product plan for the game engine Cryengine, where support for just ray tracing should be added in the upcoming version 5.7. In addition to additions that improved support for rendering virtual reality and the VR headset Oculus Quest, the new version should also receive support for the graphics interfaces DirectX 12 and Vulkan.

    With support for Vulkan and DirectX 12, it also follows that Cryengine 5.7 is supported for asynchronous compute , which is the technology for parallel execution of calls to the graphics circuit, which is part of AMD's architecture GCN.Before this, Crytek will release version 5.6 of the engine, which optimizes it for systems with many processor cores. This will be further enhanced when version 5.7 adds support for said interface.

    An attractive addition to the planned version 5.8 is support for paired graphics cards in DirectX 12, so-called multi-GPU. If this can be combined with the support for ray tracing , it has the potential to re-create arguments for pairing graphics cards, something the game development has given priority to in recent years.Cryengine 5.7 will be made available to developers in the spring of 2020.

    Read more about Cryengine:
    NEON NOIR: Real-Time Ray Traced Reflections - Achieved With CRYENGINE
    CRYENGINE
    Published on Mar 15, 2019
    Technology Reveal: Real-Time Ray Traced Reflections achieved with CRYENGINE.

    All scenes are rendered in real-time in-editor on an AMD Vega 56 GPU. Reflections are achieved with the new experimental ray tracing feature in CRYENGINE 5 - no SSR.

    Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.
     
    Last edited: May 12, 2019
  21. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Will AMD take the lead in innovation, saving us all from the nonsensical high-end $$$$ "dead-end" of overpriced over-promised yet under-delivering high performance hardware for "gaming"?

    AMD Leading the Way With Innovation!
    The Good Old Gamer
    Published on May 12, 2019
    Moore's Law is Dead, and AdoredTV, have dropped some MAJOR news on AMD's NEXT GEN CPU Technology with Zen 3!
    APU Future: The Fall of Nvidia - https://youtu.be/gvIPXdtkGYc
    Moore's Law is Dead - Zen 3 Video: https://youtu.be/il5Zl3vGvVw
    AdoredTV - Zen 3/Milan Video: https://youtu.be/3IPlhjv8NiE
    Coreteks - Going Beyond Moore's Law: https://youtu.be/XW_h4KFr9js
    theGoodOldGamer - Live Stream: https://youtu.be/W6K9raAirtU


    Lots of great comments...here's a few:

    8est 8its
    2 hours ago
    Amd has single-handedly made the pc market interesting again. Intel has got a good kick in the butt and now seem to want to innovate but it does look like it's going to be too little and too late for them..assuming they don't get any more delays. We now know why nvidia rushed their rtx launch.. Imagine releasing their cards after vega 56 showing how rat tracing can be done via software only.. What a joke.. Absolutely hate nvidia and their bull. Just need amd to build their war chest..once they have enough money I think they can do a tonne of goodness with graphics and really boost RTG."

    Naked Hardware 1 hour ago
    "For such a small company AMD are one of the most innovative companies in the world. Yep Intel and nVidia also produce plenty of good products but they also do a ton of shady stuff and (particularly Intel) stagnate tech when they're on top. AMD being competitive is good for tech. Thanks to AMD we're going to see massive leaps in desktop CPU performance, and all because of competition."

    Vignesh Balasubramaniam 2 hours ago
    FYI, the OpenCL and Vulkan projects have merged. Vulkan is now for both graphics rendering and compute. As before with OpenCL, it is an open-source standard.
    Another thing to note with the pervasiveness of CUDA is that NVIDIA makes sure to provide deep discounts and sometimes free GPUs to universities doing research. Because of this, universities use CUDA, because NVIDIA gimps OpenCL performance on their GPUs. At my time at UNC, a good amount of the research has been to reverse engineer the underpinnings of CUDA for the benefit of other researchers and publishing the findings, so that researchers have a better idea of what the GPU is actually doing at any given time. Right now, CUDA is a black box that is a pain to work with, and researchers only use it because they have already gotten deeply invested in it. That's the biggest barrier for AMD, is the massive amount of code that has already been made for CUDA. I mean, even open-source projects like OpenCV (an open source computer vision library) made CUDA the GPU compute solution of choice, even though OpenCL already existed! NVIDIA has that much sway in the market, that a massive open-source project would ignore a fellow open-source project, and instead back a proprietary one as the officially supported solution. That is slowly changing with AMD being amazing in the Linux and open-source community and even putting out open, vendor neutral versions of the popular APIs with ROCm (Radeon Open Compute). There are ROCm versions of Tensorflow, etc.

    Mopantsu 1 hour ago
    "AMD is such a good value right now. Prices on their CPU's and GPU's are falling in price fast before launch. 1700 for crazy low price."

    SkyForce6700 2 hours ago
    "I do not think the scenario with Nvidia being left to cater for the high end gamer's are very likely. Remember Nvidia is a shareholders company. If they see no future growth of the company they will abandon ship and it will sink. Meaning no more Nvidia going forward. In that scenario what has happened to the computer gaming world is essentially a flattened landscape. All gamer's will be playing on similar level hardware similar to what we already see amongst mobile gamer's and the ultra high end gamer will be a relic of the past. This is a scenario that is quite possible."

    Raging Monk 2 hours ago
    "I am not super excited about anything but I watched just the same. lol"
     
    Last edited: May 12, 2019
  22. cooldex

    cooldex Notebook Consultant

    Reputations:
    26
    Messages:
    114
    Likes Received:
    35
    Trophy Points:
    41
    Highend gaming is only for people that can afford it, and mostly bragging rights (like high horsepower cars) at some point it gets like "why you need all that for". But I think everyone of us would like to experience it and build a balls to the wall supercomputer, with like 3 8k monitors and a quad (x4) gtx 690's in sli (lol I don't even think new games even support sli) and those those dual xeon motherboards. water-cooled and 128gb ram in octa channel, 100TB of storage. Aww man im starting to drool. Rendering times of anything (I made) would be non existent.
     
    Aroc and hmscott like this.