Testing AMD Ryzen and Intel Kaby Lake For Business Use | PCMag.com
tl;dr: conclusion
The second Ryzen is past the 4GHz it will compete in real time with Intel, the clockspeed still means a lot. On top of that, once optimization is in place (I know BIOS updates and OS updates are worked on atm) they are in a sweet spot that they will have to leverage to their full potential.
The question is whether AMD knows they have created a jewel or not.
-
A lot of work went into these video's, but the RX580/570 are just refreshes of the RX480/470, depending on the AIO card OC maybe up to 5-11% real world difference in games between a top OC RX580 and a RX480 stock (not much OC there anyway), more or less depending on the game. The refresh does put the RX580 in better competition with the 1060, and there is a $10 MSRP drop in pricing. There is a big bump in power usage, and the top OC'ing cards are really thick, like 2 1/4 slot width. AMD added "Chill", an FPS limiter to reduce unneeded FPS over monitor refresh, and that's a big deal - reducing power, cooling, noise, and increasing life. The benchmarks have "Chill" disabled.
Last edited: Apr 18, 2017 -
http://www.tweaktown.com/news/57177/nvidia-launch-next-gen-volta-based-geforce-q3-2017/index.html
Sent from my SM-G900P using Tapatalk -
-
Also, this time they are pushing up the price, meaning they were bleeding margin on the other cards at the same time as being scared. I just don't know I'll wait another quarter if Vega kills the Ti. Nvidia's actions on drivers and laptops lately makes me feel like Vega is worthy, if performance is there, to thumb my nose at Nvidia.
Edit: also, with the scheduling move, this might not have the 12nm die shrink
Sent from my SM-G900P using Tapatalk -
Sent from my SM-G900P using Tapatalk -
-
-
It's a tough time to buy something, that's for sure, new stuff coming around the corner and likely significantly faster, cheaper, and more power efficient.Last edited: Apr 18, 2017 -
-
-
"The all new RX480 Vbios..."
Starts at 28:20...
The Radeon RX 580 8GB Review - Polaris Populism
Last edited: Apr 20, 2017 -
"Intel will unveil its Basin Falls platform, i.e. Skylake-X, Kaby Lake-X processors and X299 chipset, at Computex 2017 in Taipei during May 30-June 3 two months earlier than originally scheduled, and will bring forward the launch of Coffee Lake microarchitecture based on a 14nm process node from January 2018 originally to August 2017, to cope with increasing competition from AMD's Ryzen 7 and Ryzen 5 processors, according to Taiwan-based PC vendors."
Coffee Lake: cheaper 6-coreer possibly already in August - Notebookcheck.com
Translated from German
Raiderman, ajc9988, Rage Set and 1 other person like this. -
Last edited: Apr 20, 2017
-
Does Ryzen Really Need Fast Memory? Guide for Gamers
Does RAM speed REALLY matter?
Last edited: Apr 21, 2017Atma, jaug1337, lctalley0109 and 2 others like this. -
tilleroftheearth Wisdom listens quietly...
I don't see/hear any reaction to Intel in that link at that time spot...
TBoneSan likes this. -
It is still a rumor, but if this is true, Intel is willing to kill Kaby to keep higher margins to compete with Ryzen.
Meanwhile, AMD's earnings will only include March's numbers and CNBC says will come May 1. Considering both Intel and Nvidia moved up releases, if rumors are true, means they don't exactly plan this war to be all pricing.
Sent from my SM-G900P using Tapatalk -
Gigabyte RX 570 4GB Aorus Review: Power, Thermals, FPS
Powercolor Red Devil RX 570 Review & Benchmarks (Overclocked)
Sapphire Nitro+ Radeon RX 570 4GB Review | Good Enough To Be Called A New Line
PowerColor Red Devil Golden Sample RX 580 Review
Last edited: Apr 20, 2017Atma likes this. -
-
http://www.guru3d.com/files-details/amd-chipset-drivers-download.html
So, updated chipset drivers for those that have AM4 boards. Let us know how it works!lctalley0109, triturbo, hmscott and 1 other person like this. -
Raiderman, jaug1337, lctalley0109 and 3 others like this.
-
Sorry if this was posted already but it really does look like Intel is pushing for an early release. A 12c processor...
http://www.guru3d.com/news-story/in...9-chipset-announced-at-computex-may-30th.html
"The X299 chipset will be compatible with Skylake-X and Kaby Lake-X Intel Core X series processors. Skylake-X and Kaby Lake-X, will include 12, 10, 8, 6 and 4 core products. That 12-core version is a new one and seems to be a bit of a panic reaction from Intel towards AMD who might be releasing 12 and 16 core parts."
AMD, time to unleash yours ahead of time. -
Sent from my SM-G900P using Tapatalklctalley0109, Rage Set and hmscott like this. -
lctalley0109, hmscott and ajc9988 like this.
-
Sent from my SM-G900P using Tapatalkjaug1337, triturbo, Rage Set and 1 other person like this. -
-
http://m.hexus.net/tech/news/graphics/104953-rx-vega-launch-within-next-two-months-says-amd/
Edit: http://m.hexus.net/tech/news/graphics/104950-amd-launches-polaris-powered-radeon-pro-duo/
Sent from my SM-G900P using TapatalkLast edited: Apr 25, 2017hmscott likes this. -
More articles on Vega release, including articles about driver development, Vega used with Adobe creative products, etc.:
https://wccftech.com/amd-confirms-rx-vega-launching-quarter-showcases-running-8k-video-editing/
http://www.networkworld.com/article...gas-ability-to-handle-8k-graphics-at-nab.html
http://wccftech.com/amd-teases-vega-prey-preview-gpu-launching-2017/ (from February, but telling they are trying to have a competent day 1 driver)
https://lists.freedesktop.org/archives/mesa-dev/2017-April/152733.html (OpenGL support for Vega)jaug1337 likes this. -
tilleroftheearth Wisdom listens quietly...
Thought I would give these links a quick read... to see the demo's and other 'showcases' showing Vega running 8K video (which in one link below is indicated to still be a few years away).
Didn't see any demos at all? More puffing from the AMD camp?
Or, did I miss anything?
-
Sent from my SM-G900P using TapatalkRaiderman and tilleroftheearth like this. -
Id Software praises AMD Ryzen and promises optimizations for the next game engine - sweclockers.com Use Google translate
"This means that players will eventually be on more powerful processors than today, something that allows Id Software to put more effort into parallelization. Just this is highlighted for the next major version of Idtech, which will benefit from all the performance available in the multi-core Ryzen processors."
"Ryzen has a super attractive price point, so we’re going to see a lot more capable CPUs in a lot more people’s hands over time and the additional threads and cores allow us to turn up a lot of things, like better frame rate, more AI, more things happening in the game space, more simulations running, more realistic worlds."
"We’re working on the next generation of idTech right now and we’re definitely going to fully optimize for Ryzen. The new engine technology that we’re working now is far more parallel than idTech 6 was; we plan to really consume all the CPU that Ryzen can offer."
And all those who hoped for long lifespan and opted for the fully locked down 4 core i7-6700hq - 7700hqWill they feel a little cheated? Or shall we say screwed ?
I haven't forgot all those Who paid +$4000 for machines with 6700HQ. Not fun if you opted similar machines for 3-5 years lifespan.
Last edited: Apr 26, 2017 -
I know this is off topic but the Radeon pro wx7100 used in the links is like a crossfire of 2 RX580's or so. I think the Vega will be fast but for the consumer version unless crossfired with two of them 8K may be a stretch not just for video but for gaming especially.
ajc9988 likes this. -
I definitely agree on the consumer version having to be in crossfire. Considering 8K is 4X 4K and the 1080 or Ti only plays Wildlands at 40FPS average on 4K ultra, you would need like 4 cards from Nvidia, and even that wouldn't be good enough due to SLI scaling. The best you could hope is that the Vega consumer flagship could meet or slightly exceed the Ti and that, due to the Pro Duo helping with better crossfire scaling, you could then use crossfire on multiple GPUs to hit the mark. Considering the 500 series performance, the added crossfire development, the rumors showing an underclocked (75% of expected final version) Vega matching a 1080, and the expected Tflop performance of the Vega exceeding the Pro Duo, you are still correct in saying crossfire is the only way to hit that in gaming (which may perform better in crossfire over SLI in properly supported titles, especially as Nvidia has tried to kill SLI support).
Just wanted to mention that in some titles, you would likely still need more than two consumer cards to hit that mark.
Sent from my SM-G900P using Tapatalk -
It is absurd that good optimization is so heavily overlooked, it makes my eyes water.hmscott likes this. -
tilleroftheearth Wisdom listens quietly...
Rendering and my 'running' was meant to be the same thing. I still don't see it though in those links?
The content creating that I do is with pro, still cameras. Not video. The editing of those images doesn't even need a discrete GPU...
Do the Pro Duo's accelerate PS and LR? Then, I might be interested (a little).
Thanks.
-
I do agree it is roughly the same thing, but rendering can include applying color changes, edits, etc., into the work, so it can take more time than the running of the film depending on the frame per second rendering. I linked reports on what was shown, as I have not found film of the event yet and was not personally present. I do apologize for using the word "show" when it should have been "tell." Lol.
So, I'd wait to see if it does provide a benefit to those two programs, but it may be something that interests you if it does. Of course premiere makes a better exhibit because of the longer nature of much of the encoding process, but I'd almost be willing to bet it applies to the entire creative suite.
Sent from my SM-G900P using Tapatalkhmscott and tilleroftheearth like this. -
tilleroftheearth Wisdom listens quietly...
Thanks, I thought I was going blind trying to find links to video on those links...
Up to now, I have not seen GPU's have an extensive impact across the entire CC suite. If that has changed; I'll be re-optimizing my workflow (soon!).
Take care.
Last edited: Apr 27, 2017ajc9988 likes this. -
https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html
https://helpx.adobe.com/photoshop/kb/photoshop-cc-gpu-card-faq.html
Neither program supports more than one card, officially, but suggests that it may be able to, if necessary. To do so, make sure the cards are the same make and model, same driver version, etc. How this would work with the Radeon Pro Duo, I'm not exactly sure as they do not list dual chip cards on LR or PS, but do list the 295X2 for premiere. Also, I do not know if those two programs matter so much on Cuda and OpenCL support (I'm not as familiar with the programs as I'd like to be, but both are supported by Premiere), but finding that out and which you would prefer to use may influence your purchasing decision (or what you would prefer to use if you fiddle around with Premiere for fun in your downtime). Except for testing (which even the mobile 900 series seems supported), if you are going to buy, I'd recommend the fastest single GPU card out there (unless certain something like the Radeon Pro Duo would work (by shooting Adobe an email to confirm support on those programs)). You'll need to look up the advantages of Cuda with the Titan and Quadro lines and check them (and performance reviews) against the FirePro and Pro series, or the upcoming Vega consumer card and the 1080 Ti. But, since you asked about it, I hope this helps!
Also, I do not know how much support has been added since 2014 or how the changes would impact your workflow, but considering they have been working on it for the past couple years (hence saying just about any card past 2014 will work), it may have a larger impact now. With that said, using graphics acceleration, on many programs like these, has seen them love higher IPC over more threads (as I mentioned before). That may change in the future as Adobe optimizes for Naples and Ryzen, but it is worth mentioning again as it may impact your platform purchase decisions. I cannot remember if the higher IPC love was seen specifically testing Adobe products, so it would be worth checking some of the Ryzen reviews again and once Naples and Skylake-X are released in June-Aug. time frame. I think Jay TwoCentz had a video comparing the 6900K with two 1080s squaring off against Ryzen with two Titan X (maxwell) GPUs and the rendering was a couple minutes slower on the Ryzen. But, in his 30 day video, he said it was quite enjoyable and he loved being able to live stream and video capture games playing at 1080p with the encoding happening on the same CPU rather than using a capture card. Further, Linus's video showed better image encoding for gaming than Intel's or Nvidia's hardware encoding acceleration. Just to replay some of the greatest hits in this thread.tilleroftheearth and hmscott like this. -
tilleroftheearth Wisdom listens quietly...
Thanks, ajc99888.
I am aware of the potential of GPU accelerated workflows, but like I said; in my workloads/workflows it is negligible.
See:
https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CS6-GPU-Acceleration-161/
The above is an old article of how close a discrete GPU vs. even Intel HD 4000 graphics iGPU compares to some filters/enhancements (most of which I don't use - and these are the ones that are the most significantly accelerated).
See:
https://www.pugetsystems.com/labs/a...15-8-AMD-Ryzen-7-1700X-1800X-Performance-910/
Here is a link which shows that Intel is still the platform in the money no object camp (mine) - when, productivity is king (and LR is a major part of your workflow). GPU acceleration isn't too important here... (in my own setup/workflow).
See:
https://www.pugetsystems.com/labs/a...CC-2017-NVIDIA-Quadro-Pascal-Performance-938/
With Premiere Pro, that changes, of course. With the GPU used being king.
When compared to the performance advantage (negligible, in my case) vs. the higher cost of a discrete GPU for each of my workstations - not to mention the higher power load/worse battery life and higher chance of an additional failure point - an iGPU solution such as is offered by most Intel CPU's is highly desirable.
Put another way; for the cost of a decent discrete GPU (8GB VRAM) to give ~5% (overall, for my workloads) advantage - I can buy two or more additional SSD's for my DT workstations. These can be used as Scratch Disk space (with 50% or more OP'ing...), or, they can be used as fast storage for transferring RAW image files to be ready to be processed by the main setup... used like this, SSD's provide much more than mere single digit improvements to productivity and are a much better purchase than discrete GPU's have ever been for my workflows (at least since iGPU's became a 'thing').
Multiply the savings of a decent/discrete GPU by many dozens of workstations and I could buy multiple units of the NAS below which frees each workstation up to concentrate on the work (only) - and not also the managing of the huge files I end up producing daily/weekly/monthly.
See:
https://www.qnap.com/en/product/model.php?II=160&event=2
With an i7 QC, 16GB of RAM, 7x 8TB and 1x 1TB SSD and 4x 10GbE ports - a NAS like the above performs almost as a DAS in some/most of my workflows. That is where I have greatly increased my productivity (time spent on storage/management of my files has drastically decreased - and when I need to expand the storage every few months? Simply clone another identical system like the above - but maybe with 10TB drives next time.).
Thanks again for your in depth answers; greatly appreciated.
But I think now you know that I have explored what there is already available to me (and I'm making the most from it).
Take care.
-
If you want a jaw dropper and proof AMD will take some server space, check out Intel's new flagship Xeon 28C/56T at $12K. Really makes you wonder how the 32C/64T Naples will fare and at what price.
https://m.facebook.com/story.php?story_fbid=10156105872794942&id=49650744941
Also, here is WD's new 12TB:
http://m.hexus.net/tech/news/storage/105067-western-digital-begins-shipping-ultrastar-he12-12tb-hdd/
Sent from my SM-G900P using Tapatalktilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
With just 205W TDP, that is some serious firepower in the new Xeon's... $12K isn't a lot of $$$$, by itself. It depends what profit you can make from something like that (usually on a 24/7/365 schedule).
Some clients I know can have similar high ticket (tech) paid off in a matter of hours or days, for example...
If AMD comes in too low; they'll never be able to raise the prices (if they're ignoring the long term strategy they should be focusing on; they'll bottom out, again). If they come too high; they'll get slow traction and may never see profit soon enough. Tough position for them to be in.
Google, Amazon, MS, etc. will be able to custom build as needed with these Naples as their base - but most buyers will need something with a little more incentive than just price.
Of course, for both Intel and AMD, we need to see what the performance is. And when they can actually deliver it.
See:
http://forum.notebookreview.com/threads/intel-optane-coming-april-24th.803009/page-2#post-10515129
See the link above to get an idea of why I think Intel still thinks (and acts) like it's still in the drivers seat. AMD Ryzen/Naples or not.
When Micron joins Intel in shipping their version of XPoint (~7 months from now), QuantX, things should get really interesting once again!
See:
https://www.theregister.co.uk/2017/02/03/micron_working_on_nextgeneration_xpoint/
I'm not in a 'buy' or 'test' mode right now; my workflow is optimized, blueprinted and put to bed.
Rather; I'm in 'what the $%# is coming down the road soon' waiting mode!
In the meantime; saving my pennies and eager to spend them all, very soon.
ajc9988 likes this. -
I do agree $12K is nothing, but I do want to know more, as I thought from earlier leaks that the 2.5GHz was the boost rating and 2.2-2.3 was base clock (they may have "against [their] will, knocked it up another notch. BAM!"). But your point is noted. But, if AMD maintained a 40% margin on the Ryzen 8-cores, and the 16-core comes in under $1300 (hopefully $1000-1200) with that margin, they will take a huge chunk, easily paying for the jump to 7nm.
Now, applying that to video cards shows that it costs AMD more than Nvidia, or at least in prior generations, to develop their cards, considering the relative pricing levels, while also showing them often pricing against one tier down from Nvidia's tier. So that fight is harder, but AMD also has around 30% market share there compared to under 6% for CPUs before Ryzen.
Just wanted to add some context.
Sent from my SM-G900P using TapatalkLast edited: Apr 27, 2017tilleroftheearth likes this. -
What is this? A Dead thread?
Come on boys and girls
Intel expects CPU prices to fall now that AMD's Ryzen is here - PcWorld.com
"During an earnings call on Thursday, Intel’s CEO Brian Krzanich sidestepped a question on whether Ryzen had any role in the projected chip price declines. He said the market dynamics were a reason." -
Everybody here is likely too busy "winning" with their new Ryzen computers to spend time posting...maybe later when the new AMD GPU's arrive.
The new Ryzen CPU's for servers would be nice to see too. -
We Got Inside AMD’s Texas Home - & Learned How To Overclock RAM on Ryzen
Starts@ 3:50
Last edited: Apr 30, 2017Raiderman likes this. -
Sent from my SM-G900P using TapatalkTBoneSan, Raiderman, Papusan and 1 other person like this. -
-
ajc9988 likes this.
-
With the AMD Vega rumors and "benchmarks" floating around I thought I might be fun to post Ryzen + Vega coverage from Dec 2016, and then the rumors...
AMD Radeon Vega Benchmarks Star Wars Battlefront 4K & Doom 4K Compilation
AMD Radeon Vega - 2 SKUs Spotted | Benchmarks & Specs
Last edited: May 1, 2017
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.