Profile: LISA SU - A new AMD is rising
Coreteks
Published on May 2, 2019
-
-
Nvidia vs AMD: Moving Past the "GPU War"
The Good Old Gamer
Published on Apr 27, 2019
Deciding which GPU is right for YOU should have NOTHING to do with Brand!
GTX 1650: Price Premium for ENTRY Level?
The Good Old Gamer
Published on Apr 23, 2019
Nvidia has released it's Geforce GTX 1650 today... with NO reviews available for potential customers... and for GOOD Reason! The RX 570 outperforms for less $.
Does High-end Gaming Make Sense Anymore?
The Good Old Gamer
Published on May 4, 2019
May the 4th Be With You!
http://forum.notebookreview.com/threads/does-high-end-gaming-make-sense-anymore.828724/Last edited: May 5, 2019 -
Will AMD support Ray Tracing on hardware basis in coming new GPUs now as Intel follow in nvidia's footsteps? Or will they just sit down wait it out and see how it works out for Nvidia and Intel first?
Intel Xe GPUs to Support Raytracing Hardware Acceleration Techpowerup.comLast edited: May 3, 2019 -
AMD supplies the CPU / GPU / APU's for the next gen console hardware. When the Ray-tracing in Console games get ported to PC's it will be using hardware / software / API's on PC's that support the console game features hosted on AMD hardware.
As software only ray-tracing demo's have shown dedicated hardware may not be necessary to provide RT RT (VT) as implemented.
I doubt AMD is going to license Nvidia's proprietary hybrid RTX ray-tracing methods for consoles, or PC hardware, so that's a dead end.
Intel can promise anything at this point, it's all smoke and mirrors and so far Intel cancels more projects than it delivers, or ages them out forever, so I wouldn't put much stock in anything Intel promises for their GPU's, they haven't a clue as to what they are really going to deliver.
In the last couple of months I saw Intel trying to drum up people to do survey's as to what consumers want in a next gen GPU. If Intel is still trying to figure out what people want in their next gen GPU's we aren't going to see those requested features implemented and delivered in a commercial consumer product for another 12-24 months.
Intel wants your feedback on what it should be doing with graphics
By Paul Lilly 9 days ago
https://www.pcgamer.com/intel-wants-your-feedback-on-what-it-should-be-doing-with-graphics/
Intel appears to be clueless, and is asking others for a clue as to where to start off on their GPU building quest.
If Intel is promising hardware accelerated ray-tracing in their data center GPU's so far it looks like a vague formless promise, nothing more:
" ...As David closed his blog he mentioned, “We will look forward to sharing more details on the Intel® Xearchitecture in the months ahead.” I’m pleased to share today that the Intel® Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel® Rendering Framework family of API’s and libraries."
https://itpeernetwork.intel.com/intel-rendering-framework-xe-architecture/#gs.97iqpj
https://itpeernetwork.intel.com/intel-xe-compute/
" ...The blog only mentions that the company's data-center GPUs support the feature, and not whether its client-segment ones do. The data-center Xe GPUs are targeted at cloud-based gaming service and cloud-computing providers, as well as those building large rendering farms....
"I'm pleased to share today that the Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API's and libraries," said Jim Jeffers, Sr. Principal Engineer and Sr. Director of Intel's Advanced Rendering and Visualization team. Intel did not go into technical details of the hardware itself..."
https://www.techpowerup.com/255073/intel-xe-gpus-to-support-raytracing-hardware-acceleration
There is also no mention of real-time ray-tracing... Intel's data center ray-tracing could be optimized for render farms in a non-real-time environment. As mentioned RT is for data center only products, no mention of consumer GPU RT RT product has been mentioned or promised.Last edited: May 3, 2019Vasudev likes this. -
hmscott likes this.
-
"RTX" is Nvidia's proprietary BS made up name - a bastardization of their other made up name, "GTX".
"RTX" is not an abbreviation for Real-Time Ray-Tracing.
AMD mentioned other methods of accelerating features without dedicating silicon to the feature, unlike Nvidia did with their RT / Tensor cores (really just added pipelines), so AMD doesn't have to waste silicon area when those features are not in use.
Let's wait and see what AMD comes up with and then likely shares with the community as a free and open standard for all to standardize on in the Console, PC, Streaming, and mobile GPU spaces.
The best way to implement new features are in software and tune the hardware for unique on the fly needs, using the hardware and software together to optimize for all uses dynamically without wasted silicon.Last edited: May 3, 2019Vasudev likes this. -
hmscott likes this.
-
I hope neither actually waste silicon space doing so for such a minor use %, time will tell.Vasudev likes this. -
-
Hi, any info if Windows 7 driver exist for Ryzen i7-3750H?
hmscott likes this. -
AMD is only releasing Windows 10 drivers...
https://www.amd.com/en/support/apu/...rocessors-radeon-rx-vega-graphics/amd-ryzen-2Atom Ant likes this. -
How Strong Can APU's get in the coming years?
Moore's Law Is Dead
Published on Mar 28, 2019
How strong can APU's get 5 years from now? In my opinion, way more powerful than most realize. But before we estimate the future, it is worth discussing the past....
https://www.guru3d.com/articles_pages...
https://www.anandtech.com/show/6347/a...
https://www.extremetech.com/gaming/17...
https://www.tomshardware.com/reviews/...
https://www.anandtech.com/show/9320/i...
https://www.overclock3d.net/reviews/c...
https://www.pcworld.com/article/32734...
https://ark.intel.com/content/www/us/...
https://ark.intel.com/content/www/us/...
The Good Old Gamer 1 month ago
"Moore's Law Is Dead I think the general issue is that PCMR used to be about pc gaming. Today it’s all about numbers and not about the games. This is why people are over paying so much. They’re sold on bigger, albeit mostly irrelevant, numbers. I’ve been getting grief because there’s no reason to game at less than 144fps apparently. The marketing from Nvidia has been so good that no one remembers most games from last gen (much better games in my opinion,) were mostly capped at 60fps on PC XD."
ImTheSlyDevil 1 month ago
"The other day I asked Ed from Sapphire something related to this. Basically, APU's will get to a point where they replace mainstream gpus and he said that Sapphire was open to the idea of making consumer motherboards if this ever happens. Imagine a high end Sapphire SFX motherboard with an embedded APU with HBM."
LazyGeekgamer HD is dead channel 1 month ago
"This will be exciting for itx gaming and portable gaming too"
The Good Old Gamer 1 month ago
"I like your chart. Spelled out what I was talking about via visuals. Once we have hard numbers on the next gen consoles that'll tell us exactly how far we've come. I suspect 8c/16t 2.5-3Ghz CPU with 16MB L3 Cache (1/2 of DT Zen 2,) with Vega 56-64 performance launching next year. There's nothing stopping AMD from slapping on HBM and selling those to OEMs for DT usage."
Moore's Law Is Dead 1 month ago
"Thanks Chris, I actually think the PC gaming world is in for a rude awakening when it comes to the PS5's specs. It is an open secret on my channel that I keep mentioning the PS5 is going to blow away most high-end PC gaming builds people have right now. I am by no means ready to double down on this theory, but this is one of those hunches I feel pretty confident about. (My Cut Down Radeon 9 PS5 Leak Video elaborates)
People keep forgetting that Sony is willing to sell their consoles at a loss when necessary, and also that a monster gaming console would be the perfect thing to make Streaming Platforms look like junk in comparison. Finally, I really....REALLY don't think most people around here understand just how much they are overpaying for GPU's right now. I will have a series on the coming console war some time this Spring... I just need the time and a few more sources to confirm what I think I know." -
Moore's Law Is Dead
Published on May 5, 2019
A lot of hopes are riding on Navi's success at this point - whether it's PS5, RTX 3000, or Gaming Prices in general. Let's discuss a bad outcome to Navi.
-
ATI PRADEON, $96 Grey Market RX 580
Hardware Unboxed
Published on May 5, 2019
DipetSourced 8 hours ago
> bet this thing didn't work/work poorly
> still beating 1650 with a 66% of it's price
> pikachu face
rush21hit 5 hours ago
"A shady $96 RX gpu kills 1650 in every tests lmao Talk about insult "
JRV 2017 8 hours ago
"Its exactly the same as the MAXSUN branded RX580s you can pick up on aliexpress"
The Corn RX580 mentioned in the video, actually for sale on newegg for $105 is the lower spec 2048SP version, not a full RX580. I re-watched the HU segment and although he doesn't mention the Corn RX580's reduced 2048SP, the video pans over the newegg product page and shows it @11:22 (as 2048 CUDA Cores).
CORN AMD RX580 [2048SP] 256-Bit 4GB GDDR5 Graphic Card support DirectX12 with dual fans Video Card RX 580 GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder etc.
Limited time offer, ends 05/11 By Corn Electronics
$105.99 Sale Ends in 7 Days (Sat) Save: $54.00 (34%)
https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE93K4231
Listed on newegg Corn has 3 models of RX580, the 2048SP version HU pointed out for $109 (now $105), and 2 other full 2304SP RX580's:
https://www.newegg.com/Product/Prod...&Description=corn+rx580&N=50120625&isNodeId=1
HU did mention there were plenty of name brand RX580's for sale only $60 more or less (from $70-90 Pradeon prices) and HU didn't recommend going with the off-brand Pradeon / other off-brand (chinese "knock-off") models.
The "ATI Pradeon" benchmarked like a full RX580 by HU's tests - although they weren't extensive tests, and GPU-Z shows 2304SP (@07:40) so the "ATI Pradeon" is a full RX 580. It is only the Corn RX580 HU pointed out with 2048SP.
Corn also lists a full RX570 with 2048SP for a few bucks less than their RX580 2048SP.
CORN AMD RX570 [2048SP] 256-Bit 4GB GDDR5 Graphic Card DirectX12 Video Card GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder,Apex etc.
$159.99 => $102.99 Sale Ends in 7 Days (Sat) Save: $57.00 (36%)
https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE91E9033Last edited: May 6, 2019 -
Last edited: May 5, 2019hmscott likes this.
-
I updated my post to point out the 2048SP, as well as other details (below). I re-watched the HU segment and although he doesn't mention the Corn RX580's reduced 2048SP, the video pans over the newegg product page and shows it @11:22 (as 2048 CUDA Cores).
Listed on newegg Corn has 3 models of RX580, the 2048SP version HU pointed out for $109 (now $105), and 2 other full 2304SP RX580's:
https://www.newegg.com/Product/Prod...&Description=corn+rx580&N=50120625&isNodeId=1
HU did mention there were plenty of name brand RX580's for sale only $60 more or less (from $70-90 Pradeon prices) and HU didn't recommend going with the off-brand Pradeon / other off-brand (chinese "knock-off") models.
The "ATI Pradeon" benchmarked like a full RX580 by HU's tests - although they weren't extensive tests, and GPU-Z shows 2304SP (@07:40) so the "ATI Pradeon" is a full RX 580. It is only the Corn RX580 2048 HU pointed out with limited 2048SP.
Corn also lists a full RX570 with 2048SP for a few bucks less
CORN AMD RX570 [2048SP] 256-Bit 4GB GDDR5 Graphic Card DirectX12 Video Card GPU PCI Express 3.0 DP/DVI-D/HDMI,Play for LOL,DOTA,COD,War Thunder,Apex etc.
$159.99 => $102.99 Sale Ends in 7 Days (Sat) Save: $57.00 (36%)
https://www.newegg.com/Product/Product.aspx?Item=9SIA4RE91E9033Last edited: May 6, 2019 -
USED Vs. NEW RX 570.... How Risky is it Buying Used?
Tech YES City
Published on May 5, 2019
Buy a New or a Used Graphics Card? That is a question that keeps coming up over time, though today I am comparing a brand new RX 570 to a USED Gigabyte RX 570, one that cost under half that of the new solution. Though what are the caveats of doing this? Let's investigate and discuss.
-
-
-
I am not sure about others but I had not anticipated AMD toppling performance of nVidia. Unlike Intel they have not stagnated in development and performance improvements. While not enormous gains they have been enough to stem the AMD tide pretty much. Unlike Vega64 this may finally be the leveling ground to nVidia offerings.
hmscott likes this. -
It may, at least for low to mid-range. The article still says AMD is struggling with Navi 20. These are only unconfirmed rumors, of course, so we will see how it will turn out. Hopefully there would not be further delays because I like to see prices for 2080 start dropping faster and I like seeing more rapid progress in terms of graphical fidelity - this is equally valuable to me as the gameplay itself (all of which now depends on how soon AMD will be able to catch up).
Last edited by a moderator: May 6, 2019Papusan likes this. -
However, the way Lisa Su said it there is some wiggle room, as the Radeon VII is a different class - cost - than what Navi will be released under.
So, maybe, a Navi GPU that is both faster and cheaper than the Radeon VII is possible. -
-
The only thing I've seen close to "reality" is a brief mention in a code release for Linux, and that could simply be a compatibility naming issue and not a validation of architecture - it's just a name:
" These first drops are pretty lightweight stuff, with the definitions including some architectural workarounds and a few new instructions, and don’t really indicate a whole lot by themselves. But the language used in the code seemingly confirms the GCN architecture.
EF_AMDGPU_MACH_AMDGCN_LAST = EF_AMDGPU_MACH_AMDGCN_GFX1010"
https://www.pcgamesn.com/amd/navi-linux-confirms-gcn-design?ampLast edited: May 7, 2019Arrrrbol likes this. -
To be fair to them, they have really stretched GCN out a long way past what other architectures would have managed. The gap between the 7970 and the Radeon VII (or even Vega 64) is huge. I just think its time to move on to something more efficient now so they can start pushing clock speeds higher without needing a nuclear reactor to power it. -
Maybe there will be a whole nuther branch that correctly defines it later, at this point it's just a placeholder.
Do you (or anyone else) have an official mention from AMD that Navi is GCN? -
https://wccftech.com/amds-navi-gpus-confirmed-to-retain-gcn-design/
https://www.eurogamer.net/articles/digitalfoundry-2019-amd-navi-next-gen-graphics-pcb-leak
Some people seem to think this will be the last time they use GCN however, which if true is a very welcome change.
Its all still mostly rumours obviously, but enough of them point to it being true. Not that it really matters if the performance is good for the money, which is the most important thing. -
GCN, or not GCN, we'll know soon enough. And, I'm mostly fine with it either way, although GCN is the more well known quantity so perhaps that's better to believe right now for comfort's sake.
GCN as implemented through the AMD Radeon VII is awesome, and along with driver updates users are learning how to use the tuning software to reduce voltage, reduce power, reduce temperature, while getting more performance - and once the GPU is tuned the fan curve can be tuned to reduce noise.
If Navi does as well as the Radeon VII in performance upgrade for GCN on 7nm and is available for less cost and in far greater quantities, there will be little reason to pick Nvidia over AMD on cost performance, the same as with Intel.
I doubt Nvidia will drop prices, and given the poor performance vs cost for the low end RTX / GTX GPU's AMD should have no trouble placing lower cost performance competitive GPU's into the market.
GCN might actually be good news for stability and reliability measures, as it's a long term well known quantity. We know how to tune it for best performance, and apps / games written to GCN performance will scale predictably.
A GCN Navi that costs less and performs better than AMD legacy and current Nvidia GPU's would be a good thing.
Let's hope that whatever or whenever AMD releases a new GPU architecture past GCN that the new architecture actually outperforms GCN.
Otherwise AMD would be stuck like Intel is with 10nm and Nvidia is with RTX, both committed to failing costly blunders - IMNSHO.Last edited: May 7, 2019Arrrrbol likes this. -
https://www.cryengine.com/news/how-we-made-neon-noir-ray-traced-reflections-in-cryengine-and-more#
That Vega 56 "Ray Traced" demo ran at 1080p 30fps. LOL. So a closed loop, tech demo ran at 1080p 30fps. Yikes.
"However, RTX will allow the effects to run at a higher resolution. At the moment on GTX 1080, we usually compute reflections and refractions at half-screen resolution. RTX will probably allow full-screen 4k resolution. It will also help us to have more dynamic elements in the scene, whereas currently, we have some limitations. Broadly speaking, RTX will not allow new features in CRYENGINE, but it will enable better performance and more details."AlexusR likes this. -
For the few people with overpriced RTX GPU's it's just not worth coding to them, as evidenced by these very developers - they coded for non-RTX GPU's and are only speculating on what might have been if they actually had taken the time to code to RTX features - but it never happened because there just aren't enough overpriced RTX GPU's out there to code to.
Their comments sound more like a play for some juicy funding from Nvidia to implement RTX features.
IMNSHO, there are far more GPU's that don't have Nvidia RTX for developers to code new games, and clearly that's what game developers need to do - target code for GPU's without Nvidia RTX, as they have been and continue to do to this day.
Developers coding for DXR / RT RT - GL API supporting GPU's , which far outnumber Nvidia RTX GPU's, and their games need to run well on all GPU's in order to sell enough games to be profitable.
The new consoles for 2020 will have RT RT and GL features, without Nvidia RTX and without Nvidia drivers / libraries, and those games will be the games that make it to PC with those RT RT and GL features coded to run on GPU's without requiring Nvidia RTX.
It doesn't matter that a few 2080ti's out there can do more, there simply aren't enough of them to make it worthwhile to dedicate resources to reach out to them with specific Nvidia RTX game features in addition to supporting the far larger pool of non-Nvidia RTX GPU's.
Game developers need to sell as many games as possible and that means coding new features to play on the largest number of GPU targets, without limiting the fun to the overpriced Nividia RTX GPU's.Last edited: May 7, 2019Talon likes this. -
AMD and Cray are building the world's most powerful supercomputer
Engadget
Published on May 7, 2019
It'll be an exoscale machine that can do a quintillion calculations per second.
AMD and Cray partner to build the world's fastest supercomputer
CNBC Television
Published on May 7, 2019
AMD CEO Lisa Su and Cray CEO Peter Ungaro join "Squawk Alley" to discuss their partnership with the U.S. Department of Energy to build a new supercomputer named Frontier.
Expected World’s Fastest Supercomputer Powered by AMD EPYC CPUs and Radeon Instinct GPUs
AMD
Published on May 7, 2019
AMD joined the U.S. Department of Energy (DOE), Oak Ridge National Laboratory (ORNL) and Cray Inc. in announcing a new exascale-class supercomputer scheduled to be delivered to ORNL in 2021. To deliver what is expected to be more than 1.5 exaflops of processing performance, the Frontier system is designed to use custom AMD EPYC™ CPUs and purpose-built Radeon Instinct™ GPUs.
AMD EPYC CPUs, AMD Radeon Instinct GPUs and ROCm Open Source Software to Power World’s Fastest Supercomputer at Oak Ridge National Laboratory
https://www.amd.com/en/press-releas...t-gpus-and-rocm-open-source-software-to-power
Powering the Exascale Era
https://www.amd.com/frontier
AMD back to Super form with Frontier Supercomputer
Graya Overload
Published on May 7, 2019
Department of Engery (DOE), Oak Ridge National Laboratory (ORNL) Announced a new Supercomputer Frontier that Cray and AMD will be working on. It is a big win for AMD. Epyc CPUs and Radeon Instinct GPU's will be used.
Last edited: May 7, 2019Ashtrix likes this. -
-
Nvidia releasing RTX hardware IP into the public domain, Unlikely? Probably, knowing Nvidia's penchant for useless proprietary features that even tank the performance of their own GPU's, for nothing special in trade, only to use those to sell the Gameworks minded that's its something special - that other vendors don't have.
IMNSHO, none of the Nvidia eye-candy is worth halving the FPS, none of it.
But, "it's a thing" and Nvidia can claim exclusivity, and some of the time you can almost see a difference, and some of that is subjectively interesting - but not critical to game play - in fact the FPS drops interferes with game play vs. not taking the performance hit.
Nvidia's useless FPS stealing BS eye-candy is just that, BS that Nvidia can put on display, point to it so a certain percentage of shiny object enthusiasts can then pay a huge price for it, but none of it improves game play - subjectively or not we can do with out Nvidia's BS just fine.
It's going to be a while before consoles release with new games using RT RT / GL features with PC releases running on DXR / RT-RT-GI API supporting GPU's. I hope this RT-RT-GL eye candy can be disabled on Consoles as well as PC's.
That's enough on that for now, lets see how it plays out over time.Last edited: May 8, 2019Talon likes this. -
Navi being GCN based isn't necessarily bad (or new for that matter - AMD already said that Navi will be GCN based from the start). They could have decided to disable/remove a certain amount of compute and increase geometry engines, ROP's and texture units that games would actually use.
So, wait and see until Navi is released (at any rate, its supposed to be the last iteration from AMD that uses GCN).Last edited: May 8, 2019 -
We Need AMD to have a Turn Dominating The Market
Moore's Law Is Dead
Published on May 7, 2019
I often see people say "I just hope ___ isn't in charge for too long so we can keep competition." Usually I agree, but honestly we need AMD to dominate till 2023 so they never fall apart again.
-
1080p average with Vulkan 217 fps. But who games at 1080p with a 2080 Ti?bennyg, AlexusR, hmscott and 1 other person like this. -
-
217FPS was with my 2080 Ti with my daily driver settings (overclocked). Still previously this was not achievable. The driver has fixed performance issues. Stock performance averages 204fps at less than 90% utilization. There is still a massive bottleneck. 4K is what the card was designed for and 4K is where it blows the doors off the competition.
On release the 2080 Ti scored 124fps min and 160fps average in 1080p Vulkan at stock. Current scores at 170fps min and 204fps average at stock with new driver. Amazing what Nvidia can accomplish when given access to the game and development isn't choked off to a single company.Last edited: May 9, 2019 -
-
-
Lenovo adds AMD Ryzen Pro-powered laptops to its ThinkPad family
You can get the first available models at the end of May starting at $939.
VALENTINA PALLADINO - 5/8/2019, 6:00 AM
https://arstechnica.com/gadgets/201...n-pro-powered-laptops-to-its-thinkpad-family/
"Lenovo is adding more choices to its beloved and iconic ThinkPad lineup this year: the new T495, T495s, and X395 laptops are all powered by AMD's Ryzen 7 Pro processors with integrated Vega graphics. With the same design and MIL-spec level of durability, these new ThinkPads will give customers the option to go with AMD without sacrificing what they love about the premium ThinkPad lineup.
The ThinkPad T495 and T495s models are 14-inch laptops while the X395 measures in at 13 inches. They will look similar to the T490 and T490s Intel-based laptops announced last month because Lenovo essentially took the same frames and stuck AMD APUs inside. That means they all have MIL-spec tested designs and features like far-field mics for VoIP conferences, Lenovo's camera privacy shutter, and optional PrivacyGuide screen filter.
The 14-inch displays on the T495 and T459s and the 13-inch display on the X395 will be FHD 1920×1080 panels with touch and non-touch options. They will also have AMD's FreeSync technology for improved refresh rates and pixel quality.
The biggest differences between these laptops and Intel-powered ThinkPads are performance and ports. According to Lenovo, the second-gen AMD Ryzen 7 Pro processors combined with integrated Vega graphics should provide an 18-percent improvement in performance over previous generations, as well as a better multimedia experience.
Lenovo claims improvements in battery life as well, but those will only be over previous generations—in fact, representatives told Ars that the AMD-powered machines will likely last 45 minutes to one hour less than their Intel-powered counterparts. That's not a huge decrease in battery life, but it's worth taking into account before choosing which CPU you want in one of these premium ThinkPads. The AMD models also do not have Thunderbolt 3 ports as the Intel models do, but they do have USB-A ports and an HDMI 2.0 port.
Lenovo's new ThinkPads with AMD Ryzen processors and Vega graphics will be available soon: the T495 will be available in late May starting at $939, while the T495s will be available in early June starting at $1,089. The ThinkPad X395 will also be available in June starting at $1,089."
I think it's great that AMD is getting into more laptops, this is a great start to a new era in laptop tech. -
custom90gt Doc Mod Super Moderator
I just hope the cooling is better than say the A485. It's one of the biggest reasons I'm selling it.
hmscott and tilleroftheearth like this. -
If true, I am not happy.
https://www.vgamezone.com/2019/05/0...uld-explain-threadrippers-2019-vanishing-act/
https://www.theinquirer.net/inquirer/news/3075238/amd-3rd-gen-threadripper-delay
https://www.techradar.com/news/amd-ryzen-threadripper-3rd-generation-could-be-delayed-until-2020Last edited: May 10, 2019hmscott likes this. -
Thin laptops are the bane of performance, maybe the 7nm spin's of the Zen 2 APU's will help when they arrive as well.
-
custom90gt Doc Mod Super Moderator
-
It's nice that voltage tuning and undervolting has finally gotten the attention it deserves after decades of pushing for moderating CPU thermal issues in laptops - as well as GPU's, now there's still a ways to go for AMD to catch up and provide the same options for their mobile CPU's.
Did you file a request / complaint with Lenovo / AMD about the need for voltage tuning with the AMD mobile APU's? It might help the cause and give your desire a push for implementation at least in the next generation of AMD mobile APU's.custom90gt likes this. -
While AMD has a mobile entry to the market IMO it is very minimalist. It does not seem they are yet going hard after the market. I think Epyc and consumer desktop is their primary targets right now. At some point they will most likely become more serious about the mobile market.
hmscott likes this. -
custom90gt Doc Mod Super Moderator
It would be great if they can get back into the game on the mobile side, but for right now I'm not very impressed. For sure in the future if they offer something like ryzen master for the mobile cpus...ALLurGroceries, Dannemand and hmscott like this. -
Lisa Su seems to have a good eye for what can be done and doing it in order of importance, but that means something(s) are bound to be left in the wings waiting for attention.
AMD full performance laptops with desktop components have the best chance still, especially with the interest in full performance in large frames vs thin, that will match desktop options closest.
AMD has already put a focus on the APU mobile software drivers - supporting updating directly - hopefully that includes improving the support software too.
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.