But have you seen the M295X benchmarks? It's not even that much faster than the 7970M and its rebadges. It would have to be really cheap, like $350, to warrant consideration over the 970M.
-
First of all i was thinking it is something new.
Except this i agree with you. -
This product isn't aimed for gamers. It's essentially an R9 M295X.
Maybe AMD will make a "desktop 390X" in the same form factor as the 200W 980 in the P870DM
I don't think we'll see AMD compete until HBM is on MXM. Once HBM is implemented on MXM, we'll probably see some nice AMD products(hopefully). -
-
-
What even is this? -
http://www.pcworld.com/article/2987...-quietly-with-just-one-major-customer-hp.html -
-
-
It's a cut-down chip. Also I'm not saying it would demolish any and everything, just that there's still some performance to be gained given a proper system.
-
-
Yeah but people make it look worse than it is.
-
The benefit would be for any program that was built to use HSA, not all programs. Currently, that probably isn't very many - it does yield good benefits, but you'd have to add code to optimize for AMD APUs, so it's a non-trivial effort, and more so since AMD's marketshare isn't great, and a good part of AMD's processors in use are still either pre-Kaveri, or not APUs (that is, desktop FX/Athlon parts). I'd be curious to see an up-to-date list of what software does support HSA. -
moviemarketing Milk Drinker
Maybe it's only useful for certain specific tasks within Photoshop? Or perhaps some setting for HSA wasn't properly enabled when Toms Hardware did this benchmark. -
moviemarketing Milk Drinker
Not for gaming but AMD has announced a couple of new mobile workstation cards: http://www.amd.com/en-us/press-releases/Pages/amd-firepro-graphics-2015oct01.aspx
TomJGX likes this. -
According to Tom's review of HSA ( http://www.tomshardware.com/reviews/a10-7850k-a8-7600-kaveri,3725-11.html), it does appear to vary by test - in some tests the i3 4330 is equal to the Kaveri 7850, but in other tests the 7850 matches the quad-core i5 4670K. So your mileage may vary - I don't know enough about Photoshop to know which benchmarks would be relevant to which workflows. Things also may have changed somewhat since Carrizo, Broadwell, and Skylake came out.
The workstation cards are somewhat interesting. I haven't kept up on the latest workstation cards on either the AMD or NVIDIA side, but it's nice to see a refresh. And while perhaps not targeted towards gaming, they can be used for gaming. I've been using a FirePro in my laptop, and have a friend who's using a Quadro, and they do perfectly well (though on a new laptop, they probably wouldn't be the best performance per dollar for gaming). -
TomJGX likes this.
-
-
No, it's not known. Exactly what I meant with:
-
-
You'll excuse my doubts, but Alienware is no longer Alienware (the performance brand).
TomJGX likes this. -
-
amd hasn't really been competitive for a while in notebook space..
-
They stopped competing after 7970M.
Well I suppose 8970M was the budget alternative to 780M, but beyond that AMD definitely stopped even attempting to compete.Last edited: Oct 4, 2015 -
Yep, they left their mobile users like poor triturbo here high and dry
TomJGX likes this. -
780M's only saving grace is their ease of overclocking, and 970Ms basically destroyed that benefit.
It'll be way too late, but good lord I hope AMD's Arctic Islands cards blow nVidia out of the water... both in mobile and in desktop formats. We all know nVidia can do good work if they want, but they need to be forced into actually moving their feet a bit. -
Well 780M and 8970M came out around the same time, while 870M came almost a year later. So 870M wasn't contemporaneous with 8970M. 8970M had like I think 90% of 780M's performance for 70% the price? That's why I said it was the budget alternative.
triturbo and i_pk_pjers_i like this. -
What I think is that it offered a more midrange alternative. 770M was a joke compared to 780M, just like 675MX was compared to 680M. As much as people defend those cards, it is what it is. But a 8970M being vastly more powerful than a 770M for only ~$100 more? Perfect. -
-
I just removed the last two posts. Play nice.
-
That defeats the entire purpose of buying a machine to game on.octiceps likes this. -
D2 Ultima likes this.
-
CAUTION: I might have written a bit of a book. (Are you proud of me @D2 Ultima?
)
@triturbo As much as I respect your 'put your money where your mouth is' attitude, you can't expect users not to buy a (relatively) good product. Like it or not, the only option that users who want high-end performance and/or portability in their gaming laptops have is to buy Nvidia GPUs. I don't like it either (it is in fact one of the reasons I've managed to hold out on upgrading so far), but until AMD gets its crap together in the desktop market and can start taking the mobile market seriously (like seriously the Fury X is an amazing piece of engineering and absolutely obliterates anything Nvidia has to offer in terms of raw compute performance yet it is ~5% slower than a 980 Ti in games WHAT THE HELL AMD). At the same time however, in the desktop market, I would say that it is reasonable to expect more gamers to put their money where their mouths are and buy a slightly slower Fury X, if nothing else for the potential performance improvements via drivers, the promise it shows in DX12 and Vulcan, the fact that it's water-cooled and that a Fury X crossfire configuration actually beats out a Titan X SLI configuration (this is actually true but never talked about and I don't get why).
-
Senpai did not notice you.TomJGX likes this. -
Maybe one day I can be worthy of Senpai's attention...
-
moviemarketing Milk Drinker
-
-
-
AMD needs to leave APU behind and focus on new, improved architecture. And try to improve their processor speed and power, while maintaining battery efficiency and keep their present Radeon offerings as stock GPU. AMD needs some major gains to stay competitive and needs to catch up to Intel at minimum clock for clock, core for core. Anything less will not do.
-
Then on the other hand, you have AMD who charges $649 for the Fury X that barely keeps up with a stock 980 Ti, has barely any overclocking headroom, and will likely get handicapped by the 4GB HBM in the next 2 years. So purely from a technical and "futureproofing" standpoint, the logical decision is to buy a 980 Ti. I was very morally conflicted about this, but in the end caved and bought a 980 Ti because I couldn't justify paying the same for less just to prove a point. I'm sad because had AMD priced the Fury X at $549, it would've been an instant buy for me.
And yes I'm being a hypocrite, but like I said, it's rarely black and white, and sometimes you just have to make decisions.TBoneSan likes this. -
I'll show myself the door now. Merry early Christmas, y'all. -
I keep saying it: $200 price point? R9 380. ~$320 price point? R9 390. $400 price point? R9 390x. $530 price point? R9 Fury, OR save and buy 980Ti. $650 price point? 980Ti.
AMD doesn't get a free pass for the Fury X. I'm not sorry. Zotac's AMP! Extreme 980Ti has something like a 1300MHz+ core clock AT STOCK. And that can be found for under $700. You don't even need to bother with OC potential... you just quite literally buy that card and you're done. There's no reason to purchase a Fury X.
However, and this is the big however, in the mobile sector it isn't two nearly-equivalent cards with the same price points like the 960/380, 970/390, 980/390X/Fury and 980Ti/FuryX are. It's "960M, 965M, 970M and 980M" and nothing AMD has even comes close for the same price points. Especially because of the OC potential of the 970Ms, which far outstrips the potential of the 980Ms (not saying the 980M ends up slower; but you can't OC a 980M nearly as much as a 970M in all cases I've seen, which makes the 970M a great choice for many a person.
So yes, to me it's black and white. If the cards are basically the same power, go the one that has the better ethics and whatnot. If the cards are the same power and one is cheaper, grab that; I don't blame ya. If the cards are NOT roughly the same power but they're the same price... buy the stronger one.
Oh, and just to make you hate nVidia more: they didn't "just lie" about the 970... they ARE STILL LYING. The card is a 224-bit mem bus card and cannot use all 8 32-bit memory controllers in tandem, yet it's marketed as a 256-bit bus card. -
Well see by keep buying nVidia's products, I'm basically telling them it's ok to cheat and lie, because I'll still buy their crap at the end of the day. On the other hand there really was no justification for spending the same and getting less with the Fury X. Honestly if they ditched the CLC and sold a bare Fury X for $569, I'd still take that over the 980 Ti. That or if Win 10 was any good, then AMD would at least have a pretty good shot at taking on nV under DX12.
Also "nVidia" and "ethics" should never appear in the same sentence lol. They're morally bankrupt, from anti-competitive practices such as the Gameworks BS, to the price gouging from Fermi to Kepler which continued to Maxwell, it's pretty clear the only thing they care about is their margins and bottom line. The sad part about Gameworks is I literally can't think of a single Gameworks game that didn't run like utter garbage even on nVidia's own hardware. Their willingness to make their own owners suffer just to make the competition look worse is beyond appalling. Oh and I fully expect them tot "forget" about optimizing Maxwell when Pascal launches. The only thing we can do is to make as much noise as possible, or do whatever it takes to hurt their bottom line so they take notice.
Not saying AMD is a saint, but I think it's quite telling when Gaming Evolved titles run equally as well if not better on nVidia's hardware, and expect for the launch issues with TressFX in Tomb Raider (which was nVidia's fault anyway), none of AMD's sponsored games has had any issues running on nVidia GPUs.Last edited: Oct 5, 2015 -
You really don't need to remind me of n-launch a new line with midrange cards as overpriced flagships to launch higher end cards later after everyone already upgraded so they have to buy new GPUs again-Vidia and their crap. Really, you don't.
As for gameworks... I don't even know. I've NEVER seen a single dev use it who is known for optimized titles. Never. It's Ubisoft publishings or CDPR. The former is cancer for gamers and the latter is known for pretty games, not well-running games (might I remind us all that Witcher 2 downsampled from 4K ran better than with Ubersampling on, which also disabled other forms of AA and effects?). Arkham Knight doesn't even COUNT. It's not like TWIMTBP titles in the past have meant AMD cards have been crap; Unreal Tournament games and Borderlands and stuff have had no problems I know of on AMD cards except lacking PhysX. I'm hard with this I know, but I value being fair more than I value bashing nVidia.
I know AMD isn't a saint either. Believe me I know. Their fans are almost cultish too. But I still think a card should be purchased for the user who wants it, and sometimes there's no alternative for the other company. Look at me. I'll *NEVER* own AMD cards until crossfire works in Windowed or Borderless modes. I just won't. Because I'm not getting single GPU. I'm getting multi-GPU probably for as long as I live again. I thought I'd be satisfied with this machine but hooo boy, I didn't know the depths of my own desire for power. But that's my logical decision: buy what works best for me. When people are considering single cards and they've got a price point, the logical decision is the best card at that price point. The 980Ti is the winner at its price point. AMD wins at every other price point. But you're right; if a Fury X was only $570 or so... ONLY THAT. I would be telling everyone to grab them. Screw the 980Ti in that respect.
Also, there's been deals I've seen people getting. There's people buying EVGA b-stock 980s for under $400 with Apex 2.0 coolers. I can't sit there and look at someone square in the face and tell them to buy a R9 390X for $420 instead of a GTX 980 for $350. There's nothing wrong with the 980 and that price is apt for it. Can't do jack about that. -
On the plus side, here's some good things about the desktop CPU market from AMD; which I hope can be taken at face value (key word: HOPE)
http://www.overclock3d.net/articles...number_crunching_performance_of_steamroller/1 -
They pretty much ceded the high-performance CPU market to Intel, and focused instead on high-value (for cost) APUs. And their GPUs match / trade with nVidia GPUs (at best) on performance, and typically perform worse on heat, noise, and power. I really hope that AMD's HBM architecture on their GPUs can scale up to something awesome in the future; and that they continue down their innovative design thinking with things like the AMD R9 Nano. Fantastic design, fantastic performance; the only thing wrong with that card is that it needs a price drop.
I really want some great competition in the market for both CPU and GPU. I'm hoping that AMD's new leadership team can help turn that company around.TomJGX likes this. -
- make sure it runs decent on their own hardware
- don't use excessive tessellation/AA/whatever that kills performance for no visual improvement whatsoever (think Hairworks in Witcher 3)
Otherwise what's the point of having some extra features that nobody can use except MAYBE those with 2 or more GPUs at the highest end? (ie those willing to spend $1400+) Not to mention the whole blackbox issue which I won't get into.
I mean just look at how TressFX was handled after the initial launch gaffe. Because CrystalDynamics provided nVidia with the source code, they were able to provide a patch within 2 weeks, and their cards ended up performing better than AMD's! This is why Gameworks is such a cancer and really needs to DIAF as fast as possible. This is besides the (over)tessellation shenanigans we've seen more than a few devs pull as well, most recently in Witcher 3.
Yeah I hate Gameworks with a passion.TomJGX, TBoneSan, DataShell and 1 other person like this. -
It really defeats the purpose of it all if it just runs like aids for everyone, and if they're going to partner with Ubicrap they should make sure games with their logo on it actually run. Same with WB games and Arkham Knight. CDPR just doesn't know the word "optimization", just like the Crysis devs. -
The problem wasn't that Crysis was poorly optimized. The problem was that they put in an Ultra-High graphics mode setting that was unattainable, and it drove gamers nuts. It drove them nuts to know that they couldn't just move the graphics settings sliders to max, and have their expensive new hardware crush that game. If CDPR had just disabled the Ultra-High setting, and made High the best graphics settings you could attain, then the whole controversy around Crysis 1 would never have existed
Sent from my XT1575 using Tapatalk
AMD, where art thou?
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Sep 24, 2015.