The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD, where art thou?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Sep 24, 2015.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    But have you seen the M295X benchmarks? It's not even that much faster than the 7970M and its rebadges. It would have to be really cheap, like $350, to warrant consideration over the 970M.
     
    TomJGX and sa7ina like this.
  2. sa7ina

    sa7ina Notebook Consultant

    Reputations:
    543
    Messages:
    272
    Likes Received:
    203
    Trophy Points:
    56
    First of all i was thinking it is something new.
    Except this i agree with you.
     
  3. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    This product isn't aimed for gamers. It's essentially an R9 M295X.

    Maybe AMD will make a "desktop 390X" in the same form factor as the 200W 980 in the P870DM :p

    I don't think we'll see AMD compete until HBM is on MXM. Once HBM is implemented on MXM, we'll probably see some nice AMD products(hopefully).
     
  4. sa7ina

    sa7ina Notebook Consultant

    Reputations:
    543
    Messages:
    272
    Likes Received:
    203
    Trophy Points:
    56
    You are probably right:
    AMD-Embedded-Radeon_4.png
     
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Pshh. If hawaii could have shrunk down its TDP envelope, they'd have done it already. As I already pointed out, the need for a far larger PSU for a R9 390 over something like a 970 is already a dealbreaker for some users.
     
  6. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76

    What even is this?
     
  7. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The sound of zero cares being given
     
    TomJGX and sa7ina like this.
  9. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I'm yet to see the R9-M295X performing in an non throttled machine.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Look at R9 285 benchmarks, that'll give you a good idea
     
  11. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    It's a cut-down chip. Also I'm not saying it would demolish any and everything, just that there's still some performance to be gained given a proper system.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Which is clocked higher, so FLOPS is about the same (slightly higher on 285 actually).

    Thanks for stating the obvious, but at the end of the day the potential unthrottled performance is a known quantity and it's nothing special.
     
  13. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Yeah but people make it look worse than it is.
     
  14. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Yeah, Kaveri or later APU cores - IIRC, all of them, including non-A10s, support HSA. If Deks is correct that Photoshop can use HSA (and I don't use Photoshop, so I can't verify that), then it would get a boost from Kaveri and Carrizzo APUs, which have HSA support. Since that would allow Photoshop to take advantage of both CPU and GPU cores at once, that could lead to a significant performance improvement.

    The benefit would be for any program that was built to use HSA, not all programs. Currently, that probably isn't very many - it does yield good benefits, but you'd have to add code to optimize for AMD APUs, so it's a non-trivial effort, and more so since AMD's marketshare isn't great, and a good part of AMD's processors in use are still either pre-Kaveri, or not APUs (that is, desktop FX/Athlon parts). I'd be curious to see an up-to-date list of what software does support HSA.
     
  15. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Hmm...unfortunately in Photoshop the Kaveri 3.9GHz and 4GHz quad cores seem to perform around the speed of Haswell dual core i3, for example significantly slower than the old 2.6GHz i7-920: http://www.tomshardware.com/charts/cpu-charts-2015/-29-Adobe-Photoshop-CC,3720.html

    Maybe it's only useful for certain specific tasks within Photoshop? Or perhaps some setting for HSA wasn't properly enabled when Toms Hardware did this benchmark.
     
    TomJGX and Kent T like this.
  16. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    TomJGX likes this.
  17. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    According to Tom's review of HSA ( http://www.tomshardware.com/reviews/a10-7850k-a8-7600-kaveri,3725-11.html), it does appear to vary by test - in some tests the i3 4330 is equal to the Kaveri 7850, but in other tests the 7850 matches the quad-core i5 4670K. So your mileage may vary - I don't know enough about Photoshop to know which benchmarks would be relevant to which workflows. Things also may have changed somewhat since Carrizo, Broadwell, and Skylake came out.

    The workstation cards are somewhat interesting. I haven't kept up on the latest workstation cards on either the AMD or NVIDIA side, but it's nice to see a refresh. And while perhaps not targeted towards gaming, they can be used for gaming. I've been using a FirePro in my laptop, and have a friend who's using a Quadro, and they do perfectly well (though on a new laptop, they probably wouldn't be the best performance per dollar for gaming).
     
  18. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    TomJGX likes this.
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I don't see how it's been it's better binned since the 25W reduction is due to underclocking (and undervolting?). And the performance of the underclocked M295X is known. It sits between 870M and 880M.
     
  20. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    No, it's not known. Exactly what I meant with:

    I'll wait, you can try and question everything I wrote, as you usually do.
     
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maybe you should read the review first? The M295X in the AW15 doesn't throttle, and since the FirePro version has identical clocks, common sense dictates that it performs the same. It's possible the performance would be improved with better drivers, but that's par for the course with AMD and Enduro particularly.
     
  22. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    You'll excuse my doubts, but Alienware is no longer Alienware (the performance brand).
     
    TomJGX likes this.
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nice straw man, but that has nothing to do with this
     
  24. uber_shnitz

    uber_shnitz Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    amd hasn't really been competitive for a while in notebook space..
     
  25. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    They stopped competing after 7970M.

    Well I suppose 8970M was the budget alternative to 780M, but beyond that AMD definitely stopped even attempting to compete.
     
    Last edited: Oct 4, 2015
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yep, they left their mobile users like poor triturbo here high and dry
     
    TomJGX likes this.
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It really wasn't though. 680M era with the 7970M? For certain. 780M and onward? AMD just lost. Hell, even a 870M is better. Hell, an 870M going by core strength alone beats a 780M at stock. 967MHz boost 1344 cores vs 823MHz boost 1344 cores. And the 870M was cheaper than AMD's offerings too; no reason to get an AMD card after that.

    780M's only saving grace is their ease of overclocking, and 970Ms basically destroyed that benefit.

    It'll be way too late, but good lord I hope AMD's Arctic Islands cards blow nVidia out of the water... both in mobile and in desktop formats. We all know nVidia can do good work if they want, but they need to be forced into actually moving their feet a bit.
     
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well 780M and 8970M came out around the same time, while 870M came almost a year later. So 870M wasn't contemporaneous with 8970M. 8970M had like I think 90% of 780M's performance for 70% the price? That's why I said it was the budget alternative.
     
    triturbo and i_pk_pjers_i like this.
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well it wasn't like 90%. It was more like 80%. From what I remember, 680M and 7970M were ballpark neck and neck before considering overclocking, and 780M was ~25% faster than 680M. If that's the case, 8970M with a simple 50MHz overclock would have been a lot weaker.

    What I think is that it offered a more midrange alternative. 770M was a joke compared to 780M, just like 675MX was compared to 680M. As much as people defend those cards, it is what it is. But a 8970M being vastly more powerful than a 770M for only ~$100 more? Perfect.
     
  30. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Funny that the ones complaining are nGreedia owners. I'm waiting here patiently, I know that it's hard time for them, and I'll build an eGPU setup in the mean time. Hope that W7170M would hit eBay soon and my machine would deal well with it, which I doubt though. Ironically my machine doesn't deal well with AMDs, but I'm not giving-up easily (at all), I just postpone. So yeah, I'm waiting, you (nGreedia owners) are the ones crying, because they milk you to the hell and back and you realize it, but you are still fine with it. I don't know how it's English, but we have a saying in Bulgarian - It's not crazy the one who gets, but the one who gives. As long as throw your money at nGreedia, I don't see how you'll resolve your problems. Just be happy with whatever they throw at you (as it seems to be the case) and call it a day. I'll be here hoping that AMD would survive, but the news are not good - laying-off workers, most likely wont source enough (if any) HBM2 chips (I see nGreedia and their money here, who would say no to a bulk sale) and etc. Obviously the Fury line wasn't enough to pull them off the bottom a bit. I don't see how next year would be any different.
     
  31. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I just removed the last two posts. Play nice.
     
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    As much as I hate nVidia right now, I'm not going to buy an inferior GPU to game on because it has the name AMD tied to it.

    That defeats the entire purpose of buying a machine to game on.
     
    octiceps likes this.
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Exactly. That's making an emotional rather than rational/logical purchasing decision AKA being a fanboy.
     
    D2 Ultima likes this.
  34. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    CAUTION: I might have written a bit of a book. (Are you proud of me @D2 Ultima? :D)

    @triturbo As much as I respect your 'put your money where your mouth is' attitude, you can't expect users not to buy a (relatively) good product. Like it or not, the only option that users who want high-end performance and/or portability in their gaming laptops have is to buy Nvidia GPUs. I don't like it either (it is in fact one of the reasons I've managed to hold out on upgrading so far), but until AMD gets its crap together in the desktop market and can start taking the mobile market seriously (like seriously the Fury X is an amazing piece of engineering and absolutely obliterates anything Nvidia has to offer in terms of raw compute performance yet it is ~5% slower than a 980 Ti in games WHAT THE HELL AMD). At the same time however, in the desktop market, I would say that it is reasonable to expect more gamers to put their money where their mouths are and buy a slightly slower Fury X, if nothing else for the potential performance improvements via drivers, the promise it shows in DX12 and Vulcan, the fact that it's water-cooled and that a Fury X crossfire configuration actually beats out a Titan X SLI configuration (this is actually true but never talked about and I don't get why).
    /ramble
    Aww, I missed the forum drama. :p
     
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You call that a book? That's not even a chapter.

    Senpai did not notice you.
     
    TomJGX likes this.
  36. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    :( Maybe one day I can be worthy of Senpai's attention...
     
  37. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Not expecting to see anything special for mobile, but on the desktop side at least, it appears AMD will launch the Fury Gemini soon [​IMG]

     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    While that might be true, Crossfire still doesn't work in windowed/borderless modes, so to myself and quite a few other users, it's pointless tech.
     
  39. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Gemini, eh? Sounds like codename for the Fury X2 dual-GPU card.
     
  40. Kent T

    Kent T Notebook Virtuoso

    Reputations:
    270
    Messages:
    2,959
    Likes Received:
    753
    Trophy Points:
    131
    AMD needs to leave APU behind and focus on new, improved architecture. And try to improve their processor speed and power, while maintaining battery efficiency and keep their present Radeon offerings as stock GPU. AMD needs some major gains to stay competitive and needs to catch up to Intel at minimum clock for clock, core for core. Anything less will not do.
     
  41. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I wish it was this black and white. nVidia lied about the 970 (and possibly about Maxwell's async compute capabilities), so the logical thing to do is to vote with your wallet and not give them another penny.

    Then on the other hand, you have AMD who charges $649 for the Fury X that barely keeps up with a stock 980 Ti, has barely any overclocking headroom, and will likely get handicapped by the 4GB HBM in the next 2 years. So purely from a technical and "futureproofing" standpoint, the logical decision is to buy a 980 Ti. I was very morally conflicted about this, but in the end caved and bought a 980 Ti because I couldn't justify paying the same for less just to prove a point. I'm sad because had AMD priced the Fury X at $549, it would've been an instant buy for me.

    And yes I'm being a hypocrite, but like I said, it's rarely black and white, and sometimes you just have to make decisions. :(
     
    TBoneSan likes this.
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No, but it is this red and green.

    I'll show myself the door now. Merry early Christmas, y'all.
     
    kenny27 and D2 Ultima like this.
  43. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    it seems black and white to me.

    I keep saying it: $200 price point? R9 380. ~$320 price point? R9 390. $400 price point? R9 390x. $530 price point? R9 Fury, OR save and buy 980Ti. $650 price point? 980Ti.

    AMD doesn't get a free pass for the Fury X. I'm not sorry. Zotac's AMP! Extreme 980Ti has something like a 1300MHz+ core clock AT STOCK. And that can be found for under $700. You don't even need to bother with OC potential... you just quite literally buy that card and you're done. There's no reason to purchase a Fury X.

    However, and this is the big however, in the mobile sector it isn't two nearly-equivalent cards with the same price points like the 960/380, 970/390, 980/390X/Fury and 980Ti/FuryX are. It's "960M, 965M, 970M and 980M" and nothing AMD has even comes close for the same price points. Especially because of the OC potential of the 970Ms, which far outstrips the potential of the 980Ms (not saying the 980M ends up slower; but you can't OC a 980M nearly as much as a 970M in all cases I've seen, which makes the 970M a great choice for many a person.

    So yes, to me it's black and white. If the cards are basically the same power, go the one that has the better ethics and whatnot. If the cards are the same power and one is cheaper, grab that; I don't blame ya. If the cards are NOT roughly the same power but they're the same price... buy the stronger one.

    Oh, and just to make you hate nVidia more: they didn't "just lie" about the 970... they ARE STILL LYING. The card is a 224-bit mem bus card and cannot use all 8 32-bit memory controllers in tandem, yet it's marketed as a 256-bit bus card.
     
  44. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well see by keep buying nVidia's products, I'm basically telling them it's ok to cheat and lie, because I'll still buy their crap at the end of the day. On the other hand there really was no justification for spending the same and getting less with the Fury X. Honestly if they ditched the CLC and sold a bare Fury X for $569, I'd still take that over the 980 Ti. That or if Win 10 was any good, then AMD would at least have a pretty good shot at taking on nV under DX12.

    Also "nVidia" and "ethics" should never appear in the same sentence lol. They're morally bankrupt, from anti-competitive practices such as the Gameworks BS, to the price gouging from Fermi to Kepler which continued to Maxwell, it's pretty clear the only thing they care about is their margins and bottom line. The sad part about Gameworks is I literally can't think of a single Gameworks game that didn't run like utter garbage even on nVidia's own hardware. Their willingness to make their own owners suffer just to make the competition look worse is beyond appalling. Oh and I fully expect them tot "forget" about optimizing Maxwell when Pascal launches. The only thing we can do is to make as much noise as possible, or do whatever it takes to hurt their bottom line so they take notice.

    Not saying AMD is a saint, but I think it's quite telling when Gaming Evolved titles run equally as well if not better on nVidia's hardware, and expect for the launch issues with TressFX in Tomb Raider (which was nVidia's fault anyway), none of AMD's sponsored games has had any issues running on nVidia GPUs.
     
    Last edited: Oct 5, 2015
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I actually find the 980Ti is a fine product that's finely priced and worth a buy. The rest of their line, is not, and why I never recommend them. Besides, it might make a point if nVidia notices 980Ti and 750Ti cards selling out and literally every other card in their lineup is left high and dry. (And the flip side for AMD too; the R9 390 and R9 380 cards getting so many sales should say something to them about price to performance vs competition).

    You really don't need to remind me of n-launch a new line with midrange cards as overpriced flagships to launch higher end cards later after everyone already upgraded so they have to buy new GPUs again-Vidia and their crap. Really, you don't.

    As for gameworks... I don't even know. I've NEVER seen a single dev use it who is known for optimized titles. Never. It's Ubisoft publishings or CDPR. The former is cancer for gamers and the latter is known for pretty games, not well-running games (might I remind us all that Witcher 2 downsampled from 4K ran better than with Ubersampling on, which also disabled other forms of AA and effects?). Arkham Knight doesn't even COUNT. It's not like TWIMTBP titles in the past have meant AMD cards have been crap; Unreal Tournament games and Borderlands and stuff have had no problems I know of on AMD cards except lacking PhysX. I'm hard with this I know, but I value being fair more than I value bashing nVidia.

    I know AMD isn't a saint either. Believe me I know. Their fans are almost cultish too. But I still think a card should be purchased for the user who wants it, and sometimes there's no alternative for the other company. Look at me. I'll *NEVER* own AMD cards until crossfire works in Windowed or Borderless modes. I just won't. Because I'm not getting single GPU. I'm getting multi-GPU probably for as long as I live again. I thought I'd be satisfied with this machine but hooo boy, I didn't know the depths of my own desire for power. But that's my logical decision: buy what works best for me. When people are considering single cards and they've got a price point, the logical decision is the best card at that price point. The 980Ti is the winner at its price point. AMD wins at every other price point. But you're right; if a Fury X was only $570 or so... ONLY THAT. I would be telling everyone to grab them. Screw the 980Ti in that respect.

    Also, there's been deals I've seen people getting. There's people buying EVGA b-stock 980s for under $400 with Apex 2.0 coolers. I can't sit there and look at someone square in the face and tell them to buy a R9 390X for $420 instead of a GTX 980 for $350. There's nothing wrong with the 980 and that price is apt for it. Can't do jack about that.
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
  47. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    I don't know, man. I mean, for the past few years, AMD hasn't been performance-leading in either CPU or GPU.

    They pretty much ceded the high-performance CPU market to Intel, and focused instead on high-value (for cost) APUs. And their GPUs match / trade with nVidia GPUs (at best) on performance, and typically perform worse on heat, noise, and power. I really hope that AMD's HBM architecture on their GPUs can scale up to something awesome in the future; and that they continue down their innovative design thinking with things like the AMD R9 Nano. Fantastic design, fantastic performance; the only thing wrong with that card is that it needs a price drop.

    I really want some great competition in the market for both CPU and GPU. I'm hoping that AMD's new leadership team can help turn that company around.
     
    TomJGX likes this.
  48. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The main thing I was getting at is that Gameworks just doesn't seem to serve any purpose whatsoever besides gimping hardware all around. When turned on, they kill frames like crazy, and the effects aren't even that good. Yes I realize this has as much to do with the devs as nVidia, but if they're going to provide an effects library to devs, then the least they could do is

    - make sure it runs decent on their own hardware
    - don't use excessive tessellation/AA/whatever that kills performance for no visual improvement whatsoever (think Hairworks in Witcher 3)

    Otherwise what's the point of having some extra features that nobody can use except MAYBE those with 2 or more GPUs at the highest end? (ie those willing to spend $1400+) Not to mention the whole blackbox issue which I won't get into.

    I mean just look at how TressFX was handled after the initial launch gaffe. Because CrystalDynamics provided nVidia with the source code, they were able to provide a patch within 2 weeks, and their cards ended up performing better than AMD's! This is why Gameworks is such a cancer and really needs to DIAF as fast as possible. This is besides the (over)tessellation shenanigans we've seen more than a few devs pull as well, most recently in Witcher 3.

    Yeah I hate Gameworks with a passion.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah, I agree.

    It really defeats the purpose of it all if it just runs like aids for everyone, and if they're going to partner with Ubicrap they should make sure games with their logo on it actually run. Same with WB games and Arkham Knight. CDPR just doesn't know the word "optimization", just like the Crysis devs.
     
  50. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    Crysis 1 actually ran fine on mid-range GPUs at the time it was released. When people complained that it wouldn't run at the highest settings on the highest-end-money-is-no-object hardware, the devs said to turn settings down from Ultra-High to High and the game would run fine.

    The problem wasn't that Crysis was poorly optimized. The problem was that they put in an Ultra-High graphics mode setting that was unattainable, and it drove gamers nuts. It drove them nuts to know that they couldn't just move the graphics settings sliders to max, and have their expensive new hardware crush that game. If CDPR had just disabled the Ultra-High setting, and made High the best graphics settings you could attain, then the whole controversy around Crysis 1 would never have existed

    Sent from my XT1575 using Tapatalk
     
    TomJGX and TBoneSan like this.
← Previous pageNext page →