One reason AMD cards have sometimes had higher frame rates in games over the years can also be explained by the fact that they can arbitrarily leave out some graphic elements. This is one of the reasons Futuremark requires that tessellation be enabled by AMD users. In games and benchmarks tessellation being turned down or turned off makes them seem to be more powerful than they really are, but that is only because some things are not being rendered at all. Depending on the game or benchmark it might not be completely obvious that elements are missing, but in other conditions it can be obvious. For example, where grass or vegetation should be visible it can be a blank area with nothing but a colored background.
-
-
AMD is still trails Nvidia in tessellation performance but they are ahead in compute.
-
Meaker@Sager Company Representative
Yes but they only put limits in because developers were purposefully over tessellating scenes. It's all a bit sad but pixel per pixel the amd cards are usually just fine.
octiceps likes this. -
There was that anti-aliasing fiasco in Batman: Arkham Asylum a few years back where AMD users were shut out from using AA in the game unless they tricked it into thinking they were running an Nvidia card. That was followed by the episode in Crysis 2 with the invisible tessellated waves beneath the level and overly tessellated concrete slabs. And more recently, there was the fur on the dog's back in CoD: Ghosts, excessive use of line tessellation designed specifically to tank performance on AMD cards (it was a GameWorks title, after all). -
failwheeldrive Notebook Deity
And AMD doesn't have higher memory bandwidth than Nvidia (at least when comparing the 290x to the 780 ti or Titan Black.) The Ti has a 336 gb/s bandwidth versus the 290x's 320 gb/s. Despite having a smaller 384 bit memory bus, the Ti gets the edge over the 290x due to its higher clocked memory (7ghz vs 5ghz.) -
If what you're saying is true, then AMD cards are actually multitudes more powerful than nVidia cards... because as I pointed out earlier, games 2012 and back that were nVidia: The Way It's Meant To Be Played titles ran about as well on nVidia cards as they did on AMD cards, but AMD: Gaming Evolved titles required approximately 1 card level up from nVidia to run as well (I.E. a 6870 would require like a 570 to compete, though a 570 should be a good bit stronger than a 6870). If nVidia gimped AMD cards, but AMD games on AMD cards just bring out their full potential and nVidia cards on AMD games struggle to catch up, then AMD's had the strongest cards for years and nVidia has just been paying off devs to make it look like they're even and/or winning -
Also, as for AMD, their memory bandwidth has always been "just enough" compared to nVidia's. I seem to remember GDDR5 on with 256-bit mem bus Radeon HD 4xxx cards being kept up with by nVidia's 280/285 and it's GDDR3 memory due to the 512-bit bus. So that bit doesn't surprise me. But I really haven't heard (except from I think one or two people earlier in this same thread) that CrossfireX was better performance. I know in some games (like CS:GO and the DayZ mod) SLI does jack poop and disabling it is quite literally the same performance. But in others, even 60% scaling is better performance than a single card. Maybe not WORTH having the two cards, especially for laptop owners... but definitely better overall. -
And then there was that episode years ago when Nvidia was busted for the heavy-handed tessellation in Crysis 2, and it was painfully obvious when you saw all the places where tessellation was needlessly or wastefully employed. I don't think Crytek is that incompetent, so I'm pointing the finger at Nvidia for that one.
You can see how this hurts not just AMD users but gamers and the PC ecosystem as a whole. This is essentially what AMD has been accusing Nvidia of all these years.
And the sad part is, Nvidia haven't changed their ways at all, they've only gotten bolder. Now, with GameWorks, they've got this library of effects that they distribute to contracted game developers as DLL's, essentially black boxes with no source code already pre-optimized for Nvidia hardware, which prevents the developers from optimizing these effects for AMD. And in fact, there's a clause in the developer's contract with Nvidia stipulating that they are prohibited from even showing these effects to AMD. So in the end, AMD cards will always slow down more than Nvidia ones when running these effects, and there's nothing AMD or the dev can do about it.
I can't recall a Gaming Evolved title running significantly slower on Nvidia hardware. Care to list a few examples? The only one I can think of is TressFX in Tomb Raider, but that was because it took a few months for Nvidia to optimize its drivers. After that, it runs exactly the same (which is to say, not very well LOL) on both Nvidia and AMD cards.
On the flip side, I know there have been quite a few big TWIMTBP/GameWorks titles in recent years that have been gimped on AMD hardware, whether in performance or image quality. The most recent examples I can think of are CoD: Ghosts and Watch Dogs. -
Also, BF3 is a nVidia title which saw no significant downgrades on AMD hardware (which is why I said pre-2012). I say mostly pre-2012 because apparently these new consoles have shaken up developers and they are just unoptimizing crap left right and center, so I think all the games in the last year and a half are just anomalies for the most part.
Anyway, as for AMD games which run severely worse on nVidia hardware, I can remember for certain Far Cry 3 is one of them. Crysis 3 is also one of them, which is one of the only games a stock R9 290x will clearly beat a GTX 780Ti in (according to Tek Syndicate). BF4 also definitely runs better on AMD hardware even without mantle being enabled. I'm trying to remember some older titles that were worse on nVidia hardware too, but I am awful with names. I'd need to see the title first to tell you, but AMD only lists the most recent ones on its website. I know Tomb Raider ran better for AMD with TressFX on, but that's because in that respect, AMD cards are stronger, which is to say the DirectCompute power of their cards. So I don't blame that game at all though.
As for Ghosts, I do stand corrected. All the information stated that the dog fur was indeed going to use PhysX, so I did not know they had it available for AMD users. Ghosts on the whole is unoptimized as a moocow though, but ah well. I can't counter that bit of your arguement -
If Crytek were incompetent, I don't think its engine would be so advanced with every rendering trick in the book. You forget that it pioneered the use of ambient occlusion in games. Crysis was the first game to use SSAO back in 2007. Even the damn grass in Crysis 3 is innovative. CryEngine is one of the few, along with Frostbite, than can take advantage of 6-8 CPU cores. And the thing about Crysis and Warhead is that they were so far ahead of their time. They've run great on every video card I've owned from 2011 on, and not half-bad either on the 8800 GT before that at tweaked "cheap" Very High settings in DX9. And I loved now tweakable and moddable they were, which you just don't find in new games nowadays.
BF3 is both a Gaming Evolved and TWIMTBP title. Says so right on the back of my game box. It ran better on comparable Nvidia hardware for the first year, until Catalyst 12.11 Never Settle driver came out and AMD optimized the hell out of its drivers, taking the performance crown in not just BF3 but a bunch of other AAA titles literally overnight.
All those other AMD titles you listed are a toss-up based on which reviews you look at. And most likely, it just took Nvidia a little bit longer to optimize its drivers for them since they didn't have access to the code as early as AMD. Thus the review code showed them performing slightly better on AMD. But in the end, I assure you that Nvidia was able to bring performance up to, if not past, AMD, since we all know how good Nvidia is at optimizing its drivers (no joke).
I've got both AMD and Nvidia cards and I can tell you, out of the games I own from the Gaming Evolved catalog, there is nothing that runs clearly better on one or the other, which can't be said of a few infamously TWIMTBP/GameWorks titles in the recent past. And AMD doesn't have the recent history that Nvidia has of gaming the system with shady and anti-competitive practices.
TressFX in Tomb Raider runs the same for both Nvidia and AMD. It uses DirectCompute. Nvidia's DirectCompute performance is very competitive. I think you're thinking of OpenCL, where AMD clearly leaves Nvidia in the dust. -
failwheeldrive Notebook Deity
It absolutely crushes games that are optimized for cfx (and sucks in games that aren't lol.)
Here's another quick video review of 2, 3 and 4 way cfx 290xs at 4k. The scaling is ridiculous in a lot of games. https://www.youtube.com/watch?v=TISoJsWbaSc#t=40 -
failwheeldrive likes this.
-
Meaker@Sager Company Representative
They either have to or beef up the bridge interconnect.
-
Received the monitor today, tried it on Bioshock infinite at 4K on Ultra with a couple of things on high and getting 48fps average which is annoying because it's just (well ok quite a way) short of 60fps.
May try overclocking the cards but don't want to burn them out! -
Meaker@Sager Company Representative
Obviously don't do anything you are not comfortable with but it's worth testing to see if just pushing up the VRAM clocks will impact the FPS since 4k requires a lot of bandwidth.
-
-
Meaker@Sager Company Representative
Well there is the double shipping costs to consider at this point.
-
-
That one is £150 more expensive in the UK (got the Samsung for £439.99) and in my opinion doesn't look aesthetically as good as the samsung. I don't really care about the stand as I will be sitting directly in front of it on a desk and it is just the right height etc as it is, so I guess I'll keep it.
Does overclocking the Vram increase the temperature of the core or just the memory? How much of an overclock should I give them on afterburner?
Thanks -
-
Did a 300mhz overclock on the memory, seems to increase the FPS a fair bit, getting constant 60fps on FFXIV A realm reborn in one of the busy cities. Temps are 78 on one and 75 on the other. I don't have an unlocked BIOS so can't up the voltage. Cores are running at ~80% load
-
-
At stock speeds would the temps be lower? Don't really like seeing my cards that close to 80C. How is it that the unlocked BIOS can provide a performance increase?
-
Most computers run cards at 80c. The cards are designed with a 100c or so limit
I think even mid 80s is fine -
-
Do you have a link to the thread about the unlocked BIOS?
I can't find it on google!
Thanks again.
EDIT: Nevermind, found it on techinferno. Flashing looks a bit complicated but I may give it a try.
EDIT 2: Nevermind2, both cards are flashed and seem to still operate! What would you advise me to do with regard to further overclocking? I see there is a temperature target section, would this allow me to set the max temp I want my card to run at and then it will overclock it to get the max performance which would allow it to remain below that temperature? Is it kind of like a self-adjusting overclock depending on how hot the cards are? If so that sounds great!
Any help would be much appreciated!
Thanks again for all your help prior as well. -
-
Thanks for your reply,
Just played 15 mins of Bioshock Infinite at 4K on High settings (+110/500 on core and memory respectively, no change to voltage) and my max temp was 87 on one card (the other was only 82). It actually did crash after 15 mins. Is Alienware thermal paste significantly worse than other aftermarket pastes? Is there a number of degrees temperature drop I could expect using something like arctic silver or similar?
I wouldn't want to do a repaste myself I don't think, wonder if I could get Alienware to repaste it for me under warranty?!
Really jealous of your systems by the way, I bet you wouldn't need to bother with any of this with your 880m SLI config!
Thanks. -
Haha, you'll be surprised at how better in performance your rig is compared to mine.
Myself, Mr. fox, Slv, John and a few others, have all compared scores 880m vs 780m. The 780m actually out does the 880m by a small margin at stock level. This's purely down to Nvida throttling the 880m via there vbios and drivers somehow.
Over the past few days, I've doing some research. From my research, I've come to the conclusion that faster memory = better FPS during gaming. The core clock doesn't really push the card in regards to FPS all that much. If you want a taste of how my 880m's run with Slv's and John's vbios - 993mhz/2500mhz (actual core and memory speeds) that's what my cards run at, at the normal stock boost speed.
Edit: it appears that our brothers favour over clocking the core rather than the VRAM. Take a look at the 'Just got my 880m twins' thread. -
Thanks for your reply, I bet the extra 8GB of memory would be nice for 4K though in some games
I'm on a chat with dell, warranty runs out on the 15th so gotta do it quick! -
-
-
Also, regarding temps, you need to use HWinfo and let the fans go. The stock Dell paste is actually pretty good and very long lasting. The temps will drop dramatically. The AW 18 just is way too conservative to adequately cool the machine in my experience with running it. Some games (Far Cry 3, Sleeping Dogs, etc) run the temps pretty high but with the fan profiles now being able to be manipulated, you can expect very cool temps on stock thermal paste. But seriously, I don't think you need to be overclocking just yet.....
-
Thanks for your advice Dave,
What would you say is a sensible RPM for fans 1 and 2 (seems I can't control the third one for some reason) if I want to leave them on continuously at the same speed playing games? I don't really mind how loud they are, just don't want to knacker the bearings on them too fast.
I guess I'll leave overclocking for now, or maybe only stick on a very modest vram overclock
Is the reason my 1st card runs hotter than my 2nd card due to the different shape of the heatsink, or is it some other factor? What temps on a stock 780m would justify a repaste under full load?
Thanks again. -
Edit: Here his Mr. Fox's guide for setting up HWINFO - Link -
In HWiNFO fan control, I can only control one fan, the CPU fan I think (the middle one). That one is controlled by the "Fan 1&2" section, Fan2 and Fan3 are greyed out so I can't edit them.
Are all of them meant to be adjustable?
Thanks
EDIT: Read this on a post:
Warning: Manual fan control is not possible with the Alienware 18. This also needs to be fixed. HWiNFO64 fan control does not function correctly due to issues with the EC. (Almico SpeenFan also has exactly the same problem.) Fan control for the CPU fan is possible, but doing that cuts power to the GPU fans. They will turn off and cause the video cards to overheat severely. Fan 1 is the CPU fan. Fan 2 normally controls both GPU fans simultaneously. Both of these utilities work fine with the M18xR1/R2 but not the new 18. Leave it set to "System Auto" to avoid damage to your video cards.
So how do you guys control the speed of your GPU fans? -
-
Ah so it isn't possible to change the fan settings in the Alienware 18 manually.
I saw a petition for Dell to make the fans controllable for the 18 but I don't think it has had any effect unfortunately as of yet. Guess I'll just be happy I can play a couple of games at 4K without crazy temperatures. -
-
Meaker@Sager Company Representative
Because it opens the possibility of people cooking their systems most likely. It's a shame.
Mr. Fox likes this. -
reborn2003 and D2 Ultima like this.
-
Thanks for the info Mr. Fox.
When in HWiNFO64 (latest beta version v4.41-2250) I don't see all three fans and I don't have the "Dell EC" section, only "Compal EC" which contains only fans for the CPU and GPU1. When I go onto the fan button at the bottom, I can only control what is labelled as "Fan1 & 2"?
I am on the latest BIOS (A08), is there something that needs to be changed in the BIOS for this to work, or are my settings in HWiNFO wrong or something else.
Appreciate the help.
-
Found a thread you made helping someone setup HWiNFO64 for their Alienware 18 and now have the Dell EC.
I assume System/GPU will control both GPU fans at the same time? Which fans are which? Is CPU Fan1 and System/GPU Fan 2?
Thanks.
EDIT: Think I've got it worked out, wonder why one of the GPU fans only goes to 3300rpm when the other one goes to 3500rpm?Mr. Fox likes this. -
As you discovered, on the new Alienware 18 (and I assume the 14 and 17) you must enable EC support for manual fan control. On the M18xR1/R2 you can enable it but doing so sometimes causes odd system behavior, so it is better to disable EC support on an M18xR1/R2.reborn2003 likes this. -
Having used dual GPU solutions from both parties. I can only draw on 3 simple facts:
1. Stuff will break, stuff will never work as advertised, be prepared for workarounds regardless of brand. The only real difference is workarounds are easier with NVIDIA because of the excellent NVIDIA inspector.
2. VRAM capacity is king, get as much as you can. Because when you go dual/tri/quad you are doubling/tripling/quadrupling rendering power but VRAM capacity is mirrored.
3. Microstuttering is largely a thing of the past now as older DX9 grade games run extremely well on Single GPUs (though NVIDIA have a mild lead here). Your biggest issue is having optimized game profiles available at all, no profile=no smoothness=no scaling, this is where NVIDIA has the edge simply because of their larger driver team. -
IDK, I'd be pretty upset if my CrossFire setup didn't work properly in Witcher 2, modded Skyrim, or PlanetSide 2, which are all DX9 games that are very demanding at their highest settings and require multi-GPU setups to run well (moreso in the case of weaker notebook cards).
And you're right, thank god for Nvidia Inspector for allowing me to fix so many of Nvidia's broken or nonexistent SLI profiles. AMD users had RadeonPro up until a short while ago, but the developer turned the cold shoulder and left to work on the Gaming Evolved App powered by Raptr. -
Meaker@Sager Company Representative
Do those problems hit the chips using the XDMA engine too?
-
-
Is it even possible to force CrossfireX on games that don't support it with AMD? I know I can do it with nVidia (like Skyrim + RCRN/ENB = force AFR2 etc as RCRN/ENB doesn't support SLI by default)
-
D2 Ultima likes this.
4K gaming on Alienware 18 with 780m SLI possible?
Discussion in 'Alienware 18 and M18x' started by l701x, Jun 19, 2014.