The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    4K gaming on Alienware 18 with 780m SLI possible?

    Discussion in 'Alienware 18 and M18x' started by l701x, Jun 19, 2014.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    One reason AMD cards have sometimes had higher frame rates in games over the years can also be explained by the fact that they can arbitrarily leave out some graphic elements. This is one of the reasons Futuremark requires that tessellation be enabled by AMD users. In games and benchmarks tessellation being turned down or turned off makes them seem to be more powerful than they really are, but that is only because some things are not being rendered at all. Depending on the game or benchmark it might not be completely obvious that elements are missing, but in other conditions it can be obvious. For example, where grass or vegetation should be visible it can be a blank area with nothing but a colored background.
     
    D2 Ultima likes this.
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    AMD is still trails Nvidia in tessellation performance but they are ahead in compute.
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Yes but they only put limits in because developers were purposefully over tessellating scenes. It's all a bit sad but pixel per pixel the amd cards are usually just fine.
     
    octiceps likes this.
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yep, I agree. Pretty sad indeed. Nvidia's been paying off game developers to purposely gimp performance and/or image quality on its competitor's cards for years. Most developers won't refuse free money and/or hardware and they have to keep mum about it because being under NDA is one of the stipulations they must agree to in the contract they sign with Nvidia.

    There was that anti-aliasing fiasco in Batman: Arkham Asylum a few years back where AMD users were shut out from using AA in the game unless they tricked it into thinking they were running an Nvidia card. That was followed by the episode in Crysis 2 with the invisible tessellated waves beneath the level and overly tessellated concrete slabs. And more recently, there was the fur on the dog's back in CoD: Ghosts, excessive use of line tessellation designed specifically to tank performance on AMD cards (it was a GameWorks title, after all).
     
  5. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    CFX has had better scaling than SLI for quite a while now. When a game is optimized for cfx, it works really well.

    And AMD doesn't have higher memory bandwidth than Nvidia (at least when comparing the 290x to the 780 ti or Titan Black.) The Ti has a 336 gb/s bandwidth versus the 290x's 320 gb/s. Despite having a smaller 384 bit memory bus, the Ti gets the edge over the 290x due to its higher clocked memory (7ghz vs 5ghz.)
     
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Crysis 2's tessellated water map gimped everyone's performance though, if I remember right. I did not hear about being unable to use AA in Batman AA though. As for Ghosts, wasn't the dog fur meant to be PhysX? I did not know it was simply tessellation. My friends with AMD cards ran the game... well as fine as it could be run? The game runs like poop. But didn't know anything about that dog fur thing.

    If what you're saying is true, then AMD cards are actually multitudes more powerful than nVidia cards... because as I pointed out earlier, games 2012 and back that were nVidia: The Way It's Meant To Be Played titles ran about as well on nVidia cards as they did on AMD cards, but AMD: Gaming Evolved titles required approximately 1 card level up from nVidia to run as well (I.E. a 6870 would require like a 570 to compete, though a 570 should be a good bit stronger than a 6870). If nVidia gimped AMD cards, but AMD games on AMD cards just bring out their full potential and nVidia cards on AMD games struggle to catch up, then AMD's had the strongest cards for years and nVidia has just been paying off devs to make it look like they're even and/or winning
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Also the first time I've heard that O_O. Maybe I need to do more digging, or very very few games are optimized for CrossfireX.

    Also, as for AMD, their memory bandwidth has always been "just enough" compared to nVidia's. I seem to remember GDDR5 on with 256-bit mem bus Radeon HD 4xxx cards being kept up with by nVidia's 280/285 and it's GDDR3 memory due to the 512-bit bus. So that bit doesn't surprise me. But I really haven't heard (except from I think one or two people earlier in this same thread) that CrossfireX was better performance. I know in some games (like CS:GO and the DayZ mod) SLI does jack poop and disabling it is quite literally the same performance. But in others, even 60% scaling is better performance than a single card. Maybe not WORTH having the two cards, especially for laptop owners... but definitely better overall.
     
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Extreme tessellation hurts both Nvidia and AMD, but it hurts AMD a lot more because of their architecture's weaker tessellation performance. Basically, Nvidia pays developers to deliberately slow down their games just because it slows down Radeons even more. In the case of CoD: Ghosts, there are much more efficient ways of rendering fur than the extreme line tessellation that is used.

    And then there was that episode years ago when Nvidia was busted for the heavy-handed tessellation in Crysis 2, and it was painfully obvious when you saw all the places where tessellation was needlessly or wastefully employed. I don't think Crytek is that incompetent, so I'm pointing the finger at Nvidia for that one.

    You can see how this hurts not just AMD users but gamers and the PC ecosystem as a whole. This is essentially what AMD has been accusing Nvidia of all these years.

    And the sad part is, Nvidia haven't changed their ways at all, they've only gotten bolder. Now, with GameWorks, they've got this library of effects that they distribute to contracted game developers as DLL's, essentially black boxes with no source code already pre-optimized for Nvidia hardware, which prevents the developers from optimizing these effects for AMD. And in fact, there's a clause in the developer's contract with Nvidia stipulating that they are prohibited from even showing these effects to AMD. So in the end, AMD cards will always slow down more than Nvidia ones when running these effects, and there's nothing AMD or the dev can do about it.

    I can't recall a Gaming Evolved title running significantly slower on Nvidia hardware. Care to list a few examples? The only one I can think of is TressFX in Tomb Raider, but that was because it took a few months for Nvidia to optimize its drivers. After that, it runs exactly the same (which is to say, not very well LOL) on both Nvidia and AMD cards.

    On the flip side, I know there have been quite a few big TWIMTBP/GameWorks titles in recent years that have been gimped on AMD hardware, whether in performance or image quality. The most recent examples I can think of are CoD: Ghosts and Watch Dogs.
     
  9. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I DO believe Crytek is that incompetent. I mean... look at Crysis 1 and Warhead. Even today those things can barely halfway take advantage of a new videocard, and your fps still suffers. I haven't tried Crysis 3 yet though.

    Also, BF3 is a nVidia title which saw no significant downgrades on AMD hardware (which is why I said pre-2012). I say mostly pre-2012 because apparently these new consoles have shaken up developers and they are just unoptimizing crap left right and center, so I think all the games in the last year and a half are just anomalies for the most part.

    Anyway, as for AMD games which run severely worse on nVidia hardware, I can remember for certain Far Cry 3 is one of them. Crysis 3 is also one of them, which is one of the only games a stock R9 290x will clearly beat a GTX 780Ti in (according to Tek Syndicate). BF4 also definitely runs better on AMD hardware even without mantle being enabled. I'm trying to remember some older titles that were worse on nVidia hardware too, but I am awful with names. I'd need to see the title first to tell you, but AMD only lists the most recent ones on its website. I know Tomb Raider ran better for AMD with TressFX on, but that's because in that respect, AMD cards are stronger, which is to say the DirectCompute power of their cards. So I don't blame that game at all though.

    As for Ghosts, I do stand corrected. All the information stated that the dog fur was indeed going to use PhysX, so I did not know they had it available for AMD users. Ghosts on the whole is unoptimized as a moocow though, but ah well. I can't counter that bit of your arguement
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    If Crytek were incompetent, I don't think its engine would be so advanced with every rendering trick in the book. You forget that it pioneered the use of ambient occlusion in games. Crysis was the first game to use SSAO back in 2007. Even the damn grass in Crysis 3 is innovative. CryEngine is one of the few, along with Frostbite, than can take advantage of 6-8 CPU cores. And the thing about Crysis and Warhead is that they were so far ahead of their time. They've run great on every video card I've owned from 2011 on, and not half-bad either on the 8800 GT before that at tweaked "cheap" Very High settings in DX9. And I loved now tweakable and moddable they were, which you just don't find in new games nowadays.

    BF3 is both a Gaming Evolved and TWIMTBP title. Says so right on the back of my game box. It ran better on comparable Nvidia hardware for the first year, until Catalyst 12.11 Never Settle driver came out and AMD optimized the hell out of its drivers, taking the performance crown in not just BF3 but a bunch of other AAA titles literally overnight.

    All those other AMD titles you listed are a toss-up based on which reviews you look at. And most likely, it just took Nvidia a little bit longer to optimize its drivers for them since they didn't have access to the code as early as AMD. Thus the review code showed them performing slightly better on AMD. But in the end, I assure you that Nvidia was able to bring performance up to, if not past, AMD, since we all know how good Nvidia is at optimizing its drivers (no joke).

    I've got both AMD and Nvidia cards and I can tell you, out of the games I own from the Gaming Evolved catalog, there is nothing that runs clearly better on one or the other, which can't be said of a few infamously TWIMTBP/GameWorks titles in the recent past. And AMD doesn't have the recent history that Nvidia has of gaming the system with shady and anti-competitive practices.

    TressFX in Tomb Raider runs the same for both Nvidia and AMD. It uses DirectCompute. Nvidia's DirectCompute performance is very competitive. I think you're thinking of OpenCL, where AMD clearly leaves Nvidia in the dust.
     
  11. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    I have been out of the mobile crowd for a while so I'm not sure how AMD's drivers and CFX scaling has been doing in that respect, but in the desktop world the 290 and 290x definitely have better scaling than the 780/780 ti, most notably in 3 and 4 way CFX. SLI tends to show extreme diminishing returns with 3+ cards (I went through this with my 3 780s, where some games actually performed better with only 2 cards, and most games showed minimal gains with 3 lol.) The 290x on the other hand has awesome scaling. Check out this quadfire 295x2 review from hardocp: HARDOCP - R9 295X2 QuadFire - AMD Radeon R9 295X2 CrossFire Video Card Review

    It absolutely crushes games that are optimized for cfx (and sucks in games that aren't lol.)

    Here's another quick video review of 2, 3 and 4 way cfx 290xs at 4k. The scaling is ridiculous in a lot of games. https://www.youtube.com/watch?v=TISoJsWbaSc#t=40
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah XDMA CrossFire scales really well in Eyefinity and 4K. When it works LOL. And the frame pacing is really good in DX10/11 games. Wonder if Nvidia will follow suit and go bridgeless SLI as high resolutions take off.
     
    failwheeldrive likes this.
  13. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    They either have to or beef up the bridge interconnect.
     
  14. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Received the monitor today, tried it on Bioshock infinite at 4K on Ultra with a couple of things on high and getting 48fps average which is annoying because it's just (well ok quite a way) short of 60fps.

    May try overclocking the cards but don't want to burn them out!
     
  15. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Obviously don't do anything you are not comfortable with but it's worth testing to see if just pushing up the VRAM clocks will impact the FPS since 4k requires a lot of bandwidth.
     
  16. dandan112988

    dandan112988 Notebook Deity

    Reputations:
    550
    Messages:
    1,437
    Likes Received:
    251
    Trophy Points:
    101
    I have the same samsung monitor. Return it ASAP and order the asus pb287q. It is the exact same panel and ports as the samsung, same price, only it was a much improved stand with pivot, swivel, and rotate abilities unlike the samsung which can't do anything. It also has a 3 year warranty vs samsungs 1. It's a better buy in every way, especially since it's the same price on newegg.
     
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Well there is the double shipping costs to consider at this point.
     
  18. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Get a desktop. [​IMG]
     
  19. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    That one is £150 more expensive in the UK (got the Samsung for £439.99) and in my opinion doesn't look aesthetically as good as the samsung. I don't really care about the stand as I will be sitting directly in front of it on a desk and it is just the right height etc as it is, so I guess I'll keep it.

    Does overclocking the Vram increase the temperature of the core or just the memory? How much of an overclock should I give them on afterburner?


    Thanks
     
  20. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    Just the VRAM. Overclock by +100 to 300mhz with a slight voltage increase. If it increases your FPS in game, bump it up an extra +50 to 100mhz. Keep you temps lower than the high 70 region if possible. I tanked +600mhz at 1.050V. My temps went in the high 70's with ICD 7 though.
     
  21. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Did a 300mhz overclock on the memory, seems to increase the FPS a fair bit, getting constant 60fps on FFXIV A realm reborn in one of the busy cities. Temps are 78 on one and 75 on the other. I don't have an unlocked BIOS so can't up the voltage. Cores are running at ~80% load
     
  22. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    Nice! Impressive FPS there. I take back what I said earlier in this thread. It appears gaming at 4K isn't such a waste. Well done mate. My advice here now; definitely flash Slv's and John's vbios to your 780m's. You won't regret it even at stock speeds you'll see a vast improvement.
     
  23. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    At stock speeds would the temps be lower? Don't really like seeing my cards that close to 80C. How is it that the unlocked BIOS can provide a performance increase?
     
  24. dandan112988

    dandan112988 Notebook Deity

    Reputations:
    550
    Messages:
    1,437
    Likes Received:
    251
    Trophy Points:
    101
    Most computers run cards at 80c. The cards are designed with a 100c or so limit
    I think even mid 80s is fine
     
  25. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    The unlocked bios removes the throttle that occurs with the stock vbios.
     
  26. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for your replies,

    Do you have a link to the thread about the unlocked BIOS?

    I can't find it on google!

    Thanks again.

    EDIT: Nevermind, found it on techinferno. Flashing looks a bit complicated but I may give it a try.
    EDIT 2: Nevermind2, both cards are flashed and seem to still operate! What would you advise me to do with regard to further overclocking? I see there is a temperature target section, would this allow me to set the max temp I want my card to run at and then it will overclock it to get the max performance which would allow it to remain below that temperature? Is it kind of like a self-adjusting overclock depending on how hot the cards are? If so that sounds great!

    Any help would be much appreciated!
    Thanks again for all your help prior as well.
     
  27. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    I set the power target to around +120-130%. Set the temperature target to 93c and uncheck the box above the temperature gauge. I would raise your memory up +500 and test in 3D Mark 11. If your driver crashes, up the voltage by 1 increment. Keep going until your reach your optimal point. Try not to run under extreme heat conditions. The highest temps you want to see is 83c and lower. If your gpu runs hotter than 83c, stop over clocking until your get some better thermal paste.
     
  28. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for your reply,

    Just played 15 mins of Bioshock Infinite at 4K on High settings (+110/500 on core and memory respectively, no change to voltage) and my max temp was 87 on one card (the other was only 82). It actually did crash after 15 mins. Is Alienware thermal paste significantly worse than other aftermarket pastes? Is there a number of degrees temperature drop I could expect using something like arctic silver or similar?

    I wouldn't want to do a repaste myself I don't think, wonder if I could get Alienware to repaste it for me under warranty?!

    Really jealous of your systems by the way, I bet you wouldn't need to bother with any of this with your 880m SLI config!

    Thanks.
     
  29. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    Alienwares paste isn't bad bad, but it's nothing compared to top end thermal pastes these days. I recommend using IC Diamond 7. Try calling Alienware for a repaste, worst that can happen? They say no.

    Haha, you'll be surprised at how better in performance your rig is compared to mine.
    Myself, Mr. fox, Slv, John and a few others, have all compared scores 880m vs 780m. The 780m actually out does the 880m by a small margin at stock level. This's purely down to Nvida throttling the 880m via there vbios and drivers somehow.

    Over the past few days, I've doing some research. From my research, I've come to the conclusion that faster memory = better FPS during gaming. The core clock doesn't really push the card in regards to FPS all that much. If you want a taste of how my 880m's run with Slv's and John's vbios - 993mhz/2500mhz (actual core and memory speeds) that's what my cards run at, at the normal stock boost speed.

    Edit: it appears that our brothers favour over clocking the core rather than the VRAM. Take a look at the 'Just got my 880m twins' thread.
     
  30. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for your reply, I bet the extra 8GB of memory would be nice for 4K though in some games :)

    I'm on a chat with dell, warranty runs out on the 15th so gotta do it quick!
     
  31. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I told ya =3. lots of games out there already run at (via SSAA) or can run at 4k... just the super pretty ones we WANT to run at 4K aren't anywhere near being ready (like Crysis 3, Star Citizen, BF4, etc) XD
     
  32. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    Don't mean to sound harsh but if you are wary of unlocking the custom Vbios or doing a simple repaste then I don't think you should be messing with overclocking sliders. Just my .02.
     
  33. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    Also, regarding temps, you need to use HWinfo and let the fans go. The stock Dell paste is actually pretty good and very long lasting. The temps will drop dramatically. The AW 18 just is way too conservative to adequately cool the machine in my experience with running it. Some games (Far Cry 3, Sleeping Dogs, etc) run the temps pretty high but with the fan profiles now being able to be manipulated, you can expect very cool temps on stock thermal paste. But seriously, I don't think you need to be overclocking just yet.....
     
  34. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for your advice Dave,

    What would you say is a sensible RPM for fans 1 and 2 (seems I can't control the third one for some reason) if I want to leave them on continuously at the same speed playing games? I don't really mind how loud they are, just don't want to knacker the bearings on them too fast.

    I guess I'll leave overclocking for now, or maybe only stick on a very modest vram overclock :)

    Is the reason my 1st card runs hotter than my 2nd card due to the different shape of the heatsink, or is it some other factor? What temps on a stock 780m would justify a repaste under full load?

    Thanks again.
     
  35. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    I personally max my fans out and plug in the old earphones during a heavy gaming session. If I don't plan on gaming for more than an hour. I tend to just let the fans run automatically. The only other advice I can give you, is to set your fan tables up manually. Mr. Fox has a very detailed guide on what settings to apply for optimal. I followed his tutorial, I wasn't let down. However, I gradually got sick and tired of my CPU fan constantly dropping and maxing out during gaming in Dark Souls 2. So I ended up just maxing my fans out during playing that game.

    Edit: Here his Mr. Fox's guide for setting up HWINFO - Link
     
  36. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    In HWiNFO fan control, I can only control one fan, the CPU fan I think (the middle one). That one is controlled by the "Fan 1&2" section, Fan2 and Fan3 are greyed out so I can't edit them.

    Are all of them meant to be adjustable?

    Thanks

    EDIT: Read this on a post:
    Warning: Manual fan control is not possible with the Alienware 18. This also needs to be fixed. HWiNFO64 fan control does not function correctly due to issues with the EC. (Almico SpeenFan also has exactly the same problem.) Fan control for the CPU fan is possible, but doing that cuts power to the GPU fans. They will turn off and cause the video cards to overheat severely. Fan 1 is the CPU fan. Fan 2 normally controls both GPU fans simultaneously. Both of these utilities work fine with the M18xR1/R2 but not the new 18. Leave it set to "System Auto" to avoid damage to your video cards.

    So how do you guys control the speed of your GPU fans?
     
  37. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    I don't use HWINFO on my 18. I use it on my 17 as both fans are able to be manually adjusted.
     
  38. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Ah so it isn't possible to change the fan settings in the Alienware 18 manually.

    I saw a petition for Dell to make the fans controllable for the 18 but I don't think it has had any effect unfortunately as of yet. Guess I'll just be happy I can play a couple of games at 4K without crazy temperatures.
     
  39. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    I have seen a few threads/ideas/suggestions in regards to allowing us control over our own 18's fans. Why Dell/Alienware won't just allow us to do this, it's far beyond frustrating as there fan tables are screwed up at the moment.
     
  40. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Because it opens the possibility of people cooking their systems most likely. It's a shame.
     
    Mr. Fox likes this.
  41. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    Yes, it is possible. Alienware fixed that problem already. You need either of the two latest BIOS versions to use HWiNFO64 manual fan controls, but it works like a charm now. I really sucked that it didn't work before. There would be no need for manual fan control if the fans actually functioned in a desirable manner from the factory. Have a look...
    [​IMG]
    Indeed, it was a shame. I've very glad it's not like that any more.
     
    reborn2003 and D2 Ultima like this.
  42. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for the info Mr. Fox.

    When in HWiNFO64 (latest beta version v4.41-2250) I don't see all three fans and I don't have the "Dell EC" section, only "Compal EC" which contains only fans for the CPU and GPU1. When I go onto the fan button at the bottom, I can only control what is labelled as "Fan1 & 2"?

    I am on the latest BIOS (A08), is there something that needs to be changed in the BIOS for this to work, or are my settings in HWiNFO wrong or something else.

    Appreciate the help.

    hwinfo.jpg
     
  43. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Found a thread you made helping someone setup HWiNFO64 for their Alienware 18 and now have the Dell EC.

    I assume System/GPU will control both GPU fans at the same time? Which fans are which? Is CPU Fan1 and System/GPU Fan 2?

    Thanks.

    EDIT: Think I've got it worked out, wonder why one of the GPU fans only goes to 3300rpm when the other one goes to 3500rpm?
     
    Mr. Fox likes this.
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    Glad you got it sorted. I renamed those sensors. There are only two sensors for the M18xR1/R2 and the fans cannot be controlled individually. The 18 should have three sensors and the ability to control each fan independently. Those limits are because Dell decided to set the maximum RPM of each fan to a value less than what it is capable of. I don't know why they chose to do that, or make one run faster than the other. My guess would be they are trying to keep the machine quieter than some enthusiasts want them to be. Yes, quiet is nice, but cool is much nicer.

    As you discovered, on the new Alienware 18 (and I assume the 14 and 17) you must enable EC support for manual fan control. On the M18xR1/R2 you can enable it but doing so sometimes causes odd system behavior, so it is better to disable EC support on an M18xR1/R2.
     
    reborn2003 likes this.
  45. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Anandtech did mention when he compared SLI GTX 780Tis and Crossfire 290X that SLI seemed to have inferior scaling at 4K. The 780Ti 15-25% performance lead at 1080p and 1440p mostly melts at 4K. Though, the reason as to why is less clear, could've been unoptimized drivers, SLI bridge bottleneck (highly unlikely) or the lesser ROP perfomance of GK110.

    Having used dual GPU solutions from both parties. I can only draw on 3 simple facts:
    1. Stuff will break, stuff will never work as advertised, be prepared for workarounds regardless of brand. The only real difference is workarounds are easier with NVIDIA because of the excellent NVIDIA inspector.
    2. VRAM capacity is king, get as much as you can. Because when you go dual/tri/quad you are doubling/tripling/quadrupling rendering power but VRAM capacity is mirrored.
    3. Microstuttering is largely a thing of the past now as older DX9 grade games run extremely well on Single GPUs (though NVIDIA have a mild lead here). Your biggest issue is having optimized game profiles available at all, no profile=no smoothness=no scaling, this is where NVIDIA has the edge simply because of their larger driver team.
     
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    IDK, I'd be pretty upset if my CrossFire setup didn't work properly in Witcher 2, modded Skyrim, or PlanetSide 2, which are all DX9 games that are very demanding at their highest settings and require multi-GPU setups to run well (moreso in the case of weaker notebook cards).

    And you're right, thank god for Nvidia Inspector for allowing me to fix so many of Nvidia's broken or nonexistent SLI profiles. AMD users had RadeonPro up until a short while ago, but the developer turned the cold shoulder and left to work on the Gaming Evolved App powered by Raptr.
     
  47. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Do those problems hit the chips using the XDMA engine too?
     
  48. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yes.

    Skyrim_3840x2160MST_PLOT.png Skyrim_3840x2160_PLOT.png Skyrim_3840x2160_PLOT.png
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Is it even possible to force CrossfireX on games that don't support it with AMD? I know I can do it with nVidia (like Skyrim + RCRN/ENB = force AFR2 etc as RCRN/ENB doesn't support SLI by default)
     
  50. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah you can force one of the AFR modes in RadeonPro.
     
    D2 Ultima likes this.
← Previous pageNext page →