The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    When are the new Alienwares with Maxwell coming?

    Discussion in 'Alienware' started by Cloudfire, Sep 29, 2014.

  1. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,366
    Messages:
    2,080
    Likes Received:
    3,274
    Trophy Points:
    281
    Just checked out the Prema Mod BIOS, I envy those Clevos that they are having such a good aftermarket upgrades & Mods support.

    Even my next machine would be definitely a Clevo with an X99 Chipset + an SLI . This would be true successor to a P570WM3 & I guess most of the Alienware M18x R* and AW users in-favor of performance will shift to clevo machines soon...

    With the upcoming release of P7xxx series (only downside is the M2 slots) with a good design + K processors, As Intel is also killling off the most potent MX line-up & newer machines, The time has begun for the Alienware to loose their customers to Clevos from the AW17 or even AW18 coz of their limitations, crippled BIOS and inability to upgrade adding to that with this stupid ultraturd AW13 release they are digging their own grave !!

    Still no news on the 900M series, It's a shamless & inexcusable situation for AW & very unfortunate for us... :nah:
     
    TBoneSan, papusan and TomJGX like this.
  2. tinker_xp

    tinker_xp Notebook Consultant

    Reputations:
    10
    Messages:
    179
    Likes Received:
    36
    Trophy Points:
    41
    Been scanning the Alienware facebook page (UK) and boy are those guys taking a lot of flack over the 900 series, it's also the same story over on the dell forum's too. Makes me wonder what the decision is? As Dell/Alienware are only hurting themselves at this point as you scan this page and a lot of people have already become bored and frustrated of waiting and have nearly chosen their next laptop which isn't Alienware.
     
  3. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    What I find strange is that there has been no formal declaration of the end of the AW 14. Obviously pulling it down from their site the minute the AW 13 arrived points to it being stopped, but OTOH with the M11x and M15x there were formal declarations they were ceasing. Now, Dellienware just does't say anything, which is odd, and it's been long enough a statement should have been forthcoming by now. They are either in great flux with a lack of direction right now, or feel they can take the heat while turning out some new systems that really "wow" the public at PAX/CES, or most likely some combination of the two. I keep thinking how lackluster the 2014 event was... even they looked bored doing it which says something.

    They really, really need to come out with news on where they are headed.... but I feel certain they are holding back to make a splash in January at the events that month. If they don't have a couple of aces up their sleeve people will be mighty pissed...
     
  4. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    I suggest people wait and see and judge the direction on their next generation of product. I do hope they manage to pull it off though.
     
  5. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Looks like good news is finally here...
     
    reborn2003 and TBoneSan like this.
  6. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I believe there will soon be a GM204 replacement for both the 970M and 980M. These issues may be limited to these two cards only.
     
    reborn2003 and Mr. Fox like this.
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Well, hopefully GM204 will be even more powerful than 980M and overclock like a banshee on steroids. If these new GPUs are only just more powerful but don't overclock like there is no tomorrow it's going to suck real bad, LOL. Using less power is nice if it means it has more discretionary power available to burn. I'm not interested in "saving" power per se. I'd like to be able to take even greater advantage of the dual 330W adapter than would I could with 780M SLI. Hopefully, it will benefit from having more headroom for overclocking than what the dual adapter allowed for 780M SLI.
     
  8. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
    the alienware 13 IS THE NEXT GEN. We rightly judge them on the only new gen stuff they released.
    The older machines are just that, over a year almost 2 years old. The 13 is, less then a month or so, making it rightfully, the latest next generation of the Alienware product-lineup.

    With it, came NO upgrades for the 17, or 18 lineup, and discontinuation of the 14. Overall, they are regressing. If the next lineup of 280m GTX is when Alienware releases a new product then they will have skipped a year entirely. Which is next year coming. So, alienware basically did nothing for us this year - however, many of us like the graphics adapter - just not the soldered on ulv cpu. I would have bought a 13 if it was not for the ty CPU option. Its the single thing that stopped me from buying it.

    Thats 1500$ less in Dells pocket , cause of that ty CPU. if I wanted such a crappy CPU, I'd demand they add a graphics adapter for ALL laptops including thier usual inspiron series... then i can have portability and functuality and power wherever and whenever.

    I bought a Razor Blade instead for my small portable this year.

     
  9. AcE Krystal

    AcE Krystal Notebook Guru

    Reputations:
    5
    Messages:
    59
    Likes Received:
    21
    Trophy Points:
    16
    Same here, I first was thinking of a Thinkpad T440s with a i7-4600U... But I found it not powerfull enough for me cause my current Vaio Z 13,1" has a HQ cpu inside it. My minimum is a 4 core HQ. I was disappointed at seeing a U cpu in the Alienware 13, but then I was really shocked when I saw they would even limit it to a i5...

    Searching for a while already for a new heavy duty ultrabook, but so far I can't find something that satisfies me. I'm thinking about the GS60, or maybe also a Razor Blade, but a powerfull 13 or 14" Alienware that has the carbon based housing would really have been my choice is it existed. Will be making my final decision arround mid December, still hope a powerfull Alienware 14 is comming.
     
  10. RS4

    RS4 Notebook Consultant

    Reputations:
    32
    Messages:
    188
    Likes Received:
    50
    Trophy Points:
    41
    I don't think we can expect any new announcements before January events.
     
  11. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    FWIW - they have stated many times an tweeted as well that once Broadwell is available that an i7 is forthcoming (and for some strange reason i3's). Albeit we have no reason to believe they won't just be more dual-core ULV chips, but still.... it is an i7...

    I think this is the case too.... there were some rumors of announcements in November, but it's looking like that's not happening.
     
  12. reborn2003

    reborn2003 THE CHIEF!

    Reputations:
    7,764
    Messages:
    2,988
    Likes Received:
    349
    Trophy Points:
    101
    Looks like the i7 is out

    Intel® Core™ i7 4510U (Dual-Core, 4MB Cache, up to 3.1GHz w/ Turbo Boost)

    Intel® Core™ i7 available - will ship after the holidays

    Cheers. :)

     
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Intel® Core™ i7 4510U (Dual-Core, 4MB Cache, up to 3.1GHz w/ Turbo Boost)

    ...so lame... :nah:

    *lights match behind rear end while passing gas*
     
  14. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Yep..

    Let's bring out a laptop with massive external graphics power ...and gimp it with a pocket calculator CPU

    Derp1.jpg
     
    Mr. Fox, Ashtrix and pathfindercod like this.
  15. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181
    The 13 could have been a ground breaking ultraportable laptop if they tried a little harder...
     
    Mr. Fox and TBoneSan like this.
  16. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I know. I would have been temped to get it if it had a powerful quad core.. If it had an MX I would have pounced on it! Obviously there will be another machine (likely a bigger model) with a quad core that allows x16 lanes to be used. But as the size of the laptop increase, so does my usefulness of an external GPU too. In that case... why not just have it all internal!
     
    Mr. Fox and pathfindercod like this.
  17. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    An Alienware 18 with 4940MX, 980M SLI, 600W AC adapter (or a dual 330W junction box like Clevo offers for their SLI laptops) and the Graphics Amplifier with a desktop GTX 980 or Titan GPU would be outstanding. I cannot imagine an enthusiast not wanting something amazing like that. Nor can I imagine any enthusiast being excited about a product powered by a low TDP dual core processor.
     
    pathfindercod likes this.
  18. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Truer words were never spoken.... Dell is terrified of throttling and thermals and massive support issues (to a degree justly so), but they have not tried to reach a happy medium. Instead, they just retreaded the M11x concept... which doesn't work in today's world at that price/performance point with a ULV dual-core. They are also terrified of disturbing the 17" sales... so we get something that might make sense on paper, but not in the real world... the right way to have done this would have been to offer both.... a ULV-dual core at a significantly lower price and a full-on quad at a price point more appropriate to what the AW 14 had; hell I bet half the people here would have put down money even at a premium for it.

    Again, there is hope that something else might be in the works, no one knows if the ongoing line up will just be a 13/17/18 set of systems... and no official word either that the AW 14 might not return.
     
    Mr. Fox, Ashtrix and pathfindercod like this.
  19. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,689
    Trophy Points:
    581
    Still haswell, hopefully it'll be better when broadwell is released.
     
    reborn2003 and Mr. Fox like this.
  20. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Yup, I don't really like Haswell either. A Haswell MX CPU with Liquid Ultra and a lot of fussing about with settings is way better than any other non-Extreme mobile CPU, but Haswell in general is pretty much a pile of crap.
     
  21. RS4

    RS4 Notebook Consultant

    Reputations:
    32
    Messages:
    188
    Likes Received:
    50
    Trophy Points:
    41
    Don't expect anything other than Haswell for MX and H parts till Q3 2015 in shipping laptops. Broadwell ULV laptops will be rolled out in Q1 and Q2 2015 depending on product timelines.
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Isn't the amp limited to 1 GPU for now? I know dual GPU/single board products exist, but they still need a bridging PLX chip for "internal SLI", not sure if it'll actually run both chips or just one. I think for SLI laptops there's no point getting the amplifier, especially if you really are limited to 1 eGPU. I guess you could make a case for games that don't support SLI, where a single strong eGPU would be better than 2 980Ms. But in all other cases I'd rather take the 980M SLI over the eGPU.
     
    Mr. Fox likes this.
  23. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Yes, me too. I agree with you. Two 980M in SLI would definitely be better than one desktop 980. Heck even two 780M in SLI is better IMHO than on desktop 980 (or one 980M). Multi-GPU is always better to have than just one GPU. The eGPU would just be a geeky toy that's fun to goof around with. Should work fine as a docking station on steroids since the monitor, mouse and keyboard stay connected to the eGPU box.
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Eh, 780M SLI over a single 980 or 980M is a tough sell. Single GPU is always better than multi-GPU if the single GPU is as fast or faster. You don't have to deal with the myriad multi-GPU issues including microstutter and are always ensured of maximum performance irrespective of drivers/SLI profiles and game compatibility. 980 is definitely faster than 780M SLI, even more so when overclocked considering its ample headroom. 980M is a better choice as well since it's essentially tied with 780M SLI stock-for-stock.
     
  25. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181

    Tied? Not really. Single 980m in BF4 gets 60-70fps and 780m sli gets 100-120... On paper it might be, but not real world. Games that thrive on sli you just cant beat a sli setup.
     
    Mr. Fox likes this.
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    I totally agree. And, I have never struggled with all of the SLI problems that I often hear people complaining about. It's really simple... if the game is a piece of dog feces, temporarily disable SLI, enjoy playing the half-assed game, then re-enable SLI. Crisis averted, LOL. Otherwise, you leave SLI enabled 24/7 and kick the living crap out of machines with one GPU that is almost twice as powerful. There is no down side to SLI or CrossFire versus having just one GPU. The only negative element to whole deal is costing more money for the second GPU and the growing number of incompetent game devs that need to find a new career because they suck at what they do. I will not even consider the option of purchasing any high performance laptop that lacks dual GPU capacity. About the only way I would use a machine with a single GPU is if my only option was building it piecemeal because I didn't have enough cash up front to buy all the parts at one time.
     
    Docsteel, Ryan 23 and pathfindercod like this.
  27. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    I hope it's earlier than that... was thinking if they preview some systems in January, then allow for ordering 4 to 6 weeks later we might get something in March.... if it's Q3.... that's almost backing up into the opening of the window for Silverlake right?
     
  28. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Intel has a keynote at CES on Jan. 6th. They should present both Broadwell and Skylake there, or at least Broadwell.

    PAX South is Jan. 23rd. That's when we expect to see what Alienware plans to do with the rest of their lineup.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    780M SLI is almost 2x as fast 980M? Smells fishy. Here is 880M SLI vs. 980M in BF4. Nowhere near the difference you described, and that's with 880M's. I'd assume 780M SLI, being slower, would be pretty much tied with 980M.

    NotebookCheck puts 980M and 780M SLI on fairly even ground in other games as well. 980M wins some games, 780M SLI wins others. The margin of victory isn't big either way, certainly not enough to justify the numerous downsides of SLI. See CoH 2, where 780M SLI gets utterly destroyed because the game does not and will never support multi-GPU. If 780M SLI was 30% faster than 980M across the board, then maybe there's an argument to be made for it. But it isn't, plus 980M overclocks better, runs cooler, and uses waayyy less power. As much as I've ridiculed 980M for being a severely crippled GM204, choosing 780M SLI over it is just plain stupid to me.

    And it's not as if I'm talking out of my butt and making uninformed statements. My setup absolutely depends on SLI to deliver acceptable performance in demanding games. Running on a single GPU isn't an option. Trust me, I've had my fair share of hair-pulling while messing with hex bits in the driver to try to get SLI to work in unsupported games. Hours and hours of trial-and-error testing of different combinations in Nvidia Inspector since there's zero documentation out there and nobody outside of Nvidia knows what the different bits do on a low level.

    If there's anything 2014 has taught me, it's that SLI--hell, multi-GPU in general--isn't worth it anymore. Seemingly every single AAA title released this year has come out the gate with broken or non-existent multi-GPU support, which is either added weeks/months down the line, or not at all, because the game uses rendering techniques incompatible with multi-GPU, an increasingly common occurrence this console generation. Because let's face it, multiplat AAA devs focus first and foremost on the consoles. PC is a second-class citizen. If they can use some AFR-unfriendly motion blur or antialiasing technique to cheaply spruce up visuals on the current-gen consoles and make them look slightly less bad, they will, multi-GPU support be damned. Reworking entire parts of their games, or maybe even their entire rendering pipeline, to make them SLI/CrossFire compatible is not a priority because of the low ROI. Multi-GPU PC gamers are a minority. A very vocal minority, mind you, but still a minority.

    A big part of the blame lies with AMD and Nvidia as well for not pressuring devs to step up their game, and for their own driver inadequacies.

    Anyway, all of this BS beyond my control. What I can do is stay away from multi-GPU while the outlook is this bleak with no indication of improving (signs point to quite the opposite, actually). I'd rather spend a premium on the fastest single GPU in existence rather than get several cheaper GPU's with "potentially" higher combined performance.

    One word: Microstutter. An unavoidable reality of any multi-GPU configuration. Below 40-45 FPS is a vastly better experience on a single GPU. And before anybody comes back with "but I don't have microstutter," yes you do. It's inherent to every multi-GPU setup not running SFR or Mantle, this is scientifically proven fact. Just because you personally can't feel it doesn't mean it doesn't exist.

    [​IMG]
     
    RS4 and Robbo99999 like this.
  30. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181
    70fps with a 980m my 780sli in my aw18 runs between 110-120 FPS in bf4.. Those are my real world experienced facts. My 9377 with 980m sli slaughters my 780m sli but thats to expected. I play 3-4 games and it's pretty much the same. I base my decisions on real world gaming experience not synthetic benchmarks. I have plenty of hardware to test all scenerios for my needs..
     
    bumbo2 likes this.
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'm happy for you, really am. But your good experience in no way discredits the bad experiences many people out there have had with multi-GPU over the years. Plus they probably play much more than just 3 or 4 games. ;)

    Edit: Thought I'd leave this here.
     
  32. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Those issues are not NVIDIA's fault. The people who develop these games are slackers. Quantity over quality - they want to be paid more than they want to make a good game.

    You can't put all of the blame on SLI.
     
    bumbo2 likes this.
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Actually, it kinda is Nvidia's fault if they're not working with the developer to implement proper SLI support before a game's release or threatening to pull their marketing dollars and sponsorship if the developer doesn't shape up.

    I'm not putting any blame on SLI at all. It's great technology when it works.
     
    Mr. Fox, bumbo2 and Ashtrix like this.
  34. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    I'm not defending NVIDIA, but I think it's up to the developer to show the initiative to reach to NVIDIA before a game is released to ensure it is ready for SLI. If they don't, then they're lame... just like the tag line in your signature says. We cannot blame NVIDIA if the developers are just lazy and stupid, and never communicate with NVIDIA. If the developers don't do so, it puts NVIDIA in the reactive position of having to tweak the drivers for game support instead of the developer being proactive and accountable.
     
    bumbo2 likes this.
  35. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It's the other way around. Game companies/developers are supposed to reach out to NVIDIA. They need to be in constant communication with NVIDIA and AMD.
     
  36. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    This confirms a concern I have had with the advent of console gaming... my feeling is that if it's a console port all bets are off for effective multi-gpu support, but a PC-first game on average seems to be better and usually will benefit to a significant degree (albeit with tweaking). I have heard that PC gaming is actually in an upward arc right now, maybe we'll see better support going forward at least for a few years if the trend continues. The advent of single gpu cards capable of pushing what SLI was only a couple years ago is also going to put some downward pressure on devs... many people will "settle" for gaming at 60 fps simply because they don't have monitors that go higher.. so there often isn't significant benefit in trying to push 120 fps (though the average fps is raised which helps). Although I have had my fill of dual-gpu gaming on laptops for now, mainly because it's not effective for me to carry a dual-gpu laptop these days, I sincerely hope vendors like Alienware, MSI, etc. don't start to downplay the concept as it definitely is useful on the laptop side of the equation.
     
  37. RS4

    RS4 Notebook Consultant

    Reputations:
    32
    Messages:
    188
    Likes Received:
    50
    Trophy Points:
    41
    Nope, all the intel info from reliable sources indicate that "Haswell for MX and H parts till Q2 2015 in shipping laptops" , Broadwell ULV in shipping laptop are scheduled for Q1 and Q2 2015 and you can expect Broadwell MX and H parts only in Q3 2015 in shipping laptops.

    Then Skylake laptops parts will follow same schedule, that is Intel M skylake parts in Q4 2015 in shipping ultrabooks/ 2 in 1 etc, Skylake ULV in Q1, Q2 2016 in shipping laptops etc..

    these all are from reputable websites who have given correct information about intel timelines in the past too, you can check the thread about this topic which is running several pages in this forum itself.
     
    bumbo2 and Docsteel like this.
  38. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    If you don't notice microstutter then don't worry about it. FWIW stuttering only becomes noticeable for me when FPS drops below 45, so it's a non-issue.
     
  39. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    @RS4: thanks for the info - I admit I haven't read up on the release schedules like I should - this helps me pin down my decision to buy or not buy an AW 13 as a replacement for my aging-but-still-useful M11x's.
     
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    There's not much point in trying to keep up with release dates on poor to mediocre performance products unless you really are into that sort of thing. They could come and go with hardly a <del>nod</del> scowl or a glance from folks that care about excellent performance. Sounds like 2015 will be a year of dumbing down and pathetic junk from Intel. *yawn* That's a real bummer, and pretty stinking sad when 2nd Generation Intel i7 processors are something to be more excited about than 5th Generation.
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    There is a niche for most machines out there Mr. Fox, i'll be interested to see more benchmarks about which titles would be particularly impacted by the CPU.
     
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Communication is a two-way street. I didn't say lackluster multi-GPU support is entirely the fault of the IHV's, but they deserve a part of the blame just like game devs do. They can certainly improve drivers on their end. Even just simple things like Nvidia updating driver game profiles to include new executables in a timely manner or, god forbid, not screwing up executable names in the first place.

    I remember when PlanetSide 2 got its 64-bit exe and it took Nvidia months to add it to the driver. So SLI users unfamiliar with Nvidia Inspector were stuck with single GPU all that time.

    Or how about Divinity: Original Sin, whose profile has had the wrong exe name for god knows how many driver releases since the game came out during the summer. Again, no SLI for Average Joe.

    Or how about 8/10 official HBAO+ flags in the driver having serious graphical or performance issues. Once more, requiring enthusiasts to painstakingly look for unofficial flags to fix a graphical feature that is advertised in every single driver release.

    This sort of stuff just shows you they don't QA their profiles at all before pushing them through and they don't really care either because they know enthusiasts will always clean up their mess for them.

    And I'm not even gonna get into how Nvidia let PhysX, 3D Vision, and Surround die a slow, painful death. It's not worth mentioning at this point.
     
  43. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    The love for NVIDIA is STRONGGG in this one!!!
     
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Well it's not really about love, it's about juding actions and products.

    But anyway I hope AMD come back in a big way because competition is needed.
     
    FrozenSolid, Docsteel and octiceps like this.
  45. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yeah, hopefully AMD does something impressive this year. It has been a long time coming.

    It's only "two-way" after initial contact, but it is the game developers/company's responsibility to make contact in the first place. Not the other way around.

    I agree with the rest of this quote. ;)

    [​IMG]

    Happy Thanksgiving, everyone! :thumbsup:
     
  46. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    I was being Sarcastic Meaker... Facepalm man!!

    [ Godzilla-facepalm.png
     
  47. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Bingo. Blindly loving a company who pushes out half-baked products is called fanboyism. No thanks.
     
  48. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Irony is probably 3/4 of the products on the laptop market right now qualify as half-baked.... and their fans eat it up :D
     
    TBoneSan likes this.
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No kidding. These companies could learn a thing or two from my dealer on how to get fully baked.
     
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931




    Well, 'possums smile when they eat poop... and so do their friends.
    possum-possum.jpg
    ...some enjoy the ride even when they have no idea where they are going.
    possum.jpg
     
    Last edited by a moderator: May 6, 2015
    TBoneSan likes this.
← Previous pageNext page →