The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    GTX 900M series officially announced by NVIDIA!!!

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Oct 7, 2014.

  1. Sen7inel

    Sen7inel Notebook Consultant

    Reputations:
    242
    Messages:
    140
    Likes Received:
    90
    Trophy Points:
    41
    Here is the thing; I don't. That's an example of flawed thinking. So is claiming that someone buying something other than a clevo/sager is vain or otherwise compromised from making an intelligent purchase. Everyone's situation is different. These are fine laptops but definitely not a fit for everyone. But we can agree to disagree.

    Here we definitely agree; frankly I'm not looking forward to many of the next gen console ports. PC gaming isn't what it used to be, perhaps taking an extended hiatus isn't so far fetched.
     
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Now see.. THIS here, this is the point where it becomes beyond an average joe whose maximum willingness is to clean out his fans with canned air every month or so. I could mod the life out of my machine and probably get it running very nice and never overheating even in my extremely hot temperatures, but what good does that do for say.. someone else in my situation who isn't as willing or knowledge-able to do all of this?

    It's like how Johnksss has a working 880M without issue. He is one of the only users with a 8GB 880M which works... but he has modified his machine and swapped in/out GPUs like crazy and WILL make his stuff work, no matter what. It is beyond what I would be capable of doing, so I would likely end up with a "broken" 880M if I had one. So from what you say, I don't think recommending a GS machine to someone who likely will not go as far as you will be a bad idea if I can promise it can be workable. ESPECIALLY if you reduce the turbo clock of the CPU. That's not working as intended (I did not check your posts, you could have not done this, but you did mention needing to use XTU)
     
  3. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    The guy who mentioned the Sager said it was not the best looking. What I see around the forum anymore is "It looks so sexy!" and the like. It is very much a vain thing. I have heard clevos been called ugly pretty frequently. People on this thread have mentioned several times which laptop looks better or other. I am not delusional. You get whatever you want. I think it is very much an aesthetic issue among buyers.
     
  4. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Thanks for the input everyone!

    OK so judging by this review of the Clevo P651, it seems that:

    - P651 has the cooler chassis under load compared to P35W v2
    - P651 has the quieter fans under load than the P35W v2
    - P651 has significantly shorter battery life than the P35W v2

    Whether or not P35X v3 improves on the above areas we are yet to find out.

    So, I guess for myself, the main points it all boils down to are:

    - P651 has a 180W power supply vs 120W power supply for the P35X v3 - would this limit the P35X v3 at stock clocks in anyway (as I certainly won't be overclocking in this slim chassis anyway due to heat)?
    - P651 has half the VRAM of the P35X v3 - potential bottleneck for ultra textures in upcoming games perhaps? (Shadow Of Mordor, I'm looking at you!)
    - P651 likely has better GPU cooling than P35X v3 - but CPU cooling does matter and P35X v3 could have an advantage there. Hard to say at this point in time.
    - P651 likely has considerably shorter battery life than the P35X v3 - big bonus to P35X v3 as it will last longer away from the plug. Always handy.
    - P651 has 4x USB 3.0 ports while there is only 2x on the P35X v3 (also 2x USB 2.0) - advantage to the P651, all my external HDDs are USB 3.0 now.
    - P651 has fingerprint scanner while P35X v3 doesn't - I really love the fingerprint scanner on my current Clevo, it really is convenient.

    Really, what is most concerning me the most is whether that 120W power supply for the Gigabyte will significantly limit the 980M in anyway vs the 180W on the Clevo, assuming no overclocking is required?
     
  5. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    Very true, there are a couple other things that might need to be taken into consideration too though.

    First is the CPU fan on upside down as it appears? Even upside down it should get some airflow, but it might be reduced, possibly causing the CPU heat problem. Also if it is on wrong, did they get a preproduction model, and this is fixed?
    Clevo p651se - fan.jpg

    Second the 980m version will be 4mm thicker, likely to raise the thickness of the GPU fans, however they may make the CPU fan thicker as well, which considering it will use the same CPU could make a big difference.

    Also that GPU is WAY overcooled, the review said you could hardly tell the GPU fans were on, and it still kept it at like 70c or something. Since the GPU and CPU are completely separate that means over-clocking the GPU might be very worthwhile on this laptop.

    If the gigabyte has only a 120w with a 980m it near certainly would cause throttling from too low of power, are you certain they don't use a different power supply on 970m and 980m models?
     

    Attached Files:

  6. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    Yup, I couldn't find power supply info for those new v3 models, but for Gigabyte P35W v2 there were different power supplies for different GPUs:


    mySN.de | SCHENKER XMG | Schenker Technologies - XMG C504 CORE Gaming Notebook 39,6cm (15.6")

    No reason for GTX 970M / GTX 980M versions to go with lesser one.
     
    Cakefish likes this.
  7. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    p651se according to NBC has a 150w PSU, which would power throttle itself also
     
  8. epicninja

    epicninja Notebook Consultant

    Reputations:
    5
    Messages:
    126
    Likes Received:
    28
    Trophy Points:
    41
    Well, at least dragon age inquisition appears to be a good pc release. It's not really a port (they made an entire trailer on how they focused on pc lol), but still is a next gen game. The min and recommended specs are low, and from the the trailers it looks like there are enough graphics to also challenge high level cards. Take notes ubisoft. Here are the specs:

    This Is What PC Gamers Need For Dragon Age: Inquisition
     
  9. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Pretty much same as BF4, right? That runs at 720p on Xbox One IIRC.
     
  10. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    Dragon Age Inquisition runs on Frostbite Engine, so its performance characteristics / requirements should be pretty similar to Battlefield 4.

    EA's new strategy of using one engine everywhere seems to be working (Battlefield, Dragon Age, Star Wars, Mass Effect, Mirror's Edge, Need for Speed).

    Ubisoft has multiple engines used by different studios for different games (Snowdrop engine - The Division, Anvil engine - Prince of Persia series, Assassin's Creed series and Rainbow Six, Dunia engine - Far Cry series, Disrupt engine - Watchdogs), so they kinda need to re-develop things instead of re-using and optimizing.
     
    Ningyo likes this.
  11. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    So maybe Bethesda will use id Tech for everything! :)
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Don't you mean ZeniMax? And dear god, I hope not...

    And I hope it's not Gamebryo/Creation either. That PoS is almost as bad. Elder Scrolls and Fallout games have always been buggy as hell.
     
  13. Sandwhale

    Sandwhale Notebook Consultant

    Reputations:
    69
    Messages:
    123
    Likes Received:
    83
    Trophy Points:
    41
    These we're my two primary considerations as well; however, I've also stumbled upon the Gigabyte P34X v3 which has a 970m (fine for 1080p everything) and is only 1.7kg. It only has one screen option which is 1080p matte (which is kinda nice because it limits my decisions for me... I think we share this fickleness problem lol). Nevertheless, all of these laptops come out around the same time (within a week or so), and so I'll be willing to wait an extra week or so as long as I get the one that I know is the best for me.

    The primary aspects I've been mulling over are:

    -The flex in the Gigabyte keyboards (I had an Aorus x3 plus and returned it because the adhesive under the keyboard failed and the keys on the left half kept popping out... so I'm weary of Gigabyte build quality and quality control)

    -Heat is obviously a concern for these thin laptops but I'm sure the P34X v3 will be cooler than the P35X v3 and the P651.

    -The screen on the Aorus x3+ was a matte 3k display and I was very very pleased with the quality of image and crispness (albeit after some tweaking with the various scaling). The P651 has a matte 3k display option which would be the best, but I could live with the glossy 3k on the P35X v3 or the 1080p matte on P34X v3.

    -Battery life on the Aorus was great for me, so I'd love at least 3-4 hours to match this. Perhaps the P651 could make this, but it is likely the P34 will be around 6-7 hours and the P35 will be around 5.

    I would like some opinions on these aspects if anyone in a similar position or is just willing to contribute: So... Do you guys think glossy is worth the crispness indoors while glaring poorly in more lit areas? Is the 980m simply too much power for the chassis of the P35X v3 for it to be reasonably cooled? How do the keyboards and quality of Sager/Clevo laptops compare to those of Gigabyte? Would the cooling system of the P651 cause the CPU to bottleneck the rest of the system due to the single CPU fan (vs 2 GPU fans)?

    I'd hope to pre-order one of these babies sooner rather than later, so any of these "pre-review speculations" are useful to me to help formulate a decision.

    Thanks

    BTW they changed the color scheme of the P34X v3... It looks MUCH better now and is actually a really attractive option!
    View attachment 116635
    -
     
    steberg and Cakefish like this.
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Frostbite 3 wasn't made for most of the kinds of games EA is wanting their studios to use it for. Battlefield? Sure. Star Wars Battlefront? Great. Mirror's Edge? LOL not with the jumping/mantling physics, everybody will be humping the wall getting pushed back 30 feet every time they want to build momentum. Theoretically should work though. I know Dragon Age and Need for Speed had serious problems with the engine and their dev teams were griping about it because of the way the engine sees physics/objects or something, but I guess they got it to work.

    If they want to evolve frostbite into an all-purpose smorgasbord like Unreal Engine, let them go ahead. But it's going to include many rocky games and launches before that happens, or at least many MANY buggy as hell betas as the devs try to get used to using a FPS game engine for non-FPS applications.
     
  15. Sen7inel

    Sen7inel Notebook Consultant

    Reputations:
    242
    Messages:
    140
    Likes Received:
    90
    Trophy Points:
    41
    I can definitely see where you're coming from, it may well seem intimidating.

    But in reality I did all the hard work; at this point it's no more difficult than flashing official updates. As for xtu, it wasn't a temp issue, I actually overclocked a bit :) - mainly undervolted to extend battery life. Cool. Fast. Thin. With kepler you could have two of the three. The 970m is much cooler running and I'm convinced none of this will be necessary.
     
  16. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    But they are the best games ever! :)

    (sorry for getting out of topic, but I need to defend my favorite games of all time :))

     
  17. Sen7inel

    Sen7inel Notebook Consultant

    Reputations:
    242
    Messages:
    140
    Likes Received:
    90
    Trophy Points:
    41
    I for one dig the Sager looks, it's all in the eye of the beholder. You know what they say about opinions...
     
  18. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    After reading:

    MSI GS60 Ghost Pro 3K review - Nvidia GTX 970M & 870M versions

    I want to thank Nvidia for making our dreams come true! An ultrabook form factor with 9.5k 3dmark11?? I thought we were like 1-2 years apart from that! Now the question is:

    Buy MSI GS60 Ghost Pro 3K NOW or wait for Razer Blade 14 2015 with Broadwell + well, Razer...

    Dear god?! why do we need to make such decisions!?!
     
  19. Sen7inel

    Sen7inel Notebook Consultant

    Reputations:
    242
    Messages:
    140
    Likes Received:
    90
    Trophy Points:
    41
    That's an easy decision - there will ALWAYS be something better right around the corner. As the Razer 2015's coming out we'll be hearing rumors about the 2016 :) and on and on...
     
    long2905 likes this.
  20. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ikr! and it is this vicious cycle of joy and pay! I love every second of it :)

     
  21. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Hope to see some benchmarks soon for P35x v3 and Clevo P651, both shipping with 980m rather than 970m.
     
    Cakefish likes this.
  22. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    :) ..there's actually a good reason why that happened. In Demon's Souls, the actually good game that no one played, they designed the game on top of the Phyreengine toolkit thing Sony made. It's a framework for OpenGL that with common useful imports that take advantage of the cell-processor (i.e., prepare some complex math based on current information in the scene, and have it execute and complete in one cpu-cycle, and so on). And it basically invites you to write code that makes complex real-time adjustments to the graphics during the drawing of each frame, etc. Things like the ripple-effect at the Nexus if you spawn as human is an easily visible one. The hit-detection during the fighting is another. Bunch of cheaply programmed (but hilariously expensive thing on a PC budget) visual effects also turn up here.

    A "draw-back" in this sense is that you essentially plan the amount of time reserved for calculations before each frame, so it's a completely necessary decision to pick a framerate. And that's a console-convention that turns up more and more in games now. For different reasons, but because it enables you to add certain routines for shaders on PC for example. Or to create threads with faster or slower updates for node-generation in the background depending on the speed of the hardware, things like that. And on limited hardware on consoles, it's essential to get anything out of it. But when you can prepare the entire process-diagram on beforehand, then you could choose to cap the framerate at 30fps to ensure you can make per-frame adjustments to animation, things like that.

    And trying to port that to PC essentially involves either dirty rewrites of these function calls, and waiting until any of those calls randomly complete at utterly irregular execution times, that all would cause framelocks, since the logic requires completion before each draw call.

    Or, like what From Software did, rewriting half the game and simplifying the hit-detection to normal hitscan and static trigger animation moonwalking on top of a flat plate.

    So all the super-accurate hit-detection trickery in Demon's Souls was gone in Dark Souls, but the engine still allowed some of the functions, and the engine still relies on per frame calculation for certain things. So that game is stuck on PC at a fairly low framerate no matter how light the effects would have been set to. That's.. just not what determines the framerate here.

    And like you say, some of the animation, some of the hit-detection, some of the objects that suddenly hit other objects at a massive frequency - cause slowdowns. And that's because these function calls in the OpenGL code need to complete before each frame is drawn. Bad programming practice, perhaps. But when coming from the PhyreEngine and Demon's Souls, with the amount of Cell specific (and unique) code, it makes sense that they'd run into that problem.

    PC developers and xbox, xbox360, ps4 and xbone developers, though -- they don't have that approach at all. In this case, there's none of the per-frame dependent code in the first place. Very, very few PC games use that, because it's just not very efficient. You phase the semi-complex math you can get away with into static tables and then use that to adjust towards later with constant lookups. Or you don't adjust based on game-world context at all, and just trigger static effects that are already loaded, next to whatever other static object is there.

    Framerate in the end usually is exclusively determined by the overlay filters, anti-aliasing, etc.

    Quick-time events, for example - usually a necessity. Because pairing up two animations dynamically is actually ridiculously difficult, as well as expensive. And that's why basically all PC games are based on hit-scan and static animation playback. Where the most advanced we're seeing is either cape-flapping that is adjusted on the graphics card logic, with physx (again a framework for OpenGL with certain imports based on hardware capability) or the "Mantle" things in Tomb Raider, for example. And here the hair or the cape flowing is actually completely independent from everything else in the game-world other than the nearby object's neck, and the cape doesn't flap when it hits other objects in the world even though the wind affects it at certain points, etc.

    Same thing with shooters. "Rag-doll" effects look really bad and unnatural. And if the objects really are supposed to interfere with all the other objects, the code is too expensive. And the static effects look better in the end anyway. And that's where "AAA" games come from: A huge studio that spends monstrous resources on creating static animation for cutscenes. And where the actual game logic consists of checking if the gun is pointed at the figure, and then triggering a death-animation when you fire.

    Actual hit-dependent animation, area and object-aware cover-systems, animation transitions that aren't 100% scripted (and therefore dictate the height of the cover, etc.) -- not going to happen on current development /conventions/. And even if it was possible to aim for it, with physx and mantle, and OpenCL -- it's not going to be done, because the ones who know how to do business in the industry already know it doesn't pay off to increase the coding budgets.

    Basically, and it's not the first time that's been said, what happens is that PC games used to be back in the long-long-ago an industry where the cutting edge tech on graphics and processors was made. That was where all the new techniques were experimented out, that's where all the latest tech found an actual use. And now it's an actual industry, and new tech isn't necessary to make money. So you end up with finding the same techniques in cheap indie-games, just with worse animation and overall production - because the foundation is so simple.

    While some indie games push on programming techniques that are more advanced. And they are never rewarded for that.

    Meanwhile, the most interesting animation and graphics work lately happened by in-house developers developing exclusively for Sony, for the Playstation 3. :/ ..you know..

    So yeah, PC gaming isn't what it used to be. Not that it's really a huge problem. But what's depressing is that in the time I've been writing reviews and tried to get articles about games printed - is that generally people are not impressed by new tech. They don't see the point with it, don't see the difference between Demon's Souls (awesome!), and Dark Souls (crap!), for example. While the "PC gamers" out there are obsessed with fill-rate, clock-speeds and ram size. People cram an SLI setup into a Monster Tower, add a 3000W powersupply - and play CandyCrush, without SLI-optimized code, at 90000fps. On a triple-head monitor setup. And Dolby 2ch sound dispersed in a 7.1 surround rig. Etc.

    PC-gamers nowadays are bs, I guess that's what I'm saying. Also, get off my lawn and stuff like that.
    Oh, no. The thing is that developers either are prohibited from talking to anyone about what they're doing, on the threat of lawsuit and contract breach (at EA, for example, this is serious business - devs in the wild talk as if they are kept under watch by sniper-fire). Or the ones people are actually talking to when they say they're talking to "devs", are the producer and PR folks. And finally, most people in the games-media haven't the faintest idea about what they're talking about anyway. Even people like TotalBiscuit or some of the guys at Gamespot, and PCGamerUK, for example -- they end up at "procedural generation", and then it's a wall there. And they don't avoid writing about tech because their audience doesn't understand. They don't write about it because they don't understand any of it themselves.

    ...But the developers who aren't bound by contracts or threat of lawsuit, they are extremely willing to talk your ear off if you actually ask.
     
    moviemarketing likes this.
  23. grandfinale

    grandfinale Notebook Consultant

    Reputations:
    1
    Messages:
    118
    Likes Received:
    6
    Trophy Points:
    31
    Is the keyboard flex thing really a big deal?... I mean from the video demonstration I saw, the guy was practically slamming his finger into the K key just to get it to flex. Who types like that?
     
  24. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    so is a laptop with 970m sli at 3k better or worse than a single 980m at 3k? saw that 980m has better bandwidth. sorry, still a noob
     
    HopelesslyFaithful likes this.
  25. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,983
    Trophy Points:
    431
    gotta wait for more config core when 20nm comes around, probably a huge jump.
     
  26. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Cool. Maybe it is the American resellers who got it wrong (got my info from their store pages). None of the British resellers even have that info, nor the manufacturer's website even!

    Will send Overclockers UK an email, asking them directly. This is a hugely important factor.

    Sent from my Nexus 5
     
  27. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    i doubt a 120 150 or 180 watt PSU is enough for a haswell quad and a 980m. My m17x R4 with 3720qm and 7970m at the time blew the 220W PSU i bought while running F@H. A 240 w PSU just barely holds this thing up. If cooling was good on my "new" 3920xm i would probably blow the 240w PSU.

    180 watt "should" work as long as you don't pull 100% load but if you do...PSU bye bye
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    180w will be enough for stock.
     
    HTWingNut and Cakefish like this.
  29. asukhama

    asukhama Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    1
    Trophy Points:
    6
    giving that the 970M really overshadows the 860M in all aspects and is only about 200usd more expensive, i'm thinking about getting a 970M setup instead.
    So far i had this in mind

    15,6" FHD matte panel (non-IPS)
    i7 4710HQ
    GTX 970M 3gb GDDR5
    8gig 1600Mhz DDR3
    Hitachi 7mm 500GB 7200RPM HDD
    Cooler Master Extreme Fusion X1 thermal paste
    120W charger

    all of this for roughly 1200 euros,

    is this a reasonable price you reckon?
    I didn't include an SSD at this point cause i want to cut the price a bit. I would be buying this from Cjscope, a Clevo reseller, so they allow for easy upgrades (memory, HDD etc) in the future. I think it has 2 more free mSata slots and 3 memory slots that can be filled.

    In any case, i'd appreciate your opinions if this laptop is future proof enough for that price? (i don't very intensive full HD gaming but i feel the 860M would make me tone down graphics down too fast in the near future)
     
  30. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Now that I'm seeing that both of those laptops have tiny power supplys I wouldn't buy either of them. I certainly can't see how a 120W power supply can power a potentially 100W 980M and a 47W CPU plus the maybe 20W overhead of the other components! 180W might not be overstressed at stock clocks on everything though, but cutting it fine I think.
     
  31. jegarfor

    jegarfor Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    Hi Meaker, I have seen you know very well how to upgrade msi gt models. I have written in the gt60/gt70 post but I am not sure the answer I received is complete. take a look if possible. thanks in advance
     
  32. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I may have been wrong with my information. I've sent an email to my reseller to ask for confirmation. 180W I can cope with as there's no thermal headroom to overclock anyway, but 120W would indeed be very small.

    Sent from my Nexus 5
     
    Robbo99999 likes this.
  33. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Would there be an option to buy an aftermarket power supply with greater wattage too? I've used different Dell power adapters on my Dell laptops in the past. When I was insanely overclocking my 8600M GT in my M1530 I upgraded the power supply from 90W to 130W to cope with the extra load - is that possible with the two laptops you're looking at? Sometimes using the wrong adapter, even if connector, voltage, and wattage are acceptable can still result in things like battery not charging due to incompatibilities.
     
  34. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    if the adapter exists but thats a big if. Those systems with 980 should have 220-240w. if you hit 100% load without OC you run a big chance of the PSU shutting off. If you max OC that 180w PSU is toast
     
  35. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Found a P651 review with a 970M from notebookcheck.net, don't know if you've seen it. Gets a bit warm, but not the 970M, just the CPU and the casing:

    Clevo P651SE (Schenker XMG P505) Barebones Notebook Review - NotebookCheck.net Reviews

    (150W supply on the 970M version) Also, 150W recorded power usage during 100% GPU load on the 970M (albeit Furmark), without any CPU load!! 180W definitely marginal for 980M. Notebookcheck recommending 180-200W power adapter for the 970M! Power adapter getting to 70 degC - that will warp your carpet or something, or you could literally gently cook an egg on it or something - pasteurisation temperatures achieved! (They diss the 4K screen too, saying that it's not really usable with Windows 8.1 - font sizes, and also not useful for gaming they find.). It's a noisy laptop too, gees, not looking good!
     
    moviemarketing likes this.
  36. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Before the GT72 came about MSI shipped its GT70 with a 180W PSU. So it's not unheard of.

    @robbo no idea yet, will have look into it.

    Sent from my Nexus 5
     
  37. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    mt G51j came with a 120w for a 720qm and 260m and with max load that PSU could brick and 260 was a fairly tiny GPU. 150 is just stupid for 970 and 180 for 980 is stupid
     
    Robbo99999 likes this.
  38. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yeah, I don't understand skimping on the power supply - MSI learnt their lesson after their ridiculous 180W power supplys supplemented by their positively marketed NOS battery boost system! That was a marketing miracle, but just pure dressing up of a turd in a tuxedo!
     
  39. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Overclockers UK don't know the answer!

    Sent from my Nexus 5
     
  40. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    As a thought, if you really want to upgrade to 980M, why don't you just sell the 780M card & then buy a 980M on ebay - will it be compatible with your machine, maybe with some help by Prema (not up on the Clevo side of things)? At least you're happy with your laptop being not too gaming bling and it's got everything else you want from a laptop already, and will be a better value cost proposition too, and the 980M will probably run cooler than your 780M anyway, which will in turn then allow you to overclock to your heart's content which is not possible on your slim laptops with measly power supplys and poor cooling ability!
     
    HopelesslyFaithful likes this.
  41. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    The GT72 with 980m comes with a 19.5V 11.8A = 230W power supply, and at stock hasn't drawn more than 170W for the most part, so lots of breathing room.
     
    Robbo99999 likes this.
  42. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    thats fair amoutn but these with 120, 150, 180 is just stupid. I question the 180 for X7 pro but at least its a super tiny PSU compared to the 240w version. I might get both. One for backpack (180) and one for home (240)
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah 180W on X7 Pro is too small IMHO. Two 970m's even if they draw only 50-60W each, that's still borderline for the system. 180W for a 970m system and 220-230W for 980m seems about right. The problem is that manufacturers like Aorus and Razer look at stock only I guess assuming that users won't push their systems. They assume they're all "dumb" customers that buy a laptop and use it and don't overclock or mod it in any way. I'm sure from a stock perspective it will satisfy 95% of games out there. But even if the PSU is pushing close to its max limit it will greatly shorten the life of that adapter.
     
  44. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    edited it check again
     
  45. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I like the Ghost Pro 970m 1080p, but that logo just kills me. I do have a black vinyl patch I could put over it. Too bad it's a lit logo and not just some stuck on emblem that you could peel off. :mad:
     
  46. HopelesslyFaithful

    HopelesslyFaithful Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    3,271
    Likes Received:
    164
    Trophy Points:
    0
    yea my g51j had these lights on 24/7 and looked cool but was super annoying
     
  47. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    I agree... Seems to be a bit of waste since the Clevo you have Cakefish probably can take a 980M... Also all the advantages Robo's listed...
     
  48. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Almost picked up the GS70 Stealth last year, except for that logo. It's called "Stealth" but there is this giant "Gaming Series" dragon crest of arms on the back of the lid. Looks like something an 8 year old kid might post on his treehouse.
     
    Dabeer and ZerockX like this.
  49. Sewje

    Sewje Notebook Geek

    Reputations:
    26
    Messages:
    95
    Likes Received:
    10
    Trophy Points:
    31
    Heads up for UK, Pcspecialist are taking orders for the 980m, they have options for 8gb or slim chasis laptop both 15", which is what I was waiting for, mines on order already absolute bargin compared to the msi or asus ones. definitly for people who have memory and ssd parts already at hand to swap out. Dang I thought I could wait .... but i want it now!!! :D
     
  50. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    Ok, I am firmly on the side of those arguing that it's valid to want thin gaming laptops, but I have to agree that there is aesthetic factored into the decision. I am almost 100% convinced that I'll be buying the Clevo P650SE due the fact that I can afford to put better RAM and faster and larger SSD(s) into it, but I still think it's rather ugly, and if I could afford the slimmer, sexier GS60, I might get it anyway, with the hopes of being able to afford to upgrade the RAM and SSD later.

    Within a certain market segment, buyers will absolutely choose whatever they think is sexiest (that they can afford). What drives them to that market segment in the first place, however, is NOT always aesthetic! For me, and it sounds like for several others on this forum, the ability to take a gaming-capable machine along for air travel and hotel stays without worrying about it taking up too much space or cutting grooves into our shoulders when we carry it around is much more important than being able to play the latest games at the top settings at the highest frame rates - and whether you agree with our opinion or not, you need to stop acting like it's an invalid opinion to hold.
     
← Previous pageNext page →