The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Clevo notebooks with 800M series coming out February 2014

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 11, 2013.

  1. 1nstance

    1nstance Notebook Evangelist

    Reputations:
    517
    Messages:
    633
    Likes Received:
    221
    Trophy Points:
    56
    Not trying to defend my brand or anything, but my 780M also stays relatively cool. Never goes above 85 when gaming. Most of the time it's around 78 when gaming. Still, Alienware has awesome cooling. What about ASUS though?
     
    Cloudfire likes this.
  2. SinOfLiberty

    SinOfLiberty Notebook Evangelist

    Reputations:
    240
    Messages:
    308
    Likes Received:
    34
    Trophy Points:
    41
    25% more cores than 650 Ti

    Expected performance according to the source where it all came from"Sweclcockers" 25% over 650 ti

    Core clock increase 22% over 650 Ti

    Now, how come does it only have 25% over 650 Ti when the amount of cores is 25% and core speed is 22% higher. This suggests the new architecture.

    On the other side: It might have been a kepler card that is waiting for be released(with a slight delay) in the mid range segment. I am definitely with it.

    Edit: Has very similar specs to 675MX kepler card. In fact, it is a down clocked version of it.
     
  3. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I didnt mean to bash any brands. Its just my observation that atleast Alienware 18 seems to run cooler than what other brands can offer. 85C with GTX 780M is not bad either 1nstance :)

    I think Asus G750 have pretty good cooling as well, but I`m not sure. They will use the same model with GTX 880M so they seem to trust their cooling system.
     
    1nstance likes this.
  4. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Depending on the source, they could get in trouble for revealing details of the GPU before disclosure date, so praps that's why they greyed it out.

    Well, if you're basing Maxwell vs Kepler performance efficiencies on the result of this leaked information then I think you would conclude that Maxwell is less effiicient than Kepler given the results I posted. Can't believe NVidia would develop an architecture that is less efficient on a per core and per Mhz basis, unless the new architecture allows something like 50% more cores per mm squared when compared to Kepler when comparing both architectures on the SAME node size (say both on 28nm for arguments sake).
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Mine never went above 76C.
     
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GTX 650 Ti:
    768 cores @ 1046MHz (803328)
    4995 GPU points in 3DMark11

    GTX 750 Ti:
    960 cores @ 1176MHz (1 128 960)
    5608 GPU points in 3DMark11

    5608/4995 = 1.13
    1128960/803328 = 1.40

    Blimey you are indeed correct. Efficiency have indeed gone down. By a lot.

    Is there any way in reducing power consumption without increasing performance/core with a new architecture? Perhaps thats was Nvidia`s goal with Maxwell in 28nm. Just reduce power consumption and wait with performance until 20mn is here?
     
  7. SinOfLiberty

    SinOfLiberty Notebook Evangelist

    Reputations:
    240
    Messages:
    308
    Likes Received:
    34
    Trophy Points:
    41
    This is so boring.

    Once upon a time everyone praised the mighty Kepler, now the audience is sick of it. I mean, it is time for him to shade away and let the successor take the throne.
     
  8. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`m sorry but that went way over my head. :p
    Less transistors in a core? If that is true for Maxwell, we are looking at a huge increase in core count if they are gonna double the transistor count with 20nm right? And it means performance/core have again gone down with Maxwell vs Kepler. Kepler cores was already 1/3 performance of a Fermi core if I remember correctly.

    But you might be on to something robbo. Sweclockers revealed today that GTX 750 is also coming along with 750 Ti, and it require no external power supply. It can suck its juice directly from the PCI-e port
    http://wccftech.com/maxwell-gtx-750-nonti-inbound/
     
  10. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Well, that's OK, as I said I'm not an authority on this either, so I'm just thinking & postulating out loud.

    There's nothing magical or indivisible about a "core". They're only made up of transistors right, so if NVidia design a new architecture, then they can choose to design their "cores" with a different layout, either using more or less transistors (and probably changing a bunch of other stuff too that I don't understand). So, if they've designed Maxwell to have less transistors per core, then it could well be that we can see can improvement of efficiency per core in terms of space saving on the die and consequently as decreased power consumption per "core". The 960 cores on the 750ti in the leaked info, if they were these new "cores", then that's the only way that we could say that Maxwell was more efficient that Kepler - based solely on the leaked info that you showed us. As I said though, we might be basing all this discussion on a leak of dubious integrity. It's so hard to explain these technical things in words, I'm doing my best, but not sure I can explain it any better.

    EDIT: to answer your Kepler vs Fermi point. 1 Fermi Core is about as powerful as 2 Kepler Cores. Fermi cores were 'hot clocked' on the shader, at twice the frequency of the core clock (that's where the factor of two comes from). So, yes, that's another example of when NVidia "redefined what a core was".
     
    Cloudfire and deniqueveritas like this.
  11. 1nstance

    1nstance Notebook Evangelist

    Reputations:
    517
    Messages:
    633
    Likes Received:
    221
    Trophy Points:
    56
    That's awesome. Do you own Witcher 2 and Far Cry 3? If you do, what temps did you get?
     
  12. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    W230SS is 2GB soldered.
     
  13. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I do have Far Cry 3, but not installed. My last system was sent back to Dell. I've got the replacement in my signature now. Unfortunately, I won't be gaming on this system as I'm trying to sell it. The highest temperatures on my previous Alienware 17 were from benchmarks or Battlefield 4 and Crysis 3. But they never exceeded 76C while gaming, but I believe during a benchmark the temperature did reach 78C.

    My ambient temperature is quite low and I've got a custom cooling pad as well. That is likely a factor as to why.
     
  14. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    Then what in the name of god is the Lenovo 860M 4GB? As far as I know, Lenovo do NOT offer MXM. 3 versions (4GB soldered, 2GB soldered, 2GB MXM) is straignt into the realm of impossibility.
     
  15. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Since when does 2gb and 4gb actually matter on defining a card? Its not like its the difference between 128bit and 256bit or any value in between

    dosed, mixed and stirred, not shaken, from taptalk
     
  16. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    It helps because it gives people a way to determine versions.

    Also, depending on the resolution (3K MSIs) and the game and setting, 4GB can give better performance than 2GB.
     
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    That's just a cut down gk106 core.
     
  18. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I use Arctic MX-4, and my 780Ms used to run about 85-88 C after hours of gaming in Sleeping Dogs with max settings. After finally finding the courage to mess around with my 4k machine, I decided to do a full repaste of everything. Mythlogic did a pretty good paste job as there was almost no spill outside of the die. Since I was (and still am) inexperienced, I naturally applied too much and basically just caked on the MX-4. To my surprise, my 780Ms ran a full 5-7 C cooler when running it through Kombustor. And temps maxed out around 80 C in Sleeping Dogs (average is about 75 C).

    Learning from this, I repasted the CPU by also caking on the MX-4. With this my 4900MQ never runs a hair above 78 C during the XTU stress test (avg = 73 C). Prior to repasting, it would spike up to 91 C, and the average would hover around 85 C.

    I don't want to sound ridiculous here, but at least for my P370SM, it seems that applying gobs of MX-4 actually does more good than applying "just enough" or too little. Perhaps you might want to try just caking on the thermal paste and see if that helps? I used the line method for the the 780M and 4900MQ, 2 lines each about the thickness of a rice grain on the 780M and 4900MQ die.

    EDIT: Should really mention that these temps were obtained using the laptop on a U3 cooler with the fans on medium setting. Also, -80 mV undervolt with 4900MQ.
     
  19. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
     
  20. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Right, so you can probably imagine what a proper paste job (that still involves smacking on the MX-4 but without the air bubbles) could do to the already IMO pretty good temps.

    And yeah I picked the MX-4 specifically for its ease of application and removal (ie n00b-friendly); heard the IC Diamond is a biatch to apply, and will etch dies if not properly removed. Besides I decided to personally boycott the IC Diamond after the whole IC vs Techpowerup fiasco. :D
     
  21. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, I agree with all of that, and what you said about IC Diamond! (Probably doesn't help to go completely nuts about slapping on the paste though, a little too much is probably fine though, too little is not fine.)
     
    SinOfLiberty likes this.
  22. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Quite Possible!
     
  23. SinOfLiberty

    SinOfLiberty Notebook Evangelist

    Reputations:
    240
    Messages:
    308
    Likes Received:
    34
    Trophy Points:
    41
  24. SinOfLiberty

    SinOfLiberty Notebook Evangelist

    Reputations:
    240
    Messages:
    308
    Likes Received:
    34
    Trophy Points:
    41
  25. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    well this means probably a rebadge for the GTX 860M :)
     
  26. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Those specs are wrong. 30 ROPs? Look at the GPU-z screenshot in the previous page. It says 16 ROPs.
    That is GPUBoss. Do they even have any credibility?
    The cards are either Maxwell or it is Kepler. It can`t be somewhere in between. GM107 can`t be a Kepler refresh. Its called GM for a reason.

    It seems that the core-clock efficiency have gone down a lot with the GTX 750 Ti compared to GTX 650 Ti. So it looks like an entirely different architecture than Kepler. Less performance efficient than Kepler, but it looks like it will be concentrated around power reduction instead. Meaning they can shove in many more cores with 20nm when it gets here.

    Either GTX 750 Ti is GM107/117 aka Maxwell or it is GK10x aka Kepler.
     
  27. SinOfLiberty

    SinOfLiberty Notebook Evangelist

    Reputations:
    240
    Messages:
    308
    Likes Received:
    34
    Trophy Points:
    41
    Gpu boss revealed the important part.

    They got rop count wrong but the leaked gpu screen also has some parts grayed out. U cant reveal everything, either gray it out/ mess up a bit of info. Same happened to 780, its specs differed from what It has but gk 110 was confirmed.
     
  28. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    No it doesn't. A game usually uses around 1.5gb of ram and the rest is vram

    dosed, mixed and stirred, not shaken, from taptalk
     
  29. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Because only the doubtful that you post is accurate

    dosed, mixed and stirred, not shaken, from taptalk
     
  30. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    Wrong. At 1440p (less than the 3K MSIs), Crysis 3 can use 2.2GB+ VRAM with Very High settings and 8xMSAA. Take that to Ultra and 16x MSAA, and who knows what you've got.

    How much vram do you actually need? - Graphics Cards - Linus Tech Tips

    Remember I said sometimes, not all the time.
     
    reborn2003 and Cloudfire like this.
  31. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  32. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    So you are sayingg to me that 32bit app addressed more than 3.5gb of ram....

    Oh I feel forgot crisys is one of the few 64bit games out there...

    dosed, mixed and stirred, not shaken, from taptalk
     
  33. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    By the time you turn up the settings that high, you're not going to have playable framerates. So there's no point to the extra VRAM, especially for a mid range card like the 860m.

    And Crysis 3 doesn't have an 'Ultra' setting or 16x MSAA.

    And having 3 configurations of the 860m isn't impossible. The 660m came in 1GB MXM, 2GB MXM, 2GB soldered versions and the 650m came in 512MB soldered, 1GB soldered and 2GB soldered versions iirc.
     
    HTWingNut likes this.
  34. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Right, lol. I like it when I read that people recommend going for the larger video RAM to "future proof". The problem isn't just amount, but performance and bandwidth of the card. Sure a game might use more than 2GB, but those that do are when they are running at maximum AA and super high res. It's not like an 860m with 8GB GDDR5 will make a difference in the future. It will still be limited by the performance of the GPU.
     
    Cloudfire likes this.
  35. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I think all high end GPUs should come with 3GB or 4GB VRAM because there are some scenarios in games where it actually does hit over 2GB in VRAM usage. Like Skyrim with several mods running @1080p use 2.2-2.4GB VRAM. Not to mention several games would be dangerously close to 2GB. And you have to pick either 2GB or 4GB for 256 bit GPUs. 2GB would leave me very sceptical while playing with GTX 880MX for example.
     
    reborn2003 and Robbo99999 like this.
  36. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, I've seen in the 1800's for Skyrim (Vanilla plus Bethseda High Resolution Pack). And I've seen 2300MB at the highest point for one of my games, can't remember which one though! I think 3GB is comfortable for high end, but 4GB is more comfortable, above that I don't think it makes sense as far as current GPUs are concerned in the mobile arena.
     
  37. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Totally agree. 4GB should be standard.

    Anyhow, what happens if you cross 2GB usage in a game and you only have 2GB? The extra VRAM usage is put on the DDR3? Or does it start clearing stuff which is cached and it doesnt need?
     
  38. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Truthfully though, even though a game shows 2.5GB vRAM usage, does it really mean it's taking advantage of it or just it's textures staged there for use? If you had 1GB vRAM and it used 2GB, it would prioritize what it needs, and the other 1GB would just be staged in your system RAM the way I understand it.
     
  39. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I've always thought the extra textures that wouldn't fit in VRAM would end up being stored in system RAM, which is better than the disk, but still slower than the VRAM. I wonder if the NVidia driver handles that or the OS? Apparently my GPU can use 7926MB of shared system memory (due to having 16GB of RAM, was a lower figure when I had 8GB). So, that's why I assume extra textures that can't fit in the VRAM sit in the RAM - but I could well be wrong about this!

    @HTWingnut, that's an interesting theory, I don't know how it's proved or disproved though. Maybe if a 680M 2GB went up against a 680M 4GB when running Skryim that uses 2.3GB on the 4GB card, and comparing performance results between the two. Don't know if that's been done before though!
     
    reborn2003 and Cloudfire like this.
  40. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    This has been done in the past with desktop GPU's, not mobile, but iirc the performance result was the same at 1080p. It wasn't until 3x1080p monitors with 8xAA were used that it became an issue, and by then the FPS is cut down drastically anyhow. If I find the link I'll post.

    edit: Not Skyrim, but here's some benchmarks 2GB vs 4GB single card:
    http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
    http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
     
    reborn2003 likes this.
  41. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Thanks for those links, I took a look at them. Yes, there's not much difference in performance between 2 & 4GB, but it looks like they didn't measure VRAM usage in both of the cards. It would have been interesting to have maybe seen an example of a game where the performance was the same between the 2 cards, but the 4GB card was using say 3GB of VRAM vs the 2GB card using say 1.9GB of VRAM. That would have then proved your theory about it only using the extra VRAM if it's there, but getting away with the same performance on a card with less VRAM.
     
  42. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Strange Double Post!
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    There was a test, but I just can't find it. I don't think it was Skyrim, but maybe GTA IV or something where vRAM usage was like 2.5GB, and they tested 2GB and 4GB cards and the result was the same at 2k or lower resolutions.

    I dunno. If I had an unlimited budget and that was my life/job, I'd love to test/evaluate stuff like this. But two $800 cards is a bit out of my budget. :p
     
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I think the GPU put everything on the VRAM. It doesnt make sense to me to put important things like textures and such on the system memory, because its slower but also because there will be latency from moving files from DDR to VRAM if needed.

    The system memory is used for something obviously since games do use DDR3, but im not sure what exactly those are. All I know is the system use both DDR and VRAM and the memory you see used in GPU-z is only from VRAM and not VRAM+system memory.

    It would be interesting to see what happens to performance if you use 2.5GB and only have 2GB VRAM. Is the performance hit from clearing VRAM cache to make room for things thats need to be written or is it from reading game files from the slower system memory? I`d like to think it is because DDR3 is slower, but I`m not sure.
     
  45. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yeah, that's cool, you don't have to trawl to find the link, if you've seen the evidence of it that's fine, I believe you. I can kind of understand the idea that some games might use the extra VRAM if it's there, for just-in-case marginal scenarios that never or very rarely come about, and therefore performance could be same on lower VRAM cards.
     
  46. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Not sure of all the functions that would reside in RAM for a game, but AI routines or anything that needs to be processed by the CPU would sit in RAM. Sound processing. The base game code might sit in RAM - haha, I probably just made up a term there, I have no idea! Stuff that sits in VRAM would be stuff that the GPU can understand & manipulate before outputting to the screen I guess, everything else would have to be on the RAM or disk I imagine.
     
  47. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Basic programming things

    dosed, mixed and stirred, not shaken, from taptalk
     
  48. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
  49. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    The grass and trees look awful, though. They HAVE to fix that. And no, I have no idea when Unreal 4-based games will come out.

    This might help though - UNREAL ENGINE 4 FAQ . Read this before asking.
     
    Robbo99999 likes this.
  50. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It still looks flat with plastic sheen to it. When are engines going to get rid of the sheen on everything? That's what kills it for me. Plus no reflections on the windows. It looks like a miniature scale dollhouse to me. And yeah the grass is just flat textured too. I dunno, not real impressed. I guess I would have to see it in action in a game.

    Here's Unreal 4 engine Hind House:



    And the real Hind House

    [​IMG]

    [​IMG]
     
    ThePerfectStorm likes this.
← Previous pageNext page →