The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New gaming notebook unveiled - AORUS X7

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 21, 2013.

  1. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    I wouldn't say that it costs upwards of two grand just yet guys... I mean... MSI is known for being ridiculously overpriced and RazerBlade machines are equally so. This is the first original idea I've seen in a LONG time, since I first saw the Clevo P150HM actually.

    The cooling system is ingenious, makes perfect sense in practicality and WORKS very well at that. Two fans, with two vents per fan, makes a great gaming machine sing. Plus it lets the machine stay super thin... >.>

    I say it'll cost no more than $2,000 if it is released. The dual mSATA SSDs are gonna drive the price up, that's no doubt, but I can see more budget friendly models to be released in the future. Without a doubt XoticPC will pick this up. I'll give up my R3 here to snag one. This is an epic design. :D
     
  2. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    In which MSI laptops do you see this overpricing?
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    They already claimed starting price of $2099 for base model (which is I believe with HDD only).
     
  4. sangemaru

    sangemaru Notebook Deity

    Reputations:
    758
    Messages:
    1,551
    Likes Received:
    328
    Trophy Points:
    101
    I'm really curious as to whether or not they'll also sell this machine as a barebone to OEM's. That would be REALLY sweet.
     
  5. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    doubtful. They (Gigabyte) haven't in the past and this is a very strong competitor against the Blade and other slim gaming notebooks, they really have no reason to.
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Barebones would basically be with no RAM or Hard Drive, possibly no Wi-Fi card. That is a total of $150 less than base... /shrug/

    I personally am "planning" on buying a base model and throwing in my own RAM and SSD's.
     
  7. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Sorry double post:cry:
     
  8. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Looks great and has the specs to boot, great combo!
    My only concerns are besides how well it actually cools the SLI GPU's/CPU are what fan noise levels are.
    Question for you guys with SLI setups is micro stuttering and screen tearing still a problem? Has it gotten any better?
    Had a CLEVO with SLI a while ago and the screen tearing and micro stuttering was very distracting and noticeable at times. Wish you can get G-Sync with laptops becuase this would be a perfect laptop for it.
     
  9. sponge_gto

    sponge_gto Notebook Deity

    Reputations:
    885
    Messages:
    872
    Likes Received:
    307
    Trophy Points:
    76
    May I ask why you consider screen-tearing as a multi-GPU issue? I thought it should affect any computer without vsync in the same way..

    As for microstuttering, thanks to the awareness that was raised last year, people have been keeping a close eye on the frame time metric. Nowadays I would run Afterburner to show FPS and frame time on the OSD. As far as I can tell my SLI cards are doing very well with smooth frame times. A few months ago AMD produced a fix for microstuttering on CFX and my personal experience with it had been positive as well.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Funnily enough, the multi-GPU frame pacing issues in question (runts, tears, and dropped frames) can't be detected using traditional metrics like FRAPS and OSD's because of where they occur in the rendering pipeline, that's why they've eluded tech journalists for so long, until some good investigative reporting by the Tech Report and PCPer last year finally brought the issue to light and came up with a definitive (albeit very expensive) way to measure it via a capture-based testing setup. Nvidia's own efforts in this arena can't be forgotten either as its public release of FCAT was also instrumental.
     
  11. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Have you given up on the idea you had on upgrading the GPU in your R3 then? Would be cheaper! Or are you hankering after some more mobility?
     
  12. lee_what2004

    lee_what2004 Wee...

    Reputations:
    822
    Messages:
    1,137
    Likes Received:
    14
    Trophy Points:
    56
     
    Last edited by a moderator: May 12, 2015
    moviemarketing and RMXO like this.
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    No micro stutter whatsoever as long as FPS is above 40 FPS with 780M SLI. For the most part this will never be an issue as long as you don't go overboard with AA.
     
  14. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    In my personal experience it's not as bad with a single GPU. Maybe SLI has improved greatly but with my 780M GTX it is not as noticeable. Maybe it's todays drivers that are more just optimized, anyway glad to hear SLI/CF has been greatly improved.
    Anyone see how big the PSU is for this?
     
  15. lee_what2004

    lee_what2004 Wee...

    Reputations:
    822
    Messages:
    1,137
    Likes Received:
    14
    Trophy Points:
    56
    Several pages ago,
     
  16. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    Screen tearing is simply the frame rate of the game exceeding the refresh rate of the monitor and ending up getting partial frame draws. The better the performance of the GPU (SLI is fast most of the time) the more likely it will happen. Syncing should fix it, but if you get wild performance swings you will end up clamping the frame rate low during poorer performance times. This is where adaptive vsync and gsync can help.
     
  17. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    Wait, how do you know this? You've seen it in action or deep dive analysis?
     
  18. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Is gsync incorporated in all Nvidia cards?
     
  19. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Wow seems to handle Splinter Cell at 4k resolution quite easily.
     
    Last edited by a moderator: May 12, 2015
  20. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    No.. But just an example..
     
  21. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Isn't G-Sync independent of the card that's why they need a separate PCB in the monitor?
     
  22. RMXO

    RMXO Notebook Deity

    Reputations:
    234
    Messages:
    722
    Likes Received:
    242
    Trophy Points:
    56
    That's so sexy, it makes me want it even more now. Splinter Cell looks and runs so nice, makes me want 3x 4k monitors too.
     
    Last edited by a moderator: May 12, 2015
  23. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    Yes, and it requires a Kepler (exact gpu models I can't recall) but I was only speaking about it as one if the technologies that can stop tearing. Enough about gsync.. :)
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    G-Sync requires 650 Ti Boost or higher desktop card.

    Done talking about G-Sync. :)
     
  25. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Just to make sure, that would not include the 765m in the Aorus, correct?

    OK, that's enough talking about gsync. :p
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yep no laptop support at all at the moment.
     
  27. colonels

    colonels Notebook Consultant

    Reputations:
    147
    Messages:
    222
    Likes Received:
    18
    Trophy Points:
    31
    has anyone looked at the actual dimensions of this thing? 15.4 inches x 10.3 inches x 0.88 inches it is barely bigger than most 15 inch screen laptops... in fact it is slightly smaller footprint wise than Gigabytes own P35k which has a 15 inch screen

    the MSI G70 17 inch screen has dimensions of 16.5 x 11.3 x 0.85 nearly an inch larger on both length and width

    how is this possible? it also looks fairly small for a 17.3 inch screen laptop in all the videos
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    It makes sacrifices in cooling, processing power and expansion options.
     
  29. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Isn`t it nice? :)

    Forget about comparing with MSI. They need to have a notebook that big because it only have 1 fan, so it needs more room for the heat. Plus it have a much hotter spot (where the GTX 780M resides) so the notebook have requirements on distance from the hot die to not make the chassis too hot to touch.
    X7 have dual fan and it doesnt have nearly as hot GPU hence why they can put the chassis closer to the GPU without problems.

    Its all about engineering. AORUS have done a much better job here than MSI. Well if the CPU doesn`t get as hot as the chinese guys measured that is.
     
  30. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    He's talking about the MSI GS70. Which has 2 fans.
     
  31. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Haswell CPU's get HOT HOT HOT without a large independent HSF, no matter how you look at it. And I'd much rather have an i7-4700HQ running at 2.5GHz than an i5-4300m at 3.0GHz.

    Now one thing that would have made this a more interesting gaming notebook is if they used the i7-4750HQ with the Iris Pro 5200 so you could game on battery. ;)
     
  32. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    @Red: Oh I see, Well that makes it even more impressive :)

    @HT: Yeah I know, Haswell is a hot chip for sure. So much packed in such a small chip.
    Personally I think if heat was a problem, then AORUS would be better off with a 35W Quad Core CPU like Ivy Bridge had. It was actually quite a bit cooler than the 45W CPUs and had no problems running a GTX 680M :)
    But I`m not sure if there exist a 35W Haswell Quad?

    I agree, lower clocks is better. Dual Core would stink anyway in such a fine notebook.
     
  33. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The Haswell equivalent of the 35W quad would be the 4702MQ, which I think was said to "run just as hot with 10% less performance" in this thread somewhere.
     
    Cloudfire likes this.
  34. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Thanks.

    It runs 200MHz lower than the 47W CPUs. The 35W Ivy Bridge CPU I linked to above run 500MHz lower than the 45W. So I feel like I want to agree that the 4702MQ will maybe run as hot as the 47W Haswell`s. It will maybe be a couple of degree lower, but thats about it I think
     
  35. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I'm assuming the HQ's are similar to the MQ's and the 4702MQ @ 37W runs just as hot and at a slower clock speed than the 4700MQ @ 47W. So there's nothing gained by going with a 37W quad core, it just will run slower, plus Intel tends to price the 37W quads higher than the 47W ones despite the pricing indicated on their website.
     
  36. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    Just watched the video regarding the CES 2014 presentation of this machine and the baseline model will cost $2099!

    I will buy this thing if that's true. :D Great pricing by Gigabyte on that one. ^_^
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Sarcasm? LOL
     
  38. Joker13z

    Joker13z Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    5
    Trophy Points:
    6
    About 2k is actually a good price for what you get compared to other models in my opinion.
     
    Cloudfire likes this.
  39. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    Actually no. I'm quite happy with that price. It's a great deal for what you get to be honest. Most machines are over twice as thick and twice as heavy, literally, but only cost a couple hundred less.

    By that point, spending a little more to get the same performance in a smaller machine is usually what people go for, unless they LIKE that huge machine...which is nice, but I move around A LOT with mine...so this M17x is pretty bulky. >w>
     
    Cloudfire likes this.
  40. Wildo

    Wildo Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    1
    Trophy Points:
    16
    Joined and sub'd to post here.
    Price is a bit high but that's what you get for the form/wow/new factor.
    I just placed an order for the Lenovo y510p w/ 755m sli a week ago and didn't see this thread till right after I purchased.
    I need a new laptop now but if the Aorus came out now, I would be very very tempted to cancel the Lenovo and pick this up instead.
     
  41. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I think we are looking at March release. I agree with imglidinhere and Joker.

    Take MSI GS70 for example. It cost $1700.
    With X7 you get 120GB extra SSD (Raid0) and you get one extra GTX 765M. The value in that is $400.

    So the price is actually where it should be.
     
    Joker13z likes this.
  42. Joker13z

    Joker13z Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    5
    Trophy Points:
    6
    I hope you never have to use their customer support haha. Their products are good, but I avoid them now like the plague because of the horrible customer service experiences me and my wife have had. -_-
     
  43. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31
    Why would they be releasing this laptop at the end of February using the 765m ??? Why would anyone buy this knowing the next iteration of graphics cards is literally 2 months away....at the most....
     
    Cloudfire likes this.
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Thats a valid and good question Nick.

    You could try asking their twitter and/or facebook page. They seem to be pretty active on both

    https://www.facebook.com/AORUSGaming
    https://twitter.com/AORUS_Gaming
     
  45. Joker13z

    Joker13z Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    5
    Trophy Points:
    6
    I can think of two reasons but I may not be right haha. First, a SLI of let's say 865m may be more expensive to put out and raise their price point. Or they could simply have spent a long time working out the temps, power, etc. of the SLI 765m and want to get the product out quickly to see how it fairs until they invest in producing a newer model.

    But hey, I just buy the things so who knows haha. I agree with Cloud that you should ask them as well.
     
    Cloudfire likes this.
  46. colonels

    colonels Notebook Consultant

    Reputations:
    147
    Messages:
    222
    Likes Received:
    18
    Trophy Points:
    31
    is there really a difference between HIGH and ULTA graphics settings that one can tell on a 1080p display? if there isn't, then a simple GTX 765m would get you playable fps on HIGH in all the games currently, and you could have 5 hours battery life under 5 pounds 15inch such as the announced lenovo y50
     
  47. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I already asked them and no response yet. My thought was that they couldn't use an 800m series to showcase and advertiser at ces so they used 765m. Final production may be 860m which more than likely not much faster than 765m.

    Regarding one 765m vs sli there's a huge difference at 1080p Especially considering the 128 bit vram.

    Beamed from my G2 Tricorder
     
    Cloudfire likes this.
  48. biyectivo

    biyectivo Notebook Enthusiast

    Reputations:
    145
    Messages:
    19
    Likes Received:
    4
    Trophy Points:
    6
    I don't think they will release this with the upcoming nVidia chips. They have already stated specs and price.

    Do you think PC modding companies such as XoticPC/Excaliber/etc. could potentially sell this beauty with 800m SLI instead of the 765m? I don't think so but hope dies last :)
     
  49. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Why not? Like I said they're not able to technically advertise with the 800m chips yet, so they have to go with what's available for marketing purposes. That may be why they are waiting until March as well, to get the 800m series chips in place.
     
  50. sponge_gto

    sponge_gto Notebook Deity

    Reputations:
    885
    Messages:
    872
    Likes Received:
    307
    Trophy Points:
    76
    Unless they use standard MXM cards how can we expect the GPU's to be customizable..?
     
← Previous pageNext page →