I wouldn't say that it costs upwards of two grand just yet guys... I mean... MSI is known for being ridiculously overpriced and RazerBlade machines are equally so. This is the first original idea I've seen in a LONG time, since I first saw the Clevo P150HM actually.
The cooling system is ingenious, makes perfect sense in practicality and WORKS very well at that. Two fans, with two vents per fan, makes a great gaming machine sing. Plus it lets the machine stay super thin... >.>
I say it'll cost no more than $2,000 if it is released. The dual mSATA SSDs are gonna drive the price up, that's no doubt, but I can see more budget friendly models to be released in the future. Without a doubt XoticPC will pick this up. I'll give up my R3 here to snag one. This is an epic design.![]()
-
-
In which MSI laptops do you see this overpricing?
-
They already claimed starting price of $2099 for base model (which is I believe with HDD only).
-
I'm really curious as to whether or not they'll also sell this machine as a barebone to OEM's. That would be REALLY sweet.
-
doubtful. They (Gigabyte) haven't in the past and this is a very strong competitor against the Blade and other slim gaming notebooks, they really have no reason to.
-
Barebones would basically be with no RAM or Hard Drive, possibly no Wi-Fi card. That is a total of $150 less than base... /shrug/
I personally am "planning" on buying a base model and throwing in my own RAM and SSD's. -
Sorry double post:cry:
-
Looks great and has the specs to boot, great combo!
My only concerns are besides how well it actually cools the SLI GPU's/CPU are what fan noise levels are.
Question for you guys with SLI setups is micro stuttering and screen tearing still a problem? Has it gotten any better?
Had a CLEVO with SLI a while ago and the screen tearing and micro stuttering was very distracting and noticeable at times. Wish you can get G-Sync with laptops becuase this would be a perfect laptop for it. -
May I ask why you consider screen-tearing as a multi-GPU issue? I thought it should affect any computer without vsync in the same way..
As for microstuttering, thanks to the awareness that was raised last year, people have been keeping a close eye on the frame time metric. Nowadays I would run Afterburner to show FPS and frame time on the OSD. As far as I can tell my SLI cards are doing very well with smooth frame times. A few months ago AMD produced a fix for microstuttering on CFX and my personal experience with it had been positive as well. -
Funnily enough, the multi-GPU frame pacing issues in question (runts, tears, and dropped frames) can't be detected using traditional metrics like FRAPS and OSD's because of where they occur in the rendering pipeline, that's why they've eluded tech journalists for so long, until some good investigative reporting by the Tech Report and PCPer last year finally brought the issue to light and came up with a definitive (albeit very expensive) way to measure it via a capture-based testing setup. Nvidia's own efforts in this arena can't be forgotten either as its public release of FCAT was also instrumental.
-
Robbo99999 Notebook Prophet
Have you given up on the idea you had on upgrading the GPU in your R3 then? Would be cheaper! Or are you hankering after some more mobility? -
-
No micro stutter whatsoever as long as FPS is above 40 FPS with 780M SLI. For the most part this will never be an issue as long as you don't go overboard with AA.
-
In my personal experience it's not as bad with a single GPU. Maybe SLI has improved greatly but with my 780M GTX it is not as noticeable. Maybe it's todays drivers that are more just optimized, anyway glad to hear SLI/CF has been greatly improved.
Anyone see how big the PSU is for this? -
Several pages ago,
-
Screen tearing is simply the frame rate of the game exceeding the refresh rate of the monitor and ending up getting partial frame draws. The better the performance of the GPU (SLI is fast most of the time) the more likely it will happen. Syncing should fix it, but if you get wild performance swings you will end up clamping the frame rate low during poorer performance times. This is where adaptive vsync and gsync can help.
-
Wait, how do you know this? You've seen it in action or deep dive analysis?
-
moviemarketing Milk Drinker
Is gsync incorporated in all Nvidia cards? -
moviemarketing Milk Drinker
Wow seems to handle Splinter Cell at 4k resolution quite easily.Last edited by a moderator: May 12, 2015 -
No.. But just an example..
-
Isn't G-Sync independent of the card that's why they need a separate PCB in the monitor?
-
That's so sexy, it makes me want it even more now. Splinter Cell looks and runs so nice, makes me want 3x 4k monitors too.Last edited by a moderator: May 12, 2015
-
Yes, and it requires a Kepler (exact gpu models I can't recall) but I was only speaking about it as one if the technologies that can stop tearing. Enough about gsync..
-
G-Sync requires 650 Ti Boost or higher desktop card.
Done talking about G-Sync.
-
moviemarketing Milk Drinker
Just to make sure, that would not include the 765m in the Aorus, correct?
OK, that's enough talking about gsync.
-
Yep no laptop support at all at the moment.
-
has anyone looked at the actual dimensions of this thing? 15.4 inches x 10.3 inches x 0.88 inches it is barely bigger than most 15 inch screen laptops... in fact it is slightly smaller footprint wise than Gigabytes own P35k which has a 15 inch screen
the MSI G70 17 inch screen has dimensions of 16.5 x 11.3 x 0.85 nearly an inch larger on both length and width
how is this possible? it also looks fairly small for a 17.3 inch screen laptop in all the videos -
Meaker@Sager Company Representative
It makes sacrifices in cooling, processing power and expansion options.
-
Isn`t it nice?
Forget about comparing with MSI. They need to have a notebook that big because it only have 1 fan, so it needs more room for the heat. Plus it have a much hotter spot (where the GTX 780M resides) so the notebook have requirements on distance from the hot die to not make the chassis too hot to touch.
X7 have dual fan and it doesnt have nearly as hot GPU hence why they can put the chassis closer to the GPU without problems.
Its all about engineering. AORUS have done a much better job here than MSI. Well if the CPU doesn`t get as hot as the chinese guys measured that is. -
He's talking about the MSI GS70. Which has 2 fans.
-
Haswell CPU's get HOT HOT HOT without a large independent HSF, no matter how you look at it. And I'd much rather have an i7-4700HQ running at 2.5GHz than an i5-4300m at 3.0GHz.
Now one thing that would have made this a more interesting gaming notebook is if they used the i7-4750HQ with the Iris Pro 5200 so you could game on battery.
-
@Red: Oh I see, Well that makes it even more impressive
@HT: Yeah I know, Haswell is a hot chip for sure. So much packed in such a small chip.
Personally I think if heat was a problem, then AORUS would be better off with a 35W Quad Core CPU like Ivy Bridge had. It was actually quite a bit cooler than the 45W CPUs and had no problems running a GTX 680M
But I`m not sure if there exist a 35W Haswell Quad?
I agree, lower clocks is better. Dual Core would stink anyway in such a fine notebook. -
The Haswell equivalent of the 35W quad would be the 4702MQ, which I think was said to "run just as hot with 10% less performance" in this thread somewhere.
Cloudfire likes this. -
Thanks.
It runs 200MHz lower than the 47W CPUs. The 35W Ivy Bridge CPU I linked to above run 500MHz lower than the 45W. So I feel like I want to agree that the 4702MQ will maybe run as hot as the 47W Haswell`s. It will maybe be a couple of degree lower, but thats about it I think -
I'm assuming the HQ's are similar to the MQ's and the 4702MQ @ 37W runs just as hot and at a slower clock speed than the 4700MQ @ 47W. So there's nothing gained by going with a 37W quad core, it just will run slower, plus Intel tends to price the 37W quads higher than the 47W ones despite the pricing indicated on their website.
-
Just watched the video regarding the CES 2014 presentation of this machine and the baseline model will cost $2099!
I will buy this thing if that's true.
Great pricing by Gigabyte on that one. ^_^
-
Sarcasm? LOL
-
About 2k is actually a good price for what you get compared to other models in my opinion.
Cloudfire likes this. -
Actually no. I'm quite happy with that price. It's a great deal for what you get to be honest. Most machines are over twice as thick and twice as heavy, literally, but only cost a couple hundred less.
By that point, spending a little more to get the same performance in a smaller machine is usually what people go for, unless they LIKE that huge machine...which is nice, but I move around A LOT with mine...so this M17x is pretty bulky. >w>Cloudfire likes this. -
Joined and sub'd to post here.
Price is a bit high but that's what you get for the form/wow/new factor.
I just placed an order for the Lenovo y510p w/ 755m sli a week ago and didn't see this thread till right after I purchased.
I need a new laptop now but if the Aorus came out now, I would be very very tempted to cancel the Lenovo and pick this up instead. -
-
I hope you never have to use their customer support haha. Their products are good, but I avoid them now like the plague because of the horrible customer service experiences me and my wife have had. -_-
-
Why would they be releasing this laptop at the end of February using the 765m ??? Why would anyone buy this knowing the next iteration of graphics cards is literally 2 months away....at the most....
Cloudfire likes this. -
Thats a valid and good question Nick.
You could try asking their twitter and/or facebook page. They seem to be pretty active on both
https://www.facebook.com/AORUSGaming
https://twitter.com/AORUS_Gaming -
I can think of two reasons but I may not be right haha. First, a SLI of let's say 865m may be more expensive to put out and raise their price point. Or they could simply have spent a long time working out the temps, power, etc. of the SLI 765m and want to get the product out quickly to see how it fairs until they invest in producing a newer model.
But hey, I just buy the things so who knows haha. I agree with Cloud that you should ask them as well.Cloudfire likes this. -
is there really a difference between HIGH and ULTA graphics settings that one can tell on a 1080p display? if there isn't, then a simple GTX 765m would get you playable fps on HIGH in all the games currently, and you could have 5 hours battery life under 5 pounds 15inch such as the announced lenovo y50
-
I already asked them and no response yet. My thought was that they couldn't use an 800m series to showcase and advertiser at ces so they used 765m. Final production may be 860m which more than likely not much faster than 765m.
Regarding one 765m vs sli there's a huge difference at 1080p Especially considering the 128 bit vram.
Beamed from my G2 TricorderCloudfire likes this. -
I don't think they will release this with the upcoming nVidia chips. They have already stated specs and price.
Do you think PC modding companies such as XoticPC/Excaliber/etc. could potentially sell this beauty with 800m SLI instead of the 765m? I don't think so but hope dies last
-
Why not? Like I said they're not able to technically advertise with the 800m chips yet, so they have to go with what's available for marketing purposes. That may be why they are waiting until March as well, to get the 800m series chips in place.
-
Unless they use standard MXM cards how can we expect the GPU's to be customizable..?
New gaming notebook unveiled - AORUS X7
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 21, 2013.