If I remember correctly, skylake GT62 did output 4K@60hz, but only at chroma subsampling of 4:2:0. this was the reason owners believed, (and myself included) that it was HDMI 1.4.
I also asked MSI directly in the case for GT73VR and they didn't have an official spec other than 4L@60hz and HDMI 1.4, however with my testings it showed it was full HDMI 2.0 with chroma 4:4:4 so I do believe new models "should" be HDMI 2.0 haha or at least I hope so.
It's kinda silly that it took so long for HDMI 2.0 to be adopted. But then again HDMI 2.0 is not even adequate for 4K@60hz with HDR so it doesn't matter hahaha![]()
-
-
Thanks for the added details
I also thought someone else reported GT80S Skylake did 4k @ 60hz 4:4:4, but I don't have a link for it.
There might be some non-compliant specs to declare full HDMI 2.0, or maybe MSI doesn't want to pay the $$ to be able to list HDMI 2.0 compliance.
The GPU's are HDMI 2.0 capable, if the supporting hardware passes it through.
I would normally want to use DP anyway, but it's nice for support TV's and monitors that have limited connections - and include HDMI 2.0.
With the TB3 + DP can you do 3 x 4k @ 60hz without using the HDMI (2.0) port? -
Yeah. I mean Maxwell GPUs were also HDMI 2.0 capable, we just didn't have the physical ports for some reason. I also agree I would use DP instead.
In theory we could do that... 3 x 4K@60hz using TB3+DP but I am not sure it is configure or the bandwidth enabled on these machines, nor have the money for such an amazing setup haha.hmscott likes this. -
FYI - EVGA with the "bursting into flames" GPU's discussed here months ago have released a new cooling implementation (ICX vs ACX) for their Pascal desktop GPU's, and there are a bunch of new videos out about the new "fixes" to cooling and monitoring, posted here:
http://forum.notebookreview.com/thr...-vrms-overheating.797645/page-5#post-10459026
I really hope Nvidia takes notice and codifies using more temperature sensors around the circuit to monitor more points of concern - around the GPU, Power components, and Memory.
It's also a good move to control temperature through split zones across 2 coolers, maybe not 2 fans in a laptop but 2 heatplates - independent and fully capable of handling the heat from it's zone only.
Lots of good technical info in all the videos.
Last edited: Feb 10, 2017Atma likes this. -
Hey im not any tech expert and in a bit struggle here,hope you could help me out.So i ordered new laptop GIGABYTE Aero 14 Wv7-OG4 QHD nVidia GTX 1060, 7th Gen Intel Core i7
and i will use 2 external displays connected.Thing is i was also going to upgrade one of my monitors with Nvidia G-Sync support. So what i read from page 60 to xx every other post said that g-synch might work or it wont when using displayport 1.2. So any info will my external display work with G-synch as long as monitor has it or is it always disabled if this laptop has that optimus anyway which im not quite sure??hmscott likes this. -
In general an Optimus laptop will support external G-sync monitors on 1 or more external video ports, and if it has a MUX switch you can switch between iGPU non-G-sync and dGPU G-sync support on the internal display.
Consult the vendor product overview and specifications pages, the laptop manual, other documentation, tech support of the manufacturer, or another owner to be sure.
There is a group dedicated to supporting your laptop make / model, where you can get better answers than here:
NEW Aero 14 w/ GTX1060
http://forum.notebookreview.com/threads/new-aero-14-w-gtx1060.797689/
There are some owners in that thread and they likely have already answered the question, so read through that Aero 14 1060 thread for information, and if you don't find exactly what you are looking for you can post a question there.
And, Welcome to NBR
TBoneSan and Publ1cEnemy like this. -
Guys,
Any news on Acer Vn7-573g? Been waiting since January for more info about availability but still no good. -
Asus ROG Strix GL553VE Review
Published on Feb 14, 2017
"Lisa Gade reviews the Asus ROG Strix GL553V, the company’s midrange 15.6” gaming laptop with the 7th gen Intel Kaby Lake Core i7-7700HQ 2.8 GHz CPU and NVIDIA GTX 1050 Ti dedicated graphics. This 5.5 lb. (2.5 kg) laptop series has undergone a cosmetic makeover and it looks much like the higher end ROG GL502VS we reviewed in late 2016 (we compare them in this video). The notebook has a full HD IPS matte display, RGB backlit keyboard, NVIDIA Optimus switchable graphics and a 1.2MP webcam.
It ships with 16 GB DDR4 RAM (32 gigs max), has a 256 GB M.2 SATA SSD and a 1TB 5400 RPM HDD. Intel 7265AC WiFi + Bluetooth, a 48 Whr battery and a 120 watt charger are standard. The laptop sells for $1,299. The GL553VD is $1,099 and it has GTX 1050 graphics vs. the Ti in this model."
ThePerfectStorm likes this. -
If the have the rebirth of the g46. Then I'd consider asus considering it has at least a 1060 equivalent. I thought about this model a year too late when I thought of it so went with the w230ST with the 765m.
Sent from my iPhone using Tapatalkhmscott likes this. -
Miguel Pereira Notebook Consultant
200$ of difference between 1050 and 1050ti? So much?
Enviado do meu MHA-L29 através de Tapatalk -
The English version of the review of AW15R3 http://www.notebookcheck.net/Alienware-15-R3-Notebook-Review.196584.0.html @hmscott @Phoenix +++
Feel free to post your comments at the bottom of the notebookcheck Review article
ThePerfectStorm and hmscott like this. -
ThePerfectStorm Notebook Deity
-
Sorry bro, but I put +++ in my post
Feel free to post your comments at the bottom in the notebookcheck review
Be honest.
ThePerfectStorm likes this. -
ThePerfectStorm Notebook Deity
No prob, just remember next time
- although I personally don't think anything short of a full CPU heatsink redesign can save AW.
Sent from my SM-G935F using TapatalkPapusan likes this. -
ThePerfectStorm Notebook Deity
-
If you feel strongly about warning people, please contact the author of that article, or the site editor, and remind them we have very active forums covering this and other laptops, and they can read all the debugging and other helpful posts we've all made over the last few months about the new AW laptops.Last edited: Feb 16, 2017ThePerfectStorm likes this.
-
But Dellienware continue as if nothing has happened
ThePerfectStorm likes this. -
ThePerfectStorm Notebook Deity
That is why I don't recommend a single Alienware laptop.
Sent from my SM-G935F using TapatalkPapusan likes this. -
Ended with it after Mr. Azor went fully BGA
This will tell you everything http://forum.notebookreview.com/thr...tles-to-0-78-ghz-xtu-says-thermal-but.801219/
------------------------------------------------------------
BTW @hmscott http://www.guru3d.com/news-story/could-nvidia-be-prepping-a-volta-release.html
Last edited: Feb 20, 2017TBoneSan likes this. -
-
I'll save y'all 10 mintues of your life.
No. Unless you're happy with your external 1080 barely beating an internal 1060.
I was surprised how poorly it performed... and my expectations were low. -
It's only really worth it if you have no decent dedicated GPU in your laptop. But I wouldn't opt for anything faster than a 1050 or 1060 because of bandwidth. But the benefits of it being a dock is nice too.
hmscott likes this. -
If I could have a LGA CPU only laptop (no GPU) and an external GPU that wasn't starved by bottlenecks I'd be all over that.hmscott likes this.
-
Those results would be very disappointing if true. What about this review from the forums?
http://forum.notebookreview.com/thr...nware-graphic-amplifier-with-gtx-1080.792591/Last edited: Feb 21, 2017 -
That's not too shabby and nothing worth splitting hairs about.
The problem with the benchmark in the link you've provided is he's using an external display. If one wants to use the laptop's internal display it's going to perform like in the video @hmscott posted above.
We're almost there, but it really highlights much more bandwidth is required.
Maybe @bloodhawk can chime in as he's pretty well versed here. -
That doesn't tell me a whole lot unless it's compared with another GPU. 3DMark graphics score won't be affected so much by CPU and bandwidth through Thunderbolt as much. Plus several of those were run with G-sync on.
If you compare those results with the GTX 1080 (single) in the Clevo P870DM3: http://forum.notebookreview.com/threads/sager-np9873-clevo-p870dm3-quick-review-by-htwingnut.795187/
3DMark FireStrike:
Clevo: 16877 GPU / 22096 Score
AW: 15180 GPU / 23162 Score
Witcher 3:
Clevo: 85 FPS
AW: G-sync capped at 60 FPS
Rise of the Tomb Raider:
Clevo: 129 FPS
AW: 88 FPS
GTA V:
Clevo: 103 FPS
AW: 86.9 FPS (although this is just a screen cap, not sure what average FPS was)
Crysis 3:
Clevo: 113 FPS
AW: 63 FPS (again, just a screen cap)
Doom 2016:
Clevo: 143 FPS
AW: 181 FPS
This of course varies wildly depending on where FPS was taken since there's no established bench for this
Hitman:
Clevo: 106 FPS
AW: 58 FPS
I don't want to take away from what this user did, as it is very much appreciated, and I know what efforts it takes to bench all those things. Just the point is that while it clearly can run new games very well, it also doesn't mean you're gaining a whole lot by running a high end GPU in it. I think that JayZ video showed that there are limitations and throwing in a fast GPU doesn't help a whole lot. I don't think you'll do much better than 1060 performance even with a 1080 in there, which gets back to my point. If you have a 1060 or better GPU in your laptop, I don't see much benefit in the external GPU. If you have a system with a weak dedicated GPU or no dedicated GPU at all, it makes a lot of sense. I just wouldn't bother with anything faster than a 1060 though, and would definitely not run higher than 2.5k (2560x1440) resolution.hmscott likes this. -
UPDATE: Retesting the External GPU for accuracy
(internal display vs. external monitor)
-
Ah, you're right. I use the external monitor on my laptop all the time so I presumed it as the expected usecase for the external graphics. Now the poor results shown in the video make much more sense, but again, not everyone would want to use the external graphics on the internal screen.
-
You might still be missing the point.
The external display on the laptop would still be running looped back video from the eGPU.
The laptop is still managing the video traffic out from the laptop and back from the eGPU to the laptop connected internal and external display.
The performance difference gained from the eGPU is higher when you connect an external monitor *to the eGPU*. That way the eGPU doesn't need to send the video results back to the laptop to display locally.
The performance is a little higher, but not as good as a native GPU on a desktop.
Here's the first test result in JZ's Update video:
eGPU 1080 looped back through the Internal laptop display vs external display on eGPU 1080 vs native GPU inside laptop using internal display.
Last edited: Feb 21, 2017temp00876, Robbo99999, steberg and 2 others like this. -
I'm not following you here. Why would I connect the external display to the laptop rather than eGPU if I had one? All I said is that I wouldn't be using the eGPU with my internal display in any case.
My point is that the review seems relatively shallow and I don't think it should be taken as a be-it-all review to determine whether or not eGPU solutions are useless.Last edited: Feb 22, 2017hmscott likes this. -
The 2 videos focused in on exactly what we needed to know. How the eGPU runs locally though the display, how it runs with the monitor on the eGPU, and how the "same" GPU runs natively on the laptop.
Those are the numbers that matter, and that's what we got from JZ.
Could he have done a bunch more games and benchmark comparisons? Yes, and he probably was going to do so, but when he got the results he got, confirmed over many hours of testing, then verified the results with Asus, I think JZ lost interest in eGPU's in general and in particular lost interest in this implementation by Asus.
JZ's also on the path to cover the release of AMD Ryzen, AMD Vega, and maybe even Nvidia 1080ti or complete Pascal Refresh in response to AMD, and now that I think about it a perhaps surprise release by Intel of new SKU's in response to AMD Ryzen, he couldn't afford valuable time that would be wasted on re-verifying performance in game after game that eGPU's are lame with current technology.
I think rather than shallow, JZ was focused and to the point, wisely stopping further testing when eGPU performance wasn't of interest over native GPU performance in the laptop. -
What we talking about with eGPUs nao?
-
It's starting now(?)... delay(?)... 3,708 watching now, none of the feed URL's have anything... "Live Stream will being in a moment"... it's been 15+ minutes of moments so far
Update: Music started... on both twitch and youtube...
NVIDIA GeForce GTX Gaming Celebration Livestream (4:27 to start, 7PM PT)
Usually watchable here: https://www.twitch.tv/nvidia
And, here: https://www.nvidia.com/page/home.html #ultimategeforce
Gonna miss the broadcast myself, hopefully a playback link will follow later...made it back just in time, but so far it's a no show show...Last edited: Feb 28, 2017Talon likes this. -
Got it on via Twitch. They need to get started soon since I'm on east coast time tonight.hmscott likes this.
-
Either it's broken, not happening, or really late... nothing on any of the URL's..."Live Stream will begin in a Moment"... 15 minutes of moments have already passed
Update: Music started... on both twitch and youtube... -
Here we go! Please don't put Tom in charge of anything!!hmscott likes this.
-
Update: I forgot the original 1080ti thread (CES 2017, first disappointment) was here:
http://forum.notebookreview.com/thr...7-january-5th-8th.797940/page-5#post-10473273
And, @Galm started a new 1080ti thread here:
http://forum.notebookreview.com/threads/1080-ti-unveiled.802053/
OMG, he's gonna do it!!
Faster than a speeding bullet, the ultimate GeForce...
Twitch died just before he delivered, and Youtube died right after
The Twitch feed has gone offline completely, and Youtube is a studder / delay / stop fest..
Live Stream content not uploaded yet, but here's a short one Nvidia just posted:
Nvidia finally uploaded the Live event, but it's worse in quality than the choppy stream was... Starts at 30:30...hopefully Nvidia will fix it soon.
Last edited: Mar 1, 2017 -
Watched it on FB lul they were short on VRAM for those...
TBoneSan, Papusan, temp00876 and 1 other person like this. -
specialist7 Notebook Evangelist
With the ol NVidia they would just castrate the Titan X and turn it into the xx80Ti but I'm surprised they kept the CUDAs and only cut the ROPs, VRAM (by 1) and upped the clocks along side with cutting the price to bump the 1080 down to $499 MSRP. Feels like they won't have anything till the end of this year -> next year so this is their answer to AMD's Vega that will be the released 1st and 2nd half of this year.
Atma likes this. -
The video stream was so choppy, and they haven't posted it yet to verify, but I could have sworn he started out saying the "Ultimate" was faster than the current fastest Nvidia GPU - the Titan XP he just gave 5 away - then outdated them in his next segment
So that would make the Titan 1080ti the fastest GPU Nvidia is shipping, right?
I wonder if Nvidia's gonna stop making the Titan XP or drop the price too?
Seems like the Titan XP is way too overpriced now.
Atma likes this. -
specialist7 Notebook Evangelist
Well the Titan X or rather the Titan series is usually their way of introducing what the new architecture is capable of, usually a runned down version of the Tesla (workstation) -> Titan -> 1080Ti etc...
I remember back then was the biggest shock to the world when the first Titan came out and not even a year (half a year later) the 780Ti came out which was set at half the price and really close to Titan performance which got some people upset. The following Ti were just castrated versions of the Titan/X/Black... with this one though its surprising because they kept the CUDA count the same only lowering some specs along with taking out 1GB of ram lol. The higher clock rate will make it a faster video card over all.
Titan X Pascal you were seeing at 1597MHz core.. and max boosts to 1900-1950~ and at high temps, they were saying the 1080Ti will be 1580MHz or so and boosting/ocing to 2GHz+ at better temps so reviews should be showing higher benchmarking/performance results. -
@specialist7
There was mention of new high performance memory sku's to be released to AIB partners for the 1060 / (1070) / 1080:
1080ti 11gbits/sec
1080 10->11 gbits/sec OC sku's, memory OC
1070 8 gbits/sec no mention of memory OC sku's
1060 8->9 gbits/sec OC sku's, memory OC
It seems odd that the 1070 @ 8 gbits/sec wouldn't also get a bump to 9 gbits/sec - or even 11gbs/sec, maybe it will.
A small VRAM only refresh - IDK how Nvidia will differentiate the naming of the slower VRAM vs higher speed VRAM boards.
I wonder what the Notebook makers will do with these faster memory sku's, maybe they will show up in their own model refreshes soon.Last edited: Mar 1, 2017Atma likes this. -
Small correction: Original 1080 had 10Gbps memory, not 9Gbps.
Sent from my iPad using Tapatalkhmscott likes this. -
Thanks for catching that, fixed
-
Looks like
the $699 GeForce GTX 1080 Ti is up to 35% faster than the GeForce GTX 1080, and is even faster in games than the $1200 NVIDIA TITAN X. - http://www.geforce.com/whats-new/articles/nvidia-geforce-gtx-1080-ti
What a great year new desktop builds!
(or even upgrading)
I wonder why they said " even faster in games" than just 'even faster'. -
The twitch stream video playback for the Nvidia 1080ti reveal looks clean:
http://www.ustream.tv/NVIDIA?utm_so...dium=visit-channel&utm_campaign=notifications
1080ti reveal starts at 19:30
Last edited: Mar 1, 2017Atma likes this. -
Meh 11Gbps is what all 1080 owners can already achieve. So all they're doing is a simple factory overclock to the vRAM hence why it says OC memory card. I would assume an existing owner could simply flash their card with the vbios and getbsame factory overclock spec. Or keep using Afterburner and manually OC.
I'll be buying the 1080Ti as soon as AIB cards are available. $699 is just too good of price to pass up. -
I would expect "better VRAM" to also OC... I wonder how much headroom will be left to OC the VRAM in the refreshed Pascal GPU's, and the new 1080ti?
The AIB water-cooled 1080ti will likely be +200-300$, $899-$999 to benefit from the full performance it's gonna be pricey
-
HaloGod2012 Notebook Virtuoso
so it looks like there wont be a mobile version of the 1080ti anytime soon. They will probably save that for the second half of the year. It could be that we may never see it at all.
hmscott likes this. -
Now it is determined even once again. Acer beat Alienware yet again!! For the 3rd time in a row. @iunlock @rinneh @Mr. Fox @Phoenix @hmscott @Ashtrix +++
And Acer use like the ALIENWARE'S , the well known TRIPOD!!
And Battery draining like newest Alienware'S. More of the same Battery does drain to 95% several times a day
------------------------------------------------------------------------------------------
As well is the new Alienware's models beaten by ASUS ROG G701
Edit: I forgot to tell that also Asus, like AW and Azer have battery discharge under full load!!Last edited: Mar 2, 2017Falkentyne, Cass-Olé, TomJGX and 8 others like this.
*Official* nVidia GTX 10xx Series notebook discussion thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Orgrimm, Aug 15, 2016.
- Alienware really need to get their act together.