In advance I ask for forgiveness if potentially my choice of words or terms are wrong or ill conceived.
I want 10-bit per channel output for a 10-bit per channel (30 bit) monitor. That is what I want. Now Nvidia gtx780m has the hardware for it, but the drivers in Windows do not support full color depth. So I read up on the Intel 4600, which does not have driver - support for 10 bit-per-channel, but supports 8-bit and 12-bit. I can not find good info on exactly how to get around this issue, but I am sure it can be done. I could install Ubuntu for a dual -boot system, as I have read that there is driver support for that platform. The other option I see is the new Displaylink - adapter, DL-5500, which supports 4k (usb 3.0 -> displayport) and does not run off either graphics card on the Clevo. It is supposed to support 10-bit-per-channel rgb to 10-bit monitor, up to 3840x2160. What I want is to be able to video and photo - editing, with the possibility of perfectly rendered, large prints. (No further discussion is needed,.... please)
Does anybody have any more in-depth info regarding this matter ?
Thanx in advance![]()
-
"probably" need a quadro/firepro card.
-
Meaker@Sager Company Representative
It's a limitation of optimus as pointed out, unless you can find a hack for the Intel drivers it's a no.
-
So, anyone out there who knows ? Is it possible at all to display TRUE COLOR - output (30-bit, 1.07 billion colors) on an external monitor using the p157sm ?? It is possible for the Intel 4600, it is also possible for the GTX780m. It seems for marketing reasons they limit this possibility to Quadro or Firepro ? Can I do it in Linux ?
This really takes a lot of the joy out of a 4k display purchase, doesn't it ? -
I think the underlying thing is that Photoshop '' the only software that 10bit really matter" , require workstation card's opencl 30bit support.
technically, consumer cards hardware can output 30bit and it is proven/shown in various setup. (Linux or to a HDTV, etc)
and then the displayport on some optimus laptops is routed from the Intel make thing even more complicated. -
So 1.07 billion colors is kind of just a gimmick then, mostly? Having a 4k display capable of displaying 1.07 billion colors versus a 10-bit-per-channel display - Is there a difference ? 10 bits/channel x 3 channels = 30bits. 2^30 = 1.07 billion colors.
Does any OS make use of more than 32 bit rbga ( 24 bit + alpha) "True Color" ?
Adobe Premiere Pro & Indesign also make use of 10-bit I assume.
What I am asking here is : What is the scenario for anyone owning a 4k display capable of displaying 1.07 billion color ? They are selling cheap TN - panel ones now, which state that they are capable. Are anyone able to make use of this ? And from what source ?
The only thing I know for sure right now is that I will get 3840x2160 @ 60hz with Displayport 1.2. I have already thoroughly checked Intels pages, and tried 2560x1440@60hz so I'm confident of this.
If somebody can shed more light on the subject of 4k color depth I do appreciate itThanx baii !
Color Depth on external monitor output
Discussion in 'Sager and Clevo' started by PushT, Jul 19, 2014.