The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Let's Discuss 4k vs 1080p on a laptop.

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Luraundo, Oct 2, 2016.

  1. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    I disagree though. If you look at a given font (let's say courier new) and it's set to 1/4" in overall height, that font will look better on 4k than 1080p; even assuming you adjust all other factors so that the actual height on both is precisely 1/4". This is because on a 4k, the finer pixel pitch renders a less granular image. In gist, it means that you will likely be able to see smaller font on a 4k just as clearly as a slightly larger font on 1080p. I do benefit while gaming--for sure. But even when browsing the internet, it's much sharper too.
     
    bennyg likes this.
  2. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    While it seems most here are resisting the urge to have the most pleasant viewing experience (overall) today... technology moves on.

    Here is Dell introducing their 32" 8K monitors. Sure, a cool $5K right now - but I've paid much more for monitors (Sony FD Trinitron GDM-FW900 24") (a $5K monitor from 2000 is worth more $$$$ than a $5K 'anything' today...).

    See:
    http://www.tomshardware.com/news/dell-canvas-8k-monitor-soundbar,33292.html


    While a few go on discussing the merits of 1080p (which is circa 22 years old, at least...), the real tech world moves on. :)


    See:
    http://www.geek.com/games/john-carmack-coded-quake-on-a-28-inch-169-1080p-monitor-in-1995-1422971/
     
    inperfectdarkness likes this.
  3. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Once 4k and 120hz is a thing then I might dip my toes. 60hz is the biggest deal breaker for me.
     
    Spartan@HIDevolution likes this.
  4. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Great point. Refresh. You want it for gaming and are willing to downscale to 1080p for decent performance, well that 60Hz is also going to be a deal breaker for many.
     
    TBoneSan likes this.
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Bigger and newer isn't always better Tiller. If you believe that then you've succumbed to the marketing gluttony of the 21st century. And you're referencing a 32" display vs a 15" display. We're talking about *LAPTOP* LCD's not desktop.

    And again, it all depends on what you want. If it's primarily for gaming, what's the point? You can't drive 8K natively by any means, and the refresh is sure to be only 60Hz at best (30Hz even maybe). Desktop work, maybe, but again, depends on apps you use and how well it scales.
     
    triturbo likes this.
  6. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    Unless I'm 3D gaming (or competing in world championships), I'll never benefit enough from 120hz to justify the abysmal resolution. When I first played half-life 1 on a Hercules GPU, it was in 640x480 & I was LUCKY if I got 20fps. Through many iterations of pc gaming, better resolution has always offered me more...vice better refresh/fps at lower resolution. So while some can't stand 60hz, I've long been used to FPS's at or below that threshold--because I'm always wanting a higher resolution & I'm willing to crank it up until I reach minimally playable FPS. 120hz offers minimal benefit for that--at least at present. Now when MIDRANGE mobile GPU's are churning out 100FPS at 4k in ultra settings--on the most modern of games--then I can see the benefits.

    I wouldn't be surprised to find that someone out there is stuck at 900P because they insist on having a 240hz refresh rate. That seems silly to me, but I suppose everyone has their own ideas.
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It just shows that there is not a single solution and we have options. I just don't like it when people say how it works for them so other people are stupid because they like something else.

    Sent from my SM-G935V using Tapatalk
     
  8. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    imo 4K and higher resolutions really only make sense on larger displays. For example, FHD 60" vs 4K 60" is a gargantuan upgrade, you can very clearly see the finder details. Not so much on a 15" screen, you really have to have amazing eyesight for that. I'm willing to bet the majority of the consumer base cannot distinguish that fine level of detail on a 15" panel.

    what should be fixed across all panels is color gamut...i cannot believe we still have laptops with 72% sRGB coverage. 100% srgb should be standard by now, pushing closer to 100% adobeRGB. FFS, this was done with RGBLED years ago. I would kill for a 120Hz RGBLED WUXGA display.
     
    HTWingNut and triturbo like this.
  9. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Well let's say that 4K is OK on 32" monitor, but I agree 8K is for big@$$ TVs.

    You know, I own the very definition of "pleasant viewing experience" and it was created 2009 (the display, the technology a couple of years before that), released in a 2010 notebook. But that's of course in my eyes. Yours could and quite likely are different. Define me your visual pleasure.

    Your eyes aren't however.
     
    TomJGX and HTWingNut like this.
  10. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    15" only seems small. Then you realize that FHD smartphones (even UHD ones) present a display that is clearly superior to the older 720p or whatnot offerings that used to come on these tiny 4-5" displays. And if you can see the 720p > 1080p upgrade at 5"...then dollars to doughnuts you'll be able to see a 1080p > 4k upgrade on a 15".
     
    tilleroftheearth likes this.
  11. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Mhhhmmmm and then you should realize that the OS is tweaked for the new res, where Windows is NOT!!! When you are going to realize it? I've said it earlier in this thread that I have little complaint against macOS' scaling - it's still wasted pixels in my book, but at least everything looks awesome.
     
    HTWingNut likes this.
  12. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    Even if it's scaled though, it's not wasted pixels...because it will still LOOK BETTER. That's the myth that most people who haven't used 4k--don't seem to understand.
     
    tilleroftheearth likes this.
  13. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    HTWingNut, you should know me by now. Marketing has little to no effect on me. What I test is what I believe - even on this very forum, when everyone was shouting at me that SSD's were the bees knees (back in 2009...) and I was very firmly saying 'no, no they're not' - at that time... why? Because they were garbage (in performance and available capacity - not to mention the rip off prices...) - when compared against USB drives and programs like eBoostr (ahh, memory lane...).

    Like I noted before; I don't need a 32" monitor to know that more (equal or higher quality) pixels is better... I can see that on 5.7" screens (phones)...

    Gaming... gah... not for me, granted. But, just because the almighty GPU isn't up to the task of driving a proper resolution for our eyes doesn't mean the monitor is at fault here...

    When something is good, it is good period. 8K for a 32" monitor would be like how that pioneer Nikon D1 looked like to my eyes back in 1999... like looking at a seamless/analogue version of my images compared to anything digital before then... That is why higher resolutions are better...

    All of our eyes (at 121MP or higher equivalence) are aching for displays that match and exceed their capabilities - not just give us a good enough rendition that we will be sadly laughing at in the not too distant future. ;)

     
    tgipier likes this.
  14. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    My 'visual pleasure' is to be able to look into a monitor for 36 hours straight (photo editing) - if required - and not feel like I had my eyes massaged with rusty razor blades. :)

    Saying that one resolution is better for a certain screen size for viewing at a certain distance is a fallacy... That is just the 'good enough' 'spec's. When (not if...) monitors surpass the human eyes in resolution (~121MP to 240MP... - yeah; that is a minimum of 11Kx11K of 'pixels', required - for any size of screen/monitor...) than we can begin talking about 'overkill', in monitor resolutions.

    Just don't confuse what a monitor is able to display with what a GPU can't do... Give them time, those puny GPU's will catch up soon enough... as will Windows capabilities to handle that scaling properly... ;)

    And sure, our eyes may get dimmer and see with less acuity over time as we age - but even then they can still be used to determine if a monitor is of higher quality or not (seeing, like all of our senses, is not just about the actual organ used to perceive that quality. Rather; it is always a blend of the senses/brain interaction that gives us the final 'picture', 'sound', 'touch', 'taste' or 'smell' that we perceive).

    Don't be the ( misquoted) Bill Gates of 2017... xK resolution is not for a specific sized monitor.

    Humans don't respond to digital output in a linear fashion (and that is exactly what LCD's output...). Our view of the world is analogue and more (from our digital sources for our senses) is always better (all else being equal, of course). ;)

     
  15. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    That's subjective, but everyone and their grandpa can tell you that the mac DO looks better. If you don't believe me, believe them:

    I'm pretty certain that they've seen a couple of 4Ks. Oh and BTW, here's another opinion on the matter, which also provides a "fix" (not really there, but WAY better than what Windows does).

    Can you do that with your laptop? I can do that on mine. One of the things I was looking high and low (the backlight's refresh rate), not just the gamut. Again this is a 2009 panel with ~2007 tech - shooting at ~1200Hz, with A-TW polarizer and color gamut higher than 99% of the displays out there, also refresh rates of 48, 50 and 60Hz for smoother video-edit/playback (hesitant to try and see if it overclocks, when I get a spare assembly I would try). These are things that you rarely find on desktop displays, let alone on laptop.

    Are you certain on that, because there are studies that claim that it's lower than, like much lower. Also we see in color only with the center of our eyes (that's also the area with the highest "resolution", hence the wrong asumption) and the colors for the rest of our field-of-view are "cached". That's why if you focus on a single point for prolonged time, the sides lose color. For a better and more professional explanation, ask an optometrist (like I did, it's a friend of mine, so we discuss eyes and stuff from time to time).

    Did I said that anywhere? I most certainly not and I would never do. I'm supporter of future-proofing and as such I want to replace as less components as possible, while still having competent machine. Having a great display, motherboard, case and PSU is a great start for a future-proofed set-up. With CRTs it was way easier than now, since you could've drop the resolution and it would still look awesome. Nowadays there's still no such thing as future-proof display. Not in my eyes at least. You can get 8K display and hope that in a couple of years GPUs would catch-up, but what until then? Would it scale properly to lower useable (with your current GPU(s)) resolutions? Quite a few 4K displays don't scale well to 1080p, even though they should. After all it's four pixels for one on the very same real estate the 1080p panels existed (i.e. 15.6" or 17.3") and yet it looks bad (worse than standard 1080p panel).

    It still vastly depends on the lens/eye/mic/ear/and etc in front, doesn't it?

    I'm saying that with current Windows scaling x resolution is perfect @100% (because no scaling and 100% use of the real estate, because that's the main point of resolution, smoothing is second and obviously not well done for now). So I wont be the next misquoted Bill Gates (I think that he proved to be misquoted) if you stop misquoting me. So I'll quote you to prove my point - it's a "blend" of a few factors and the current state is - there is a perfect resolution for given real-estate (for the majority at least, a lot more people would be content with 2560x1600 on 17.1" @100%, than scaled 4K on the same size).

    And that's the very reason why the last CRTs were better (perfect blend of digital and analogue, just too big and heavy) than anything on the market as of now. Also why I hope that half-pixel-drawing-capable technologies (like CNT-FED) would eventually hit the market.
     
    Last edited: Jan 7, 2017
    tilleroftheearth likes this.
  16. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    triturbo,

    you have made some excellent points. Thanks for continuing the conversation here. :)

    No, I can't do that with any notebook I current have or have used recently - but that is more because of the condition of my age and eye health than anything else. I used to be able too, but inferior technology seems to have run my capabilities down. ;)

    That is some panel you have! Which one is it and on what notebook do you have it installed in?


    Sure, depending on who's math (or statistics) you believe, that may be true (on paper). But my real world experience has taught me to never underestimate my (or anyone else's) physical abilities - even if it's 'proven' to me by so called scientific research and other mind numbing and imagination stagnating propaganda.

    Several years ago I created a few large (17") prints at various 'resolutions' and came to the realization that I quickly ran out of vocabulary to describe what higher and higher MP images did to the final print. Yeah; much like in audio where the current and not so current know-it-all's laugh at anything better than a $150 Walkman from 1979.

    What became apparent very quickly though was that even at +20K x +20K printed to a printer capable of 'only' 1440x720 print output, I did not have enough 'computer' to print at much higher res.

    Those prints though served their purpose (even the ones that printed halfway through before the system crashed (printer or O/S?, I don't know...). And increased details were not only present at normal viewing distances with the higher resolution input files; but they were present in spite of the technically limited printing device I was using at the time too. A few years later, when technology had moved significantly and my eyes hadn't changed that much; I came to the conclusion that my eyes could resolve higher than ~250MP in optimum viewing conditions. I only wish that I could state the same thing today. :)

    Btw, I've been to a few optometrists until I found one that would allow me to play with my prescription so that I could continue to use my cameras/lenses as I want. I too would ask about the theory and the state of my eyes and make jumps in my conclusions of what would (or should; if I understood everything properly...) work best for me. It took a few eyeglass pairs along with a newfound perspective on the part of the optometrist to get to where I'm at now. But on a good day? I have 20/10 vision with my customized prescription. (Compared to about 20/30 vision with my best 'regular' prescription from any of the other optometrists' I've used...).


    No, never stated you said that. Just cautioning here...

    Again; I don't want a 4K or an 8K monitor to do no scaling on... I just want the pixels to use them as I see fit. ;)

    I also agree that some panels/displays look worse even when scaled by whole units - but that just makes that display bad - it doesn't (by default) make all 4K displays bad. ;)


    No, actually it doesn't. When we are talking about our senses, we can't separate our brain interaction from the conversation. That also means that we can learn to see/hear/feel/smell and taste better. Because our 'sensors' are not the last word on what we perceive (they don't deal in absolutes - they are much better at identifying differences...) - rather; they are merely one of the processes humans use to identify, label, measure and compare/contrast all the different inputs they are able to perceive (and yeah; some of us can do more or less than others, depending on how well we've been trained on using our sense(s)...).


    The point you're trying to prove below is a case of circular reasoning, I believe. ;)

    And it doesn't matter how many billion flies like xydt (uhm, you know - the smelly brown stuff) - that is irrelevant on how good I think xydt tastes. :)


    And on your points below; we agree 100%. ;)

    Even if half pixel drawing technology is just another doubling/quadrupling of the resolution of a given panel... :)

     
  17. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    I remember I was between first ones I purchased fujitsu-siemens amilo xi 1554 laptop with 1920x1200 17" and it was in 2007 or 2008 year and then everyone were saying omg whaat? That is overkill, pointles, why wud u do that..Now u see u have it in your laptops.. Pathetic/naive ppl..
    Exacly (some) same ppl here. And within 5 years these little boys who said 4k is meaningless or pointless on 17" will be saying opposite and will have 4k at home quietly purchased and suddenly they will switch opinions bcos simply they will be able to afford it.. And it will be standard and again they will be saying dat 8k is pointless.. :)
     
  18. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    4k on a laptop is pointless.. Not overkill but just not practical to use.. At 100% scaling everything is too small and I hate using larger scaling in Windows, everything messes up.. I'll be happy with 1080p on my laptop for a long time..

    Sent from my LG-H850 using Tapatalk
     
  19. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Stop with the insults, dude. Different strokes for different folks. In 5 years we will likely have a new Windows platform with better scaling. Again, great it works for you, go have fun with it. But stop with bashing other people's opinions and what they find comfortable. Calling them pathetic and naive (among other things) is uncalled for.
     
  20. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    It is indeed. It's HP special order LP171WU8-SLB1, only available for 8740w DreamColor equipped machines (was a $550 option and understandably RARE).

    That's a HUGE part of our conversations when we talk about this matter. To be honest I'm somewhere around you on the matter. I doubt that anyone can pinpoint exact number and I doubt that it's a fixed one for sure. Just like with displays, there's no one-size-fits-all, but there could be one-size-fits-most and so far, that's not 4K.

    Inkjets are a bit hard to have fixed resolution. They can do the half-pixel-drawing I was talking about. How it's done? Magic. Like the prices of their inks. Kidding about the first, not so for the second. Since we are way off I'll just say that it's because they spray the ink and overlapping occurs.

    With your entire field of view? Are you sure that you see things with your peripheral vision, just as well as looking directly at them. Just focus on a single word in this sentence and tel me how far you can read without moving your eye. Granted some people see better with their peripheral vision than others, but that doesn't mean that they see the same level of detail as looking straight on (trust me, a lot of arguing went here).

    And this is where we differ, I want to use it @100%. Firstly that's how most of the software is intended to be used in first place and secondly you make use of the entire available resolution. I'll repeat (maybe fifth time) that macs do look awesome, but I'd rather have the space than smoothness. You are right though, when displays get high enough resolution, we'll have both. Till then, I'd have my 2560x1600 120Hz 10bit IGZO RGB LED IPS 17.1" thanks (yeah, I'm dreaming). That's the perfect display (for me) with the currently available technology and it would cover Rec. 2020 and be HDR ready (because of the IGZO, my aSi panel is almost there).

    And Windows is not of help either and that's why all the fuss is about. Hopefully they'll change the way they do scaling, both MicroSoft and display manufacturers.

    It's more like what inkjets and CRTs do - they don't paint (keyword in both cases) the whole pixel, which results in smoothing and looks natural.
     
    TBoneSan and tilleroftheearth like this.
  21. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    Well since 15" WQXGA laptops will probably never happen, I'm stuck with 3k or 4k displays. I'd be perfectly happy with WQXGA; it's just not available in the form factor I need.

    No argument about MAC. MS has not prioritized scaling, because they know that business PC's are still a bulk of their sales. And a good chunk of the rest is from unwashed masses who bought into the 1080p brainwashing hype. That said, if windows 10 gets patched next week & scaling is 99% fixed, I'm not going to be left kicking myself because I naively insisted on only using 100% scaling option (native). My desktop is 150% scaled on 4k, and at arms length, I can still see everything on this 15" just fine.
     
  22. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    I really like my 4K 15" screen. Under Win 10 everything I run looks great with the exception of some dialogue boxes. I believe this is in part due to some changes I made.

    Most games I play run at 60 fps with 4K except Witcher 3 on Ultra which runs at 30fps. I've run that at 1080p and the frame rate goes back up to 60 fps but I've found I like the higher resolution more than the higher frame rate. 4K really is nice and it's hard to go back.

    Sent from my SM-G900V using Tapatalk
     
    tilleroftheearth likes this.
  23. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    That is a nice platform you have indeed!

    On topic; it seems we've gone full circle. If no scaling is needed or wanted; 4K notebook panels make no sense to most people here...

    If scaling is used; higher than 1080p panels is highly desirable (all other quality aspects being equal in the panels compared)...

    Windows 10 scaling issues is a problem I don't see. Partly because I invest in newer programs as necessary (and the new ones scale properly, or at least; good enough...) and partly because the platforms I use just need to do 2D 'excellent' - 3D is still not part of my main or even secondary or tertiary workflows yet, so even the iGPU's I love still can drive the 2D workloads to almost any monitor I choose to use (and in most cases; 'times' 3).

    Hang on to your platform as long as you can; I agree it is a rare and special breed.

    For anyone with the panel + platform budget you had to get an equivalent quality 4K panel today is just as doable though (imo).

    And that 4K or higher resolution would be at least equal if not superior (overall) to even your panel's pinnacle of excellence of almost a decade ago.

    I've had similar products/components over the years where I too thought they would never be surpassed - but in almost every instance, I would walk into a WalMart or some other mass consumer supermarket and find better for 1/20th of the cost or less... :(

    In 1999; the Nikon D1 was crowned the king of quality pixels in a body to match (almost) anyone's shooting style (i.e. professionals). In less than 3 years, there were such highly improved options available that even the once $10K body was a burden to continue shooting with for even the next few months... And the kick to the pants was that more than a few of those options cost only a fraction of what the pioneering D1 commanded at one point.

    I loved the D1's I went through... and even have the actual D1 I first bought to test against my F5's... but defending it past it's 'best before' date is not something I can be accused of... even if I was expounding it's virtues early on - when it seemed everyone else around me was groaning about it's astronomically high price (at intro...) :)

    Lesson learned: even at a worse, per pixel, quality level - a higher MP camera of 'this' level of quality (and higher) was/is/and always will be better than the highest quality (per pixel) camera ever mass produced. The challenge was to use the new equipment in different ways from the 'old workflow' to extract that superior quality it offered.

    In the end; what changed wasn't the equipment so much; it was me.

    (And that is how it should be (continual learning/relearning); until I finally quit breathing...).

     
  24. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    I'm sad we pretty much skipped 2k resolution completely really. Would of been a better step up than skipping straight to 4k IMO.
     
    triturbo likes this.
  25. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    You know what business runs? Old software. Which doesn't scale (well). I'm certain that there are companies that still run certain Windows release, just because their software was written for it.Of course, that's not a rule, but not an exception either.

    It is and I will.

    We'll go a bit off again, but this is the funny part - there's STILL no substitute almost a decade later (actually count it a decade, considering when the LP2480zx was released and then I'm sure that it wasn't developed over the night). OK, if we are optimistic, the IGZO 4K (used in a few machines as well as the 4K "DreamColor") is about the same (color gamut wise) as mine. Given the arguments I've provided about 4K, why bother then?! Why the quotes in 4K "DreamColor"? DreamColor panels (or rather setup/technology, since it wasn't only the panel) used to push the envelope and were a step above everything else. Even the first notebook DreamColor (LP171WU5-TLB1 in 8730w), which was of the LP171WU5 (8bit RGB LED TN) family (used in other systems), was the best of that family. Then the above mentioned LP171WU8-SLB1 was HP exclusive, hence the spicy pricing, the next best thing at the time was 8bit RGB LED TN (which wasn't very bad in comparison, actually the 8730w's DC has better color rendition, where the 8740w is on the warm side (stock for stock)) and then a 6bit 2CCFL TN (which was still nice option). The 16:9 panels were used with a slight "tune" under PremierColor brand in a couple of generations Precisions after the DreamColor equivalents. It doesn't matter though, they brought slight improvement (refinement of the technology, nothing ground-breaking), while cutting 120 pixels. Of course in my eyes there's no reason to "upgrade" (yes, these 120 pixels, I find 16:10 to be the perfect blend of portability and usefulness for laptops). Compared to everything else 16:9, the gap was even wider - the next best option was wide gamut 6bit TN. They were two steps above for quite a while. And finally, what do we have now? A panel that can be found in pretty much every 4K machine (exaggeration, but the IGZO (Sharp) 4K is pretty common) and although nice, is NOT an upgrade over what I already have. "Lesser" machines get the same panel as the "DreamColor". IF the 4K "DreamColor" was 4K DreamColor, well that would've been another story, it would've meant that they'll push once again, and would bring Rec. 2020 and HDR before it becomes mainstream i.e. future ready/proof technology. But then would you buy next year's model? So far I have 0 reasons to get 4K. HDR 4K might finally switch me to the dark side (16:9), whether or not Microsoft fixes the damn scaling. They better do however.

    I get your point, but sometimes there's literary no substitute. Since we are in a laptop forum, just ask HDX Dragon owners :)
     
  26. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    Aaahah now u calling it insults? Just bcos I spoke out the truth? Be honest be man not pussy boy.. It is how I said.. Ppl winning and moaning but later on those moaners will buy the product what they hated and suddenly they will love it. But won't say anything about past how they been ranting on the 4k product right?
    And don't tell me what to do OK? ;)
     
    Last edited: Jan 12, 2017
  27. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    Because 2k is nothing much difference btw 1k(1080p) it is only slightly higher than FHD.. And yet boys thinks 2k is something huge.
    It isn't.. It is only quarter more of FHD.. 4k is four times of FHD and dat's the huge step n difference..
     
    Last edited: Jan 12, 2017
  28. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    Scaling to 150? That's a lot.. U don't need then 4k.. When u basically scaling it back to 1080p look.. I have max scaled to 125..
     
  29. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    Most games u play at 4k60?
    Really? What laptop u have?
    I really don't believe dat u playin games all new at 4k and in ultra on litte 15" laptop without problems..
    Can you prove it?
     
  30. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    I have 1080p on 17,3" and I still scale windows to 125..it is Microsoft fault in reality.. They didn't optimised it since 10 years or 15 when everybody been using 1024x768.. Guys at macrosoft are just bit lazy.. Apple HAS it optimised in OSX.. Their letters in desktop are nicer and rounder.. Windows "cleartype" is sh...t
     
    FredSRichardson likes this.
  31. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    I have a P650RS-G (6820 & 1070)

    Games are not all new. FO4, BL:TPS, D3 & W3 is all I play lately. So far only W3 is not 60fps, but yes others are older games.

    EDIT: BTW the new Tornado F5 with a 7700K and 1080 probably can play most new games 4K@60fps. And if that doesn't work try a P870DM3-G with dual 1080s though you may prefer 3k@120fps instead ;)

    Sent from my SM-G900V using Tapatalk
     
    Last edited: Jan 12, 2017
    TomJGX and TBoneSan like this.
  32. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    Oh yeah that's brand new laptop... I played dishonored 2 and battlefield 1 on 2k on ultra.. 4k us too much..
    That defo needs 2gpus even they might not be enough for new games..
     
  33. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    I am looking forward to trying out Dishonored 2 - really liked the first game. I'll report back on it when I get there but I expect it will be 4K@30fps or something (which might not be great for that type of game - W3 is pretty forgiving).
     
  34. swiftxshadow

    swiftxshadow Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    5
    Trophy Points:
    6
    This very question is an absolute nightmare. I have been researching and researching for hours a day for the last month trying to decide refresh rate vs resolution. You would think with where we are with technology high refresh+high res would be compatible. However it would seem the only way to truly decide is to test them out. And that is nearly impossible without buying one of each with the sole intent of testing them side by side and sending the one you like least back. And not all of us have that kind of money to do that.
     
  35. swiftxshadow

    swiftxshadow Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    5
    Trophy Points:
    6
    Has anyone seen any articles or videos in reference to when we may see 120+ hz gaming on 4k laptops? Or good IPS panels with 1440p laptops? I have tried every wording scenario I can think of and cant find any articles with projected dates or companies currently working on the technology.
     
  36. magnavation

    magnavation Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    1
    Trophy Points:
    5
    I have been looking into this for a while. I came to the conclusion that a real good 1080p screen is the way to go for laptops over 4k. maybe in several years from now a 4k monitor will be good in a laptop but for now 1080p is just fine.
     
  37. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    i tried 4k60 on my 970 but it tears and I don't want to play on medium settings ..
     
  38. Predator-X

    Predator-X Notebook Consultant

    Reputations:
    11
    Messages:
    138
    Likes Received:
    23
    Trophy Points:
    31
    no chance .. ppl plays only 1080p120Hz
     
  39. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    @Eurocom Support or @EurocomTechspert mentioned a while ago that apparently AUO 4K IPS 120Hz GSync displays at 17.3" are on the way. Could see them this year.

    The Asus PG27UQ seems to be the first 4K 144Hz monitor on the desktop side but no eta on release yet.
     
    FredSRichardson likes this.
  40. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    I have also seen systems with 3K @ 120Hz . This one can be configured with 2 GTX-1080s in SLI (yes - it's insane!) and a 3K @ 120Hz screen:

    http://www.hidevolution.com/evoc-p870km1-dual-gtx-1080.html
     
  41. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Not really a 3K screen. That means 1800pish. 1440p is 2.5k. Anyway the 1440p 120Hz TN GSync display is on the Clevo P775, Clevo P870, Aorus X7, and AW 17.

    (The K is referring to horizontal pixels. So 2560x1440 2560 is about 2.5k. 3200x1800 is the closest to 3k)
     
  42. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    Oh, I didn't realize that! I'm not sure if HIDevolution does either - they are advertising this as a 3K screen:

    "17.3" 3K QHD (2560x1440) AUO 120Hz - 5ms - LED Matte Type Display - w/ nVIDIA® G-SYNC™ Technology"
     
  43. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    It's a marketing buzzword don't take it to seriously.

    The Samsung pentile display should not really even qualify as 4K either, effective resolution is like 2.7k.
     
  44. FredSRichardson

    FredSRichardson Notebook Groundsloth

    Reputations:
    183
    Messages:
    971
    Likes Received:
    420
    Trophy Points:
    76
    Hah, yes, I hate it went marketers exploit ambiguity. At least 1080p always means the same thing. They did called it "QHD", but who can remember all these acronyms like "UWQHD+"?

    https://en.wikipedia.org/wiki/Graphics_display_resolution

    The wiki page doesn't even have a definition for 3K - so maybe that is ambiguous...
     
  45. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    marketing buzzword tomfoolery is why 1080p resolution DOWNGRADE was able to be shoved down consumer throats so easily. And no, I'm not giving up 3/4ths of my resolution just to double the hertz. TV shows are still 30FPS, movies are still 24FPS. Yes i get the idea of 120hz being a "fit" for both--but I'm not going to saddle myself with piss for resolution on account of what I stream. I GAME with my laptop.
     
  46. OSihota

    OSihota Notebook Enthusiast

    Reputations:
    2
    Messages:
    40
    Likes Received:
    7
    Trophy Points:
    16
    How old are you? I hate saying this but it sounds like you may need reading glasses.
     
  47. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    LOL. No my eyesight is good. Kinda the problem. It's 20/20 or slightly above. I am migraine prone (photo-sensitive), so the higher resolution is easier on my eyes.

    I think you kind of misinterpreted what I meant btw. It's not that things in general look fuzzy to me. It's that I can see the fuzziness of a 1080p vs a 2160p display quite easily. If my eyesight was poor they should both look bad because I can't focus on the screen.

    I was at an eye exam like 2 weeks ago too.
     
    Last edited: Jan 23, 2017
    tilleroftheearth likes this.
  48. OSihota

    OSihota Notebook Enthusiast

    Reputations:
    2
    Messages:
    40
    Likes Received:
    7
    Trophy Points:
    16
    ahh ok, just thought I'd ask because you have headaches. Reading glasses aren't conventional glasses though, they have nothing to do with 20/20 vision, even with perfect vision you'll start needing reading glasses at some point (usually in your 30's). A good sign is headaches when reading. Everyone needs reading glasses at some point, even if they have otherwise good vision. I didn't know this until I started getting head aches while reading too. The minute I got reading glasses my 1080p screen looked like it was QHD lol.
     
  49. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I'm not near my thirties and again got checked like this month for reading glasses as well. And it's not headaches. I get ocular migraines, my literal vision stops working in sections...

    Thanks for looking out, but I'm pretty positive in my case that's not the issue. I can read my retina display iphone like a foot away for hours with no difficulty because I can't make out the pixels.
     
    Last edited: Jan 23, 2017
  50. jedisurfer1

    jedisurfer1 Notebook Deity

    Reputations:
    39
    Messages:
    785
    Likes Received:
    50
    Trophy Points:
    41
    Is anyone running 4k resolution with no scaling on a 15.6' laptop? I run an m4800 with QHD+ 3200x1800 and I don't need scaling.

    My Lenovo p50 is 4k and I need to scale it to 125% which defeats the purpose of higher resolution for me.
     
← Previous pageNext page →