4k on a small screen seems overkill to me & does not make sense to me. How can you tell quality difference of 4k on a 15-17 inch screen? Plus 4k can hinder battery life just overkill.![]()
-
-
So dont buy one.
I have compared 1080p and 4K side by side in 17"ers by the way. For some I'm sure its a "won't notice without looking for it" difference but I am NOT in that some.
Visual quality of small text or anything with an aliased line or curve looks great. Games, where the frame rate drop doesn't compromise the overall experience, look amazing.jaug1337, alexhawker, tilleroftheearth and 1 other person like this. -
Yeah it's solely for the increased dpi. The extra crispness vs 1080p should be visible.
chezzzz and don_svetlio like this. -
John Ratsey Moderately inquisitive Super Moderator
My phone's screen is 432 ppi and I can see the benefit of the high pixel density in terms of text clarity but I hold it about 8" from my face. On the other hand, the normal viewing distance of my my notebook screen is about 24" in which case the equivalent pixel density is 144 ppi - very close to FHD on a 15.6" panel. I personally value battery time over slightly better rendering on the screen plus the risk of problems with any software which doesn't scale well (a bigger risk with older packages of which I have several).
John -
Tinderbox (UK) BAKED BEAN KING
I cannot see the point on anything under a 32" unless it`s to be used in an VR headset, when the display is 2" away from your eyes 4K might be minimum resolution.
John. -
Some people have better eyesight than others.
-
tilleroftheearth Wisdom listens quietly...
How can you tell? All you have to do is really look at it.
Just like with (professional) digital cameras around 1999; resolution isn't the only thing that is improved with more pixels.
And just like a 4K video from youtube looks better on a 720K screen; a 4K monitor displaying even just a desktop is easier on the eyes for far, far longer than anything with less resolution of equal quality.
Not only are individual pixel differences/errors less obnoxious to our eyes because they are physically smaller and are much more insignificant part of the whole picture... they are also higher quality than their lessor counterparts too.
If you're gaming; battery life will be an issue. But I don't care one iota about gaming.
If you're simply using a computer screen/monitor for hours every day; get the highest quality screen possible (period).
Just like the keyboard and the mouse (or; my preference; TrackPad), the screen is what you use to interact with your computer the most with - don't skimp on it to save a few mere $$ or a few minutes on battery life.
Ramzay likes this. -
Falkentyne Notebook Prophet
tl;dr:
4k is good on a small screen if you care about content creation and image processing (e.g. you do WORK on your computer).
4k is worthless on a small screen if you're gaming on a laptop and want FPS higher than 30.
P.S. watching a 4k video on a 720p screen is supersampling antialiasing.... -
tilleroftheearth Wisdom listens quietly...
Of course; the error in your logic is that the GPU would provide that 'higher than 30 FPS' score. The screen size is irrelevant.
And even gamers would rather have 4K screens - all other things being equal.
-
This. So much this. It irks me that computer manufacturers think its ok to put crappy screens in laptops. They'll put some pretty decent hardware in there, advertise it at a low price, but fail to mention the abortion of a display panel they shoved in there to save $50 (see the first shipments of the new Inspiron 15 7000 gaming for an example). The screen is the single most important part of your machine. Its how the machine communicates its output to you. When speccing out a machine, you should start by allocating the most amount of money you can to the screen (without compromising your overall build/objective), then work your way down.
This applies to desktops as well. My monitors each cost as much as my GPU.
Never skimp on the display. Get an i5 instead of an i7 if you must, or get slower RAM (both trade-offs where you likely won't notice much of a difference, depending on your work load).
Well...you could simply set the in-game resolution to 1080p. 4K scales perfectly to 1080p (4 screen pixels will "merge" to show one pixel). The only real downside is the battery hit from having to light up/power 4 times as many pixels. Otherwise the FPS performance should be roughly the same as on a native 1080p display, but will still look sharper.
Overall, unless battery life is important, or your budget is limited, the only reason (IMO) not to get a 4K is cost and quality. I'll take a good 1080p panel over a lesser-quality 4K any day, and a good 4K panel can cost $200-$300 more than a good 1080p (I don't think a 4K panel is worth that much more than an otherwise comparable 1080p). But if the 4K panel is within $50-$100 of the 1080p, I'd go for it.Last edited: Jul 4, 2017chezzzz, Prototime and tilleroftheearth like this. -
4K video from YouTube looks better at 720p because of the POS encoding they use. Highest resolution native will result in a better looking lower res image. Has nothing to do with resolution of the LCD.
Except with 15" and 17" you need to scale 150% or 200% which negates any real benefit of having the smaller pixels.
People don't game on battery. It has little to do with gaming. For gaming, users are usually looking for fastest response time and refresh which is typically best on a FHD display currently. Battery life will suffer for anyone running a 4K LCD over 1080p. Four times as many pixels than FHD has a significant effect on battery life just by nature of the beast, even idling.
I agree you should get the best LCD that suits your needs. It's what you look at every time you use your laptop. But different strokes for different folks. I personally don't see the benefit of 4K on a 15" or 17" notebook. I'll take a faster refresh and response and better battery life any day. But I'll also favor IPS over TN as well because while TN is great for super fast response times and refresh, they look like **** for any kind of development work. I can't stand the tunnel vision and low color palette and washed out look.chezzzz, tilleroftheearth, John Ratsey and 1 other person like this. -
tilleroftheearth Wisdom listens quietly...
HTWingNut, we're basically saying the same thing: when you're looking at it and you see a difference (for you); then you'll need the better screen.
Thanks for the additional info/correction about youtube videos - I know that - but sometimes I forget to properly highlight that aspect (video is not my thing...). That doesn't negate the fact that higher resolution sources (even when they're not from youtube) still make an otherwise inferior screen look better though.
A faster responding screen is always a pleasure to use - even navigating the O/S for me - but that doesn't soothe the eyes like a wider bandwidth (in colour/brightness/grayscale/evenness) screen does.
Each of us has to balance the things important to us; yeah.
But that is why I say 'all other things being equal' or, 'of equal quality' though. There are good and bad examples of any spec'd resolution... but when the important aspects of the equation come together properly into a finished product; the higher resolution options offer something extra that the others can't - at any price.
An example from the (pro) digital camera world is that there are less bits in shadows then there are in the brighter colours (with any camera sensor; regardless of the resolution it has). With physically more pixels; a sensor can give images that mimic a stepless medium better (i.e. film) by giving more 'bits' in the shadows than are otherwise possible (instead of 10 pixels showing the shadow side of an (small) area in all it's 2 bit gory... today's camera's can offer 200 x the pixels or more with a quality that some say surpasses film). What those extra pixels do is give us detail in those shadow areas and make the image as a whole more believable, pleasing and natural to the eyes. -
I heard from this video that 4k is noticeable at at least 40 inches.
-
tilleroftheearth Wisdom listens quietly...
Better resolution is noticeable on any size display; as long as your vision is up to it and you're close enough (for you).
Are you saying 40" away or a 40" monitor? Either way; how 'sharp' something looks is only part of the equation. There are many more benefits that more pixels (at the same or higher quality) have a way of doing an image justice than even the best 1080p screens can.
With correction lenses; I can see at or better than a fighter pilot (15/20 vision), depending on how tired my eyes are. With many decades of educating my eyes as to what makes a good image; the benefits of 4K are not subtle to me.
-
My eyesight isn't perfect but I find it easier to use higher resolution screens even on a laptop for things like reviewing Excel models and proposals in Word/Powerpoint. It's subjective but I feel that I can read the numbers and words when zoomed out a lot without losing my perspective on where I'm looking in Excel, or see the details clearer and spot any anomalies which need to be fixed in a document which will be printed.
chezzzz likes this. -
Right. It's somewhat subjective to one's personal sensitivity but also their eyesight. I don't disagree that 4K is better overall visually, but that's only one small part of the equation that you have to personally balance due to other technology hurdles as already discussed (color gamut, refresh, response, brightness, viewing angles, etc)
tilleroftheearth and chezzzz like this. -
Support.2@XOTIC PC Company Representative
If you're gaming, 4K games on anything under a 1070 is pretty much a slideshow and I haven't even liked the experience on a 1070 to be honest.I run 1440p 980TI OC (about the same as 1070) on my desktop and I'd like to see higher FPS on that, it would probably be around 20 on some games. I recommend a 120Hz 1080p display on a small screen gaming laptop over a 60Hz 4k any day of the week , colors and detail can go directly to the hot place if they conflict with refresh or FPS in my book. If someone releases a 4K 120Hz <5ms panel for laptops and it costs the same as a regular display, I'll change my tune, but not before.
On the other side, things like modeling, drafting, photo/video editing either don't require high FPS to do, or don't have a noticeable effect on performance. For those things 1)You want that higher level of detail, and 2)Your performance isn't limited by having it, so I say absolutely get it for that.
If you're doing a little of both, work and play, something with a 1080, 1070 SLI, 1080 SLI or a desktop with at least that would be recommended, or turn your settings way down when you're gaming.jaug1337, alexhawker, chezzzz and 1 other person like this. -
I'm a photographer/videographer. Higher resolution videos are usually encoded at a higher bit rate.
The amount of detail greatly depends on the bit rate. A 4k video will be encoded at a much higher bit rate than a 720p video. When a 4k video is resized to a lower resolution (such as 720p), there will be some loss of detail because 720p is a lot less pixels than 4k, but all the bits (all the original data) is still there.
4k video shown (resized) to 720p resolution should look much better than 720p video shown at 720p, because the 4k video will almost certainly be recorded at a much higher bit rate. 4k video should be recorded on a camera which has a much higher pixel sensor. A video recorded at 720p has a lot less pixels on the sensor, and that means, it will record less detail than the 4k video camera.
The detail of the original image/video is more important, than the detail of the display. Highly detailed images should look good at any resolution. Low detail image will look bad at any resolution.
Gaming is different because, as far as I know, there is no bit rate. In Nvidia control panel, there is an image quality setting, but I dont think it depends on the display resolution. For games, I think the quality of the image mainly depends on the detail of the texture/object/image when the graphics artist saved that texture/object/image. If you want higher quality graphics in games, then you download a high definition texture pack, to replace all the low detail textures/objects/etc. -
In short, 4k is much better (more important) for recording devices, than for displays.
A 4k image which is recorded on a 4k device, should look good a 4k display, and also look good resized to a 1080p/720p display, as long as the image is not re-encoded to a lower resolution!
If the original 4k image is re-encoded to lower resolution, then much of the 4k detail may be lost. -
Perfect! Yes I agree. My 1080 Ti at 3440x1440 is still a far cry from 4K resolution and it still struggles to keep 60FPS. I had the 980 Ti which is why I bumped to 1080 Ti. While it's a improvement, still not where I want it to be unfortunately. So for me 4K for gaming is just not there yet.
I find it comical how the XBOX One X is being touted as a "4K gaming machine". Then again that means lower quality graphics and 30FPS.Prototime likes this. -
Support.2@XOTIC PC Company Representative
I'm hoping it actually has a positive effect on 1080p high refresh gaming, since I know a lot of users still aren't making the 4K switch and opting for lower rez 120hz panels instead for TVs. -
Falkentyne Notebook Prophet
This actually depends.
Some TV's will downscale to 1080p with no loss in image quality at all, just higher PPI. However MANY monitors *WILL* still interpolate 4k down to 1080p (when there should be no interpolation at all), causing a noticeable loss of image quality. I haven't seen a PC monitor that can downscale 1440p to 1080p with no interpolation at all. Even monitors with decent scalers (e.g. Benq) will still interpolate.tilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
Nah, not more important on the 'recording' end. At least equal when we're talking about computer monitors.
You're forgetting that an O/S is the 'recording'/'rendering' engine for everything we see: a 4K monitor of sufficient quality makes those long bouts on the computer that less painful (on the eyes).
-
Recording and rendering are two very different things.
In video, recording resolution and bit rate are paramount.
Record video at 480p or 480i, and it will look "similiar" on either a 1080p display or on a 4k display.
Record video at 4k and it "should" look significantly more detailed when viewed on a 4k display, than on a 1080p display.
The 1080p display has a lot less pixels which means a lot of tiny details may not be visible (or less clear), when viewing 4k video on 1080p display.
I'm not just some guy who hangs out of computer forums all day. -
tilleroftheearth Wisdom listens quietly...
Maybe someone can find the proper terms. I too understand images (moving or otherwise).
Try to read my response in the context it was meant.
The O/S is presenting the desktop, the fonts and all other items at the resolution the screen is capable of - not the other way around.
A higher resolution screen is not just about pixels. It has other attributes that make it better too (again: all other things being equal).
As for how 'clear' a display is (regardless of input quality) - that is much more on the 'quality' and intrinsic 'design' side of the equation than mere additional pixels could achieve.
With each jump in resolution; there had to be other factors that were first 'fixed' before the resolution itself became important (or marketable). That is why a quality 4K monitor is inherently superior to anything below it. Those 'extra' enhancements were just not needed (or maybe; noticeable enough) at the lower specs.
Given a recording at 'x' resolution - and the same size and 'quality' screens - one 1080p, one 4K - the 1080p screen will be inferior. Though most viewers may not able to tell why.
Given a recording at the same 'x' resolution - and different size screens (but again; of equal quality) - with each viewed at the same effective distance (i.e. arc angle) - the 1080p screen may or may not seem inferior - depending on the eyesight of the viewer.
Our senses do not like stepped input - they are very tuned to an analogue world - even if most consumers can't tell one way or another (it's called training/educating your 'ears', 'eyes', 'touch', 'smell', 'taste'). When our digital tools become more analogue in nature (at least on the output side; for monitors) - that is when the 'tool' disappears and it seems we are interacting with our data, directly.
One thing I've found so far: we are still so far, far away from that reality. But each and every advancement made shows how clearly bad the older 'tools' really were. To me, it's a double whammy. I can only imagine being born today among such advancements as we now have. Instead of having to experience them through the senses of an old, worn-out and tired body that I'm left with.
What all our digital tools do is 'copy' the nature around us. It is that 'copying' part that is so hard to do, faithfully, to the original.
-
What I am trying to tell you is that "input quality" is much much more important than the display resolution/quality.
If "input quality" (aka the original recording) looks terrible, then what you see on any display will look terrible.
If the original recording is high resolution and high bit rate, then what you see on screen should look good, both at 1080p and at 4k.
When the resolution of the original recording exceeds 1080p, then the 4k display has an advantage.
If resolution of the recorded video is 1080p or less, then the 4k display has little (or no) advantage.
This assumes the 1080p display and 4k display are equal in all ways, except resolution.
Don't focus on the O/S.
video can be viewed on a display, without a PC, by directly connecting the camera to the display.
by the way, cameras and microphones make recordings. computers make renderings. -
tilleroftheearth Wisdom listens quietly...
And I am telling you that 4K+ is not just about video - and neither is this thread (directly).
The 'recording' isn't the topic of this thread, the quality of 4K vs. 1080p displays is.
And when 4K or other high(er) resolutions can be seen with a handheld device with a display of 6" or less; having a 4K display on a notebook at 15/17 inches makes for a better overall experience.
I agree 100% that the input source quality is the most important part of the chain. But you keep ignoring the fact that an O/S isn't built to a specific resolution. It scales effectively infinitely with the display it is attached to. So; in this case; the input source quality (i.e. the O/S) is already above the resolution of any display device we have access to in our notebooks.
Note too that your assumption that 1080p and 4K displays are 'equal in all ways' is also in error. I've already covered why.
-
I know what the thread topic is.
Displays are not only for personal computers. Televisions are displays too.
I can use a display without a personal computer and without windows/mac/linux/android/etc.
If the O/S changes/alters the image on the display, this has nothing to do with the display itself.
O/S = input source quality ??? O/S scales infinitely with the display? huh? Sorry, I dont speak this language.
input source = original recording (i.e. photo/video from camera or image on storage drive)
quality of input source depends mainly on camera (if image is from camera).
Let's not confuse this topic by including software effects into the discussion.
I also know that 1080p and 4k displays are not the same.
For simple comparison reasons, I assumed all variables are equal except resolution.
This makes the comparison more simple and less confusing, even though it's a false assumption. -
So did anyone learn something from the video I posted? Does anyone agree with the guy in the video?
-
tilleroftheearth Wisdom listens quietly...
38 minute video? I can't learn anything from that; total waste of my time. Is there a transcript available?
Did you respond to my queries (post#15)?
-
tilleroftheearth Wisdom listens quietly...
Televisions are a different forum.
If you can't understand the points I tried to make; your loss. You're not even attempting to understand what I'm saying.
I thought it would be easier for you to grasp being in the field and all, but if you wanted to understand; you would be asking for clarity.
I'm not confusing anything here; you're the one that mentions software effects, not me.
Making a false assumption by default defeats the points you're trying to make. There is such a thing as making it too simple.
In 1999 I quickly found out that image quality isn't just what the camera's sensor delivers; the software plays a huge part (yeah; especially for a RAW file/format).
Btw; 'recorded' image quality is the sum of all the steps taken to deliver an image in it's intended format (screen/print/billboard/etc.). The camera sensor is only part of that equation (and in many cases the lessor part). I'm not talking about 'effects' here; I'm talking about getting a RAW image and making it come alive for the viewer; that is my 'quality' standard.
When I deliver a set of images for projection from a TV - I actually take that TV from the client and finish the images with that TV in mind. When those same images are destined to be printed; they are finished in an entirely different way. The point being that an image by itself isn't 'perfect-quality' on it's own. It depends (and highly) on how it will be shown and at what distance too.
-
My bad & yes that video is long as hell. At 27:29 he mentions 4k at least 40 inches
-
tilleroftheearth Wisdom listens quietly...
At what distance? Display size means nothing if viewing distance isn't known.
(I'll go check the video around that time stamp... thanks).
-
tilleroftheearth Wisdom listens quietly...
Okay, I saw that video around the 25 minute mark on to about the 30+... babble, babble... ugh!
So, he's talking about using 4K monitor at native resolution - yeah; for him 40"+ is a good size. And maybe for 'most' people too. Doesn't mean someone else won't value a 24" or even smaller (young eyes!) at native resolution though with the added quality that a Monitor brings over a TV.
The errors of that $500 TV greatly offset the much higher quality of the 29" he showed... Yeah; he can use it at native resolution - but it isn't doing his eyes any good.
-
If you think a television is not a display, then I have nothing more to say to you.
Have fun arguing on this forum and believe whatever you want. -
tilleroftheearth Wisdom listens quietly...
Again; never said that. I'm here to learn and share info on notebooks and computers in general. You?
This isn't arguing, btw. This is 'trying to have a conversation'...
-
I have learned nothing from you. I dont want to chat with you.
Good bye. -
tilleroftheearth Wisdom listens quietly...
Likewise.
But on my part, it's not from lack of trying.
-
Me personally, I see the benefits on larger displays, and no so much on smaller displays, especially at 15" or smaller. Maybe I could be convinced eventually to get a 4k 18" laptop but I hate scaling and I would hate to try to use native 4K res on a 15" panel.
-
Support.2@XOTIC PC Company Representative
As good as scaling has become with updates, it's still pretty flawed IMO. And you're definitely right, the difference becomes more noticeable with screen size. -
tilleroftheearth Wisdom listens quietly...
With more hi-rez screens available (more and more); the scaling will need to be addressed sooner...
My eyes can't wait when 8K+ screens are available for notebooks... (finally; a use for a gpu).
-
Support.2@XOTIC PC Company Representative
Might yet be a bit before 8K comes to notebooks I think. Unless when it arrives it's 30Hz or something.Beemo likes this. -
tilleroftheearth Wisdom listens quietly...
I'd like to believe it will be sooner than later. And 30Hz refresh rates/monitors will be looked at like dinosaurs by then.
-
Support.2@XOTIC PC Company Representative
Maybe we'll see one or two drop with Volta if it's powerful enough.tilleroftheearth likes this. -
2160p looks beautiful, but gaming on it requires hardware that occupies your wallet for a long time.
1440p, on the other hand, is a noticeable upgrade from 1080p, and even though it too requires additional power, it fits nicely between the two.
4K IMHO is objectively worth it, but running 4K 120Hz in games is almost outright insane
I am going to wait a bit before boarding that boat
Just my $0.02tilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
Just a 'note': I don't game, I don't do video directly (for the most part... even if my output is used in videos...).
I just want the image on my screens to appear as if I'm looking at a $200 (my cost) print - or better yet; the actual scene itself as viewed with my own eyes.
1440p is a good step up from 1080p. But 4K is another level (when and if it has all the bells and whistles 4K comes with; HDR, extended color, a higher quality signal path, etc.).
Starlight5 likes this. -
Support.2@XOTIC PC Company Representative
I'm kind of in the same place. 1440p was a good upgrade but I don't really have the hardware to make anything higher worth it.Starlight5, jaug1337 and Spartan@HIDevolution like this. -
Spartan@HIDevolution Company Representative
Here's my old thread about this same topic:
How does anyone in his right mind buy a 4K screen laptop? -
This thread is also relevant to this discussion: Opinions on 4k gaming on laptop
As I mentioned there, we're seeing quickly diminishing returns on resolution increases. 4K is an improvement over 1080p, but much less so than going from SD to HD was. Past 4K, there's likely to be little noticeable effect for most consumer uses in increasing the resolution (although things like 5K make sense for 4K content creators, and much higher resolutions make sense for VR, etc.).
I'm much more excited for advancements in OLED and especially " true QLED" (not that fake QLED crap Samsung just started pushing under a false name for marketing reasons) than I am for advancements in screen resolution. These display technologies will make things look incredibly better than 16K resolution ever could.
What is the point of 4k on small display?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by techlife95, Jul 2, 2017.