I disagree though. If you look at a given font (let's say courier new) and it's set to 1/4" in overall height, that font will look better on 4k than 1080p; even assuming you adjust all other factors so that the actual height on both is precisely 1/4". This is because on a 4k, the finer pixel pitch renders a less granular image. In gist, it means that you will likely be able to see smaller font on a 4k just as clearly as a slightly larger font on 1080p. I do benefit while gaming--for sure. But even when browsing the internet, it's much sharper too.
-
inperfectdarkness Notebook Evangelist
-
tilleroftheearth Wisdom listens quietly...
While it seems most here are resisting the urge to have the most pleasant viewing experience (overall) today... technology moves on.
Here is Dell introducing their 32" 8K monitors. Sure, a cool $5K right now - but I've paid much more for monitors (Sony FD Trinitron GDM-FW900 24") (a $5K monitor from 2000 is worth more $$$$ than a $5K 'anything' today...).
See:
http://www.tomshardware.com/news/dell-canvas-8k-monitor-soundbar,33292.html
While a few go on discussing the merits of 1080p (which is circa 22 years old, at least...), the real tech world moves on.
See:
http://www.geek.com/games/john-carmack-coded-quake-on-a-28-inch-169-1080p-monitor-in-1995-1422971/inperfectdarkness likes this. -
Once 4k and 120hz is a thing then I might dip my toes. 60hz is the biggest deal breaker for me.
Spartan@HIDevolution likes this. -
TBoneSan likes this.
-
And again, it all depends on what you want. If it's primarily for gaming, what's the point? You can't drive 8K natively by any means, and the refresh is sure to be only 60Hz at best (30Hz even maybe). Desktop work, maybe, but again, depends on apps you use and how well it scales.triturbo likes this. -
inperfectdarkness Notebook Evangelist
Unless I'm 3D gaming (or competing in world championships), I'll never benefit enough from 120hz to justify the abysmal resolution. When I first played half-life 1 on a Hercules GPU, it was in 640x480 & I was LUCKY if I got 20fps. Through many iterations of pc gaming, better resolution has always offered me more...vice better refresh/fps at lower resolution. So while some can't stand 60hz, I've long been used to FPS's at or below that threshold--because I'm always wanting a higher resolution & I'm willing to crank it up until I reach minimally playable FPS. 120hz offers minimal benefit for that--at least at present. Now when MIDRANGE mobile GPU's are churning out 100FPS at 4k in ultra settings--on the most modern of games--then I can see the benefits.
I wouldn't be surprised to find that someone out there is stuck at 900P because they insist on having a 240hz refresh rate. That seems silly to me, but I suppose everyone has their own ideas. -
It just shows that there is not a single solution and we have options. I just don't like it when people say how it works for them so other people are stupid because they like something else.
Sent from my SM-G935V using TapatalkIonising_Radiation, TBoneSan and Spartan@HIDevolution like this. -
imo 4K and higher resolutions really only make sense on larger displays. For example, FHD 60" vs 4K 60" is a gargantuan upgrade, you can very clearly see the finder details. Not so much on a 15" screen, you really have to have amazing eyesight for that. I'm willing to bet the majority of the consumer base cannot distinguish that fine level of detail on a 15" panel.
what should be fixed across all panels is color gamut...i cannot believe we still have laptops with 72% sRGB coverage. 100% srgb should be standard by now, pushing closer to 100% adobeRGB. FFS, this was done with RGBLED years ago. I would kill for a 120Hz RGBLED WUXGA display. -
Well let's say that 4K is OK on 32" monitor, but I agree 8K is for big@$$ TVs.
-
inperfectdarkness Notebook Evangelist
15" only seems small. Then you realize that FHD smartphones (even UHD ones) present a display that is clearly superior to the older 720p or whatnot offerings that used to come on these tiny 4-5" displays. And if you can see the 720p > 1080p upgrade at 5"...then dollars to doughnuts you'll be able to see a 1080p > 4k upgrade on a 15".
tilleroftheearth likes this. -
Mhhhmmmm and then you should realize that the OS is tweaked for the new res, where Windows is NOT!!! When you are going to realize it? I've said it earlier in this thread that I have little complaint against macOS' scaling - it's still wasted pixels in my book, but at least everything looks awesome.
HTWingNut likes this. -
inperfectdarkness Notebook Evangelist
Even if it's scaled though, it's not wasted pixels...because it will still LOOK BETTER. That's the myth that most people who haven't used 4k--don't seem to understand.
tilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
HTWingNut, you should know me by now. Marketing has little to no effect on me. What I test is what I believe - even on this very forum, when everyone was shouting at me that SSD's were the bees knees (back in 2009...) and I was very firmly saying 'no, no they're not' - at that time... why? Because they were garbage (in performance and available capacity - not to mention the rip off prices...) - when compared against USB drives and programs like eBoostr (ahh, memory lane...).
Like I noted before; I don't need a 32" monitor to know that more (equal or higher quality) pixels is better... I can see that on 5.7" screens (phones)...
Gaming... gah... not for me, granted. But, just because the almighty GPU isn't up to the task of driving a proper resolution for our eyes doesn't mean the monitor is at fault here...
When something is good, it is good period. 8K for a 32" monitor would be like how that pioneer Nikon D1 looked like to my eyes back in 1999... like looking at a seamless/analogue version of my images compared to anything digital before then... That is why higher resolutions are better...
All of our eyes (at 121MP or higher equivalence) are aching for displays that match and exceed their capabilities - not just give us a good enough rendition that we will be sadly laughing at in the not too distant future.
tgipier likes this. -
tilleroftheearth Wisdom listens quietly...
My 'visual pleasure' is to be able to look into a monitor for 36 hours straight (photo editing) - if required - and not feel like I had my eyes massaged with rusty razor blades.
Saying that one resolution is better for a certain screen size for viewing at a certain distance is a fallacy... That is just the 'good enough' 'spec's. When (not if...) monitors surpass the human eyes in resolution (~121MP to 240MP... - yeah; that is a minimum of 11Kx11K of 'pixels', required - for any size of screen/monitor...) than we can begin talking about 'overkill', in monitor resolutions.
Just don't confuse what a monitor is able to display with what a GPU can't do... Give them time, those puny GPU's will catch up soon enough... as will Windows capabilities to handle that scaling properly...
And sure, our eyes may get dimmer and see with less acuity over time as we age - but even then they can still be used to determine if a monitor is of higher quality or not (seeing, like all of our senses, is not just about the actual organ used to perceive that quality. Rather; it is always a blend of the senses/brain interaction that gives us the final 'picture', 'sound', 'touch', 'taste' or 'smell' that we perceive).
Don't be the ( misquoted) Bill Gates of 2017... xK resolution is not for a specific sized monitor.
Humans don't respond to digital output in a linear fashion (and that is exactly what LCD's output...). Our view of the world is analogue and more (from our digital sources for our senses) is always better (all else being equal, of course).
-
Last edited: Jan 7, 2017tilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
triturbo,
you have made some excellent points. Thanks for continuing the conversation here.
No, I can't do that with any notebook I current have or have used recently - but that is more because of the condition of my age and eye health than anything else. I used to be able too, but inferior technology seems to have run my capabilities down.
That is some panel you have! Which one is it and on what notebook do you have it installed in?
Sure, depending on who's math (or statistics) you believe, that may be true (on paper). But my real world experience has taught me to never underestimate my (or anyone else's) physical abilities - even if it's 'proven' to me by so called scientific research and other mind numbing and imagination stagnating propaganda.
Several years ago I created a few large (17") prints at various 'resolutions' and came to the realization that I quickly ran out of vocabulary to describe what higher and higher MP images did to the final print. Yeah; much like in audio where the current and not so current know-it-all's laugh at anything better than a $150 Walkman from 1979.
What became apparent very quickly though was that even at +20K x +20K printed to a printer capable of 'only' 1440x720 print output, I did not have enough 'computer' to print at much higher res.
Those prints though served their purpose (even the ones that printed halfway through before the system crashed (printer or O/S?, I don't know...). And increased details were not only present at normal viewing distances with the higher resolution input files; but they were present in spite of the technically limited printing device I was using at the time too. A few years later, when technology had moved significantly and my eyes hadn't changed that much; I came to the conclusion that my eyes could resolve higher than ~250MP in optimum viewing conditions. I only wish that I could state the same thing today.
Btw, I've been to a few optometrists until I found one that would allow me to play with my prescription so that I could continue to use my cameras/lenses as I want. I too would ask about the theory and the state of my eyes and make jumps in my conclusions of what would (or should; if I understood everything properly...) work best for me. It took a few eyeglass pairs along with a newfound perspective on the part of the optometrist to get to where I'm at now. But on a good day? I have 20/10 vision with my customized prescription. (Compared to about 20/30 vision with my best 'regular' prescription from any of the other optometrists' I've used...).
No, never stated you said that. Just cautioning here...
Again; I don't want a 4K or an 8K monitor to do no scaling on... I just want the pixels to use them as I see fit.
I also agree that some panels/displays look worse even when scaled by whole units - but that just makes that display bad - it doesn't (by default) make all 4K displays bad.
No, actually it doesn't. When we are talking about our senses, we can't separate our brain interaction from the conversation. That also means that we can learn to see/hear/feel/smell and taste better. Because our 'sensors' are not the last word on what we perceive (they don't deal in absolutes - they are much better at identifying differences...) - rather; they are merely one of the processes humans use to identify, label, measure and compare/contrast all the different inputs they are able to perceive (and yeah; some of us can do more or less than others, depending on how well we've been trained on using our sense(s)...).
The point you're trying to prove below is a case of circular reasoning, I believe.
And it doesn't matter how many billion flies like xydt (uhm, you know - the smelly brown stuff) - that is irrelevant on how good I think xydt tastes.
And on your points below; we agree 100%.
Even if half pixel drawing technology is just another doubling/quadrupling of the resolution of a given panel...
-
I remember I was between first ones I purchased fujitsu-siemens amilo xi 1554 laptop with 1920x1200 17" and it was in 2007 or 2008 year and then everyone were saying omg whaat? That is overkill, pointles, why wud u do that..Now u see u have it in your laptops.. Pathetic/naive ppl..
Exacly (some) same ppl here. And within 5 years these little boys who said 4k is meaningless or pointless on 17" will be saying opposite and will have 4k at home quietly purchased and suddenly they will switch opinions bcos simply they will be able to afford it.. And it will be standard and again they will be saying dat 8k is pointless..
-
Sent from my LG-H850 using Tapatalk -
Ionising_Radiation, TBoneSan and triturbo like this.
-
TBoneSan and tilleroftheearth like this. -
inperfectdarkness Notebook Evangelist
Well since 15" WQXGA laptops will probably never happen, I'm stuck with 3k or 4k displays. I'd be perfectly happy with WQXGA; it's just not available in the form factor I need.
No argument about MAC. MS has not prioritized scaling, because they know that business PC's are still a bulk of their sales. And a good chunk of the rest is from unwashed masses who bought into the 1080p brainwashing hype. That said, if windows 10 gets patched next week & scaling is 99% fixed, I'm not going to be left kicking myself because I naively insisted on only using 100% scaling option (native). My desktop is 150% scaled on 4k, and at arms length, I can still see everything on this 15" just fine. -
FredSRichardson Notebook Groundsloth
I really like my 4K 15" screen. Under Win 10 everything I run looks great with the exception of some dialogue boxes. I believe this is in part due to some changes I made.
Most games I play run at 60 fps with 4K except Witcher 3 on Ultra which runs at 30fps. I've run that at 1080p and the frame rate goes back up to 60 fps but I've found I like the higher resolution more than the higher frame rate. 4K really is nice and it's hard to go back.
Sent from my SM-G900V using Tapatalktilleroftheearth likes this. -
tilleroftheearth Wisdom listens quietly...
That is a nice platform you have indeed!
On topic; it seems we've gone full circle. If no scaling is needed or wanted; 4K notebook panels make no sense to most people here...
If scaling is used; higher than 1080p panels is highly desirable (all other quality aspects being equal in the panels compared)...
Windows 10 scaling issues is a problem I don't see. Partly because I invest in newer programs as necessary (and the new ones scale properly, or at least; good enough...) and partly because the platforms I use just need to do 2D 'excellent' - 3D is still not part of my main or even secondary or tertiary workflows yet, so even the iGPU's I love still can drive the 2D workloads to almost any monitor I choose to use (and in most cases; 'times' 3).
Hang on to your platform as long as you can; I agree it is a rare and special breed.
For anyone with the panel + platform budget you had to get an equivalent quality 4K panel today is just as doable though (imo).
And that 4K or higher resolution would be at least equal if not superior (overall) to even your panel's pinnacle of excellence of almost a decade ago.
I've had similar products/components over the years where I too thought they would never be surpassed - but in almost every instance, I would walk into a WalMart or some other mass consumer supermarket and find better for 1/20th of the cost or less...
In 1999; the Nikon D1 was crowned the king of quality pixels in a body to match (almost) anyone's shooting style (i.e. professionals). In less than 3 years, there were such highly improved options available that even the once $10K body was a burden to continue shooting with for even the next few months... And the kick to the pants was that more than a few of those options cost only a fraction of what the pioneering D1 commanded at one point.
I loved the D1's I went through... and even have the actual D1 I first bought to test against my F5's... but defending it past it's 'best before' date is not something I can be accused of... even if I was expounding it's virtues early on - when it seemed everyone else around me was groaning about it's astronomically high price (at intro...)
Lesson learned: even at a worse, per pixel, quality level - a higher MP camera of 'this' level of quality (and higher) was/is/and always will be better than the highest quality (per pixel) camera ever mass produced. The challenge was to use the new equipment in different ways from the 'old workflow' to extract that superior quality it offered.
In the end; what changed wasn't the equipment so much; it was me.
(And that is how it should be (continual learning/relearning); until I finally quit breathing...).
-
I'm sad we pretty much skipped 2k resolution completely really. Would of been a better step up than skipping straight to 4k IMO.
triturbo likes this. -
-
And don't tell me what to do OK?Last edited: Jan 12, 2017 -
It isn't.. It is only quarter more of FHD.. 4k is four times of FHD and dat's the huge step n difference..Last edited: Jan 12, 2017 -
-
Really? What laptop u have?
I really don't believe dat u playin games all new at 4k and in ultra on litte 15" laptop without problems..
Can you prove it? -
FredSRichardson likes this.
-
FredSRichardson Notebook Groundsloth
Games are not all new. FO4, BL:TPS, D3 & W3 is all I play lately. So far only W3 is not 60fps, but yes others are older games.
EDIT: BTW the new Tornado F5 with a 7700K and 1080 probably can play most new games 4K@60fps. And if that doesn't work try a P870DM3-G with dual 1080s though you may prefer 3k@120fps instead
Sent from my SM-G900V using TapatalkLast edited: Jan 12, 2017 -
That defo needs 2gpus even they might not be enough for new games.. -
FredSRichardson Notebook Groundsloth
-
This very question is an absolute nightmare. I have been researching and researching for hours a day for the last month trying to decide refresh rate vs resolution. You would think with where we are with technology high refresh+high res would be compatible. However it would seem the only way to truly decide is to test them out. And that is nearly impossible without buying one of each with the sole intent of testing them side by side and sending the one you like least back. And not all of us have that kind of money to do that.
-
Has anyone seen any articles or videos in reference to when we may see 120+ hz gaming on 4k laptops? Or good IPS panels with 1440p laptops? I have tried every wording scenario I can think of and cant find any articles with projected dates or companies currently working on the technology.
-
I have been looking into this for a while. I came to the conclusion that a real good 1080p screen is the way to go for laptops over 4k. maybe in several years from now a 4k monitor will be good in a laptop but for now 1080p is just fine.
-
-
-
The Asus PG27UQ seems to be the first 4K 144Hz monitor on the desktop side but no eta on release yet.FredSRichardson likes this. -
FredSRichardson Notebook Groundsloth
http://www.hidevolution.com/evoc-p870km1-dual-gtx-1080.html -
(The K is referring to horizontal pixels. So 2560x1440 2560 is about 2.5k. 3200x1800 is the closest to 3k) -
FredSRichardson Notebook Groundsloth
"17.3" 3K QHD (2560x1440) AUO 120Hz - 5ms - LED Matte Type Display - w/ nVIDIA® G-SYNC™ Technology" -
The Samsung pentile display should not really even qualify as 4K either, effective resolution is like 2.7k. -
FredSRichardson Notebook Groundsloth
https://en.wikipedia.org/wiki/Graphics_display_resolution
The wiki page doesn't even have a definition for 3K - so maybe that is ambiguous... -
inperfectdarkness Notebook Evangelist
marketing buzzword tomfoolery is why 1080p resolution DOWNGRADE was able to be shoved down consumer throats so easily. And no, I'm not giving up 3/4ths of my resolution just to double the hertz. TV shows are still 30FPS, movies are still 24FPS. Yes i get the idea of 120hz being a "fit" for both--but I'm not going to saddle myself with piss for resolution on account of what I stream. I GAME with my laptop.
-
-
I think you kind of misinterpreted what I meant btw. It's not that things in general look fuzzy to me. It's that I can see the fuzziness of a 1080p vs a 2160p display quite easily. If my eyesight was poor they should both look bad because I can't focus on the screen.
I was at an eye exam like 2 weeks ago too.Last edited: Jan 23, 2017tilleroftheearth likes this. -
-
Thanks for looking out, but I'm pretty positive in my case that's not the issue. I can read my retina display iphone like a foot away for hours with no difficulty because I can't make out the pixels.Last edited: Jan 23, 2017 -
Is anyone running 4k resolution with no scaling on a 15.6' laptop? I run an m4800 with QHD+ 3200x1800 and I don't need scaling.
My Lenovo p50 is 4k and I need to scale it to 125% which defeats the purpose of higher resolution for me.
Let's Discuss 4k vs 1080p on a laptop.
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Luraundo, Oct 2, 2016.