Other than the obvious reason, being the ¨HD¨ spec of course.
<rant>
My Dell XPS m1710 is now 6yrs old, I´ve kept it up to date with hardware upgrades, stripped it bare about once every 6 months to clean and re-do the heat transfer paste on the GPU and CPU, and I just keep waiting for a true upgrade in every sense, but for the main interface...The screen, seems to be crap in comparison.
Not just the resolution, but actually ¨how¨ it looks I suppose. Almost any of the 1080p ones I´ve seen from toshiba, sony, dell, etc, all look washed out and rather craptacular. </rant>
Am I missing something? Is asking for an 18+ monitor attached to a laptop with a capable resolution beyond 1920x1080 asking too much? Are there even new ones that have 1920x1200? Any that actually show images and text as well as my old laptop here(as in not washed out and rather crappy looking)?
If it weren´t for the 32bit memory controller and rather outdated slowness in comparison Geforce go 7900GS that I´m stuck with I´m not sure I´d even think to move on, sad as that is to say. I´m just wondering if maybe I´m the fool here thinking limiting computer screen resolutions to a TV standard is a bad thing.
-
1920x1080 or 1200 is about as fine as I'd like to go with 17" and smaller laptops. I do agree laptop screens these days in general suck. Most consumer laptops have crappy 1366x768 resolution. You can't even upgrade to a better screen in most cases.
This has been discussed ad nauseum already through several threads in the hardware section. So you are not alone.
I'm ok with 1080p, would prefer 1200 but it is what it is. -
In the specs for the newer cards 560m, etc.. are they capable of doing this resolution (1920x1200) natively?
So perhaps I could just price out a good gaming/desktop replacement laptop and swap out the screen (assuming I can get close to matching dimensions)? I´m having a bit of trouble finding their capable display resolutions. -
Probably because you won't see the pixels at a higher res on a tiny screen like 15".
-
The new alienware m18x, even the asus g74sx, 18.4´ and 17.3´ respectively; both of which stuck at 1920x1080. That asus I´d seriously consider if not for this. @ $1700, it would still be half of what I paid for my XPS at the time. -
King of Interns Simply a laptop enthusiast
The most straightforward answer to your question would be : so that manufacturers can cut their costs slightly.
Also if you think the new LED screens all look better than the older CCFL screens you are out of luck too. I say this as I have yet to see any modern laptop in the flesh that sports a better screen (not just reso but also colour saturation/quality) than the 4 year old one in my laptop.
The HD labelling makes it easy for manufacturers to put in inferior screens at nice low costs without the majority of customers noticing the rubbish they have just purchased until it is too late. -
Well I believe its because most mobility cards (esp 128bit-192bit) choke at higher resolution and on laptop panels, anything less than native looks terrible.. So to have acceptable fps and have the picture looking sharp at the same time i guess it makes sense not have too high a resolution for the displays so IMO 1080p is fine for 17inchers..
However I agree that manufacturers like alienware and clevo who give the option of putting in the top performance cards (and CFX/SLI) should give you an option oh a higher-res screen to take advantage of your Gpu(s) -
I'd imagine it's as mentioned above - there's no real market for >1080p screens, so doing the next step up would cost too much without economies of scale.
In addition, I'd imagine the benefits for most things would be margin at best, consolitus will make making games pretty above 1080p return little profit and productivity tasks for work instead benefit from laptop hooking into multiple screens instead of having one mega-resolution screen.
I doubt the work-side will change, but maybe the next generation of consoles will spur >1080p resolution screens. My old laptop had ULV CPU and weak GPU with 720p, everything from Llano & Ivy-Bridge onwards should be comfortable having 900p as baseline (not that it will happen). When 900p is standard, maybe they'll knock up 1080p to 1200p to make the difference between the two more pronounced. -
I have been asking this question for years the main reason is that the general public do not care. They just dont care they see HD and they think its good.
Also again as has been said it means that manufacturers can cut cost on the screens.
I would love to see screens with the same pixel density as an iphone (300ish) but i can see that happening and graphics card are defiantly powerful enough to run games at resolutions higher than 1920x1080 as people play games on multi monitor displays with much higher resolution than 1920x1080.
Also before people start posting that i cant read the text when resolutions are so high just go and increase the DPI in windows screen setting. -
Agreed with the DPI settings in win7, i find 125% on a 17.3" 1080p screen to be more comfortable than 100%, not that 100% is too small to read or bad, just a matter or personal taste.
As far as panels go, the general consumer wants the so called HD even though they loose vertical screen estate (i want 16:10 back) and it allows panel manufacturers to cut costs down with this as they can manufacture more panels with the same amount of materials. Notebook manufacturers are going along with that so that's not something i see changing anytime soon.
There are good panels available though, IPS and non IPS. The one in my G73JH is actually decent for a glossy panel. You will rarely find a decent panel on a consumer laptop though.
In any case, with the way things are now, i doubt you will find a laptop with a resolution higher than 1920x1080 so you can only cope with it or not buy a new laptop. -
Another issue with high-DPI screens is that it's hard to make them large. The smaller screens like the iPhone and such are simply easier to make without any dead pixels or anything. Once you start scaling that up though, you start running into problems. For a 17" panel, you would have what, 5-6x the incidence of bad pixels as the iPhone screens do? More? The iPhone raw resolution is 960x640, so still lower total resolution (and therefore lower chance of bad pixels) than a normal monitor. Scaling it up would be prohibitively expensive, and the majority of people wouldn't pay significantly extra for it. So they don't make them. -
Mechanized Menace Lost in the MYST
-
-
1080p has "stagnated" for a variety of reasons. Formatting is one of them. Another is that for 99% of users, 1080p is the highest that would be used/wanted, pending extraordinary circumstances, games or otherwise for most people. Those who user a higher resolution will have two or more monitors, or a 1920x1200 display, before anything else. -
masterchef341 The guy from The Notebook
The highest tier consumer movie format is 1080 pixel height in 16:9 format (less for wider formats, but still 1920 across)
That would be blu ray. That is why.
---
You also get very quick limiting returns in perceived detail as you increase resolution. We're already on the far edge of that. Increasing resolution to 2880*1620 isn't going to make a game look much different. It won't make movies any better, either. The primary people who need that need as much screen real estate as possible for work. Games and movies just don't merit the bump, and those tend to define the trend. -
Theres a few reasons for 1080p, most of them is for mass consumer adoption. This goes farther then laptops or pcs, and people who play games. This was about setting a standard for the TV market, which is the biggest dollar earning screen market. This is about your father , you grandma all of us who use HDTVs. HD programming has barely become the norm, taking 6+ years to start filming content in 1080p just for tv is crazy, also trying to push 720p slowed down the adoption of 1080p.
more over, the reason for 1080p being a 16:9 aspect is more to do with movie formats. Movies back in the day were being filmed in 16:9, to give a panoramic effect, this stuck and now we make tvs to show off that content better. The next step I'd imagine would be 4k, but with the world economy hurting, I don't see a move away from 1080p in the next 10 years. Also the next step will be Augmented Reality, or Head Mounted Displays that emulate the natural resolution of the eye. But anyways there needed to be one standard, they chose 1080p. -
Think about the loss due to dead pixels (as Pitabred stated). One dead pixel on a single freshly produced monitor will cut its worth by half or more. With higher resolution the probability of churning out monitors with 1 or more dead pixels grows almost exponentially, on top of the higher production costs.
If they find a way to economically produce screens without dead pixels (or where dead pixels can be officially fixable without much trouble) then there might be more drive in promoting and selling this kind of tech. -
You may not get more real estate with a 17" screen nowadays but you can get a 3D lCD
That is a pretty major upgrade if you ask me. -
Star Forge Quaggan's Creed Redux!
To me? Still a gimmick, not an upgrade. -
-
The other thing is that desktop monitors even don't exceed 1080p until you go to 27" or 30" or larger. And even then they're specialty and expensive monitors.
Desktop replacements - Can someone explain why resolution has stagnated @ 1080p?
Discussion in 'Gaming (Software and Graphics Cards)' started by wayne613, Jun 19, 2011.