I think the 280m will handle well wuxga 1080 for 3 years and even more. In my opinion its just a matter of preference. If you like more destop space and you dont mind the font looking smaller, go for wuxga. If you preffer your font size and icons a little larger, and you really dont mind a little less space then go for wsxga+,. Gaming wise i think its the same performance.
-
I wish there were more 1080p 15" laptops at Bestbuy. I think with this kind of thing, it's best to view it in person. I do worry about everything being too darn small.
Aside from the resolution, I assume the actual quality of the screens are the same? Lighting, viewing angles, ect? I keep thinking the lower resolution screen would be crappier, but I think I'm just being stupid. -
-
Well they will only use one screen in the review, probably the 1080p one. So I still have no way of knowing anything about the other one.
-
once you go to a higher res screen, it's impossible to downgrade. to the OP, 150 pixels vertical is a lot of screen estate, which you will lose if you go 1600*900. -
-
Gaming performance between 1080p and 1600x900 won't be that much, and 1080p is just a better screen. I can understand getting the 900p screen if you had a weak graphics card and didn't do anything else than game, but you have the fastest notebook graphic card, and not only will games look better in 1080p but everything else you do like office/school work and especially browsing websites will look a lot better. Plus you will have more screen space for viewing windows side by side, 1080p movies (mostly to show off your blue ray reader and full HD laptop screen). I have a 1200p screen on a 17inch laptop (so yours would probably be even sharper) and trust me its crystal clear. I can't even stand 1440x900 on my last notebook now. Get the 1080p screen if you can afford it, 1600x900 is nice for that size, but a higher res screen is worth it if you have the graphics power (which the W860CU will have plenty of).
-
As said before, it is not about the graphics power on its own. It is about gaming performance in the near future.
You may well get a nice 1080p screen today but it is likely that the more demanding games will have you ending up having to lower resolutions. We all know that the GTX280M is a very capable card but the higher resolution you have, the more likely you will have to lower this in the near future as opposed to if you had 1600x900. -
-
-
A few people keep mentioning 1600x900 as future proof... I would argue a 1920x1080 is more future proof. If you get the 1080p screen and IF (a big if) you find the screen is too small, you can do a couple things in windows to make your font bigger. If you want to game at a non native resolution you can do that too. It should be a fairly easy upgrade to drop in a faster cpu, add RAM, SSD, or a new DX11 graphics card if one is made available.
However if you go with the smaller screen and you wish later on that you had more screen space, it is not as easy to physically upgrade your screen. What about all the people that have 768p or 1080i TV screens that wish they could go higher?
I just don't believe there will be that big of a difference in gaming performance at the two different resolutions now or even years in the future.
I myself was happy with 1440x900 on a 14.1" so I think I'll be fine with 1920x1080 on a 15.6". -
1920 x 1080 would probably have slightly better resale value.
Just think of gamers mentality insistence on hi-res. -
I know I would just get the high res screen. RJTech has that screen as standard anyway. I agree about the resell value as well.
-
Also, as a final visual comparison, I made the following 2 resizes of the screens posted before, to show just how big/small everything will be depending on what screen resolution you pick.
*WARNING*: These pictures are made for a 22inch 16:10 1680x1050 screen, so if my computer knowledge is correct, you'll only have the true visual representation of the W860CU's 15.6 inch 16:9 screen on a screen with the same DPI as the one I just posted.Attached Files:
-
-
Thanks for posting those screen shots Prophex! I also have a 22" 1680x1050 monitor. I think the 1080p will look perfect!
Can't wait to see the screen comparisons! -
Aaaaaah, I'm not able to wire transfer the money for my new machine using internet-banking, so now I have to wait till friday evening when I go home and go straight to the bank.
And with all that extra time, I'm starting to sulk all day long due to my screen choice, (1600x900) doubting my decision every other minute ...What should I dooooo ?!
(I need someone to sum up the advantages & disadvantages of each screen size for me ...Oh, and give a description of what the target audiance for each size is. (e.g. gamer, someone watching lots of movies, someone using progs like photoshop, someone with low/high funds, someone (not) willing to upgrade, ...))
-
I took the 1080p...and my bank told me it'll take 5-10 days to make the bank transfer, so I have to suffer still a lot
And I think it's better for you to read again through all this thread instead of someone else sum up for you...I made my decision here -
Now with windows 7 its even cooler cuz you can change the zoom level of the whole operating system. That means you can have it look like 1440 x 900 rez but its actually at 1920 x 1200. I hope that makes sence.
Its always a dilema with new stuff. -
If I switch to a:
- 15.6 inch 16:9 1600x900 screen, I would get 2145.69 pixels / cm², which is an increase of about 70.78% pixels / cm².
- 15.6 inch 16:9 1920x1080 screen, I would get 3089.80 pixels / cm², which is an increase of about 145.92% pixels / cm² OR 44% more pixels / cm² when compared to the 1600x900 resolution.
I'm not sure about this, but can I assume that if I were to watch my own screen on either of those 2 monitors, everything would be the same amount of % pixels / cm² smaller then they are now ? (Yes I took the difference in ratio into account when calculating those %'s) Caus making somehing look 145.92% smaller, seems like ALOT.
Here they are: -
Whereas games are going to get more demanding, and harder to run at 1920x, especially on old architecture. I think thats the point that people are making when they talk about 'future proof'. -
-
I know there's no way my eyes can stand 15.6" @ 1920x but still am not sure if my eyes can handle @ 1600x900. I called Staples & they don't have a 14.1" @ 1440x900 (120dpi). I know the 15.4" Macbook Pros @ 1440x900 are 110dpi so maybe that will give me a good idea. I just returned a 13.3" Macbook Pro @ 1280x800 (113dpi) and that was pretty small but definitely not a big strain on the eyes. Still confused!
-
You shouldn't not get a higher res screen just because the text will be small, you can just set it bigger in windows. Once you set the text larger, it will won't be strain on your eyes, but it will be a lot sharper. I can set the text on my 17inch 1200p screen to be huge, at the highest setting the letters are a half inch tall, but its still 1200p and very sharp.
-
Call me an idiot but I actually would've gone for 1440x900 if only it was available...lol -
There is a picture in the (new) review posted on the board, that shows the W860CU with a 1600x900 panel. What do you think ?
Attached Files:
-
-
Looks fine to me but we need a unit in our hands to test.
-
EDIT - Comment removed to the more appropriate thread.
-
I think the 1/2 star is b/c the short battery...
-
If you look a the screen brightness pictures from that review, it seems that the 900p screen is a 200nit brightness panel. In my opinion its not that bright, and you can notice that from the pictures. I think the 1080 panel will be much better in terms of brightness and viewing angles. If you care for the quality of the laptops LCD, then it would be better to look into the 1080p, if you really dont mind brightness and viewing angles 900p is perfect.
-
I'm going to choose 1080p and I'm going to look at this way. The 260m so far plays well with 1680x1050 native now going with the 280m with core i7 will fill in the performance difference needed to run 1080p. Anybody thinks thats a fair assessment?
-
Thats a fair assessment. Even if you have to scale down a game or two (then having it look worse than if you had chose a lower res) on a 1080p screen, all your other games that you can play in 1080p will look a lot better. I think its more than likely there will be more games you will be able to play in 1080 then will have to be turned down.
-
I just think that as games of the future become more and more graphically intensive with more ambient occlusion, Physx effects and much deeper environments then the GPU will have a lot of additional requirements on it without the extra that will be needed to run at a possibly unnecessarily higher resolution. I just can't stand how it looks even lowering the resolution slightly. -
Do they have any res in the two thousands for notebooks? Also i keep hearing about 1200p and higher, i though 1080P was Full HD, guess not..
-
. Lol, it just makes it harder now because everything you've have been saying makes total sense.
-
You can't make a 16:9 resolution out of 1200 vertical pixels...you would have to use 1152 or 1260 which would result in 2048 or 2240 horizontal pixels respectively. Although 2048x1152 hasn't been used on a notebook yet it has been used on a few desktop monitors from Samsung, Dell and Acer and has a similar DPI/image quality to 1920x1200 on screens over 15". -
This keeps going back and forth.
1. Some people ordered the NP8662 with a 1200p screen when it was available. That's 1920x1200 on a 15" with a 260M and LOVED IT.
2. People talk about getting 1600x900 to future proof for future demanding games. Let's remember Crysis is very old and still a benchmark. Newer games have been less demanding. DX11 games are a ways away and DX10 hasn't even made the impact like DX8 & 9 did. Why limit the resolution for the comparatively limited amount of time spent gaming versus the larger amount of time spent using your computer (email, web, video, etc). Not to mention limiting yourself today for games that won't even be here for a few years? If you hate the smallish DPI, I can't argue that. However, I believe I will be happier with the higher resolution, and in case I am not I will either enlarge the font or use the zoom. In the future if I cannot handle the new games, I will switch to a lower res or turn down the details. My OLD as dirt AthlonXP2500 with a ATI 9800pro can play crysis, at a low res and low details. If I get the low res, I can never go higher without replacing the whole laptop.
3. It's a personal decision in the end and that's why there is a option for both. I just didn't want people to think the lower resn was the only way to "future proof". -
Suppose I needed to lower the resolution even further, say 1366x768, would I be better off having a native 1920x1080 or 1600x900 in this case?
-
The 1080p will handle downscaling much better.
-
Because of the very basic nature of the resize method used (bilinear interpolation) having more pixels in the final image doesnt do anything to help make it 'look better'. -
Hmm...well, I was thinking what Kevin Jack said as a good reason for choosing the 1920x1080, but if it's all the same below 1600x900, then I can see why 1600x900 might be the better gaming solution for 'futureproofing'.
Another question - Not that you'd want to set the resolution this low, but would gaming on 960x540 with a native 1920x1080 result in a "better" image than other intermediates (1366x768) since it's evenly divisible? I remember reading about that somewhere, but I don't really understand graphics rendering. -
The only way I can tolerate non-native resolutions is to use linear scaling rather than bilinear, so that it only has to be scaled in one direction. For example, I might run at 960x900 on my 1440x900 display. Its still blurry, but to me is a little 'less' blurry than if it was scaled in both directions. This goes opposite to some peoples advice to keep the scaled aspect ratio exactly the same as native, but anyway, that's my experience, it looks better to me than for example 1280x800. Obviously you need a game that is smart enough to keep the aspect ratio correct. -
v_c, downscaled images will not look identical on the two screens, because the 1080p LED will have more tightly packed pixels, thus images will not be as "blurry" when below native resolution.
-
To keep the aspect Ratio correct is enough to use the Nvidia control panel (without hope for a smart game)
-
Its like saying a 640x272 low res video looks better on a 42" 1080p panel compared to a 42" 720p panel, simply because the panel itself is inherently sharper. It doesnt - except in cases where you are using high-quality upscaling. -
Alright, I would take the 1920x1080 any day now. I used to think the same, that taking a lower resolution screen is better, but it's not. To be fair, it is hardly noticeable that the image becomes blurry when I move from 1920x1200 to 1650x1080 (my friends have also noticed that). Indeed the high pixel count matters. When I had a 1280x1024 screen and I've tried using lower resolution it always looked like crap, this is not the case with the resolution I am using now.
If you get the 1920x1080 you will not regret, it is all I can say. -
i'm going for the bigger res screen.
-
-
Based on what everyone is saying, it sounds like - for gaming - the 1600x900 should only be superior to 1920x1080 at it's native resolution. If gaming out of native resolution, then there is either no real difference between the two, or the 1920x1080 is superior.
It comes down to a judgement call (or guess) as to whether more future games would play at unacceptable resolutions in 1600x900 but not 1920x1080 (advantage WSXGA+/HD+) versus whether they would either play acceptably in 1920x1080 or only in resolutions below 1600x900(advantage WUXGA/Full HD).
I wonder if anyone who has gamed extensively with both a WUXGA and a WSXGA+ panel can comment on their experiences? -
If gaming at 1600x900 looked bad then I could understand why some one wouldn't want to game in that resolution but in my opinion, it is the best compromise between looks and performance.
I think it would be interesting to benchmark games with the same settings but with the only difference the type of native resolution.
Take the new Operation Flashpoint 2 - Dragon Rising as an example. At 1680 X 1050, all high settings and 4X AA, I get 65 - 70 fps. Some one who runs the same game with the same settings and a similar high end GPU gets about 30 fps with the frame rates only becoming close if they enable SLI.
With SLI enabled their frame rates are bought up to about 55 fps. This is of course no scientific benchmark lol but even a 10 fps difference can decide whether a game is enjoyable and playable or not. -
1600x900 or 1920x1080 on W860CU?
Discussion in 'Sager and Clevo' started by SUADE8880, Sep 25, 2009.