So I'm about to purchase a new gaming laptop.. (Clevo M860TU).. and I am installing an nVidia GTX 280m into it alongside a relatively good processor (possibly even qx9300 Intel Extreme Quad Core processor)..
But I just don't know which monitor to choose. WSXGA+ or WUXGA.. 1680x1050 or 1920x1200.. Now, my current laptop runs at 1440x900 and at the moment it does do the job and I'm very happy with it, but I'm all for higher resolutions and just cannot decide for gaming.
I just don't have any idea how the 280m can handle games at 1920x1200. I do know that the 280m seems to be specification wise between both the 8800 GTX and 9800 GTX desktop cards, and alot of gaming comparisons I have seen are in 1680x1050 but not many in 1920x1200. I also, rather than guessing how it will perform due to other cards, would like to see an actual explanation on how the 280m itself performs..
I'm talking Gears of war, 1920x1200 or 1680x1050, 4x anti aliasing, direct x 10, everything highest.. Mass Effect set to highest, Burnout Paradise set to highest, Call of Duty 4 and 5, anti aliasing and everything highest, Race Driver Grid highest with anti aliasing also.. Can the card handle it at frame rates of over 40 atleast?
And does anyone know how it will handle games like Assassins Creed 2? Call of Duty Modern Warefare 2? Colin McRae Dirt 2? etc? Will I be even able to set those graphics to near highest?
Sorry, I have just been let down in the past with graphics cards. Remember buying my 6800GS years ago, 250 I think it was, and boy was I let down.. Barely could play the sims 2 set to high graphics at the time.. and now my crappy Dell laptop can play is on 1440x900 nearly highest with a useless ATI Mobility 3450, not worth a cent.
So can anyone shed some light for me? Even, an image of 1 game such as Gear of War, Fallout 3, anything that has high graphics, on a GTX 280m on a laptop of 1680x1050 res and a laptop 1920x1200 res.. Just so I can put my mind to rest before I spend my money..
-
-
1920x1200 is worth it
-
If you're a fan of 4x AA and frame rates of over 40, I'd go for the 1680x1050.
-
Soviet Sunrise Notebook Prophet
Why did you make another thread when you already have one going?
http://forum.notebookreview.com/showthread.php?t=394110 -
I was thinking about getting the bigger screen but now that I have the 1680x1080 is more than I need. Its nice in games, but for general use it is already small for me.
Nothing wrong with my sight, just 1680x1080 is perfect, even in games.
Got no doubts about my decision now -
I am playing @ 1920x1200 with dual 8800M GTX and much weaker CPU. Haven't ran into many issues so far, but well, it's SLI. I believe your GT280 is single card?
Fallout 3 - SLI, 1920x1200, 4xAA, 10xAnisotropic, details on high, runs steady around 30-50fps, depending on area (using single card requires 2xAA, 4xAF, 1680x1050 and low water quality with no reflections)
CS:Source, TF2, L4D - single card, 1920x1200, max everything, 70+ fps
Warhammer 40k: DoW2 - SLI, 1920x1200, 2xAA, everything else on high, runs smoothly with 26-68fps, average 30 (when using only 1 card, minimum fps drop to 11, maximum to around 40, average around 23fps, games becomes choppy sometimes when many units and explosions are on screen, lower details are needed)
C&C3: Tiberium Wars - single card, 1920x1200, 4xAA, everything else on max, smooth and steady fps even when playing Scrin and spamming PACs with hundreds of small units. (higher AA setting drops fps low but i didn't really see a difference between 4xAA and 8xAA in image quality; haven't tried SLI in this game)
Assasins Creed - SLI, 1920x1200, 2xAA, high details, runs smoothly
World of Warcraft - single card, 1920x1200, 4xAA, max details, shadows off, steady 40fps
Haven't really played any other games much so can't comment further. 1920x1200 is an awesome screen, much better than 1680x1050, it's just so much crispier, lots of space on desktop and really awesome for HD movies. But considering my experience i do have doubts about late games running at that resolution with just a single card. You can definitely run them, but i believe that you will be forced to lower some details. 1680x1050 will most probably allow you tu run games maxed out (or close to it) with at least 2xAA but you are losing some advantages of higher res screen. I think it all comes down to what are you planning to do with your notebook. If it's going to be a pure gaming machine and nothing else, then perhaps 1680x1050 will bring you higher fps even in future games. However, if you plan to do other stuff as well, like watching movies, 1920x1200 is just better all around. You will have to sacrifice some graphic settings in game, but the versatility is much much better. Personally i would get better screen and sacrifice some details in games, but it all comes down to your preferences. -
What do games look like in 1920x1200 without anti-aliasing? Would 1920x1200 with no AA look better than 1680x1050 with anti-aliasing? -
I wouldnt take the 1920x1200 option on a 15" screen if you payed me, the screen resolution of 1680x1050 is as far as id want to go.....for gaming it looks a dream and for normal internet and word tasking its at the limit of the size id want to work in.....i definately wouldnt take the higher res in a screen smaller than 17" tbh.....
But good luck with your choice. -
-
To be frank, when you are gaming, you won't notice much difference in sharpness between the two. its not like one is crazy high and the other is crazy low, the difference is not that much. They are both high resolutions, you would be happy with either.
Personally, on a 15" screen and with the GTX260M, I would go for 1650x. -
I was able to max out (max AA, AF, highest quality, etc) all of the games (FEAR 2, Mirror's Edge, COD4, COD5, etc)I played at 1920x1200 with my just 9800M GTS, except for Crysis, Far Cry 2 (I only use 2AA/4AA), and GTA IV.
With my 280M, it'll be even easier. -
I'd take the 1680x1050, purely because on a 15" screen that "upgrade" just isn't gonna be worth it. Especially if you take a noticeable framerate hit for it.
If you really want to watch/play something at 1920xwhatever, then hook your laptop up to a HDTV. -
Okay.. Thanks guys.
Think its 1680x1050 for me then -
Oh, i didn't even realize we are talking about 15" screen, i thought that it's 17" all time long. In that case choice is even easier, 1920x1200 will be really really tiny on 15", wouldn't recommend it. 1680x1050 is just about right for that size and you will get much better performance in games.
-
1680 X 1050 will potentially allow you to play at native resolutions in gaming for longer in the future, without having to lower resolutions as much as you may have to with 1920 X 1200 for performance reasons.
-
1680 x 1050 is just fine and dandy for me on a 17". I would reserve 1920 x 1200 to sli notebooks.
1680x1050 vs 1920x1200.. comparison?
Discussion in 'Gaming (Software and Graphics Cards)' started by MatthewRuddy, Jun 30, 2009.