ok i have a question about a 1800P screen res on a 15.4" laptop. I hear people complain about how small the icons appear on screen with this res. My question is if you can just change the res to make the icons and other things larger?
-
ViciousXUSMC Master Viking NBR Reviewer
yeah you can, but then you paying more for nothing probably.
-
Well ya.. but i wont recommend that resolution for a 15 incher. unless you are a gamer that has a 8800GTX or equal.
-
It would be silly to just lower the resolution to make icons appear larger. LCDs are meant to run at their native resolution, so you'll get some crappy quality if you go in that direction. Also, as Vicious said above me, you would be paying for nothing (but I am assuming you're looking at a system where that config is required to get a promotion). At any rate, I have the 1920X1200 on my M1530, and I find that the icons aren't too small at all. You really adjust to the size of everything very quickly and you'll find it grows on you to the point where you'd have trouble going back to a standard resolution. If you did have trouble with icons being too small, you could simply increase the size in the personalization menu. Same thing with font - just increase your DPI setting and you'd be all set. But run the machine at the native resolution if you get it. It would be a shame to let all that real estate go to waste. Common man. Global warming.
-
yeah, but fr gaming, even with a 8800m gtx, u can't rly play much on 192
-
well it can on almost every game - Crysis.
-
Yes, everything is adjustable.
Either by increasing the DPI,or by holding down CTRL and using the mouse wheel to zoom in and out. -
Yeah I bought Crysis like a week ago and I have been playing every day. Works like a charm with medium settings.
-
-
-
No...
*Runs in shame* -
Yea, I adjust the webpages constantly.
Zoom in and out depending on the fonts used. -
its a great feature if you are looking for hidden small text
see what i mean?Last edited by a moderator: May 8, 2015 -
-
lol nice one
-
-
Sirmetman, i think he meant future games, not already a year old or more, forgive him for him broken language.
-
Well, CoH is still often used as a graphics benchmark game, as is HL2/Ep1/Ep2. I think it's reasonable to comment on the capabilities for such titles in a current context. And his implication is that current mainstream games can't be run at WUXGA, which is untrue, by my reconning.
-
sirmetman you must note you are running on a Quadcore, which really helps smoothen things up on such res
-
Quadcore makes little difference in how things scale in terms of resolution. Scaling and rendering happen on the GPU almost completely independent of the CPU. If I was talking about how my machine matched up against others when dealing with max unit counts or numbers of concurrent AIs or some physics calculations, I'd agree with you, but when we isolate the variable to graphics resolution, CPU makes very little difference.
-
i would ask you why did you get a Quadcore then, but no more Hijacking.
-
Resolution, lighting, and graphics are tied to GPUs; damage modeling, physics (though with CUDA etc, this is far from 100%), AI, and other game logic is handled by CPU. Quad core should be better when games become more heavily threaded, and also to better support multitasking. With games in general, but especially with multithreaded games, framerate is not the only important measure.
In any case, yeah, 8800M GTX can handle all but the most demanding games at 1920X1200. That's my statement, and I'm sticking to it (until someone proves otherwise). -
how does your system handel Race Driver:GRID, i know mine does 60+ @ 1440x900, but how does yours do @ 1920x1200?
btw, i dont think you can play Crysis on high @ 1900x1200 with playable FPS can you? -
interesting...tell you what....scale your cpu down to 1.6 ghz and go run them games again and let us know the frames versus the frames while running quad at 2.4. bet you will find quite a difference.
-
-
It's like saying changing the tires will make a firetruck put a fire out faster. It can get to the fire faster maybe, but once at the fire, you have to change something other than the tires to be able to put the fire out faster. A slower CPU will make the game run slower, but not in a way that ties to resolution.
Oh, and I noted that I don't have Crysis. Note what I said that was for the majority of games, it will do just fine. The key word is majority. Yes, there are some which it will not, but most games you will throw at it will do fine. I wouldn't suggest someone not get a WUXGA screen because 5%-10% of games they will play won't be able to play high-max at native res.Last edited by a moderator: Jan 29, 2015 -
not a very good example you gave, but i see what your trying to say.
and it's really only one game in question. you used COH as your basis of a decision, while the rest of the gamin community uses CRYSIS as a basis of testing out gf card. now unless you know something that we, nvidia, intel, gateway, hp, dell, sager or anyone else that i forgot to mention, don't know. now would be the time to share..there is no single gpu card that can run that game on very high. and your sager with it's 96 shader cores will not run it playable at 1920x1200 very high, let alone adding AA in to the picture.
edit: now if you add that second 8800m gtx card...you might be able to pull it off. -
Your right, I didn't base it on the extreme case of Crysis. I decided it was a more reasonable measure to think about what 90% of what you will play will be able to do rather than the 10% of games likely to be played that will not be able to perform at that level. Next time someone asks me if broccolli is good for them, I'll remember to say no, because if you are allergic or eat 4 pounds of it a day, it could do you harm. Eesh...
-
-
I gotta agree with John, Crysis is becoming the new benchmark. 90% of people posting in the "Can My Notebook Run It" Thread are asking about Crysis. But you make a good point Sirmetman... though Crysis is the buzz.
-
-
Good point. Crysis is a benchmark, but there are many other great titles coming out.
All in all, the 8800m GTX is still one beast of a GPU, and will handle anything thrown at it for quite some time. -
yeah, not dismissing the gtx at all, but standing up for the gts as well. even though the original question had nothing to do with where we are now...lol
-
Good point.
Yeah, I think his question got answered on page one, heh.
-
-
-
Haha, that's the freedom of opinion.
-
how fast of CPU do u all suggest combined with a 15" 1920 x 1200 geforce 8800 GTX?
-
At least a T8300 (2.4) .
-
whats the difference between the 2.4/2.5/ and 2.8 CPU?
-
.1, .3, and .4 GHz.
-
-
Sorry, you're right KGann. A little more insight. As long as they are part of the same series and have the same FSB speed and cache size, then the difference between 2.4 and 2.5 is ~4% perf boost, 2.5 to 2.8 is ~12% faster, and 2.4 to 2.8 is ~17% speed increase.
Screen Res 1920 x 1200
Discussion in 'Gaming (Software and Graphics Cards)' started by chidesd, Jul 7, 2008.