Hi guys
Im writing to all of those who exaggerate their Gfx perfomance.
After reading so many threads like:
Will my x200, GMA, x1100 play these... games where people respond to it with:
Oh my geforce 7200go runs CS;S on 1024x768 with high settings and AA enabled so i gues it will run... game oh and btw x200 plays F.E.A.R on low settings so you should be alright.
OK.
So geforce7200go performs about the same as radeon x600 mobility which wont run CS;S on such settings, so how the hell would 7200go manage playable framerates on those settings???? My old desktop with x700pro couldnt do that how the hell 7200go does??? Thats because its BS, there is no way it will run HL2 engine games at 1024x768 +high detail! Cut the crap and tell the truth! Oh and about x200 running FEAR - x700pro lags on 640x480 with med settings.
thx for listening.
(soz for me english)
-
I was playing HL2 the other day on my X1300 at 1280x800, Everything on High, AF at 4x. Most of the time it was around 40-60fps. Once in the intro video and when I first walked into the open square it dropped to 20fps, but quickly picked up as textures were cached in local memory. Personally though I'd probably play at medium details, no AF just to keep fps above 40 at all times.
FEAR I play at 1024x600, everything on low, no shadows. Still looked nice though, top top game. If you have trouble running it, disable shadows. Watch your fps rise and rise, especially on the older ATI cards, they were not too hot on shadowing (look at Doom3 engine on Xseries of cards) -
I know you can have good fps in small areas with high settings but not that you get 60fps all the time and thats not what i call playable.
-
If you think that playable is only 60fps no wonder you hate IGPs so much. Most titles you are only going to achieve that on high end laptops or desktops playing at those resolutions. I don't know what most people consider 'playable' around here, but for me if the game is not stuttering (ie 25fps or below) i'm a happy camper.
-
Oh so if 7200 runs game at high detail with 25 fps thats considered to be playable? I can tell difference between 100 and 60fps and 50 is as low as i will ever go.
-
Video runs at 25fps, is that not watchable? People's opinions on what is playable is a subjective topic, your high standards are not shared by everyone.
-
You couldn't even run Oblivion on a GMA900...not even 640x480 with all settings in minimum. X3 didn't even work...and anything newer than NWN hated my old GMA.
IGPs are good for light, very light, gaming and that's it. Anything beyond that and you need to look elsewhere. However, I'd guess anything about two years old would work on a 200m, about three years old for a GMA. Forget most of the other options (does nVidia have an IGP?...if so then count that in like the 200m). -
My old lappy which I no longer have and so do not have to defend did run FEAR on 1024x768 with low settings and 800x600 on medium with effects and some eye candy on. It had an x200m, non dedicated 128mb hyperthread.
All I can say is that either the 1gb of ram did it good and the AMD64 did, or not, no idea. As we know every set up is different really, with the stock ATI drivers it would run like a piece of ****, but with the omega 6.3 it ran well.
I think you need to take a chill pill, sit down and relax. Why would anyone need to lie about what their machine can and can't do anyway? By contrast my X200m wouldn't run planetside in busy areas, but that is just a poorly coded resource hog. -
My video card (radeon 9100 IGP w/ 64MB VRAM) runs HL2 at medium (max res), AoE3 medium, and CSS at medium high (at 25 fps). All of them, with the exception of Microsoft's bloated AoE game, run smoothly with no complaints. IGPs aren't half bad.
-
Iceman0124 More news from nowhere
Thats an apples to gall bladder comparison, film being played back and on the fly. Rendered images on a pc are two completely different things, the human eye processes these sources differently -
Well fps tolerance IS very subjective.. If you ask me, I would take 25fps consistent over 40 high and 20 low with it changing inconsistently.
And the reason film is at such a low FPS is film by its nature incorporates motion blur, something rendered images don't naturally do. -
Motion Blur helps a lot. I can tell the difference between 25 fps and 50 fps in F.E.A.R., but in Need For Speed I can run it at top settings with 4xAA and it gives me between 25-35fps and looks smoother than F.E.A.R. Motion Blur is key to making a game looking smoother.
That being said, fps tolerance is subjective as everyone has said, and if you feel that you NEED to have 60fps at all times, then that's your thing. But telling other people that they are ridiculous for saying games are playble at xyz settings on abc card or whatever is nonsense, because they may find 25fps completely playable (as a lot of people do). Besides, pretty much everyone on here agrees that IGPs are not great for gaming... hardcore gamers don't buy computers with IGPs. This is nothing new. -
-
Just to let people know, I wasn't trying to say movies and games are comparable with the whole 25fps thing, I was just trying to point out that below 60fps does not mean its un-playable. I needed a crappy analogy as I was on the way out the door at the time, but trust me I've been involved in mannnnnyyy discussion over gaming/movie fps before! FPS is subjective, I know many gamers who won't get an LCD since they like the higher refresh rates they offer, they see 60hz on an LCD as a drawback to gaming. If that's what works for them, fair enough!
But from being a member here, most laptop owners will not be expecting their games to be 60fps+ at all times, as for most owners playing newer titles, it is simply not going to happen. It is just a case of different people having different perceptions of what playable really is. -
Ok guys i wouldnt say that 25-30fps is playable. For me to play comfortably in (for example) CS i need 100FPS constant in 10 smoke grenades same goes for CS:S for comfortable play i need 100FPS while shooting in wide open areas.
-
-
I'm one of those gamers who is quite happy when the FPS of a game stays over 25. I don't have bad eyesight (miltary tested, aircraft program) and any more for me I can't really tell the difference. I actually prefer not seeing the difference. Means I don't get all gee'd up because I played FEAR at 30FPS on my X1400, and only had to spend half as much money as I would have if I was a frames per second junkie.
And, about the 7200. I've never used one, so I'm not 100% sure on it's performance. But if a Radeon 9000 IGP can play FEAR everything low at 640x480, the 7200 should be able to play it decently looking. -
My desktop plays 1.6 at 100fps stable, so does my F3Jc and a smoke grenade won't make it lag, although only in the desktop PC´s case, lappy`s fps gets below 30 and basically what I would call unplayable. Source runs tweaked at native res and luckily Source`s smoke nades aren't such performance hogs.
At least in a laptop, great specs won't make it a bonafide gaming machine because of one particular thing - the LCD. Basically, since almost every laptop LCD panel has a response time in the 16-25ms range, it is helluva lot compared to modern desktop LCD-s. At least I find playing 1.6 and Source on my Samsung 730BF 4ms more satisfactory than on the "slower" laptop LCD.
This turned out to be a rant, ain't that so? -
There's a reason I don't use FRAPS or anything like that. I just play with the settings until the game runs as fast as it feels like it should at as high a settings as the computer can handle, and call it good
-
lappy486portable Notebook Evangelist
Well that is your problem. For a long time I suffered with horrendous frame rates because I had a Intel GMA 900. 15 FPS was like 40 to me. Now that I have the go7200 I got spoiled. When I go to 25, I still think it's playable but not smooth. When I got 25 with my GMA 900 I thought I was in heaven. My point is what you don't think playable is, is actaully incredibly smooth to most people. And yes, I can run HL2 and CS:S at 1280x800 at all high with 4x af. And I always get 25+fps. Only once did I get lower at that was at the very end. My go 7200 scores 1550 3DMark 05's. That is more then a X600, X700, OR X1300. Close to a X1400. I do get that much frames per second, and Im not just talking out of my ass. When I first saw the results I was amazed. I haven't tried Fear yet, but Im going to download the demo right now and see what I get. And why would I make it up, just to be an *******? Oh and UT2004 plays at all high at 1024x768 never going below 25 FPS.
edit: I just played the FEAR demo and it runs at 800x 600 at the graphics card setting at medium, with no shadows. Very smooth and good lookin. Im becoming more and more suprised with this card
edit: again I just dowloaded Farcry Demo and I can play high quality at 1024x768. Wow, I never knew what this card had in it. Its quite the beast lol. Beginning to wonder if I couldt tackle crysis, lol -
And your saying your specs with a pentium M and a x1300 runs it at that 40-60 :/.
The only thing I can think is of is the resolution difference? -
lappy486portable Notebook Evangelist
-
-
No i wanst joking about 100fps and yes i can tell the difference between 60 and 100fps. I cant stand fps drops or any lags and surely 25fps isnt playable (for me anyway). However i dont mind turning all settings down and playing at 640x480 to get this extra fps for a smoother gameplay.
-
I never got that. I mean 30fps is plenty for me. And as for telling the difference between 60 and 100fps? Thats a bit sad if you think it impacts your gaming. I have played CS S at 35fps and still done EXACTLY the same as I did when getting 80fps in my desktop....
Some people have just gone totally overboard with the whole thing.... -
Hell I'd pick smooth 60fps good detail gaming over crap detail 640x480 with a higher framerate.
-
-
You can get CRTs that go over 100hz, few people at other forums I visit refuse to go LCD as it means playing games at 60fps. Each to their own, I'd never go back to CRT myself. -
I hope you're not talking about playing on a laptop. Or an LCD for that matter. Most LCD's only have a refresh rate of 60Hz, meaning that it is only possible for the screen to display 60fps. If you're getting 100fps, you still won't see any more than 60. The screen isn't displaying any more than 60. And the highest CRT refresh rate I've ever seen is like 85-90Hz, though there may be some more gaming monitors out there that go higher. But seriously, if you're talking about laptops, there's no way you're seeing smoother performance from 100fps over 60fps. In fact, 100fps would probably be less smooth because you'll get tearing since the graphics card is outputting more frames than the screen can show.
Ever played Quake3??? well there is such thing as strafe jumping and it cant be done properly at 90fps at which Q3 is locked it but as soon as you set it to 125 you'll see the difference. Also there are difference in hertz and FPS and (CRTs can go up to 120hz). There are also lots of tools like refresh lock etc. -
I can strafe jump at less than 60fps. I used to play on my Aspire 3002, with SiS graphics, and could pull that off as easily as my desktop can. Just because YOU can't do it at less than 90fps :S doesn't mean others can't
That reminds me, need to put Quake 3 on this thing, I need some good Linux games. -
. Yes I do run mine's at 16x AF and 8x AA.
-
-
My 19" CRT will refresh at 100Hz at 1024x768. I personally would rather be running at 1280x1024 with some reasonable quality settings and settle with the lower framerate and refresh rate. 1280x1024 @ 85Hz is fine for me on the CRT. I start to get headaches with low refresh rates though (not a problem on LCD, they work differently).
-
Agreed. 60hz on a CRT gives me a huge headache. Usually anything below 75Hz starts to get to my head, but I can stand lower than 60fps.
-
metalneverdies Notebook Evangelist
my friend has a hp with 128 mb x200 dedicated and he has to play source at 640x480 with every thing on low and dx lvl 7 to get 30-40 fps constant... its also a 2.2 ghz amd 64 bit so its not slow either igps just arnt that good for gaming for every day use they are great cause they save power and still have good looking video for what they are ment to do... and you guys are crazy i have an x300 that i took out of a dell and put in my desktop and it runs bf 2 no problem on medium settings with 60 fps and css at about 80 fps aa on 2x and anistropic 4x detail on high or medium
-
Yeah, overclocking an X300 you can get around 2000 3DMark05 points, which is what some stock X1400s get. Though, looking at the performance of the new X1150, and if the X1250 arrives, I'm beginning to feel I could settle for an integrated card.
HL2 doesn't look bad in DX7 mode, and being able to play it @ 1280x800 in an X200 is fine by me. -
1150, that's integrated? What's its performance level? And there's a 1250 coming out that's also integrated? I feel I'm only going to benefit by waiting...
-
Buy something today and it's out of date tomorrow, buy something tomorrow and you live without today... -
Well that's always the case.. but it seems like with the new OS it'll be even more prevalent
-
The waiting game never ends but it probably might be actually worth waiting for a potential release date on that 1250....other then that, for a low-end user, integrated breakthroughs aren't exactly quite as common as their high end market so if its too far off, I'd go for the X1100. -
Sylvain - oh so im wrong am i? so all gaming community is wrong too? you cant strafe jump properly on <125FPS and thats not my personal opinion.
Maybe you should read that: http://ucguides.savagehelp.com/Quake3/FAQFPSJumps.html
Let me give you a few exaples where fps matters.
Lets say me and you are playing CS:S. Im running it at 640x480+lowest settings and i get around 70-90 FPS. You are playing it at 1024x768+all high+AF&AA and you get 25-40FPS. You can look at nice looking walls and experience slide show while i own you a**. Who do you think has advantage? I doubt you will be able to do anything against me. In CS FPS affects you recoil aswell as game phisics. Sometimes its fast placed game where every frame rate counts.
In competitive games like CS and Quake fps does matters and who ever has most has an advantage. -
-
Hmm.. reading up on the 1250 it sounds pretty nice. I wonder what AMD new video options or nvidia ones there will be. I'm not really in the mood to get an intel machine (lower end machines don't seem to exist with good video)
edit:
Nvidia has their MCP61S IGP chipset coming out and it sounds like they're planning a mobile version. Sweet. Sounds like the 1250 and MCP61S are the answers to the beefier vista requirements. I think I'm going to wait for those. -
id rather go get some hax than playing on lowest res and settings -
And I play CS at 1280x800 @ 40fps, and it doesn't create a slideshow. I'm not a very good CS gamer anyways (don't have twitchy reflexes) so increasing my fps won't help me one iota. That's why I play flashpoint and RTS games instead, because patience and planning can redress the reflex balance.
AND, I play FEAR at 30fps, which drops sometimes to 25fps. I've never complained about the smoothness (or lack thereof) and find myself enjoying perfectly fine. Sure, the extra fps, or a boost to max GPU settings would be nice LOL but I wouldn't get any gameplay advantage out of it.
That's the beauty of eyes, everyone's pair is different. You need to see things at 100+ frames per second, that's great. I see perfectly at 25. And I know which one I'd rather.
EDIT: I never play with an AF or AA on. When running at native res, I just don't see the point, the performance hit is just not worth it. -
Hey man i gave you link and you should read what it says! i know how to strafe jump trust me. At 60fps it feels weird and slow. You dont get enough speed to get to places like: qdm6 jump (from rail to bridge). ill give you an example: Run CS.16 and type max_fps "30" in console now try playing like that, try to jump and see how it feels. As ive said competitive games like quake and cs require high fps to play properly (look at pro gaming scene). Ever seen a progamer playing at 1280x800?dont think so. However in single player games like hl2 you dont have to have high fps to enjoy the game but id rather play at 1024 with med detail and have more fps than suffer.
CS at 1280x800 @ 40fps is that on your own? how many ppl on the server? ever played on 40 man with those res? Personaly i (in mp game) i play at 640x480+lowest settings+high fps config@80-90 fps on 40 man server i dont care about detail that much (dont have time looking at nice things) i just cant stand lags thats why i have to compromise.
To XIII_GT - get hax? are you noob or something? if you cant play the game go do something else or play different game where no skill involved like NFS maybe? -
-
And as I've said before I've played CS Source at 35-40fps oh very high settings (as sylvain said) and done fine, as well as most other people on the server. I guess they weren't part of the l33t brigade you're in though, otherwise they'd have used the power of their 100fps mad skillz to better effect. -
lappy486portable Notebook Evangelist
Alright Ive seen a x700 get about 1500, but it might have been underclocked. The x1300 does not rip apart the 7200, in the contrary(sp?) the 7200 hands the x1300's ass. -
IGP my a**
Discussion in 'Gaming (Software and Graphics Cards)' started by applx, Oct 30, 2006.