I have bought a new laptop (due to arrive soon, I hope) with the NVIDIA GeForce 9800M GTS 512MB RAM. I would be interested in knowing whether my system will be able to run 'all' games even say three years from now. By 'run' I don't necessarily mean 'play well', but if it will be able to run them at all.
At this time, my system has ATI Mobility Radeon X700 with 128MB RAM. Needless to say, it cannot run all games that are being released at the moment, but for some of these games it is not the power of the graphics card that is the problem (if I were willing to run on very, very low settings and resolutions), but more issues like the lack of support for a new shader model, or a new DirectX version and so on. Is this likely to be an issue with my new system too after three years?
-
The 9800M GTS should last you at least 3 years, assuming you don't mind turning down graphics features.
-
Red_Dragon Notebook Nobel Laureate
Id say 3 years but around 2010 you may end up having to play games on medium
i have it and its great though
-
Great! Thanks guys - I am happy that it should last me that long if I am willing to compromise on graphics settings as time goes on. I was worried that some new version of shader model or something like that might come along and new games will require it, but it won't be supported by the graphics card.
-
I think 3 years is being too optimistic. I don't think any cards from late 2005 for example can play games today, perhaps on the lowest resolution and settings but that's not very enjoyable.
Technology changes very rapidly, especially PC games and their requirements and GPUs. -
Yeah 3 years is pushing it. Likely 2 years though.
-
yep about 2 years should be the limit....
-
2-years to run games well, I guess, but will the card 'expire' in 3 years due to stuff like new shader models that it will be incapable of supporting?
-
Late 2005 hmm my go 7800GTX is from later 2005 to early 2006. Still plays games really good today, not all of course but many, even Crysis at Medium settings 1280x800 33fps average and all this with a single core CPU. This was before my old GPU broke down
I had it in my older XPS. It ran World in Conflict at 1920x1200 and not on any low settings at all. Look up some youtube vids of of it, username is Magnus72.
So 3 years I think it´s possible. -
i think you can still "run" games 3 years away from now
-
That's what's important for me - to be able to at least 'run' them even a long time from now if I heavily compromise on the graphics.
-
I got it in either 2004 or early 2005. so it's possible especially with the 9800 being relatively top of the line. -
Red, your system, is that 1 GB of dedicated video memory or 512? I'm looking at the P-7805u (FX Series) that says it's 1 GB of discrete (is that the same as dedicated) video memory.. at $1,100 that's got to be too good to be true.. and with WUXGA too to boot! -
-
-
When answering this question, better take a look at the gaming market. Because of piracy most game developers want to sell their games for consoles too. More then a year after release Crysis is still lonely at the top if it comes to graphics quality. Take a look at these numbers:
Unreal Engine 3: around 80 games created or under development!
CryEngine 2: not more then 7 games created or under development!
Nvidia and Ati are developing fast but game developers are not! They are just all using the same engines, copying and tweaking. People buying Geforce 200 series cards these days are just trying to get crysis running at 1920x1200 or higher. GTA4 is just terrible programming. All other games run perfect on 9600GT which is roughly an xbox360. A 9800GTS will run every decent programmed game until the next generation of consoles arrives. I don't think I will see a playstation 4 in the next 3 years. -
It'll really depend on the types of games you're playing. Companies will always try to get as many people as they can so they'll try and make games with broader/lesser requirements. Not many PC games are of the Crysis requirements really. Just think of all the general consumer games such as the Sims series or even games like DMC4. On minimal requirements they're not that demanding.
I have a 3-4 year old desktop with an Athlon 3700+ and a 8800 GTS(320mb version) and I'm still to at least run today's games without a problem >.> -
Look at 8800GTX.. how long ago that thing was released?
-
A desktop won't be as much of a problem since you can upgrade graphics/cpu/whatever at will. Laptop..you are basically stuck with what you have.
-
-
Red_Dragon Notebook Nobel Laureate
As for the 7805 it is well worth the money go for it, you wont regret it -
Yes, a 9800 GTS will last over 3 years easily. I don't know how people are basing their comments of "2 years tops" since the TC said he wants to run them, not at any specifict settings (which includes the lowest possible).
I have a laptop equipped with an Ati Mobility Radeon 9600 from 2004 that packs a decent punch for even some games of today. Hell, it is as fast, or faster than some IGPs! And that GPU is from 2003. It isn't even a high end GPU, it was a medium range GPU!
A high end GPU like 9800 GTS will surely show it's strenght on games even today. As someone else mentioned for example for over 2 years, a 7800 GTX GPU still is a formidable GPU, and holds performance probably near some mid range GPUs like 9600GT (albeit with less tech).
So yeah, basically a high end GPU will stand the test of time much easier than your average GPU, and mostly because you don't mind playing at low settings. -
-
Have such a hard time getting that across to people these days...
-
Your GPU will be suffocated by the time its 1GB RAM are fully used up. I laughed when I first saw the desktop 8600GT 1GB in one tech magazine. Its like giving an Olympic class swimming pool to a newbie swimmer.
-
1) As has already been mentioned, very few PC-only AAA games are being made anymore. It just doesn't make any financial sense. The consoles are running hardware that is more than 2 years old now. Their processors were more or less futureproof (assuming that the code is specifically written for them), but their graphics cards were most certainly not. As long as you don't insist on exotic resolutions and settings, you should be fine.
2) At the moment, the GPUs are improving, but they're improving slowly. What is the difference between the 9800M GTS and 8800M GTS? The latter is a bit slower, but not by much. The new 100-series cards are promised to be what, 20% more powerful? The card makers aren't doing anything fundamentally new (like they did when they introduced the unified shader model); they're just stuffing more of the same chips into the same space and while this does increase performance, it suffers from diminishing returns.
This is not the kind of difference you see in an in emerging market such as, for example, the SSD sector where last January $500 would buy you 80GB, today $500 will buy you 256GB (which is also faster and less error-prone) and next January you'll probably be able to get a full terabyte for the same amount of money. This is an established market with exactly two competitors who don't have much incentive to innovate beyond a 20-30% improvement per year. -
Peter Bazooka Notebook Evangelist
I've been using my 8800gts (basically same performance) for a year now and it still hasn't missed a beat except for with Crysis and Supreme Commander both of which were released in 2007 and run like crap on almost anything.
Many new top tier games like L4D and COD WAW it still maxes out so I think it will be viable until a new generation of consoles come out and the ports start killing it. The consoles are about equal to a 7950gtx (correct me if I'm wrong) in power and I'm pretty sure it handles most games even after having a few years on it.
This is assuming that the architecture doesn't change drastically like it did to shaders and even then since many games are released on consoles simultaneously, it should last as long as they do. And with companies like Blizzard/Valve that make PC only games it should continue to run great because they realize games are about more than just graphics. -
just like a pentium 4 will still run everything. does it do it as well as an i7? No. but it can do it!
-
Your 9800 will be fine for a good while..I still play wow, CoD:WaW, Fallout 3, etc..on a Raedon Xpress 200M (integrated!)..not on any kind of high settings, but at 1440X900 none the less
-
High VRAM is necessary for lots of textures, physics and high resolutions. If the GPU in itself is incapable of rendering this properly, then the extra VRAM will be in a sense "wasted"(since chances are the user will not attempt huge things which will limit the GPU's performance), but it's not that the GPU cannot theoretically use it. -
I think it's more based on the fact (since the GTS can access it) that a game has to be developed almost in mind to make more use of the VRAM. It really doesn't offer any extra performance now, but one would think if a game was programmed to possibly use the extra VRAM beneficially for the game, then it would make a difference. To an extent it's not like 1GB cards have been around for a long time especially in terms of market saturation (or well consumers owning them).
I have no facts to support this (outside of many people not owning 1GB cards) so who knows. Theoretically it makes some sense to me. -
Our 8-9800 series laptop gpus will last us longer than the cards made in 2005. Because developer were pushing the game industry and making better more intense and engaging graphics and pushing graphics cards. I remember back in the day we had to upgrade our graphics cards every 6 months. Not anymore.
Now a days companies are playing it safe and not taking any risks, due to how badly Crysis was received. They are not willing to take the risk. the 8800 desktop cards can still max every game out easily. And those were made in 2007 or 2006 i think.
That's what i think.
Oh and...3 years is good. -
masterchef341 The guy from The Notebook
imo, it directly relates to the consoles. this has been the case in the past, and this will be even moreso the case in the future.
games are made with the consoles in mind.
if you are using tech that is equivalent or beyond the current console tech, you will be able to run games on the market.
the problem with the x700 (or the x800 series) was that when the new consoles were released, they were released with support for a higher shader model. soon thereafter, games were developed to target these consoles, and since these games were also being ported to the pc, cards with support for the new shader model were the point of entry.
if new consoles come out that support shader model 5.0, games will target that. it all depends on the consoles, in the end. unfortunately.
that said, not every game will require the new tech, but AAA cross platform titles will. -
I do like good pretty graphics, but I don't see why playing games on low settings is bad. Maybe you play crappy games that just look good. I mean, I generally buy a decent video card, but the people who buy the highest end card and sli/crossfire them are out of their minds.
-
Bear in mind, you need SLI/Crossfire for higher resolutions. Monitors are getting bigger and cheaper, and if you have anything bigger than a 20" and want to play native at high settings and eyecandy like AA, you might need a multi-GPU setup.
NVIDIA GeForce 9800M GTS and longevity
Discussion in 'Gaming (Software and Graphics Cards)' started by Destrel, Nov 5, 2008.