In the thread posted recently about NWN2 I made this statement:
Its gotten me thinking about where PC gaming is going. I've had a lot of graphics cards over the years, most of them either mid range or performance series. I've used them for both gaming and professional apps. The list includes:
Radeon 256, GeForce MX 440, GeForce Ti 4400, Radeon 8500, GeForce 5500, Radeon 9700, FireGL X1-128, and Radeon x700 (in laptop).
All of them are mid range to performance cards and, in their day, they could run anything I threw at them at near highest or highest settings with playable framerates. But recently its gotten to where midrange isnt so great for gaming. Not too long ago, I picked up my first high performance gaming card, a GeForce 6800. Even it couldnt play everything at peak performance but its, for the most part, a noticible boost over my x700.
So, what does the future hold for PC gaming? We have the new super high performance cards that can top out at $500. We have SLI and Crossfire. Combine with a motherboard for these technologies, you could pay a worst case senario of about $1250 just for graphics cards and motherboard alone. All in the name of getting maximum settings and 60fps from your games. For some reason beyond me, hard core gamers are willing to pay this kind of cash. Even people with very little money shell out this amount!
It is becoming my opinion, that hard core, high performance PC gaming is becoming a rich persons pass time. Most of the graphics cards listed above I paid for myself. Some of them I was 14-15 years old when they were released, and for the time, paying $100 for a graphics card was a big hit to the old wallet, but I could do it! Yes, mid ranges have been getting more expensive, but they're still very affordable for the gamer on a budget.
Now, heres me, and many other people, "poor college students" or just people who are gamers on a budget who love to game and cant get the same experience we got in years past. I forced myself to buy the 6800 and even that cant give me the max settings in todays games that my old 8500 could get on games in its day.
Now, I can understand the developers wanting to create something beautiful. I am a graphic artist as well, and the sensation of seeing your final work in full 3D with everything you can do to make it look as stunning as posible just gives you this awesome sense of accomplishment and beauty. And the first time you sell a print or mount one of your works on you wall is unforgetable!
However, I do believe it is necessary for the developers to consider their customers more. They could alienate an entire group of gamers by tailoring their games to function best on the high price systems. At some friends LAN parties I am routinely mocked for gaming on an x700 or my 6800. I know they dont mean much by it, but, frankly I'm getting quite tired of it.
The worst part, in my mind, is that Nvidia has begun releasing their professional workstation cards with SLI compatability. Could this mean my chosen profession, an already VERY expensive field, become that much more expensive that I'd need to buy two graphics cards.
This trend has finally gotten to me, and I've broken down and I'm going to see what its all about. Before fall, I'm planning on putting together a SLI or Crossfire desktop system. Granted, it'll probably eat into either my college funds or software funds, but hey, all us gamers have to sacrifice somewhere, lol![]()
Now, I know that gaming isnt all about the graphics, its about the story and gameplay (thats a whole 'nother rant!). But graphics are a very nice thing to have. Feel free to comment if y'all wish. Its nice to know various opinions on this![]()
-
-
Notebook Solutions Company Representative NBR Reviewer
I agree with you Twilight. Only a few people buy high-end graphics cards and PC's. I would prefer a mid-range, much cheaper notebook/PC.
Charlie -
Well multicore seems to be the future, be it CPUs or GPUs (SLI for GPUs), and we are at the beginning of this new trend at the moment. Give it a year or two and prices will start to level out as dual core/SLI becomes standard. As well as this, at the moment games aren't optimized for dual core setups, and CPUs are holding back the capabilities of the graphics card. I think developers are also getting lazy on the coding front which isn't helping, games seem to be made for consoles now and then ported to PC, rather the opposite of what was a happening a few years ago. (id and Valve are excused, Source has really matured as an engine, and Carmack's engines always seem to evolve over time. Quake Wars looks so much better than Doom 3 from the screenshots so far)
Personally what annoys me more is the ridiculous amount of cards ATI and Nvidia release these days. To use the GF4 launch as an example, people had the choice of going for the budget MX card, or for gamers there was the Ti4200, 4400 and 4600. Only 4 cards but they fell into the price/performance categories perfectly. Now we have multitudes of cards with SE, LE, XT, XTX, GT, GTX, GS, Pro, XL, GX2...etc etc.. List is endless. Consumers become confused with the range, and often buy a card which is hopeless and become disillusioned with the whole process.
I agree though on graphics over gameplay trend that seems to be emerging. I went for an X1300 in my laptop as frankly i'm tired o developers trying to use fancier graphics to sell a game. I recently got CoD2 and played 1024x600 on DX7 mode, and I still had a great time on the game, and have replayed it often. I don't need bump mapping or Anti-Aliasing or Pixel Shader effects to enjoy a game, just good stories and good gameplay. Sure graphics can contribute to the enjoyment of a game, but it should not be the sole reason for playing it. That's what benchmarking is for. -
I understand what your saying. I personally think DX9 and Shader 3.0 are getting dated for what developers are wanting to create. DX9 specifically is lacking performance. Once DX10 comes out (I think is about 3x powerful as DX9) and cards supporting it, they will have a large performance boost, with little, if any extra cost. Once DX10 cards become mainstream, and having 3x + power of cards now, hardware will start to come back with the detail level.
-
It's not THAT bad really. Well, not quite yet in my opinion. I've been using a Sapphire 256MB x800 for the past 18 months can still run new games at a nice level, decent res with options on mostly high.
Of course, there have always been top end expensive cards ever since the real start of proper PC gaming, I remember when I first got my proper 3d card, a 16mb TNT, one of my friends had a better Voodoo card. When I finally got my TNT2 card, someone else had a geforce 256.
Unless you're willing to shell out serious money, which I personally am not, there will always be someone ahead of you. But in some cases, its not worth it. Some people pay hundreds extra just for a few extra frames in their game. Is it worth it? Only to some people.
There will always be expensive top end cards in each generation but to be honest, most mid-range cards are up to the task unless you REALLY are the kind of person that wants 60fPS in every game even though the human eye can only see above 25.
Besides, some of todays engines are just poorly coded. Take Half Life 2 for example - runs beautifully even on low end cards. Compare that to FEAR - the game looks like muck unless you have a decent card.
Things might change temporarily with DX10 cards but it will settle back into the old routine of "What you got is fine, but you'll always know somebody with a better card then yours...."
Because when you buy a card these days, its outdated in a matter of months. Years ago, a top end card remained king for a year or two. They are releasing wayyyy too many variants too quickly these days for my liking. -
As a gamer myself I have played many many games (you wont belive how many games i have played) and from the past, when I was young, have 1MB RAM for GPU and I can play starcraft at max setting, i was happy, then for 2 long year I still can played games at max setting, then Silver come, I need 2MB for VRAM, it was till 2000 and after that, I cant play anything anymore so I change to my PS1.
So, after more then 10 year of gaming, the next generation game keep coming fast, so fast with the developed of technology. new hardware mean new standard for games. Game developer think it a race in graphic, they made more beautiful but expensive graphic, people say it good, they made it more beautiful and more expensive.
The result is, now aday, more and more and more people dont play all that new games on there PC anymore. They play:
- Old games when they have a new PC system so they can have max setting games
- MMO becoming more and more popular with low requirement and good interactive.
- Switch to console like Xbox and PS2 for nice graphic and cheap.
As many games site like www.gameFAqs.com, the one I visit alot, poll result show PC games is more expensive than console games.
The main reason is money and quality.
You may or may not know this but not many gamers have money to fullfill their hobby which is become more and more expensive comepare to 10 or 5 year ago.
Nice graphic is good but it come with high price, too high. very few game made for low or medium PC, Titan Quest is one of these, nice game look good on old PC. new games these day base on the top-of-the-line-PC not medium PC.
Maybe game developer need to sit back and think who buy their product, the rich or the normal working people for gamer are mostly people, not many of them are rich and Games is a hobby.
If it continue like this, PC gamer will leave it oneday when they cant follow it anymore. -
Charles P. Jefferies Lead Moderator Super Moderator
I highly agree with the points Twilight made.
Lately, another trend seems to be to bring over games from the xbox 360. This is a total letdown in my opinion for PC gamers. Our platform is different than the 360, no matter what anyone says. I understand that there may be many good games for it, but if you're going to bring it over, at least re-code it for PCs. We don't want trash that doesn't run properly (Oblivion, GRAW). Great games, poor coding and limited gameplay. The controls are even in disarray in some games.
DirectX 10 is very iffy at the moment. There is no set date to when it will come out, rather, we have a vague idea. I doubt that it is going to take off fast, because in order for DX10 games to run, you must have Windows Vista and a DirectX 10 card - a card is either DirectX 10 or it is not. There is no inbetween, a la Nvidia GeForce FX. -
PC gaming costs way too much. The graphics cards as well as the processing power is ludicrous. Using a PS2 has been way better for me as well as for my budget.
-
Notebook Solutions Company Representative NBR Reviewer
I would also recommend an XBOX 360 if you have an HDTV. I mean, it costs less then a brand new PC that can run Oblivion. But somehow a PC is different...
Charlie -
yeah you get tons of oblivion mods if you have it on PC.
-
i remember my old 9600 se was not bad for gaming, even though it was a low to mid range card i could max out most settings on ut2004. now it seems that yo won't be getting close for anything less then a top notch card. i have a 7800 and can bearly max out fear... kind of sad
-
HavoK, I understand that someone is always going to have something better than what I own, lol, we all know that! It doesnt bother me, I just dont like this trend. I dont need super high graphics, I just want an enjoyable experience. So far its still very enjoyable, most of the time. But playing through the GRAW demo on my laptop disapointed me greatly. The game ran like total **** and looked like it too! Mid range doesnt cut it very well
I just ask for about 30fps from my games and a decent look graphics wise.
As for console vs. PC, I've always been more loyal to the PC than the console. lol, I've been a PC gamer since 1988! I've been building and mantaining my PC's since I was 9! (with dads help at that age, lol!) With a PC you can really get into it, you pour your blood (literally, PC cuts hurt), sweat, and tears into your system. Its a much more fullfilling experience than going out to the local WalMart and opening up your new X-Box 360. But now our hard built creations aren't enough for what we're building them to do. It kinda hurts
Dont get me wrong, I do play console games (Mario veteran!) and I've owned most major ones since the NES. PC games have always suited me more. And graphics have usually been better for the PC as well.
A agree that these lame xbox ports are getting old. I want a game thats well programed for the desired platform! Oblivion is a great game, but I know that it should run better than that. This is another thing thats cranking up the system requirements and making it harder and harder for us budget gamers to get by decently.
PC gaming is loosing customers fast, and frankly, I'm afraid of loosing my desired platform! -
And the mockery of my poor, misguided, console-devotee friends is unbearable. I would certainly agree with all you people have been saying, it is rediculous and my parents think I'm crazy. Well I hope that the good predictions of it all evening out are true.
-
I think you're forgetting that we're soon going to be hitting the transisition phase between different generations of GPU's.
The emphasis on SLi etc. is just a stop gap for the 8000 series/R600 series. The mid-range versions of these GPU's will max out these "current-gen" games easily.
I also remember the days when maxing games out was basically a given, however games are expotentially becoming more complex. I always bear this in mind, ever looked at Morrowind after playing Oblivion? For the sake of your eyes, don't -
Gaming is now becoming a money-hungry machine. Nvidia and ATI are making riches upon riches im sure, due to their constant availability on upgrades. I remember when I was shopping for a video card the 9800 Pro was just unbelievable, and I couldn't afford it. Only a couple years later, the 9800 pro has given way to the X series and the 7000 series of Geforce. It's becoming ridiculous. I always end up buying the mid-range cards (9700 pro and now x1400\) and it's great and all but it bugs me that I can only stay in the low,medium end of it when the game has the potential to look so much better.
In terms of the poor game coding... i completely agree.
I like Halflife 2 for this reason. The graphics to me were amazing, yet it required less than Doom 3 at the time, and requires less than a great deal of games that came out after it and still looked worse. The companies need to spend a heck of a lot longer tweaking the games so that they can run absolutely beautifully on a Voodoo3, and not force people to buy the new 7900GTXTXTXGTXTGXTX. I mean I just got oblivion and am playing it at levels of graphics that just aren't as good as it would be on the xbox360
Im sure its possible. -
if you want to donate a large sum of money to them to make up for the lost profits, be my guest...
-
I think the gaming public will still have the obsession with SLI/Crossfire. I doubt its going away any time soon.
lol, my morrowind doesnt look so bad. Every texture in the game has either been personally replaced or swiched out for a downloaded mod
Its funny how much staying power some of these old GPU's have. I can run Star Wars Battlefront on my old GeForce Ti at near max graphics (at least thats what it says in the menu) but thats simply because it doesnt support all the newer technologies in the game. So it runs it in old DX7 mode and doesnt apply shaders to anything, and the environments drop off to 2D drawings after a certain distance. However I try that on my 5500 and the game is a train wreck!
Better programing is a MUST.
BTW, Xbox oblivion runs at roughly medium graphics in PC terms. -
Yes, I know what you mean. But I really wouldn't worry about GRAW's performance - that game is, whilst a good game, an absolutely terrible port from the X360 version of the game.
I've been an avid PC gamer since 8 or 9 years of age, starting with Quake and Doom 2, but I've also owned all the major consoles, NES, Genesis, PSX, PS2, Xbox, GC, X360......and I always come back to my PC, all the time. In fact I rarely ever play my x360 anymore, I can't even remember the last time I played it and I've only owned it a few months.
But as I was saying, GRAW and Oblivion are just poor ports and as such are not a true test of hardware. For example, I can run FEAR, Battlefield 2, HL2 etc at high settings, but I run GRAW like a dog, simply because its a terrible and poorly optimised port. There are also games coming out that require really heavy hardware and don't even look particularly good! It's just laziness on developers parts.
As I was saying, I'm still using a vanilla X800 card. It's about 2 years old at this stage, and has been passed out by the X800Pro, the X850XT, X1800, 6800GT, 7600, 7800, 7900 etc, and lots more.
But I can still run the new games at 1024c768 with options on mostly high, and that provides a fairly nice looking game, and I probably won't upgrade at all until I need a new PC entirely for the next gen of games.
As for my actual laptop.....see sig. A geforce 6400 isn't going to last the mile but at least at the moment I can run all the new games on low and maybe stretching to medium in some of the newish games such as BF2.....but again, proving the point about coding, I can run HL2 on high settings, it looks beautiful and is very smooth, but I can only run FEAR and Starship Troopers on lowish and they look pretty crap, fary worse then what I can run HL2 at.
the major problem seems to be coding, coding, coding. If only every engine could be as efficent as source, we'd all be happy. but then ati and nvidia would loose alot of their business.....
-
I'm an old-school console gamer. I've been gaming on consoles all my life. I had an NES, an SNES, an N64, PlayStation, PlayStation2, PSP, and every GameBoy ever made (except GBC). Only in the past year have I gotten into PC gaming, and really just in the past week since I got my new M90. I can say this much... PC gaming is very frustrating. With consoles, you never have to worry. The game always runs fine, cause it was made to run great with what you have, and everyone has the same. With PC, it's all dependent on each and everyone's machine. Even two people with the same computer will get different results just because they have two different applications running that person B doesn't, or person A has five more processes running than B, so on and so forth. And there's nothing more disappointing than paying 2 grand for a nearly top of the line laptop and not being able to play a 6 month old game at max settings (FEAR). Don't get me wrong, I love F.E.A.R. I have been playing it a lot this past week. But I'm running pretty much a 7900GS, which is like the 3rd best notebook card on the market right now not counting SLi (I put it behind the 7900GTX and the 7800GTX), and I can't get higher than 15-20 fps at HIGH settings (not even MAX). It just sucks. And what makes it worse is that I know it could look very good on my good ole $150 PS2 and I would be getting consistent 30 fps. Or I could buy an Xbox 360 for $400 and this game, about 1/5 of the cost of my notebook, and it would look better at higher fps. It's all very frustrating.
And I completely agree about coding and ports. I can run Quake 4 and Prey at max settings, higher res, and get consistently above 30 fps. NEVER LAG. Halo on max settings even stutters now and then cause it just wasn't built for PC. And as I said, F.E.A.R. stutters all the time on my rig. And I would go so far as to say that Quake 4 looks better than F.E.A.R. in a lot of respects, including textures. Granted, the lighting effects aren't quite the same, but it's **** close. It just seems such a waste to be a PC gamer these days. Consoles are catching up, and with the next generation we're not only looking at the same graphics, but also downloadable patches and mods and everything that PC games have been enjoying for years. And people say that the PS3 is too expensive...
If things don't change within the next few years, I am quite sure I will stick to just consoles and save my PC gaming for games that don't come out on the consoles. I have just been so disappointed by PC gaming. -
I m waiting for the next PCI express express, or mayb a AGP express, so that the duo core GPU would be at optimal state. But, wat i like to see, is smaller gpu board but not compromisin power. Then we can snuck that into the laptops.
Like a 13.3inch lappy with a X1900. Dream on.... -
-
Charles P. Jefferies Lead Moderator Super Moderator
The benefit of PCI-express is that its bus can have data flowing in two directions at once, versus AGP, which is unidirectional.
-
-
Meaker@Sager Company Representative
Mid-range means exactly that, mid range. If you want to play the highest setting buy high end.
-
-
Meaker@Sager Company Representative
There was a time when you could, it kind of negated the high end, but it was a short time, where now after and before it you could not.
-
hmm i tend to agree
but having said that, Everything i use from AutoCad 2006, to oblivion, to fear etc has run fine, mabye not on max max settings, but still with high settings and certainly enough to have the eye candy on, with good framerates.
and this is from a decent but still "midrange" laptop which didnt cost a arm and a leg.
I dont see it as a problem at this stage, espcially now that SLI/CrossFire set ups are becoming much cheaper along with duel core tech. -
Dustin Sklavos Notebook Deity NBR Reviewer
I think you guys are being unrealistic. In my opinion, modern cards have the most absurd surplus of power I've seen of any generation. It's not the games forcing the technology to advance, it's the technology forcing the games to advance.
You can't cite poorly coded abominations like Oblivion and F.E.A.R., neither of which is terribly attractive compared to more efficiently coded games like Far Cry, Quake 4, and Half-Life 2. For what it's worth, all Oblivion is, is a mess of pixel shaders piled on miserable art direction.
Also look at what modern graphics cards are being benched on. 1280x1024 used to be top-of-the-line high res. SLI rigs are being benched at 2048x1536 for crying out loud. And all these cards are being benched with Anti-Aliasing and Anisotropic Filtering, two technologies that weren't even mainstream until the Radeon 9700 Pro, which is a lot more recent than you think it is. And these are just shiny bonuses by and large independent of the games themselves.
Most games these days can be run at 1280x1024 with detail settings nearly maxed out on midrange hardware. Heck, look at our low end and the GeForce 7300GT. That thing's a beast.
I don't have any issues with modern graphics cards or games. It seems like you guys are asking yesterday's midrange to run modern games at high or max settings, and that's just not happening, and it's an unrealistic request. -
Pulp, thank you for your input, you put up a very persuasive arguement
Personally, I dont think this is unrealistic. Yes, I agree todays cards have an insane amount of power, but its not a surplus. Even high end cards are taxed to no end by some of the latest games. Even well programed ones.
And I'm not asking yesterdays midrange to run todays games at highest. I know my current best video cards are last generation and I dont expect them to max everything out on todays games. I'm quite happy with the way my x700 performs, and especially happy with my 6800.
Hell, my x700 surprised me to no end on how well it runs Oblivion even. Res is 1200x800, high textures, x2AA, medium/lowish fades, max view distance. Frames stay steady above 25, max is around 40. The 6800 runs the same settings and get higher peak fps (about 50) but for some reason it can drop below 25. That might be an issue with the RAM though. Its only a gig in the desktop.
My concern is about where things are going. If midranges are starting to struggle now, who knows what will happen in another couple of years. -
I think the thing that really sucks for me when it comes to the choice of buying new cards is the types of games i play. I choose to play games like CS and BF2 where, in my opinion, settings do matter. The guy thats running on 800x600 with all low setting and terrain distance turned all the way down is gunna have a hell of a time trying to beat the guy with all setting maxed because of the viewablitiy of you area. I am always tempted to but new cards and rigs hoping it will give me a little edge when im playing, and at time it has. Having a game run at around 30fps and drop to 8 when you confront an opponent can be devestating so having it run at 60 and drop to 30 helps a lot. So yea, i agree with you that it sucks that poor guys like us have to sometimes live with the mediocre gameplay, or shell out half your soul for those better rankings online.
-
I think that PC game developers will come to their senses eventually. The PC is the Most owned console in the world. (And when I say PC, I also mean laptop). The PS2 is at maybe 70 million at the most, while the PC is possibly at 500 million. Let's say only a fifth of those are 'decent' rigs, so that puts PCs at 100 million. When those devs require you to have these crazy cutting edge cards just to play their games, they're only targeting maybe 1 million out of that 100 million. That's 99 million potential target audience they're cutting out.
Also, look at what the highest selling PC games are and what they have in common. The Sims, Half Life 2, and WoW are all good games (except Sims imo) that don't require insane hardware to run. Any non-gamer adult can buy WoW, install it and play it without worrying about what their comp's specs are. And that's what they've been doing, making WoW the most popular MMO out there, and making Blizzard very very rich. -
-
Hmmm remember the Vodoo cards...How awesome it was to play a game with that card, when coming from a 4mb Matrox. Vodoo was a small transition to something better but they set the standard for sure.
A periode went on with half-and-half games supported by the Vodoocards. Then TNT arrived and turned everything upside down, and in the end it killed the vodoo. At the same time Celeron entered the market and the 300A clocked to 450 @ 100Mhz bus made our desktops together with a TNT2 or a Geforce256 a monster machine that we could throw anything at. For some time and I think it is like 3-4 years we were spoiled consumers in a market were Hardware evolved a great deal faster than than the software developers could...yeah develop games to embrace the new standard.
This new standard hasn't exactly pushed the game developers to max the performance of games, and that is imho what we are seeing today.
1. A market in explosive development and lots of consumers with ready cash
2. Hardware with loads of power
3. Developers trying to give the consumer what the consumer wants
If we do not accept or agree to terrible coding by game developers then don't buy the game. We live in a world of supply and demand, and as long as the demand is there with no real punishment to those who code badly, the supply keeps coming.
I hate when I can't play a game with eye-candy, but then again I try to destinguis between what is poor programming and what is the fault of my hardware.
Atleast I believe that we at some point in time will a shift again. When either a new hardware enters the market with new standards (Not just clones or small improvements as we see now) or when we the consumers punish those who can't make a game that will work on todays hardware, we will see changes
And to the member who spoke about ATI and Nvidia earning loads of profit, well it's a company and their main goal is to earn money and if the consumer is willing to pay for crap then they give them that. Don't tell me that You would sell the best quality only if You know that many are willing to buy less. Again simply supply and demand, and a huge pile of stockowners who want's to see earnings as well ;-) -
Meaker@Sager Company Representative
If you have an ultra high end setup the only reason you cant play at the highest details is poor coding otherwise everything runs really well.
-
well depends on how you want to play ,some people want to play on 1600x1200 all settings high x4x16 aaxfsaa and such.
personally i'm for detail not resolution , give me 1024x768 with all the details and effects to make it look as it should , and it'll have good fps at this res and that's all you need. -
I liked it when games did not have Graphics options all one could do was just run and play. Now it's run - configure - measure - play.
Yet, technology's march is only matched by capitalism's march, so don't get disillusioned. Get used to it. -
I think you can't blaim the gpu's of todays. My opinion is that the ram and rom are getting to slow to get a decent gaming pc. The ddr2 isn't much faster than ddr because of the latencys. Also harddisks must be faster then they are now. The speeds aren't changing fast enough of ram and rom. I think that if you play some games with same gpu, with (not available) ddr3 ram and faster harddisk you will play games more smoother. Ofcourse cpu and more important gpu are very needed for games but don't lack the rest.
-
You guys just need to start playing on crap all the time so eventually you won't know the difference. Try playing BF2 or Doom 3 on a Radeon 9000IGP. Hell, I have no idea what I'm missing!
-
Buy an Xbox 360, problem solved.
The only genre you'll be missing is RTS and MMO's and your current laptop will manage them just fine for the foreseeable future.
To be honest though, I think your point is a little off, apart from the rare exceptions like Crysis, most games are optimised to run quite well on mid range cards, perhaps not at absolute highest settings, but pretty close. Taking out Oblivion, F.E.A.R. and Call of Duty 2 (which actually has a multitude of settings to get it running on any old hardware), pretty much every current game will run on high settings on the current midrange offerings (and even these games can get pretty close). For every F.E.A.R. there's a Half Life 2 that looks beautiful on any graphics card from the last 5 years. -
Yeah, the RTS thing is really the only reason I own a laptop. And as you say, a 2 year old R3000 is doing just dandy, even with AOE3.
-
Here's my take: Once you get your own a house or apartment after college, if you want to pc game, get a desktop! Get an Lenovo thinkpad for work, if you need it, and just leave a desktop at home. Desktop pc's are upgradable, cheap, have wider selection of hardware, and mainly, they're cheaper.
Also, you don't need to get SLI/crossfire anymore, now that there's the nVidia 7950 GTX 1GB dual gpu graphics card. It's the fastest single card graphics card on the market. It's only $550, compared to $1000+ motherboard/sli/crossfire combos. -
I bring my laptop to go play games with my friends. I don't want to lug my desktop all over the place, and that way I have all my nifty toys with me too.
And what about two 7950GTX's in SLI? Hmmm? Then you're not just cooking with gas, you've got some napalm mixed in there -
Man that would be worth it.. 2 gb of video memory.. *drool*.. thats basically 4 graphics cards, top of the line graphics cards... you could do uber gaming with that thing..
-
For about a year. No seriously, I bet people were saying this about the "next generation" when the first gen voodoos and what not came out. "It would be great to have this much memory, or that fast of a gpu." I don't think it's ever going to end. Maybe one day they will figure out how to re-process real life at a higher resolution...then they would have something!!!
-
Keep in mind, my rant is about desktops mostly.
The GeForce 7950 itself IS SLI. Its two cards in one. (I find it funny when everyone says its the most powerful single card just to get back at ATI but when you have 2 of them they're very eager to call it quad-SLI...) You have to pay quite a lot of money to get the card. You also need a motherboard to go along with this card which could add another $120 or more for a high quality one. Which plays into the "gaming is becoming a lot more expensive" rant.
Performance today is a sketchy subject, but the hard drive only affects loading times. Not the overall gameplay experience. Todays CPU's also provide a huge surplus of power and RAM is quite fast too. Granted its still slower than it should be (compareitively speaking twoard the rest of current mainstream technology). But AMD is planning to use DDR3 memory controlers in its next generation CPU, so we'll see what that does for performance (Take THAT Intellol )
-
You have all raised excellent points in the matter, especially TwilightVampire, Chaz and Pulp.
Gone are the days when you could rely on a single GPU to max out games for years. It would be nice to see them return...I had a Geforce 2 64MB card that did well for 2 years or so. I then had a the Geforce Ti 4600 which maxed every game thrown at it for over three years until it burned out.
I'm guessing the desktop 7950 GTX would be enough to do well, with the exception of running DX 10 options (whenever they may show up), for a few years. But judging by the recent trend that Twilight is pointing out, I doubt you could max anything out in 3 years with even that monster.
And to comment on the "gaming is becoming a rich man's arena" portion of the discussion: Remember that crazy special-limited-edition $10,000 Dell desktop? The one with 4 GPUs (I think they were 7800 GTX)? Obviously top-end performance is being reserved for those with disposable income.
The Current Trend of Gaming and Graphics
Discussion in 'Gaming (Software and Graphics Cards)' started by TwilightVampire, Jul 2, 2006.