8800gtx can't even run crysis with everything maxed out, and its meant to be one of the, if not THE best graphics card available on the market. Why is it that people have been upgrading for this game and even their new graphics cards can't handle the processing power?
Is it the fault of crytek or nvidia?
Does it mean we will soon be forced into buying a dual or even quad card setup? Have graphic cards reached a limit?
9 series is coming soon i hear... yet there are rumours as to whether even the earlier generations of those cards can handle the power of crysis.
-
God should give ATI Adam or Eve (depending whether you think ATI is M or F), so they could start evolving into the stone age.
-
The Crysis developers want to make a game that looks great now, *and* looks great next year.
The only people making a "fault" are those who believe that it's a problem if they can't run any game at max detail levels.
Crysis is basically aimed at next year's GPU's. It still runs on todays, but you have to compromise a bit then. On the other hand, a year from now, you'll be able to play Crysis and get better graphics than you have now. Isn't that a good thing?
Would you rather they just removed those advanced features completely? So that you could get the warm fuzzy feeling inside of "being able to max everything now"? The graphics you'd see would still be the same. You'd just have gotten rid of the extra eyecandy that'd be possible on future cards.
You sound like someone have "failed". I'd really like to know how you arrive at that conclusion. It really is beyond me how it can possibly be a bad thing that the Crysis developers add in a few extra goodies for future hardware. -
People buy the 8800 GTX because for the past year it has gotten absurdly good performance in just about every game. Crysis is an extremely demanding and rightfully so - it is one of the best looking games to date. Crysis is also one of the only PC-exclusive AAA titles, so they don't have to limit the graphics for the consoles (like say, Bioshock, UT3, Orange Box, etc.)
Crytek has said that Very High settings weren't designed to be playable on today's hardware. But everyone still acts surprised when they can't run Crysis on Very High settings at 2560x1600 on their precious 8800's (which is already one year old, which is an eternity in video card time). But on High settings at a more sane resolution (1680x1050, say), Crysis runs fine. It's not that the game isn't optimzed, it's that 8800-owners are so used to cranking up to max settings on everything that they get upset when maxed-out Crysis challenges their rig.
Crysis is designed so that 3 years from now when everybody has their 10,800 GTX's or whatever, it will scale up and still look good compared to future games. -
Sorry to inform, but the 8000 series is a joke, more like a test run for nvidia and M$, the 9000 series seems like its the real sequel to the 7000's
http://forums.slizone.com/index.php?showtopic=6526
compare that to the 8800 lol
*of course thats the desktop model, but its serves for what the mobile model will follow -
The Forerunner Notebook Virtuoso
^ Are you kidding me? A single 8800 gts is more powerful than sli 7950s. Thats just the gts, the gtx and ultra are twice as powerful as sli 7950s.
So yes the 9 series being twice as powerful is reasonable. The higher end 8 series blows away the 7 series. THe mid range not as much. -
-
what graphics cards do you think crytek tested and played crysis on? Must be a dual 8800gtx right?
-
Does it even matter...? As all of the points above suggest.
-
-
I was actually mad at you for making such a silly comment till I read...
What is with people lately? Did everyone on NBR have a run of bad days all at the same time? Did we all sync our monthlies or something? -
-
im not throwing a tantrum or complaining, just opening a portal for discussion about crysis and next gen graphics
-
Graphics will get better as computers get better, almost at an exponential pace. Near impossible to predict the future of computers.
"In the future, computers may only have 1000 vacuum tubes and only weigh 1 ton." -
excuse me if i came off harsh, but the 8000 series feels like a test drive to me
M$ forced dx10 down everyones necks, and its hasnt even been proven to be more efficient than dx9, not to mention that dx10.1 is supposed to come out and not work with dx10 cards (I really hope i got my facts wrong on this)
this series seems to be focused on compatibility not innovation (not a bad thing tho) -
The Forerunner Notebook Virtuoso
Test drive? This generations 3rd ( 4th now thx to new gt) most powerful gpu provides more performance than the sli of last generations most powerful gpu. I don't see how its considered a test drive.
DX10 or not, the performance of the 8 series is till there and yes dx10.1 will not be compatible with dx10 cards.
Elaborate on your compatibility vs innovation comment plz, dont know what you mean exactly. -
The 8800gt is a beast, and can out perform 7950 sli, but im not trying to get that
Since i went on the market of a new notebook, ive been getting crap thrown all over me by the 8400 to 8600, theres the 8400g-8400gs-8400gt, all perform nearly the same give or take a 200 in benchmark
whats the point in pushing out all these cards that only do slightly better than the last gen version, and then having a super card? why are there so many crappy mid range cards available? why not just push out a geforce 7980 (lol)
like in crysis, theres people saying that when you run it in dx 10 mode on xp (through a rather simple hack), it runs better than in vista..... ie these cards are super powered just to case load a bloat ware OS
now im digressing because i dont know how to handle this dispute... maybe my angers not as much at the 8000 series, but at 8000 series being the *DX10 series that was supposed to be the holy grail of all gpu's (which up until a bit ago, didnt have xp drivers *argh*)
*and i ment compatible with vista -
The Forerunner Notebook Virtuoso
WEll the 8400 line are not mid range cards, they are low end. THey are meant for a boost up from integrated for some light gaming or simply to handle aero better and make use of the purevideo technology. THe 8600m is a good card. I can play any new game well with pretty good settings and resolutions with the exception of crysis which I can still play reasonably well at 1280 x 1024 with everything high, medium, shaders and low shadows. Call of duty 4, bioshock, stalker, orange box, etc all run great.
The xp vs vista is debatable. The fps are very negligible and mainly this is due to drivers that are still not optimized for vista 100% percent. The gap is rapidly shrinking though and becoming close to nill. Also those people are unlocking dx10 yes but not really. They are unlocking the very high settings but they are not taking full advantage of all the dx10 features in the game because you cannot take advantage of it unless you have the dx10 version of directx. They are missing some of the advanced physics as well as graphical features. What they are experiencing is not the true dx10.
I'm gonna steer clear of the dx10 issue because I sort of agree with you. Directx has shown me some glimmers but nothing overwhelming yet. Well we will see since only a few games have taken advantage of dx10 and crysis being the one that has used its potential the most so far.
I see where your coming from but I just disagree on the point that nvidia hardware' didnt deliver. Nvidia did deliver but dx10 whether it be microsoft or the developers did not deliver, I agree with that. -
-
-
Yeah I´m really happy with my 8800GTX and wouldn´t call that card a test run. It still runs every game I throw at it maxed except Crysis which I run in High and most options on very high in DX10 and the game plays great, not in the 40+ fps range but 30 fps range.
I will probably keep the card another year, or get another 8800GTX for SLI, but as it is for know I´m happy.
Take a look at another DX10 game and that is World in Conflict the game is just beautiful in full DX10 glory and maxed effects and it smooth as hell. Now my screen is limited to max 1440x900 and I´m comfortable with that.
I don´t cry because Crysis can´t run all options very high -
You could easily flip this argument and blame crytek for designing a game outside of the scope of available hardware, forcing people to upgrade. That is not the fault of hardware manufacturers is it? Doesn't crytek have an obligation to the consumer to design a game that will be able to be played on much less hardware and still be incredible? Like blizzard for example.
Either way, its a give and take industry. If companies didn't design games like crysis the video card industry wouldn't be forced to design bigger and better and try to get them out faster. On the other hand, if nvidia didn't make fantastic cards like the 8800s then other companies couldn't design much and be forced to design substandard games. The way the industry is spinning is a perfect relationship.
So if I had an 8800 or a pair, I wouldn't be upset that I couldn't play crysis at max everything...if I could, I would say that is a failure on cryteks part. History shows that any good, new, and bleeding edge engine kills the available hardware of the time and gives the consumer something to look forward too. Its good to see a company creating such an engine because it sets the bar of what is required of gpus for the future. They demand more power and nvidia will provide it in due time.
I guess I would try to appreciate what we have available because its all got. It may not perform how we want but that is no ones fault. It is just time and newer products taking their toll on the current tech.
Never bite the hand that feeds -
Your point?
Where's the "discussion" in your thread title?
For that matter, where's the discussion in your original post? Bashing NVidia and Crytek is not discussion.
And saying that Crysis should simply *remove* 20% of their eyecandy, just so that you'd be able to run with everything maxed is not.... sane.
Because that's what they'd have to do. Reduce the graphics complexity. The graphics that *you* would see would be the same, because you have the same hardware. But the graphics seen by people two years from now, would be worse. They'd see what you have now, instead of being able to enabled extra eyecandy.
Where is the sense in that? Why should they artificially cripple their game?
They've intentionally added graphics that'll make it look good a year or two from now. I fail to see how that's a bad thing.
DX10 is also more efficient, there's no doubt about that. But at the moment, it's not really enough to justify developers focusing 100% on that. They still have 98% of the market with DX9 cards.
And who *cares* if dx10 was "forced down people's necks"? (Not that I think it was, because just about everyone have been *very* excited about it) The GF8 series is *still* faster at DX9 as well. Your argument makes just as little sense as alkaeda's
-
usapatriot Notebook Nobel Laureate
Buy two 8800GT's and run them in SLI for the cost of one 8800GTX and get almost double the power.
-
-
For all the great technical points mentioned, it all takes a backseat to profitability.
nVidia will gladly take a backseat to ATI year in and year out (and vice versa) as long as it makes more and more money. Usually better products mean better sales but not always.
We live in a world where profitability dictates advancement. We the enthusiasts are the only people who care about performance, but if nVidia can't turn a bigger buck out of a more advanced product, it ain't happenin. So instead of saying nvidia should get out of the stone age, say "nvidia, the hell with being a profitable business!"
nVdia needs to get out of the stone ages
Discussion in 'Gaming (Software and Graphics Cards)' started by alkaeda, Oct 31, 2007.