Other then the fact that it's one of the first 4 core games, with Crysis, Portal, and Half Life Episode 2 being the others, what do we really know about it?
All we know is that it comes out next year, and it has amazing graphics...
http://xbox360media.ign.com/xbox360/image/article/735/735772/alan-wake-20060927011957465.jpg
http://pcmedia.ign.com/pc/image/article/706/706459/alan-wake-20060509052003297.jpg
http://pcmedia.ign.com/pc/image/article/706/706459/alan-wake-20060509052004297.jpg
I apologize, hmmmm destroying a topic that I meant to have people saving money by buying a bigger video card at first, and saving by not buying a video card for several years, pissed me off. Forgive me.
-
I don't know what this article is directed at.
People who want to save already know to do the best cost effective option, and that means waiting until the insane prices for G80 subside a little.These folks also know that buying a top of the line GPU just for eye candy of the game that will have it's price slashed in half or even more after 3 month or so is not cost effective.
Note : Of the shots you displayed I think only the first one will strain the CPU enough to need a quad core.The rest is all hardcore GPU work.Remember, the Quad Cores are really needed for Particle FX plus physics simulation.There is no way that CPU could assist on reflection effects or such things or ever have a chance of accessing the internal GPU framebuffer and on the fly texture data.
You really should read some articles or books about 3D effects to know how they are implemented and then start giving out recipes.
And quit bashing hmmm would you.He has walked on my nerves more than once but that never made me say anything about his behavior about his thread.(plus much of his posts had a point) -
The game is a combination of Oblivions miles of grass, trees, and qarl texturing, combined with Dooms stencil shading.
There's no way this will run on any current gpu, below a 7900 at least, or something from dx10.
If you guys honestly think you'll be able to play on low settings, you'll be in for a big mistake.
That was the entire point of my topic, preparing for the future, like games like this.
This game will be murder on any pc that it comes across.
But hey, if you'd rather ignore me and find out the hard way, what's stopping you?
I can asure you though, if you go into Alan Wake with anything less then a Dx10 card, you're gonna look like a bigger fool then GW Bush going to a anti gas march. -
usapatriot Notebook Nobel Laureate
thats ok, but this is even better.
http://www.totalcrysis.com/forums/showthread.php?t=415 -
Nice shots.
You could say you wouldn't want to play Alan Wake, but the problem is, just like the case of HL Episode 2 and Crysis, all games are ending up looking like this...
Dynamic lighting, huge world, realistic graphics...
It's either a big dx 10 card or bust now.
Unfortunatly, that's become the problem with pc gaming. Hopefully it'll calm down as we get closer to photorealsm.
The only other option is to get a average pc and get a Xbox 360 to play these high res games.
But really, with games like Alan Wake coming out, you honestly do need high power now for them.
I could possibly, entirely be wrong. Alan Wake could come out and be, by some miracle, able to run on any pc.
But, considering it has the combination of Oblivion and Doom, I have a very bad feeling about it.
You may wanna wait on my guide... Wait until around the time Alan Wake comes out... or close to... Then Dx 10 cards, or at the 8800 series, will cost little...
But I have a feeling it will need quad core and dx10, at the very least, 8800. -
Stencil buffer and it's operations have been around since OpenGL 1.2.
To reduce the depth of rendering, the good old fog tricks are still being used.
It depends on the makers.That is whether they want to make it playable on older rigs or not.
IMO the answer is yes, something that is much more fun than "pretty pictures" is the gameplay.The reason for EA's losing market share is not the fact that they cannot make good graphics, but rather their lack of clue at gameplay experience.
You are right.The reason that I didn't buy a powerful GPU (I bought a cheap 7600GS) was that I wanted a DX10 ready rig.- but again I'm going to disagree about the murdering part of the theory.(unless you add the term maxxed settings) -
Your right, but this isn't using fog.
This is as realistic as it comes.
Once again, I have a very bad feeling about how it's performance will turn out.
Around the time Alan Wake comes out, the prices of 8800 gts will have plumeted, may be best to buy them then. But I fully believe it will REQUIRE quad core and a 8800 gts. -
Notebook Solutions Company Representative NBR Reviewer
Zellio I just love your topics about new games. It makes me want an uber PC!
Anyway: those are one of the best screenies I have ever seen. Looks like a great game. Anyone has some info about the story line? -
Another thing about Doom that made it run faster was poor as hell modeling.
Look closely as the modeling, at the texturing, etc.
Horrid.
The only thing it had going was stencil shading.
If you took out stencil shading, it would look like quake 2 with a few more objects lying around. -
Alan Wake is targeted for next year release. In a years time it will be a completely different ball game with totally different prices for processors & GPUS..
It makes more sense to get a card that can play a game @ optimum settings after the game is released then speculate and pay a lot of $$ extra..
Remember Doom 3 which was targeted at the nvidia FX series during development ? It ran a lot faster on much much cheaper 6XXX series cards like 6600GT when the game was finally released..
Anybody who upgrades GPUs/CPUs in anticipation for a game would stand to lose a lot of money i say..
Just my 2 cents... -
Dustin Sklavos Notebook Deity NBR Reviewer
I'm getting really tired of this.
First of all, Alan Wake doesn't even actually look THAT good. It looks like a trumped-up Source engine (a notoriously undemanding engine). But then, I still think Oblivion looks awful, so maybe I'm not the one to ask.
Second of all, Zellio, do you ever actually use middle-of-the-road to low-end hardware? Or are you just convinced everyone needs at least the second-from-the-best hardware in the world to run a game halfway decently?
Dual core is barely utilized in games right now, and you want to tell me quad core or bust? Games aren't even CPU limited right now (though monsters like the 8800GTX can be). So now we're going to spread this middling CPU load on two more cores. The facts are:
1. There's only so much work that can be done on the CPU.
2. Coding for the high end is suicide.
Oftentimes when it gets to the highest settings in games, the difference between "better" and "best" is barely visible (see High and Ultra Quality in Doom 3 engine games; High and Very High in Far Cry, etc.) People running 7600s or even 7300GTs aren't going to wet themselves in terror at the possibility of trying to run these next gen games on their systems, when honestly these games'll probably run on old Radeon 9800s. Even now the only game that won't run playably on the venerable Radeon 9600 is Rainbow Six: Vegas, because it doesn't have Shader Model 3 support.
OH NOES I MIGHT HAVE TO TURN DOWN SETTINGS
Give it a rest, man. Some of us can live without AA and HDR (which never struck me as being worth the performance hit it causes). -
pulp, i know exactly what your talking about, thing is that most people don't want see it as it's, and it's all about buisness guys, about making money, we're being fooled by all that next generation stuff, we should ask ourselves: is that really worth being called next generation? how do you define "next generation"?
in my opinion, the only true next generation leap in the gaming industry was the leap from 2D to 3D graphics, and that's about it.
sure, great graphics make it easy on the eyes, and i definitely want better graphics, but still i can't shake the feeling that we're being treated like fools.
zellio, wait until they release it(alan wake) to see for sourself, believe me there is noway the game will look like these pictures, just like most, or dare i say, all games released prior to it. i was pretty dissappointed with oblivion, i mean the game looks nowhere as good as shown in the screenshots, not to mention the crappy performance.
one last word, anyone still remembers the killzone(ps3 title) trailer, or should i say the in-game footage?
-
I COMPLEATLY disagree! If that is the case, then there is no way that the Xbox 360 will be able to run it saying that the 360 only had an equivalant to a 7800. So.... im betting that on the 360 it will be running high res, high graphics, and it will be fine. So the same with a computer. It is just you have to have at least a 1.83 GHz Dual Core, and 2 gigs of RAM to match the 360's power. So im thinking that you could get by with ALOT less.
-
Im right there with ya!
-
IIIM3, actually, the gpu in the xbox 360 is the xenos,
it is actually quite a powerful gpu, better then the 7800 and features unified shaders, though no dx 10 support
the ps3 is the one that has the 7800 gpu derivative.
i agree with you and pulp though.
especially the "OH NOES" i have to drop aa and af or go from ultra high to high
seriously, i LOLed when i saw that zellio said i ruined his thread
i guess when he makes a thread for World of Conflict, he'll say IIIM3, Pulp and me ruined this one.
-
Yes the you can take a look at the "the Ultimate ..." in my siggy.
The XBOX360 GPU is finely tuned for console graphics (it does not support huge resolutions like 2500x... while the eDRAM makes it easier to do frame-buffer based effects - like AA and HDR) it also has unified shaders.And according to wikipedia it seems the so quoted "1.8TFlop" GPU calculation power, seems to be less than 400GFlops.
Don't count me out.
-
Actually, im almost 100 percent sure on the fact the the 360 has the equivalant of a 7800, but yes, it is finely tuned just for gaming, but im sure that my computer will run Alan Wake OH NO LOW SETTINGS!!!!! OMG IT MAKES ME WANT TO SHOOT MYSELF BECAUSE I DONT CARE ABOUT GAMEPLAY, I JUST WANT IT TO LOOK GOOD!!! THE GAME CAN SUCK ASS BUT IF IT LOOKS GOOD THEN IT IS A GOOD GAME!!!!
except i care about gameplay not graphics. -
Did you even look at my siggy.
You can also look at wikipedia's page for Xenos.In fact they say it's more similar to R600 due to it's unified shaders approach.But the RSX is based off G70(a.k.a 7800)
http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer'
http://en.wikipedia.org/wiki/Xenos -
Ahhh..... i see, when i opened my xbox up, seeing that it ran off of 512 memory, i assumed that the highest it could have been was equal to the 7900 GTX, but i know they would put something that powerful, but ive been proven wrong.
-
wouldn't even think of it
man, the xenos is awesome, unified shaders, and the like (much more advanced then the 7800 in the ps3), and the edram basically means free (no impact on fps) AA and AF
wish i had a xbox 360 -
But the problem about XBOX360 is the quality of the components.I have heard they wear out sooner than they should.And worst of all,they overheat - Which requires some of the connections to be re-soldered
What do we know so far about Alan Wake?
Discussion in 'Gaming (Software and Graphics Cards)' started by Zellio, Apr 12, 2007.