AMD has just released a report about the graphics engine Starcraft 2 will use.
http://ati.amd.com/developer/SIGGRAPH08/Chapter05-Filion-StarCraftII.pdf
I don't really know how to interpret this. Does anyone know reading from this what mobile graphic cards will run the game decently?
-
It sounds like laptops without a dedicated GPU will have trouble playing, to be honest.
-
Sorry, all that doc explained was how SC2's graphical engine achieved the effects they needed. It's essentially highly technical (they used a deferred rendering algo... which I thought was not possible on DX9, or was it DX8.1? Either way, it allows a lot of dynamic lighting on a scene with only N+1 more computations per light, unlike traditional forward rendering N * (N-1) computations per light added to the scene).
They also stressed the fact that they are stressing GPU over CPU usage, as well, they will be supporting the absolute bleeding edge GPU's. Yet at the same time, their engine will be scalable.
They didn't give a hard number, but if I were to guesstimate, I'd say the minimum requirements would probably be around a nVidia 6600-6800-ish, with a recommended of around 8600-ish. So, laptop gamers, if you have a relatively new laptop, you shouldn't have to worry about not being able to run SC2 at a decent quality.
.
You're not going to have the integrated Intel chipsets do any deferred rendering, it's just not possible. It doesn't have enough raw power for that. -
I wonder if an ATI 3470 with a score around 2600 in 3DMark06 (achieved at res. 1280x800) will be able to play StarCraft II at good framerate?
-
Wow cool read. Thanks for the link (+1), even though I didn't understand half of it - lol.
But what I did gather is quite clear: scalability first. So in other words, it will work great with lower end GPU machines but with limited detail, or if you have the horsepower, it will render as much detail as it can muster. -
-
500 units at one time............release date anyone? -
-
-
It's Blizzard. I don't expect the graphics will be too taxing on lower to mid-range systems.
-
Well, as htwingnut said, and according to the link:
"A major design goal is the scalability of the engine. Blizzard games are well known for their ability to support a range of consumer hardware, including fairly outdated specifications, while always providing a great player experience."
So it seems to designed for both ends of the GPU spectrum ( or so they hope ), though Im sure IGPs will probably only run at lower settings depending on how graphically intensive they decide to make it. -
Before reading this, I've always assumed that Blizzard would set the minimum requirement to be the Montevina IGP to attract the most people they can, so I'm just kinda shocked. -
Don't be shocked yet Hhutzy. Game isn't released. Blizzard knows what they're doing, and (a) won't sacrifice isolating a large potential fan base and (b) won't sacrifice quality of the product to do so.
Blizzard is a rare breed, but their quality is what allowed them to brave the rocky waters over the years. Most other publishers seem to finally be following suit, I think there's less junk being released in general and better quality product going out the door. -
true about blizzard but still looks demanding. Looks like your not going play it with a integrated gpu.
-
ViciousXUSMC Master Viking NBR Reviewer
Its going to require a certian level of 3d graphics power as everything has moved to 3d now. Starcraft was pixels (and I do miss 2d/pixel games..) but now everything is polygons so the best they can do is remove all the detail. Lighthing, shadows, reflections, and just make it bare polygons with low textures. If at that point its still too hard for integrated graphics what do you want them to do? Make the game all over with 2d sprites?
One thing I expect may happen is its going to hit some people hard with the cpu power needed to play.
All those guys that hopped on the low cpu high gpu bandwagon like the gateway with the 8800 and the 1.6ghz cpu. I think this may be a rude awakening when they find there cpu maxed at 100% and not enough cpu power left over to run the gpu properly. Not saying it will happen for sure but strategy games are one of the few where it can happen do to the massive amount of cpu resources needed for the AI. -
true about the cpu if I read the article correctly SC2 will rely more on the gpu rather than the cpu.
-
i think it can run on lowest setting on ati nvidia igp.
no hope for intel igp.
I tried to run warcraft3 with x3000 and was getting less than 20 fps in lowest setting(10fps in battle) but it was in vista. My old toshiba with 1.7 single core with 950 igp and XP ran smoother than dual core with x3000 in vista.
hate vista.
i hope my 6831 can run in high detail with 100+ units in one screen. -
Look in section 5.2:
"This meant supporting a wide range of hardware, from ATI RadeonTM 9800/NVIDIA GeForceTM FX’s to the ATI RadeonTM HD 4800s and NVIDIA GeForceTM G200s, targeting maximum utilization on
each different GPU platform"
So the minimum specs are either a Radeon 9800 or GeForce FX(5?XX) level GPU? -
No, it means the engine will be able to scale from lowest end GPUs to highest end GPUs, instead of having a "max settings" that doesn't utilize the full power of high end GPUs.
-
Blizzard has always made thier games be able to run on a broad range of GPUs. I have a good feeling that my 3200 with sideport memory will run this at decent graphics just fine. As for my dell, that should max any blizzard game for the next 2 years easy.
We have to remember that blizzard goes for volume of sales and player base over extreme graphics. -
-
Shane@DARK. Company Representative
To answer Tony_A's new question: Possibly, but not ones that predate the 9800 too much. -
So I'm guessing a 8600m gt will run this with fairly high graphics?
-
Shane@DARK. Company Representative
@redda2: Yes, it will
-
-
Did none of you actually pay attention?!
Thier design goal was to have the engine scale down to run on very outdated cards such as the Gforce FX', Radeon 9800, etc. Yet at the same time, be graphically competitive with other current games, and take full advantage of the current hardware, (GTX 200, HD 4800 series') This clearly means that the game will have settings capable of running on all hardware, and it will also have settings that will push the high end cards at the same time. Don't expect to be running maxed out settings in 1920x1200 on your 8600GT when they just strait forwardly stated that the game's high end graphics will require a high end card.
You cannot say that a game company that has always ran good on all hardware will always run well.. Thier last game was World of Warcraft, a game created entirely in DirectX 7. When WoW came out, the graphics were severely outdated, and so was DX7. This new SC game will take the leap from DX7 to DX9, and DX10. I would expect it to run well on all sorts of hardware granted that the settings are correct for that particular GPU/CPU setup. But do not expect to max the game out on your midrange cards at high resolutions.
One more thing about that document that people might be interested in if no one understood it. The game's overall performance will be directly affected by how many stream processors the GPU has, the more the better. When the game comes out, it should run MUCH better on ATI's new 4800 series GPUs than anything else out there.. The engine will be using stream processors on the GPU for most of the tasks rather than the CPU. -
-
I already stated in the 3rd post that 8600M GT will probably barely meet the recommended (if even...), which means MEDIUMS at best... No way he'll hit High with his 8600M GT.
unknow*abunchofnumbers* said:This new SC game will take the leap from DX7 to DX9, and DX10.Click to expand...
SC2 is built on DX9, and they are NOT changing this fact any time soon (if ever). Please get your own facts straight before correcting others.
----------------
Also, if you use Pixel Shader 2 instead of 3 (i.e. you're still on a Radeon 9XXX or GeForce 5XXX FX series card), you'll see some performance hit due to the fact that the vertex shader has to feed the pixel shader data before any calculation can take place.
Another thing is that SC2 will be using A LOT of VRAM, they're storing multiple buffers for many different usages (traditionally, games have one, or at most two, frame buffers, in order to reduce VRAM usage). But most newer cards (even mid-ranged) are equipped with 512MB or more, so this shouldn't be a bit problem for current adopters, but might be a slight concern for earlier adopters (people with a 256MB version of the 8600M GT, like myself). -
Halo360Fan said: ↑true about the cpu if I read the article correctly SC2 will rely more on the gpu rather than the cpu.Click to expand...
unknown555525 said: ↑no, it will not.Click to expand... -
It will run on a box of rocks, the poly count is similiar to WoW just some shiny effects and shadow added.
-
Warloque said: ↑It will run on a box of rocks, the poly count is similiar to WoW just some shiny effects and shadow added.Click to expand...
.
Sorry pal, you must still be stuck programming for the V01 units for the PS2 in ASM--where Polies still mattered!
This is the age of shaders, polies doesn't matter as much as they did before. In fact, if you look at all the new PC/multi-platform based engines, polies is a second thought, with shaders at the forefront (UE3.0, anyone? Gears of War and UT3 with massive shader effects, anyone?). Just look at Mass Effect, Bioshock, and to an extent, Crysis. They're all shader bound, not poly-bound.
Except for select PS3 titles (where polies seem to fly out like rabbits out of a magician's hat), most titles on the PC and XBox 360 are limiting uses for polies with a heavy emphasis on shaders to increase graphical effect. You can only get so much polies before you're fillrate bound and you can't draw anything on them anyways.
Plus, deferred rendering lends itself well to shader effects (loads of lighting sources), and per-pixel rendering is most useful when you're rendering shader effects (of course, it does benefit rendering lights, too).
It'll probably run on a box of rocks--with a GPU attached. Anything better than an IGP with Intel's name on it probably will run the game... Just at a very low FPS.
Alternai said:However, it may very well be that the 8600M GT is near or past the point where increases in GPU power will give you diminishing returns and then the game will look quite good on it anyway.Click to expand...), I can have about 1.4K lights in a deferred rendered scene before I see some slow downs (a.k.a. it's probably dropping below 40-30FPS)... This doesn't even include rendering much polies yet. Per pixel rendering is generally simple, but does take more computation resources. Depending on how they integrated their forward renderer (looks like they're rendering a different set of RGB's for translucent objects), the performance hit could be anywhere from great, to minimal.
Shadows will still be costly, and so will translucent objects, but if you avoid even one out of those two features (a.k.a. by ticking off "Shadows" in the Video options, since you probably won't be able to fiddle with translucencies), you should see a marked performance increase.
Anyways, this is just a rambling from a joe average that's worked with graphics rendering pipeline (specifically deferred rendering, actually), so don't quote any of my numbers as being solid.
-
Triple_Dude said: ↑Shadows will still be costly, and so will translucent objects, but if you avoid even one out of those two features (a.k.a. by ticking off "Shadows" in the Video options, since you probably won't be able to fiddle with translucencies), you should see a marked performance increase.Click to expand...
-
Triple_Dude said: ↑That gave me a good rufflez
.
Sorry pal, you must still be stuck programming for the V01 units for the PS2 in ASM--where Polies still mattered!
This is the age of shaders, polies doesn't matter as much as they did before. In fact, if you look at all the new PC/multi-platform based engines, polies is a second thought, with shaders at the forefront (UE3.0, anyone? Gears of War and UT3 with massive shader effects, anyone?). Just look at Mass Effect, Bioshock, and to an extent, Crysis. They're all shader bound, not poly-bound.
Except for select PS3 titles (where polies seem to fly out like rabbits out of a magician's hat), most titles on the PC and XBox 360 are limiting uses for polies with a heavy emphasis on shaders to increase graphical effect. You can only get so much polies before you're fillrate bound and you can't draw anything on them anyways.
Plus, deferred rendering lends itself well to shader effects (loads of lighting sources), and per-pixel rendering is most useful when you're rendering shader effects (of course, it does benefit rendering lights, too).
It'll probably run on a box of rocks--with a GPU attached. Anything better than an IGP with Intel's name on it probably will run the game... Just at a very low FPS.
Eh... Thing is, SC2 isn't just a graphical showboat that's using deferred rendering for everything. On my un-OC'd stock 8600M GT (DDR2), I can have about 1.4K lights in a deferred rendered scene before I see some slow downs (a.k.a. it's probably dropping below 40-30FPS)... This doesn't even include rendering much polies yet. Per pixel rendering is generally simple, but does take more computation resources. Depending on how they integrated their forward renderer (looks like they're rendering a different set of RGB's for translucent objects), the performance hit could be anywhere from great, to minimal.
Shadows will still be costly, and so will translucent objects, but if you avoid even one out of those two features (a.k.a. by ticking off "Shadows" in the Video options, since you probably won't be able to fiddle with translucencies), you should see a marked performance increase.
Anyways, this is just a rambling from a joe average that's worked with graphics rendering pipeline (specifically deferred rendering, actually), so don't quote any of my numbers as being solid.
Click to expand...
If you think that WoW and SC2 have same poly count as UT3 and Crysis you massively fail. -
I'm being quite optimistic here but I hope it'll run on a 4500MHD!!!
-
im betting starcraft will need a 8400gs MINIMUM
-
I have to take back what I said about needing a dedicated GPU.
Puma may be able to handle it just fine with 3Gb RAM. -
Triple_Dude said: ↑SC2 is built on DX9, and they are NOT changing this fact any time soon (if ever). Please get your own facts straight before correcting others.Click to expand...
Althernai said: ↑What it actually says is (page 136): We chose early on to stress the GPU more than the CPU when ramping up quality levels within the game. Later on, they write that the reason for this is that the players can have up to ~500 units on the screen at a time (or, presumably, only 0-10 units) so it would be a bad idea for increased graphics to stress the CPU as this makes it impossible to use the processor fully in both high and low unit situations.
Actually, it might -- it just depends on what your definition of "fairly high graphics" is. It is quite clear that it will not be possible to max out the settings at 1920x1200 on an 8600M GT, but you don't need to do that for a game to look good (in fact, I cannot do that at all as the native resolution of my display is 1280x800). However, it may very well be that the 8600M GT is near or past the point where increases in GPU power will give you diminishing returns and then the game will look quite good on it anyway. This is not guaranteed, but Blizzard is one of the very few companies that has gone out of their way to give a decent experience to people without high end graphics cards and it doesn't look like they'll stop now.Click to expand...
Those of you expecting to run this game like WoW are in for a major dissapointment. I do believe this game will run similar to crysis, yet take better advantage of unified shaders.
I'll be laughing hard when all of you guys start complaining of getting 10fps with the game maxed out
Let me say this one more time, the game uses advanced effects that are more taxing than ones used in crysis. That is a fact, whether they optimize it better than crysis will be known in the future when the game is released. -
unknown555525 said: ↑Let me say this one more time, the game uses advanced effects that are more taxing than ones used in crysis. That is a fact, whether they optimize it better than crysis will be known in the future when the game is released.Click to expand...
-
unknown555525 said: ↑I'll be laughing hard when all of you guys start complaining of getting 10fps with the game maxed outClick to expand...
-
I'll be laughing hard when all of you guys start complaining of getting 10fps with the game maxed outClick to expand...
I'm pretty sure it'll be playable on medium settings with integrated graphics(yes even Intel ones). -
Im going to laugh when I max SC2 at 1920x1200 on my 8600mGT. That is all.
-
For those that debate support of DX10 in StarCraft 2, go FAQ yourselves at:
http://www.starcraft2.com/faq.xml -
Most likely, the integrated cards will run the SC2 on low settings. But don't expect it to look nice as what's in the trailers. I recently played the Red Alert 3 beta, it looks similar to Starcraft2 on medium/high setting, but when i switch to low settings, it looks like a game from 2003(RA2). I suspect the same will apply to SC2.
edit: the intergrated cards im talking about here are recent ones like X3100. Thanks Triple dude for correcting me. -
liquidxit2 said: ↑Im going to laugh when I max SC2 at 1920x1200 on my 8600mGT. That is all.Click to expand...
-
Hahutzy said: ↑I guessing a 9600GT will do that, a 8600GT will probably get less-than-smooth framerates on WUXGA maxed.Click to expand...
-
unknown*somerandomnumbers* said:Actually that is wrong. Last year they announced that the game will fully support DirectX 10, along with DirectX 9. Now please do not tell me that I'm wrong, and that I need to get facts that are correct, strait. It makes no sence. I think a quick google search on your part may have helped..Click to expand...
1) Read the **** PDF documentation from start to finish--Usage of DirectX 9 is EXPLCITELY stated.
2) DX10 is BACKWARD compatible (for the most part), that means a DX9 game WILL be "supported" under DX10 (DUH?! How else would older DX9 only games run on Vista?!)
3) Even SC2's FAQ page states that they are still debating on whether or not to support exclusive DX10 features--which means they are still on DX9 (and according to the documentation, they are most likely going to stay with DX9).
Thank you for playing, but you lose. Try again next time when you get your facts straight (Not " strait"; maybe buy a dictionary while you're at it).
unknown*blahblah* said:I think the strong references to SSAO are clearly signs that this game will run worse than crysis if this effect is used with the several hundreds of units on screen.Click to expand...
Man, SSAO with hundreds of units on screen at once... Where do people these days even get these ideas?
Serenity592 said:Most likely, the integrated cards will run the SC2 on low settings. But don't expect it to look nice as what's in the trailers. I recently played the Red Alert 3 beta, it looks similar to Starcraft2 on medium/high setting, but when i switch to low settings, it looks like a game from 2003(RA2). I suspect the same will apply to SC2.Click to expand...
Deferred rendering is not going to work on integrated chips (Unless you're talking about a "semi-decent" recent IGP like a branded ATi/nVidia, then you might be able to run the game on Low settings).
People with Intel X3xxx or X4xxx need not apply. Hell, if you're still on GMA900/950, don't even start dreaming about playing SC2 on your machine.
liquidxit said:Im going to laugh when I max SC2 at 1920x1200 on my 8600mGT. That is all.Click to expand....
The 8600M GT at best will barely even meet the minimum requirements once the game ships.
qaz333 said:im betting starcraft will need a 8400gs MINIMUMClick to expand...
But, chances are, you can probably run SC2 at the lowest settings at 640 x 480 with a 8400M GS.
-
edit:deleted, for some reason it post my msg twice
-
ltcommander_data Notebook Deity
Triple_Dude said: ↑Haha... Shows how less you know.
1) Read the **** PDF documentation from start to finish--Usage of DirectX 9 is EXPLCITELY stated.
2) DX10 is BACKWARD compatible (for the most part), that means a DX9 game WILL be "supported" under DX10 (DUH?! How else would older DX9 only games run on Vista?!)
3) Even SC2's FAQ page states that they are still debating on whether or not to support exclusive DX10 features--which means they are still on DX9 (and according to the documentation, they are most likely going to stay with DX9).Click to expand...
However, I'm pretty sure you are wrong in stating that DX10 (the API in general) is backwards compatible with DX9. The entire point of DX10 was that it is a break from DX9, to allow for a fresh start. DX9 games run on Vista because Vista includes a separate DX9 API in addition to the DX10 API. In fact, there are 3 DirectX APIs, DX9.0c like what XP has, DX9.0L that is basically DX9.0c with Vista's graphics memory management enhancements, and DX10. -
just want sc2 out right now, playing SC to hold me over till it releases =p
-
dude you're ****en on crack, if the minimum requirements to play sc2 is a 8400m gs, blizzard will lose a quarter of its potential buyers. Blizzard is way too smart to throw them away. If you can run WOW on medium settings, you will be able to run starcraft 2. Blizzard designed SC2 so that ATI 9800 will run the game @ 1280x1024 on low. If thats the case, the gma4500, which is better than ATI 9800 will run the game as well.
the game is suppose to scale nicely with the latest hardware. Which means that if you want ALL the eye candy, turning the game up to max and still have it run well with 400 units on screen will probably require a 9800m gt. But i bet that the game will scale exceptionally well @ high and medium using much more fps friend settings, but still have some eye candy enabled.
My bet is that like most other current gen titles, a performance card like the 8600m gt will be able to run the game @1280x1024 using medium high settings.
Starcraft 2 Graphics Card information released
Discussion in 'Gaming (Software and Graphics Cards)' started by redda2, Aug 24, 2008.