Alright, I got my first source-powered game today, Dark Messiah of Might and Magic.
Now first off, I know this is not news - the game (and the engine) have been out a fair while - but since I couldn't use steam til they introduced offline mode, here we are.
There are a couple of issues (most notably that I have to minimise and restore every time I run it before it will display anything, hopefully fixed in the patch), but apart from that it's truly incredible.
At max settings, 2x AA and 16x AF, I get ~45FPS +/-10FPS, and it's pretty.
So I guess my question is: Let's compare it to Fallout 3 - Why is it prettier, with better physics, running at far better framerates (vs 20-40FPS indoors for Fallout), when it's and older game?
Is source just a well optimised engine, or is there more to it than that?
-
-
It's pretty much just that the source engine is incredible.
-
Source is one of those magical things one can never understand...
Well, I've heard that Source is more CPU dependent than it is GPU dependent. Not sure if it's true or not, but all that matters is that it's my favorite engine. Pretty, efficient, and full of mods. -
I didn't know Steam had allowed offline mode til after I was back from the States, so I missed my chance to get the orange box for 40 bucks...
I nearly cried =S
Ah well, hooray for ebay lol...
Something I meant to say in my first post, DMOMM is basically... well, I mean, technically it's an FP-RPG, but I really think that kicking-guards-down-stairs simulator is far more accurate. Makes me giggle every time. -
Source does a lot more calculations on the CPU as compared to other games, and is much more CPU-intensive vs another game with pretty graphics like Crysis or COD4.
However it can be a sometimes in TF2 especially, as TF2 is very CPU intensive, but is only coded to use 1 core. (ONLY TF2) Due to this, many many computers are actually bottlenecked by the CPU. I have a T8300 2.4GHz processor, and my fps drops to 25 at times due to the CPU being the bottleneck. -
I don't see what you're saying. I think the source engine is fairly generic. Fallout 3 looks better than any source game I've seen. I haven't tried Dark Messiah though.
I feel exactly the opposite. Fallout 3 runs at decent settings at a high res and looks great. However, l4d won't let me max it and I don't think there's anything special about maxing the graphics anyways. The textures aren't high res enough to take advantage of me running the game at 1920x1200. Source engine = meh. Maybe Dark Messiah is a modified source engine. The main reason people use the source engine is for the physics anyways. HEnce the reason it's CPU intensive. -
The source code is very resource efficient and its physics engine is amazing
-
the answer is the sandwich.
-
That's a little shortsighted isn't it?
Can something like that be patched, or would it require recoding the entire game? -
? ?
Source engine is future proofed to handle multi-core cpu's -
http://forums.ngemu.com/game-console-discussion/104432-team-fortress-2-multi-core-options-pc.html
Apparently it's able to multithread, but doesn't by default.
@Cathy: Try turning it on and tell us if your performance improves. -
Whoever says the Source Engine is CPU dependent or doesnt use 2 cores is on crack
I have an 8800m GTS and a 1.66GHZ C2D... ONE POINT SIXTY SIX GIGAHERTZ
Everything plays well over 50FPS @ 1440*900 everything maxed out 16X this and that whatever HDR crap you want
It is very well coded, uses clever tricks and whatsoever
And as I always said, I played HL2 @ 1280*800 medium/high settings on a GMA950 (intel) & 1.66 C2D which made so many people mad and called me a troll etc...
EXTREMELY scalable is what I say -
I've noticed this too. I just got Left 4 Dead last week, and I can play the game at the highest settings possible at 1280x800, 4X AA and 16X AF, at around 45-60 FPS.
I tried to play UT2003 at the max settings, same res, same AA and AF, and it could only manage 40FPS.
This is on an Asus N50 with the T5800 and Nvidia 9650m GT, BTW. -
Likewise, no-one said it was CPU dependant, just moreso than other engines because of its emphasis on physics, which utilises pure number of CPU calculations, straight and simple.
My machine can output 120FPS in a 6 player DM in UT2004 (UT2003 expansion, if you will), and yours should be at least slightly faster than mine... -
I remember you Beatsiz.
You were the one who was claiming that the GMA950 was some kind of God machine that could play Far Cry and Half Life 2 on high settings.
But as I recall, you couldn't produce any evidence for those claims. -
Here's evidence that the Source engine didn't always have multi-core support: http://www.tweaktown.com/articles/9...ulti_core_cpu_processing_in_source/index.html
OH HEY LOOK. MY COMPUTER MUST BE ON CRACK TOO SINCE IT'S BEING LIMITED BY THE CPU AND DOESN'T SEE THE SOURCE ENGINE USING 2 CORES! -
-
I don't get how everyone gets such good performance in L4D, i'm getting 45-60 fps regularly but as soon as i am horded i dip to the low 30's and sometimes 20's. this is at 1440x900 2xAA with everything on High
-
-
i can get 24fps looking at a wall in css, 1280*800 ALL LOW. this is on a custom map with 60 players. if that's not a cpu bottleneck i don't know what is.
-
mobius1aic Notebook Deity NBR Reviewer
Simply put, it's highly efficient and built specifically for x86 processors in the first place, which means us PC users are more important than those POS consoles! Second, it's efficiency is coupled with being highly modular so as time and technology progresses, so does Source. My only problem with it is it's rather ancient lighting systems and geometry, but Left 4 Dead make some great headway in that department with much better lighting and while the geometry was certainly good, it was kept low in order to make tons and tons of zombies a possibility. Valve has always made sure it's PC user base was happy first before the 360
-
I DID PRODUCE EVIDENCE
But whatever
No more about that
I have a 1.66GHZ C2D and play many source games at over 50FPS constant... so I dont see the problem...
Havent tried L4D or TF2 yet, anyone want to gift it to me on steam to try it. RIGHT ON -
LOL hard to produce evidence for something that never happened.
Anyway you can't get "free trials" of L4D on Steam, and the demo has been cancelled.
I THINK you can still download the L4D demo from several sites... good luck with that. It's a good game and you should seriously consider buying it. -
Don't get me wrong - if you're getting great framerates that don't seem to fit experience, that's a great thing, and more power to ya, I'm just trying to figure it out. -
ONE POINT TWENTY ONE JIGGAWATTS? -
Yes, he's back. (And still annoying)
Everyone wave hi.
He's referring to SP, Dragunov. -
Beatsiz, you won't have anywhere near as much of an issue playing HL2 with what, a dozen actors at any one time? as you would playing TF2 or L4D, where there are a far higher number of players.
I can't say for a fact that you can't play HL the way you say you can, but I will tell you now that if you try TF2 or L4D you will be a little shocked. -
Not to mention, the source engine has matured since then. TF2 requires a bit more from a computer hardware wise than HL2 and Left 4 Dead requires actually a lot more compared to TF2. This is despite the system requirements supposedly being identical.
-
. Therefore, it gives stronger evidence that its CPU limited since we have similar performance for different cards with a huge power difference.
-
It's all about optimization. Lazy developers making half-assed console ports neglect this.
Look at Crysis compared to, say Unreal Tournament 3. UT3 runs at 1680x1050 with 4/5 settings at 30-40fps for me on a slightly OC'd DDR2 9600M GT (550/1300/450). It looks bloody amazing.
Crysis at medium-low settings runs at 10fps at native res, 25fps at 1280x800. It really doesn't look comparable until you get up to medium-high settings.
And don't even get me started on GTA IV. That game struggles to run on my Phenom/8800 GTX desktop.
It's all about optimization, and both the Source and Unreal engines are shining examples of well-optimized engines. All the lazy coding practices and memory leaks have been cleaned out of them.
In making their games more console-friendly, Bethesda notoriously ignore making their games run well on lower end machines, such as notebooks. -
I really wonder if we haven't brought this on ourselves...
http://www.tweakguides.com/Piracy_5.html -
Lets not even go into piracy discussion in this thread.
Pardon the abysmally small text in my on screen display, RivaTuner's StatisticsServer isn't fully supported in Windows 7. This is TF2 with my HD 4870 1GB and Q6600 overclocked to 3.6 GHz:
I'm CPU limited with an 8MB L2 cache, 1600 FSB, 3.6 GHz quad core Q6600, you better believe a T9300, T8300, T7500, or T7300 will be. -
Only L4D supports multi core right now, which is why I get pretty poor framerates in tf2 sometimes, yet almost always average 200fps+ in L4D on my quadcore/crossfirex desktop.
You cannot compare game engines and simply say one is better just because it performs better without knowing the polygon and texture data in each frame rendered. The Source engine performs like it does because of how far behind it is. It's HDR rendering method is among the best IMO, but there's so much that Source does not natively support that other engines DO support.
My favorite advancement in graphics is completely unsupported in the Source engine right now, even though it's possible to add support for it. Parallax occlusion mapping, or Steep parallax mapping ads amazing deapth to textures without actually dropping the performance much and has been in games for several years now, yet never in the source engine. This here is what I'm talking about. -
...
I'm sure they'll put it in? *hopeful* -
That video on parallax mapping is INTENSE.
-
Yet that video's shot in the source engine...
-
Dragunov-21 said: ↑No-one said the source engine doesn't multithread, just Team Fortress 2, which I'm inclinded to believe, given what I've read.
Likewise, no-one said it was CPU dependant, just moreso than other engines because of its emphasis on physics, which utilises pure number of CPU calculations, straight and simple.
Are you sure you don't mean UT3? (as distinct from 2003, which is ages old).
My machine can output 120FPS in a 6 player DM in UT2004 (UT2003 expansion, if you will), and yours should be at least slightly faster than mine...Click to expand... -
Why was Dragunov-21 banned?
-
Hep! said: ↑Why was Dragunov-21 banned?Click to expand...
I think it's a problem with the site.
I've seen a few good members here showing up as "banned" but if I refresh the page a few times, it usually dissapears and goes back to normal.
-
He made a thread discussing about specifically hacking wifi networks. Its a temporary ban.
Its not a problem with the site. We usually give out temporary bans first before a permanent ban. -
Ha, I was wondering the same thing, saw his name turn red... thanks for the info, and won't be discussing that topic
-
mm23 said: ↑It's all about optimization. Lazy developers making half-assed console ports neglect this.
Look at Crysis compared to, say Unreal Tournament 3. UT3 runs at 1680x1050 with 4/5 settings at 30-40fps for me on a slightly OC'd DDR2 9600M GT (550/1300/450). It looks bloody amazing.
Crysis at medium-low settings runs at 10fps at native res, 25fps at 1280x800. It really doesn't look comparable until you get up to medium-high settings.
And don't even get me started on GTA IV. That game struggles to run on my Phenom/8800 GTX desktop.
It's all about optimization, and both the Source and Unreal engines are shining examples of well-optimized engines. All the lazy coding practices and memory leaks have been cleaned out of them.
In making their games more console-friendly, Bethesda notoriously ignore making their games run well on lower end machines, such as notebooks.Click to expand...
10FPS on crysis sounds awefully low...
Maybe you always want to download the patches and updates and custom settings (makes me go from Medium High to High Very High looks but same FPS) -
I will download L4D and/or TF2 Demo once I can find them and see whats up.
All I want to say is that I have a 1.66GHz C2D and seem to have no problems... thats 2/3 of most peoples CPU power nowadays
Until I get an X9000 >: ] -
Just tried L4D and it is great. Maxed out runs flawlessly.
All settings turned on and AA on 16xQ it runs good. Try it. -
rot112 said: ↑Just tried L4D and it is great. Maxed out runs flawlessly.
All settings turned on and AA on 16xQ it runs good. Try it.Click to expand...
Then again, I've got this obsession with having no less than 100FPS constant. -
LOL. I am not lying if that is what you are thinking. I was playing it maxed with full AA.
-
Nah, I'm not saying you're lying. I'm saying that's intense.
Like I said, I REQUIRE 100+ FPS.
Playable is more like 30. I'm sure you're not at the 100FPS mark on those settings. Also I do 100FPS minimum, so I'm usually playing closer to 300FPS.
Sorry if you misinterpreted that and took offense, I understand how it sounded like an accusation. -
I hope you enjoy those extra 40 FPS that make zero difference over your refresh rate.
-
Jlbrightbill said: ↑I hope you enjoy those extra 40 FPS that make zero difference over your refresh rate.Click to expand...
Nah I actually don't even use that CRT anymore, I recently upgraded to a Samsung 2243bwx.
That said, 60hz and 60FPS do not mean perfect unison. You're right that it would make little difference in visuals (though, there IS a difference between 60hz + 60FPS and 60hz + 100FPS), but back in the Call of Duty days, higher FPS also meant slightly better registration, which made all of the difference. I've kept my obsession with keeping a minimum of 100FPS. My system can handle it, why not?
60hz on a CRT hurts my eyes though, so there really was a time I gamed on a CRT @ 125hz. -
Beatsiz said: ↑I will download L4D and/or TF2 Demo once I can find them and see whats up.Click to expand...
I got both in weekend sales, TF2 for 10$, L4D for 25$. Both are great, especially TF2 is so much fun. With the new content coming for both games, they are definitely worth buying.
How does the source engine manage it?!
Discussion in 'Gaming (Software and Graphics Cards)' started by Dragunov-21, Feb 22, 2009.