Saw an interesting article that was linked at Planethalflife. Valve seems to be introducing multicore support into the source engine.
http://www.bit-tech.net/gaming/2006/11/02/Multi_core_in_the_Source_Engin/1.html
Guess its good news for all those of you that are waiting for quad cores to come out. Too bad it will probably be a while before we see quad cores in notebooks. But the performance increase seen in the quick test on a dual core was impressive i thought.
Thought some of you might like the read![]()
-
metalneverdies Notebook Evangelist
nice find!
-
Notebook Solutions Company Representative NBR Reviewer
Great post thanks. I also like your avatar Cereberal
Charlie -
Good to see Valve constantly upgrading the Source engine. I think this will pave the way for new Havok engine in Source, hopefuly getting one thread to run physics whilst the other deals with game logic, AI, sound etc. What Remedy are doing with Alan Wake is brilliant, hopefully we'll see more game developers taking advantage of new Dual Core systems.
Its a shame that they can never meet deadlines on their games though, Ep 2 not until next year now....I think they should just stick to "when its done" rather than give estimates to the press. -
Anyway, we'll see how it goes. I for one hope that they've got better programmers working on it that Gabe "I've never touched a line of multithreaded code in my life" Newell.... -
But of course, its good to see developers making progress, even if it does mean those still with DX8 cards and single core processors are left out.
/looks at Pentium M....
//Looks at MR X1300.....
///Cries....
Ahh well, roll on 360 die shrink and price reduction ! -
I just knew there was a reason why Valve likes to upload my system specs for every new download and install
I hope they release the new code before the Quads... -
mobius1aic Notebook Deity NBR Reviewer
While it's nice to see Valve taking full advantage of CPUs here and in the future, I do hope they take into account that scalabillity is important. CryEngine 2.0 is supposed to be able to detect how many cores and threads are present, and scale the engine accordingly, making maximum use of the hardware at hand. I really like how Valve and Crytek push the limits, yet not forget about everyone else in the process. That's why I think they are the top two engine developers in the industry.
-
its amazing what you can find browsing a mates pc.
Id really like to see quad core benchmarks. I think this step would make a great addition to gameplay...looks like i have to save for a notebook and a pc upgrade now
Though i also hope that there will be proper support for the single cores that are available today, some really can still pack a punch and it would be disappointing to see them left out. -
Jeez, I can run Hl2 on all of my pcs perfectly, modded even with the cinematic mod, HL2 episode 2 must be huge then...
-
-
Dustin Sklavos Notebook Deity NBR Reviewer
Regardless of whatever middling feelings I have towards Half-Life 2 and Steam, I do have to say this:
Half-Life 2 scales WELL. Surprisingly well. That game is ridiculously undemanding on hardware; I maxed out AA and AF on my desktop and it still moves like gravy at 1680x1050. Far Cry can't do that. I can't even do that on UT2K4 (which for some odd reason runs like crap).
They coded an engine that includes freaking everyone. The only other engine I've seen that scaled as well was the Doom 3 engine (don't laugh, I've run it on an All-in-Wonder Radeon 8500DV). While you want to say Far Cry runs well and scales well, it really doesn't. It really enjoys having the surplus of rendering power, but the game looks AWFUL on medium and low settings and at low resolutions.
I do have a lot of faith in Valve's coding team, and to address what a couple people have said:
Gabe isn't directly responsible for getting the multithreading code done; it was handed off to some other guy whose name escapes me because I don't feel like opening the article.
They're coding Source for longevity, which is really awesome and almost seems like it's being made to be the DirectX of 3D engines if that makes any sense. It's not being multithreaded for dual core, it's being multithreaded for n-core, where n=however many cores you want.
These guys are, if nothing else, not just forward thinking but looking at the whole picture, and I respect that.
This also points out one key thing to me: with the way CPU cores are scaling, kind of makes a dedicated PPU seem obsolete almost out of the gate, doesn't it? When you can just offload physics calculations to another CPU core which most people would already have at that point, what the heck do you need a PhysX for? The concept of a PPU or even a dedicated physics processor (see ATI's "third PCI x16 slot" implementation) feels superfluous and practically stillborn. -
The PPU thing *could* work, but not in its current incarnation. The PCI bus, long latency and limited bandwidth kills it. And yes, just dedicating a CPU core to it would work almost as well.
But with AMD's Torrenza plans, anyone can develop 3rd party CPU's and you can just plug them into your motherboard, for a high-bandwidth, low-latency coprocessor. I imagine a PPU would work really well there. But that's probably still a few years off, so yeah, for all practical purposes, I'd stick with physics being handled by CPU (eventually with GPU assistance, since both AMD and NVidia are working on solutions for this) -
A dedicated PPU is the same reason you have a dedicated graphics card. A CPU can do anything, but it's not as efficient on many things. Have you seen how much the Folding@Home has sped up when they released a GPU client? Something like 20x over the FASTEST CPU's. For parallelizable problems like physics, you just can't beat special-purpose hardware.
And Jalf: bandwidth isn't a huge deal with that... most of the actual work is calculations, and the results of them. The data is mostly flying around on the card itself, it just has to get the input (very little data) and provide the output (again, very little data). A PCI physics card will still speed many complex calculations up, leaving the CPU to deal with AI and sound processing, etc.
Valve doing multicore in the source engine
Discussion in 'Gaming (Software and Graphics Cards)' started by Cerebral, Nov 5, 2006.