http://hardware.gotfrag.com/portal/story/39270/
http://www.fragland.net/news.php?id=17391
Just for those for those people worrying. DX 10.1 is just incremental update hence the .1. All 8 series will be compatible.
-
The Forerunner Notebook Virtuoso
-
Little late on the news there, but I guess the other topics dropped off the front page.
-
The only thing that .1 provided was the usage of 4xAA w/o a performance loss. But seeing as how any recent game can be set to run 4xAA, it doesn't really matter.
-
usapatriot Notebook Nobel Laureate
-
The Forerunner Notebook Virtuoso
Yeah was just trying to reassure the people that were worrying, no-one mentioned these official statements from microsoft in the other threads.
-
But It was the same with DX9. And DX8. And DX10.1 is backwards compatible with DX10, so if your card don't support 10.1 it'll use DX10 features.
-
The Forerunner Notebook Virtuoso
Nah just in response to the comments about that your 8 series card is already useless which some people actually believed.
-
Long story short your DX10 cards are still just as good as they were, and you won't be seeing any incompatibility error messages anytime soon.
-
Thank you.
I think my head was about to explode from people whining that they were so unfortunate to have the best GPU on the market, because it wouldn't last forever...
Thank you for this pocket of sanity.
If anyone still feel unfortunate to have a 8800, PM me and I'll give you my address. You may mail it to me, and I'll send you my 6800 in return. -
The Forerunner Notebook Virtuoso
Yeah you obsolete claimers!
-
I want a 8800.
-
ViciousXUSMC Master Viking NBR Reviewer
I am glad I didnt have the money or a 8800gtx tho, from what I can gather from another forum a 9800gtx is due near the end of the year and will be 2x stronger than the 8800gtx for the same price.
We shall wait and seeMy C90 took my desktop upgrade funds, im so busy these days I have not played a game on my desktop in a good 4 months anyways... Oh how I miss my battlefield 2142
-
I'll likely get a 9900GTS or something equivalent when i buy a desktop (late 2008, early 2009)
-
-
-
And my prediction is that next year, another GPU will be launched which is roughly twice as fast as a 9800. And the year after... well, you get the idea.
In other words, nothing have changed. GPU's have roughly doubled in speed since, oh, I don't know, the Voodoo 2, possibly. -
Well, the speed up might be more if they decide to go multicore.
-
Jalf since my 8800GTX is obsolete I can send you mine
-
masterchef341 The guy from The Notebook
gpu's are already essentially multicore, and they have been for a long time.
-
In what way?
-
masterchef341 The guy from The Notebook
tons of processors on a single chip. ati uses like 100+ processors on a single chip for shading. nvidia uses like 30 more powerful (individually) processors.
gpu architecture is inherently "multicore" - in the processor sense of the word, and its been that way a long time. -
Well, think about having two of the chips? Should be MUCH more effective than SLI or Crossfire, if designed properly. And they would have a use for putting 512 MB ram on a mid range, although it would need a 256 bit bus
-
masterchef341 The guy from The Notebook
lol.
well let me ask you to clarify, are you referring to desktops or laptops? -
Well, desktops actually. Heat issues and whatnot. Should still be less than 2 cards in SLI, So gaming laptop DTR's could likely use them as well.
-
So no, don't count on it.
As masterchef341 said, GPU's are already, and have been for a decade, more "multicore" than CPU's could ever dream of.
The multicore implementation you see on CPU's is not efficient. It's the best that can be done with CPU workloads that are not particularly parallelizable.
For GPU's, why settle for this second-best solution? When you have the far more efficient one at hand, and have been using it for years? -
Yeah, ok. Just went with the childish way of thinking "more is better".
-
The Forerunner Notebook Virtuoso
Future is perhaps "fusion" technology. It makes sense but lets just see how well it becomes implemented.
-
And GPGPU's
-
masterchef341 The guy from The Notebook
yeah i was going to explain this in a story.
first (not part of the story) the gpu workload happens to be very parallelized by nature. thats why they jumped the gun and started parallelizing everything internally, whereas cpu work is very linear. that is why just now cpu's are starting to become multicore (and its a huge hurdle for developers to take advantage of it).
so back to the story.
think of the gpu as a car circa 1890. the car has a long way to go upwards and onwards in the internal design. we want to be as efficient as possible in getting passenger to their destination. making them happy. that sort of thing.
we have two choices to increase the power of the car. we can strap multiple engines to the car... 2 engines, maybe 4, or even 8.
or we could just refine the engine itself.
if you want to put two chips on the card, its sort of like two engines in a car. it could work. but there is a serious limit on your return (similar problem with SLI).
Lets just say SLI and multichip cards are equivalent. who would buy 10x 8800gtx's for futureproof rendering when dx11 is right around the corner and chips double in power every year? Literally in 3 or 4 years your 10x 8800gtx render farm madness will be eclipsed by a single card.
They still have a lot of room to improve the internal design. Thats why they do it. When that is no longer feasible (we are approaching the point where you can't manufacturer on a smaller process size anymore with conventional methods - i forget when that is supposed to hit... a few years maybe) they will probably do another revision in design or two, and then come up with another means of increasing performance. -
It'll be a *very* long time before this makes sense for high-end systems.
In low-power or simply cheap systems, it's a brilliant idea, and AMD is going provide a kick-ass product when it comes out.
The reason:
Simple, GPU's are friggin' huge, and they need a ton of memory bandwidth. On the other hand, they're not that reliant on a fast bus directly the GPU.
Now, if you were to merge a GPU and CPU onto a single core, here's what would change compared to the traditional setup:
- Communication between CPU and GPU would get far faster (obviously since they're on the same core). But we just said this isn't so important
- Memory bandwidth would decrease (Not only would it have to be shared with the CPU, but CPU's also typically use a vastly slower memory bus. You never see 512-bit buses between system RAM and the CPU. You also don't see 2GHz DDR3 on system memory, while it's not pretty much what we get on a high-end GPU). And oops, didn't we just say GPU's are very reliant on memory bandwidth?
- The amount of transistors dedicated to the GPU would go down. (Simple, at the moment, GPU's contain something like twice as many transistors as CPU's. There's a practical limit to how big dies can be made, and GPU's are hovering right on that limit, probably a bit above. - That is, it'd be impossible to make GPU's at "normal" CPU-like prices, or in "normal" cpu-like quantities. They're just too big). Now, if they had to be integrated into a CPU, the die couldn't really afford to grow beyond what a high-end GPU is today. (And it'd probably have to shrink, even), and it has to share that space with the CPU.
And of course, it'd be rather tricky to upgrade your GPU without the CPU... [wink] -
That makes sense. Thanks for the explanations, Jalf and Masterchef.
-
The Forerunner Notebook Virtuoso
I meant farther down the road distant future. But of course fusion won't be able to put up a fight to discrete gpus right away, its fairly new concept. It is a start though.
Using fusion your shifting the graphics from the north bridge to the cpu thereby increasing the speed between the core and the memory which offers lower power consumption.
Also the integrated graphics will be able to access memory directly from the processor then through the northbridge which will give it lower latencies for more gaming power.
For now its a low end gaming system solution which will basically give you more performance while drawing less energy. Thats what I meant perhaps a future solution, we will see where it goes from there. -
But for now, it's definitely a promising replacement for integrated graphics at least, but it won't stand up to discrete GPU's. -
Will it be cheaper than making an integrated gpu?
-
masterchef341 The guy from The Notebook
most likely not. it should be better than the integrated solutions available for power consumption and the potential for better performance than integrated parts is there. dedicated will still be the way to go for quite a long time. years down the road the fusion idea will be practical in the high end, as stated above.
-
I'd expect so, yes.
The CPU core might get slightly bigger (but not by much, since some functionality can be shared between the CPU and GPU parts), but on the other hand, you can remove a chip and a *lot* of wiring from the motherboard (both of which are expensive)
Also, the GPU will be able to use the same memory controller as the CPU, another nice saving. And it won't have a need for dedicated memory or complex shared memory solutions at all. All in all, the motherboard will become a lot cheaper (as cheap as mobo's without integrated graphics)
The only possible downside I can see is that the extra complexity on the CPU might drive cost up a bit on that, but I think that should be a very minor difference.
Of course, this is just my guess... We'll see when it actually comes out -
I wonder If AMD will manage to get back the performance throne, in both the CPU and GPU business.
-
I wish we just had two chips: the CPU and the general acceleration unit which does visuals, sound, file caching, and physics.
-
The Forerunner Notebook Virtuoso
-
Yup. Intel need competition. Other wise we get a new P4 on out hands. And since nVidia is getting less competition from ATI, Intel will stir up the market.
-
The Forerunner Notebook Virtuoso
Nvidia-Intel merger anyone? Or perhaps the less popular Nvidia-IBM merger?
-
Won't be allowd. Anti-trust laws
-
The Forerunner Notebook Virtuoso
Well nvidia ibm rumors have been looming for a bit now. How so though its not really monopolization and its not like between them any real predatory pricing can occur?
-
not between them. But Intel-nvidia is definitly antt-trust worthy
-
masterchef341 The guy from The Notebook
if daamit can merge, why not intel-nvidia?
-
That may have happened to AMD and ATi, it recieved an a letter regarding an anti-trust possibility.
-
Because today there are three graphics card vendors: ATI (amd), nVidia, Intel. If the two largest, Intel an nVidia would merge, they would have a overly dominating part of the gpu market.
Amd and ATI weren't really in the same market. -
The Forerunner Notebook Virtuoso
Ah that makes sense. Fine nvidia-ibm then!
-
I've heard the nVidia-IBM rumors, but I highly doubt it. It's clear that the DAAMIT merger hurt them for a bit, and nVidia doesn't really need to merger. They already have excellent manufacturing plants, doing good money wise on their own, and just well, I don't know that IBM is all that appealing.
Though a Cell PC processor might be interesting with a 9800......IBM was one of the companies to work on that with Sony wasn't it? -
The Forerunner Notebook Virtuoso
Yeah ibm worked on the cell with sony. Yeah I was just kidding around about the merger, nvidia is strong on their own two feet right now.
DX10 NOT Obsolete
Discussion in 'Gaming (Software and Graphics Cards)' started by The Forerunner, Aug 17, 2007.