I find it amazing that once a computer making purchases a graphic engine from the two leading card makers, they can configure the dedicated and shared memory(dedicated and shared combined with turbo cache and hyper memory) as they wish, often misleading consumers with their specs. The only way to find the precise config is to recieve it from the actual engineers that are building the unit.....which is not easy. My situation is this: If I had a choice between the following three notebook configurations, all with an intel core duo running at 2 ghz which would you recommend;
1) nivida geforce 7400 (with turbo cache) configured with 64mb dedicated and an additional 192mb shared.
OR
2) ATI x1400 with a straight 128mb of dedicated memory.
OR
3) Ati x1400 (with hypermemory) with 64 dedicated and 64 shared.
Keeping in mind that all system would have 1 gb of system RAM.
A very confused
znet![]()
-
#2 IMO - go w/ the one that offers the most dedicated memory - likely it can still share additional memory (ie...256 w/ Hyper Memory) but don't quote me on that.
-
Charles P. Jefferies Lead Moderator Super Moderator
The X1400 is stronger than the Go7400 in terms of performance. The Nvidia's 64-bit memory bus is what drags down its performance, although its fast memroy helps compensate. ATI's X1400 on the other hand has a full 128-bit bus, which offers more bandwidth, and better performance.
I wouldn't go with anything less than 128MB if you plan to game. 64MB is just too little.
Go for the X1400.
Chaz -
How can I tell how much memory is dedicated to the card and shared with the the turbo cache or hypermemory in either a SOny system or a thinpad system. Is there a way to go into the BIOS or other resource in the computer to discover dedicated (and shared with applicable) memory when there is no clear way to tell by their specs sheets????
a little less confused
znet -
the best is by far the ATI x1400 with 128mb
it's sometimes called the 256mb hypermemory
but it's the same thing, cause it's 128mb onboard, with an additional 128mb from RAM
hope this helps -
Charles P. Jefferies Lead Moderator Super Moderator
I've noticed a couple of trends with the HyperMemory/TurboCache. Basically, if a given notebook has "256MB HyperMemory," then divide it in half and that's your dedicated memory (128MB in this case). Manufacturers typically double it. However, it doesn't always ring true, as I've seen a card with only 64MB of dedicated memory be advertised as a "256MB TurboCache" card.
The whole thing with the shared memory is very confusing, more for marketing than anything else. Also helps the manufacturer cut costs.
Chaz -
Well..ATI's are all half dedicated, half shared, when you see the number quoted as "Hypermemory". Unfortunately, sometimes vendors don't appropriate denote it...so go by the manufacturer's specs. The only time I know this isn't true is the x200M. There's two versions and it doesn't correllate to the "half" theory.
Nvidia's seems to be more complex and you'd have to know the specifics. -
Yes Chaz, that is ususally true (that the memory is halved to 50% dedicated and 50% shared). I spoke to nividia and that is what they told me they recommemend to computer manufacturers....BUT you are also right where it can be configured to 64DEDicated and 192 shared for total of 256mb. I had to go to hell and back to find this out and what I thought was 128 and 128 turned out to be a huge let down and cause much frustration and stress.
znet -
Unfortunately TULLND the manufacturers specs are very very unclear and even computer employees don't know. It seems that the only people who do know are the computer engineers building the units, which are almost impossible to contact through all the red tape and burocracy.
-
i would opt for card #2..it has the most dedicated ram, and it would be the best of the bunch for gaming. if you do not game, then the card shouldnt matter too much.
also, it isnt as hard as you make it sound..google the card, and there quite a few links that would have the information you need..no need to contact the engineers and their families to find out.
pb,out. -
i have googled the cards but because the configurations vary from company to company, there no way to tell how much is indeed dedicated. I have talked to both ATI and Nividia and they explicitely tell me that they make their configuration recommendations and then it is up to the comp manufacturer and system designs. And yes, unfortunately, to get a definitive answer regarding companys' config, the only people they actually do know are the people building the units and those in touch with them.....the consumer is low on the list when it comes to the red tape of getting a direct answer.
-
http://www.ati.com/products/mobilityradeonx1400/specs.html
are looking for more then this?
Specifications
Mobility™ Radeon® X1400 Product Features
* 105 million transistors using 90nm fabrication process
* Four pixel shader processors
* Two vertex shader processors
* 128-bit 4-channel DDR1/DDR2/GDDR3 memory interface
* Native PCI Express x16 bus interface
* PowerPlay 6.0 power management technology
* Avivo Video and Display architecture
Ring Bus Memory Controller
* 256-bit internal ring bus for memory reads
* Programmable intelligent arbitration logic
* Fully associative texture, color, and Z/stencil cache designs
* Hierarchical Z-buffer with Early Z test
* Lossless Z Compression (up to 48:1)
* Fast Z-Buffer Clear
* Z/stencil cache optimized for real-time shadow rendering
Ultra-Threaded Shader Engine
* Support for Microsoft® DirectX® 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
* Full speed 128-bit floating point processing for all shader operations
* Up to 128 simultaneous pixel threads
* Dedicated branch execution units for high performance dynamic branching and flow control
* Dedicated texture address units for improved efficiency
* 3Dc+ texture compression
o High quality 4:1 compression for normal maps and two-channel data formats
o High quality 2:1 compression for luminance maps and single-channel data formats
* Multiple Render Target (MRT) support
* Render to vertex buffer support
* Complete feature set also supported in OpenGL® 2.0
Advanced Image Quality Features
* 64-bit floating point HDR rendering supported throughout the pipeline
o Includes support for blending and multi-sample anti-aliasing
* 32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
o Includes support for blending and multi-sample anti-aliasing
* 2x/4x/6x Anti-Aliasing modes
o Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
o New Adaptive Anti-Aliasing feature with Performance and Quality modes
o Temporal Anti-Aliasing mode
o Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
* 2x/4x/8x/16x Anisotropic Filtering modes
o Up to 128-tap texture filtering
o Adaptive algorithm with Performance and Quality options
* High resolution texture support (up to 4k x 4k)
Avivo™ Video and Display Platform
* High performance programmable video processor
o Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding and transcoding
o DXVA support
o De-blocking and noise reduction filtering
o Motion compensation, IDCT, DCT and color space conversion
o Vector adaptive per-pixel de-interlacing
o 3:2 pulldown (frame rate conversion)
* Seamless integration of pixel shaders with video in real time
* HDR tone mapping acceleration
o Maps any input format to 10 bit per channel output
* Flexible display support
o Dual integrated dual-link DVI transmitters
+ DVI 1.0 compliant / HDMI interoperable and HDCP ready
o Dual integrated 10 bit per channel 400 MHz DACs
o 16 bit per channel floating point HDR and 10 bit per channel DVI output
o Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
o Complete, independent color controls and video overlays for each display
o High quality pre- and post-scaling engines, with underscan support for all outputs
o Content-adaptive de-flicker filtering for interlaced displays
o Xilleon™ TV encoder for high quality analog output
o YPrPb component output for direct drive of HDTV displays
o Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
o Fast, glitch-free mode switching
o VGA mode support on all outputs
o Drive two displays simultaneously with independent resolutions and refresh rates
* Compatible with ATI TV/Video encoder products, including Theater 550
...are you looking to find out something else? perhaps how many capacitors/resistors/transistors are used, when it was made, how long it took, what they ate while making it..etc..
sarcasm yes, but if that information isnt enough, then perhaps you need to start bribing some ATI/Nvidia engineers to get the information you desire.
pb,out. -
PB thanks. You seemed to even less helpful than previously. No need to bribe anyone at the graphics companies as they seemed to be more willing to speak to consumers. What part of of my original posting did you not understand? Sarcasm? yes. I did indeed speak to ATI and apart from your information which I have already researched a week ago prior to calling, they specifically told me that the graphic engine memory is configured according to the computer manufacturer's design for which they can not provide memory info. They can only recommend how to configure the memory and then it is UP TO THE COMPUTER MANUFACTURER. But if you read through the info you sent and can indeed read clearly (sarcasm? yes) then you will notice that they don't state definitively that there is 128MB of onboard dedicated memory because they can't speak for every computer company's system designs. If there is hypermemory technology incorporated into the x1400, which is sometimes the case (and not all computer companies state it), then 128 MB could very well mean 64 dedicated and 64 shared. But then again, I know you know this as you've spoken to ATI directly as well and if you haven't, I am sure you just know. Sarcasm? Yes!!!
A thank you to you others who have expres your opinions with courtesy!!!! -
if ATI told you this, then did you call the vendor of the computer that you are/were looking at? if each card is configured differently for each vendor, then you should be calling them..not ATI. ATI has a set of specs for their cards, but any vendor can change them (for instance...EVGA selles overclocked GPUs, which are not stock cards from Nvidia). there is no reason to assume anything..you have the information, and you can easily call the vendors of the notebooks. they will give you the information you desire..if they dont, then i dont know what to tell you. if you buy it from a smaller vendor (as compared to say..dell), they know what they use and if they configure it in a different way then stock.
also, do not to cheap shot me with personal insults. just because we both used sarcasm, does not enable you to try to make me look bad.
pb,out. -
Charles P. Jefferies Lead Moderator Super Moderator
znet, I don't believe pb meant any harm, so let's just stop the arguement there, thanks.
If you post the notebook here that you are looking at, we can tell you how much dedicated/shared memory it has.
Chaz -
PB, first of all, I was just responding to:
""...are you looking to find out something else? perhaps how many capacitors/resistors/transistors are used, when it was made, how long it took, what they ate while making it..etc..
sarcasm yes, but if that information isnt enough, then perhaps you need to start bribing some ATI/Nvidia engineers to get the information you desire.""
which you wrote to me and I found it equally insulting making me sound inept when I was just posing a genuine question. You didn't have even respond if you thought my original posting idiotic and you didn't have to make me seem idiotic.
Anyway I leave it at that and let bygones be bygones.
As to your comments, once I spoke to the graphic card companies, I then naturally spoke to the computer company and it was ONLY when I was put in touch with a very high level employee, did he explain how the card I was looking at was actually configured. I had been given all sorts of misinformation until that call. It seems that no matter what you purchase, the configuration is up to the companies computer's design. I've seen one computer manufacturer that configured an ATI x700 with only 64MB (I've read x700 is great card and my understanding is that is it usually configured with much more dedicated memory). Computer companies don't always make it clear and specs from the card manufacturers are equally confusing.
Thanks for your offer Chaz and I too didn't mean any harm.
znet -
im sorry for posting with the sarcasm, and i really didnt mean it in a bad way...i have wierd humor.
but i will say, that having some experience with desktop GPUs, it is all up to the vendor on how they configure it (speeds etc...) that is why when you buy an EVGA card, they may have higher then stock speeds, different ram configurations...an so on.
still, i am interested to know what notebook you are buying/have already purchased...and how you like it...details, we need details!
hope this clears it up..
pb,out. -
Hey PB,
I really don't have many details at this point. I haven't bought anything yet but I am looking most of the major notebook manufacturers; Lenovo, Sony, Acer, Toshiba and HP for the most part. I would like a decent graphics card that won't be obselete in 2 months. I noticed that top cards are usually combined with 17 inch screens which is just too much. A 15 inch normal or widescreen is what I am looking for and a core duo. Most of the 15" screens seem to come with low to mid range cards. I like to do some decent gaming every now and then and for my purposes I think 128 mb of dedicated is good. As I said though, some of the computer companies are not so clear on that and often don't state whether dedicated, shared, turbo cache/hypermemory or not and asking their tech support often doesn't clear up the ituation. At least that's been my experience.
Anyway, I am still researching but I will let you know when I purchase and how the video performance is.
peace
znet
nvidia/ati graphics dilemma
Discussion in 'Gaming (Software and Graphics Cards)' started by znet, Mar 29, 2006.