Okay, I was expecting you would say something like that.
-
+1 rep for the guide, Blacky
-
Thanx, I've almost forgot about this thread.
Would be nice if I could add the newer cards to the guide. Like the 285M, M5870 and M5850 depending on the notebook model. -
These chips are rated to run at 1250MHz (.4ns) at 1.5 volts (which they don't get).
This is at least true for Clevo W860/870. -
Thanx, I've added it to the guide. I'll try to remember to rep you later.
-
Does anyone have any info on other video cards that are not included in the initial post?
That would be greatly appreciated. -
Meaker@Sager Company Representative
GDDR5 speeds can often be limited by the speed of the onboard memory controller/design of the PCB.
The GDDR5 on Nvidia cards tends to show this by being rated much higher than their stock clocks. -
For now the only real limitation has been the MXM connector. Which was the main reason why MXM 2.1 cards where stuck at 800 Mhz. Once MXM 3.0 came around the memory clocks jumped to 950 Mhz (for the same memories) which was much closer to their real specifications.
There might be other reasons as well, like power consumption and heat dissipation. -
Meaker@Sager Company Representative
No.
What I am saying is the inbuilt memory controller is incapable of running at high GDDR5 frequencies even if the chips can. If the voltage is lower it may be impeded even more.
Also the inside of a laptop can be quite noisy, the lanes going to the memory chips could be affected so the memory clock is limited by environmental factors. -
I guess you have a point, but so far those who have tried to overclock the memories seem to go as far as their official specifications go. In other words, while the 470M will run officially at 750Mhz for the memory, those who have tried to overclock have easily reached 1250 Mhz or even more. And this is usually the case for most video cards.
Indeed you are right to say that the lanes may limit the bandwidth (I have experienced this) or the memory controller, but for now I don't have enough info. to say how much of a limiting factor this is. Especially for high-end cards, this doesn't really seem to be a limiting factor. -
Meaker@Sager Company Representative
Have they tested performance at lower frequencies is not higher? Due to error correction GDDR5 can clock past optimal speed but due to the number of re-transmits offers lower performance.
As far as cards that show this:
Desktop GTX 470/480
Desktop 5870 beats HD6870 (1400mhz meem overclocks vs 1250) -
I know what you are saying, that applies not only to the memory but also to the core. From what I have seen until now there is little indication that performance diminishes, usually the memory becomes unstable before the point of negative returns is reached. However, the performance returns from let's say an extra 50Mhz of overclock diminish the higher you go, which does prove your point. The only situation where I've seen overclocking leading to negative performance is when it was going way over the frequency specifications of the memory manufacturer.
In general I would not recommend to anyone to overclock pass the specified manufacturer frequency, simply because the GDDR is more sensitive to heat and in general laptop GPUs fail due to bad video memory.
Of course, what I am saying is based on my experience with notebook video cards. To my shame I have to say I am not very experienced with desktop overclocking. -
moral hazard Notebook Nobel Laureate
Will you update this thread with GDDR5 vram, I found this thread useful.
Nice pic:
-
I would if I had more pictures of GDDR5 memories. Anyway, thanx a lot for this pic, I will update the first post in a bit. Looks like the 6970M GDDR5 is clocked at 1250 MHz or 2500 MHz (dual channel), maximum. The same as with all previous high-end notebook GDDR5 cards.
-
Anyone mind checking it out for a G73? Just interested since i want to OC.
Panther214 -
Here you see GTX 480M memory :wink:
Attached Files:
-
-
Meaker@Sager Company Representative
For instance the 5870 is dual channel, the 5970 is quad channel and the 460M is tri-channel (each uses blocks of 64bit mem controllers).
DDR stands for double data rate. SDR ram from way back when transmitted once per clock, DDR transmits twice.
GDDR5 improved on this. If you look at a clock waveform its a square wave (well it should be) and DDR transmits on one of the edges (low IIRC) where as GDDR5 transmits on the high and low edge, twice on both. So over the same clock period it transmits 4 times.
1250mhz base clock = 2500mhz DDR = 5000mhz GDDR5 -
-
Meaker@Sager Company Representative
-
For video memory, I believe the frequency of the memory modules is always listed as double what the actual frequency of the silicon is and how we report that with system memory, processors, etc. You will notice for instance that Newegg.com lists memory frequency as "effective memory clock." -
Why does these two programs show different clocks? What's the "real" and "effective" clock?!
-
-
I am not sure, I am very confused to be fair given that AMD says that it's GDDR 6950M / 6970M works at 900 MHz.
As far as I know, in one place the clock is divided by 2, in the other one it's divided by 4. So the real clock would be 2400 Mhz.
If someone can shed some light on what should be the standard.... -
-
Meaker@Sager Company Representative
AMD usually quotes the real, while Nvidia quote the effective.
But SERIOUSLY guys, with GDDR5 is 4X the real frequency to get the effective one.
So that 900mhz = 3600mhz QDR (quad data rate)
The 600mhz of the 460M = 2800mhz QDR -
Ok, thanx, that's what I wanted to know.
-
-
Meaker@Sager Company Representative
Effective frequency of data.
If you have memory that transmits once per cycle at 1Ghz and memory that transmits four times per cycle at 250mhz they have the same effective data rate.
Sure there are latency differences but these tend to get ignored, especially for graphics cards. -
What they say: 6950M has 900Mhz 256-bit GDDR5 memory. 5950 has 115.2GB/s of bandwidth.
These two statements are incongruent if the actual frequency of the GDDR5 modules are 900Mhz.
900Mhz x 2 for DDR memory x 2 for running at half the bus frequency x 2 for two data pathways x 256 bit bus / 8bits/byte = 230GB/s which is twice the stated bandwidth of 115GB/s.
So really the GDDR5 memory on AMD's 6950M is operating at 450Mhz. This is probably why Newegg.com lists their memory frequencies as "effective." But also effective compared to what is a biggie as well, since clearly Nvidia and AMD have different ideas about effective memory frequency.
-
What I do know for certain is that while Nvidia states for instance that their 480M works at 1200 MHz, if you check Nibitor you notice that the frequency table has 600 MHz for the memory. This is why I opted to used 600MHz as the frequency given that this thread is mainly to help those who want to overclock. On the other hand, then I should change the frequencies that I have posted on the first page... -
Meaker@Sager Company Representative
DDR = Double data rate
DDR transmits TWICE per cycle. Hence the name (double data rate).
GDDR5 transmits on the high as well as the low edge so doubles again.
If you look at the Nvidia website they quote all their frequencies as double their actual which IMO is just wrong, but there you go.
Go to AMD's website and they quote the actual frequecy.
The 6950 has a REAL memory clock of 900mhz.
460M has a REAL meomory clock of 600mhz.
These are FACTS.
No one would comment on how castrated the memory speed is on the 480M if it actually ran at 1.2ghz would they?
But the radeon 5770 desktop only had a 128bit bus but GDDR5, I clocked it to 1400mhz (real) from 1200mhz, if that was the DDR frequency it would have had shocking bandwidth. ATI advertises it as having 1200mhz memory.
They don't call it QDR because true QDR would transmit 4 times on the low edge and not at all on the top edge, but really it makes little difference. -
Meaker@Sager Company Representative
Think of this way, why would I use 450Mhz GDDR5 when I could use 900mhz GDDR3? It would use less power and cost less while giving the same bandwidth.
No I use 900mhz GDDR5 so I can make my PCB simple and not have to use a 512bit interface to get the same bandwidth.
The problem is that there is data going at 3600mhz down that bus, meaning that even small amounts of interference can be pretty damaging to performance, and the memory controller has to be pretty damn fast too. -
-
Ok, now I finally got it. I guess I should change the frequencies for the GDDR5s on the first page.
I need to give you more rep, but I don't have any left. -
Meaker@Sager Company Representative
I don't need to do the maths.
Using 450MHz GDDR5 instead of 900mhz GDDR3 would be so massivly stupid that it would not be done.
Memory clock speed: 900 MHz
Memory data rate: 3.6 Gbps
That would be 4 bits per clock no?
That is figures striaght from AMD.
Also you have all your maths and info wrong. If you think AMD is running its DESKTOP 6970 memory at 550mhz you are just crazy.
DDR3 1333 runs at 667mhz, DDR3 1600 runs at 800mhz.
DDR = DOUBLE DATA RATE
How much clearer can I make it? The clue is in the NAME.
SDRAM = synchronus dynamic random access memory
DDR SDRAM = double data rate synchronus dynamic random access memory
SDRAM = 1x freq
DDR SDR = 2x freq
DDR2 SDR = 2x freq
DDR3 SDR = 2x freq
GDDR5 SDR = 4x freq -
Here are accurate formulas for figuring actual frequency of the memory chips:
For DDR2 and GDDR3, finding the bandwidth from frequency:
Frequency (in Mhz) x 4 x bus width (in bits) / 8 bits per byte / 1000MB per GB = bandwidth in gigabytes per second.
(or quick and dirty: frequency x bus width / 2 = bandwidth)
For DDR3 and GDDR5, finding the bandwidth from frequency:
Frequency (in Mhz) x 8 x bus width (in bits) / 8 bits per byte / 1000MB per GB = bandwidth in gigabytes per second.
(or quick and dirty: frequency x bus width = bandwidth)
For DDR2 and GDDR3, finding the frequency from bandwidth:
Bandwidth (in GB/s) x 8 bits per byte / bus width (in bits) / 4 x 1000Mhz per Ghz = memory frequency in Mhz.
(or quick and dirty: bandwidth x 2 / bus width = frequency)
For DDR3 and GDDR5, finding the frequency from bandwidth:
Bandwidth (in GB/s) x 8 bits per byte / bus width (in bits) / 8 x 1000Mhz per Ghz = memory frequency in Mhz.
(or quick and dirty: bandwidth / bus width = frequency)
A major issue with this though is that colloquial naming conventions have graphics memory referred to as twice the actual clock speed. After looking up some old specs, the change in naming came about when GDDR3 first came out. A very plausible explanation unfolds as well. Although GDDR3 was quite superior to DDR, at first the modules were running at a significantly lower frequency. I guess they figured since they didn't want to say that memory frequency decreased (even if performance increased), and to sort of normalize GDDR3's performance compared to DDR, they labeled all the GDDR3 memory as twice the actual frequency. Since has been going on ever since. It looks like ATI did not do the same when GDDR5 came out, although it looks like Nvidia again doubled the frequency they say the memory operates at. So really what is the correct way to go about it? Although everyone lists memory speeds at least double what they really are, it is disingenuous. Also since Nvidia and AMD don't even agree with the same standard for frequency, it is a good idea to boil them down to the actual frequency of the memory modules to compare. -
This is the corrected version:
Code:Table of bandwidth with respect to frequency of memory modules compared to SDRAM. SDRAM = 1x frequency DDR SDRAM = 2x frequency DDR2 SDRAM = 4x frequency GDDR3 SDRAM = 4x frequency (GDDR3 is homologous to DDR2) DDR3 SDRAM = 8x frequency GDDR5 SDRAM = 8x frequency (GDDR5 is homologous to DDR3)
-
Meaker@Sager Company Representative
DDR3, DDR2 and DDR at the same frequency have the same bandwidth.
DDR3 just uses a few tweaks for speed at the expense of latencies.
In fact thats why there was some people against the introduction of DDR3 initially since the frequencues were only sligtly higher than fast DDR2 chips and the latencies were much worse.
Only GDDR5 quad pumps.
GDDR3 has ALWAYS been faster than DDR2 or DDR. So I have no idea what you are going on about.
BTW The short lived GDDR4 was related to DDR3. GDDR5 has nothing to do with it.
One of things listed in shops sometimes is the bandwidth rated for chips.
PC2-8500 and PC3-8500
PC2-8500 = 533mhz DDR2 (1066 DDR)
PC3-8500 = 533mhz DDR3 (1066 DDR) -
As for the DDR3 having worse latencies than DDR2, it was the same for DDR2 when compared to DDR originally. The reason for the shift to DDR2 was that you could (eventually) achieve higher speeds, and the same will be the case for DDR3. Note that 266 MHz is about the standard top end of currently available memory clocks, which is why DDR2 tops out with PC2-8500. Put that same clocked memory in DDR3 (PC3-17000) and you'll end up with double the data transfer rate (of PC2/3-8500).
GDDR5 is often considered to be "based" off of DDR3 because it has the same 8-bit prefetch buffers as DDR3, as well as the doubled data lines (of DDR3 compared to DDR2). The I/O is different though, (same as GDDR3 as compared to DDR2), which is what gives them the added speed.
So, yes, DDR3, DDR2, and DDR at the same I/O bus frequency have the same bandwidth. The thing is, Trottel has been talking about the memory chips, not the bus frequency... -
Righto, folks -
I've included a series of images of an HD4670 I'd picked up from eBay recently to put inside an Acer Aspire 5920G. It previously resided in an Advent 8555 gx (an MSI clone) before it was pulled.
The memory chips are from Hynix, and after much scrutiny I can determine the codes are as follows:
H5PS1G63EFR
20L ____936A
Could I have a bit of assistance in deciphering the makeup of the card?
Thanks!
Edit: According to the chart on Page 1 of this thread, the card's memory looks to be DDR2 @ 400MHz. Could anyone confirm, or prove me stupid? (which I well could be...)Attached Files:
-
-
Meaker@Sager Company Representative
Google hynix
->
go to products
->
go to gfx ram
->
go to part numbering guide
->
look up the number 36
->
find it is 275mhz (550DDR)
Google is your friend. -
Thanks for leading me in the right direction!
So:
H5PS1G63EFR
H.......Hynix
5.......DRAM
P.......DDR2
S.......1.8V
1G.....8K REF/64ms
6.......x16
3.......8 Banks
E.......6th Gen.
F.......FBGA (Wire Bond)
R.......Lead/Halogen Free
20.....500MHz
L.......Commercial Temp & Low Power -
There you go!
-
niffcreature ex computer dyke
Can we revisit the earlier discussion? As long as the level levelheadedness sustains which seems viable.
I don't understand this completely:
Oh and also, this is nice... and you could add the exact model # of chips on HP 3700m and 770m since I happen to know them... but they come with 3 different brands 800-900mhz and it isn't related to any model on the rest of the card.
In light of that it could be entirely possible that some of these are off half the time or more, unless we get like a dozen people with the exact same card it would be hard to say whether all manufacturers do that or only HP.
Last edited by a moderator: May 7, 2015 -
Meaker@Sager Company Representative
Hey niff, they wont ship to the UK btw.
As for the ram, graphics DDR is out of sync with DDR because it had an extra half generation. The ill fated GDDR2 which was a MASSIVE power hog, still based on DDR.
Also GDDR4 which was based on DDR3 IIRC was also not anything special over GDDR3 and also used quite a bit of power, it was only really used on one card IIRC. -
Does anyone know the frquency for the gtx 285m?
I currently have it in my w860cu. I could open it up and check, but that would mean repasting right? -
moral hazard Notebook Nobel Laureate
Yes it would mean repasting.
I don't know the frequency but would be interested in finding out. -
If so, I might do it later today. -
moral hazard Notebook Nobel Laureate
If anti-bac cleaner is 99% alcohol I guess it might be fine, not sure though, never heard of it before.
I use methylated spirits. Anything like cheap ethanol would do the job I guess. -
If someone can help with this I will get the frequency for the 285m
http://forum.notebookreview.com/sager-clevo/555870-help-removing-gpu-heatsink-w860cu.html
How to find your real GDDR frequency
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Blacky, Apr 5, 2009.