I have a Crucial Real SSD C300 256GB and my small transfer size performance is much lower than benches I've seen. Granted these were on the desktop with a SATA 6G speed, so I'm wonder if that is the cause of the performance difference and was hoping anyone else with a similar system can confirm. I tried installing the Intel Rapid Storage drivers and the results are the same, I also tried running CrystalDiskMark in safe mode and the Random Read and Write 4KB (QD=1) are about twice as fast.
Due to the 3G vs. 6G the seq and 512k seems about right, but the random 4k seems really low?
LEFT:
ASUS G53JW-XA1
Windows 7 Ultimate 64bit
Crucial Real SSD C300 256GB Firmware v2
RIGHT:
Core i7 920 Marvell 9123 SATA 6G
Windows 7 Ultimate 64bit
Crucial Real SSD C300 256GB Firmware v1
![]()
-
-
Thank you, I did search but with 100 threads of info it was impossible to find the relevant info so quickly. I applied the power settings changes and my idle temps stayed the same while increasing my 4k from 15.7/21.2 to 22.8/39.1, nearly double the write performance!
My question now is what is a C300 supposed to be on a SATA-3G as the 6G bench is still nearly double this? -
According to benchmarks from here (which is also supposedly a 6 Gbps), your numbers are fine. Don't forget that there's a big difference between running drives just as storage drives (which most review sites do) and using them as OS drives.
-
-
Thus, as you can see, the difference between a drive benchmarked as a storage drive, and a benchmark run while a drive is in use as an OS drive will depend on whatever other processes that are being run at the time that require drive access. Review sites benchmark as storage drives because that gives "true" numbers. The numbers are the "real" maximum rates of the drive. Even when used as an OS drive, the numbers don't actually change; it's just that benchmarks run while the drive is being used as an OS drive can't detect that full value because part of it is being used. -
That makes sense, but if you have everything closed and an idle system there is no way its going to be a "big difference" and can be ignored. The difference I am trying to figure out is for my 4k tests on my notebook are now 21/40 where in the legit reviews bench they are 35/78, what I am wondering is if that is attributable to the inherently slower notebook subsystem and SATA-3G or if it is something that I need to look at.
-
How "idle" is idle, though? Even on XP, most "idle" systems still have 30-40 processes running. It gets worse with Vista and 7. Still, I've looked through several other benchmarks via google, and pretty much all the "real-world" (i.e. not done as a specialized review) benchmarks come out as about 30/50 for 4K... on desktops with SATA II (3.0 Gbps). Even more to the point, in terms of actual perceived performance... you're not going to notice a difference between your current scores and the "ideal" scores from the legit reviews bench. It's a little like complaining that your car can't get up to a full 200 mph, it can "only" get up to 175 mph... you'll never notice except on a dragstrip.
-
For average users I agree, but it depends on what you are doing, if you are copying a directory with 15,000 4kb files then you sure are going to notice the difference. I read 15,000 - 30,000 files daily as part of my work in creating a data file, and the speed these are read does matter to me.
About the processes, it doesn't matter if you have 10 or a 1,000 processes running, if they aren't using any CPU and or accessing the disk, which none of them are during idle, then they aren't going to effect performance.
Thanks for the 30/50 reportIf anyone else has any info please post it.
Crucial Real SSD C300 + Asus G53,G73
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rasterman, Jan 9, 2011.