Is there a proper way to set up an SSD as a storage/games drive? I just recently purchased a large SSD drive. I'm currently running a smaller mSATA SSD as my main OS drive. Any info would be greatly appreciated.
-
tilleroftheearth Wisdom listens quietly...
If the storage is static; make sure the drive is powered up on a regular basis - at least every three months if your data is important or the only copy... OP'ing in this case is not necessary if top performance is not necessary, but I would still leave at least 10% free space (and jump to a higher capacity drive if my usage of that system changed).
If the storage is more than any usage above static as described above, OP'ing is still recommended. but 33% of the actual capacity may not be the best tradeoff between capacity and performance (I would be looking at 25% or at worst, 20% for this type of 'storage drive' usage).
I would also do the following on either drive:
- Create a 'Program Files' folder and a 'User' data folder at the root of the drive.
- Right click on each folder and select Properties, then the Security tab. Click on Edit and select the Users group and in the dialogue box below, click Full control. Click OK, OK.
- Do the above to both root folders you created.
- Right click on each folder and select Properties, then the Security tab. Click on Edit and select the Users group and in the dialogue box below, click Full control. Click OK, OK.
- In the Users folder, create a new folder (I use the actual user's 1, 2 or 3 initials - shorter is better...). I'll use the 'FRS' folder as an example here.
- After creating the 'FRS' folder, I MOVE the corresponding Users' folder to this folder and create any other data folders I want/need***.
- If you don't want those folders located there (I know of no good reason why not though... as long as the drive isn't removable), simply create the new folders for the data that you do want there.
- In the 'Program Files' folder, simply point your installs to this new drive (tip; simply change the letter to this drive...) and it will be installed there.
The above should be all you need to properly setup your new 'storage drive' for most common usage modes when data and program files are to be stored on it, depending on the performance expected and the frequency of the changes expected on that drive. - Create a 'Program Files' folder and a 'User' data folder at the root of the drive.
-
Hey thanks for the detailed reply. The drive will not be static as I currently have all my games installed on the separate SSD and I play something pretty much every day. I also have a 'Program Files' and 'User' data folders there already. I had a regular HDD which contained my 'Program Files' and 'User' data folders and I just cloned that drive onto my new SSD.
Sorry for my ignorance but what does OP'ing mean?
Also, I had thought there were some special ways to get SSDs set up properly to perform in an optimum way, but I guess that only really matters when setting up the main OS SSD? -
OPing stands for over provisioning, which you can read about quite a bit on the forums here, as well as SSD-related websites. Some drives come with software that will do this for you, but all you really need to do is shrink your drive's partition so there's some unallocated space left on the drive. Take some time to familiarize yourself with how is is done and what it means: you don't want to lose all of your data in the process.
SSDs need to actually wipe a cell before it can be rewritten to. When you delete something, the OS marks it as free, but doesn't actually do the deletion (this is what TRIM does). So writing becomes a two step process - delete first, then write the new data. If you overprovision the drive, the SSD now has a space where it's guaranteed that any data is deleted. This saves a step, since the write can happen right away without needing to erase first. Fewer steps = better performance. It's also further complicated by the fact that the smallest possible unit that can be deleted on an SSD is much larger than the smallest possible unit that you can write to, but I'll leave that out for simplicity.
To get the best performance, you want to allocate enough extra space (OPing) so that you can always write without having to delete first. The drive will have less total space, since you're trading off space for better performance. tilleroftheearth knows a lot about this, so I would follow his guidelines.
There's always tweaking you can do, some with larger gains that others. Many of the tweaks you'd see for early generation SSDs were actually aimed at reducing the number of writes (rather than performance), since nobody really knew at that time how quickly a drive might wear out. This is even less of a concern on a storage/games drive, since it will be much more static overall relative to an OS drive. -
Thanks for the info. And I just read up on it. I also read up on the amount to OP drives and people seem to vary. I've seen tilleroftheearth mention 30% and up as well.
Is it too late to OP the storage/games drive that I have now? For OPing to be effective do I need to format the drive and then OP and reload all the data? -
I've read some people recommend to create a new partition to fill the space (and then delete that partition), and that this step declares the space to the drive controller as usable for the OP performance benefits, but I'm not 100% sure this step is needed (although it shouldn't hurt, and only take a minute to do). -
For a new drive, it shouldn't be needed as the logical space is already marked as free. For an old drive this might help, but only if the partition tool does trim deleted sections.
-
tilleroftheearth Wisdom listens quietly...
To setup OP'ing on a used drive:
Shrink the drive as much as you can (I would leave 1GB of free space or less, if I could). Don't worry, this will only be temporary.
Create a new partition on the 'unallocated' space and format it. Now, don't use the computer (and make sure you have disabled sleep and/or hibernation) for at least an hour.
If you are using Windows 7 SP1 and above, you will now have that part of the capacity Trimmed and ready to go.
Now, after leaving it for at least an hour (or better yet; overnight) delete the partition you created.
Expand the main partition to the size you want/need or to the 67% optimum value I find gives the best performance for the least capacity loss.
Do not take into account what the SSD may be using as 'built in' OP'ing from the factory. Simply add the number of MB's available in the Disk Manager when you're expanding or shrinking it (in total) and multiply by 0.67. This number I usually round up/down to the nearest GB size that makes sense to me and then I add 200MB to allow Windows to show me that capacity even after formatting is accounted for.
So if I am using a 480GB (nominal) SSD, the actual total MB's might be 457,965. Multiplied by 0.67, that is 306,836 MB's. Rounded up, that is 300GB of total capacity and adding 200MB would have me at 307,400 MB's total allocated space with the remainder (150,565 MB's) left as unallocated for an approximate 33% OP level for best sustained performance over time.
Hope this helps. -
Thanks for the tips guys. Again this SSD (used and has my user folder and games) will be for storage and games. I have a separate SSD for my main drive. I will give this a shot tonight. When you say don't use the computer, you mean just leave it on running all night correct?
-
-
tilleroftheearth Wisdom listens quietly...
Yes, correct. Even better if you leave it at the BIOS screen or at least with no internet connection (then it has nothing or at least, less to do...)
-
Are you sure about the BIOS screen? It should be necessary for the normal OS to be loaded in order to issue TRIM commands.
Once TRIM is issued, idling at the BIOS screen might be better for GC. -
tilleroftheearth Wisdom listens quietly...
At the BIOS screen, the O/S cannot (and of course, doesn't) do anything to the drive.
This is what we want for TRIM and GC to finish unimpeded by any and all the normal interruptions from the O/S (automatic maintenance; virus scanning, indexing and windows updates to name the most common background tasks that are run with the O/S running, but sitting there 'idle'.
However, at the point when the new partition is formatted, that is when TRIM is issued to the drive. The O/S from this point forward is now only interfering with the internal TRIM/GC process. That is why going to the BIOS is preferred.
If you really want to do this right;
Shrink the main partition to as small as you can (before you do; if you can move some GB's worth of data to an external drive to make it even smaller, I would - afterwards, you can copy them back).
Now, boot up with a Windows 7SP1/8/8.1/10 Setup installer USB/DVD drive/disk and proceed to the advanced drive setup screen. Format the 'unallocated' space you created within windows.
Let it sit at this screen as long as you can (at least 1 hour... overnight is better).
When you have waited enough; select that new partition (Carefully!!!) and delete it.
Cancel out of the installer. When back into your normal Win O/S, expand the partition to the size you want and copy back any data you may have moved to an external drive.
At this point, I run MyDefrag on the drive (and yes; defragging does make a difference).
On normal/heavily used disks, I continue to run MyDefrag about once a month (after MS Patch Tuesdays) to keep the system at it's peak.
(I used to recommend PerfectDisk Pro, but have found MyDefrag to be lighter on the system, just as effective and free - and both are much better than with no defrag at all).
Enjoy.
-
Boot up with a window installer? It doesn't make a difference. Partition is a partition. Just create a partition, initiate TRIM in Windows 8/10, or use O&O defrag tool for TRIM in Win 7 (use pro demo, not free version). Let it idle however you want for 15-30 minutes, then delete the partition, and re-TRIM the remaining partition(s) on the drive.
Defragging only makes a difference immediately until the garbage collection and TRIM kicks in again, then it doesn't matter, because the contents are rearranged by the firmware on the fly anyhow (due to wear leveling), and it's an artificial improvement. 30% OP is overkill for most users. Unless you do heavy writing to the drive regularly and/or fill the drive to the brim, it just doesn't matter. If you're worried, just OP 10-15% and you will be fine. -
tilleroftheearth Wisdom listens quietly...
The difference is that if the drive is constantly being poked with background Windows tasks, it will never get to the TRIM we want it to do. TRIM is a 'suggestion' to the drive for when it is idle; it isn't a command that is run on demand.
Defragging makes a difference beyond GC and TRIM running (once or more). It doesn't matter how the internal arrangement of the data is stored. It is the logical fragmentation that makes a difference. A file that is logically in thousands of fragments will take longer to read than one that is in fewer or no fragments. This is what the O/S sees, not what the hardware is doing...
The improvement is not artificial or temporary unless the workload forces the data to become fragmented again. On my systems, I defrag after MS Tuesday updates are done and the computer has been left to do it's background post cleanup routines (i.e. 'maintenance') for a couple of hours or more... and I usually do not need to do it again until next month.
What the hardware is doing internally and what the O/S sees are two different things. Defragging still helps as much as when HDD's roamed the earth. At least if you want the fastest O/S experience possible.
-
Unless you know what the actual GC and wear leveling routines are it's hard to know exactly what it's doing. It may artificially improve it depending on usage scenarios. Heavy use with little to no down time, maybe. But for general use, the system is idling enough for TRIM to work. Easy to check by writing a bunch of small files to a drive, run a performance metric, then let it sit for 10-15 minutes even browsing web or doing some mundane tasks, and test again, and it will improve, with most modern SSD's.
If it were a good idea then you would think that at least one major SSD maker would provide a defrag utility, but none do. Only TRIM utilities. Only if you are constantly doing writes to the drive and then shut down and start up and have continuous writes to the drive will it possibly help. But it doesn't for 99.99% of users.
And who is to say that by defragging you are just initiating the GC routine then that wouldn't have initiated on its own?
All OEM's state that defragmenting your SSD is not a good idea.
Samsung: http://www.samsung.com/global/business/semiconductor/minisite/SSD/global/html/support/faqs_03.html
" Will defragmentation improve my Samsung Solid State Drive’s performance?
No, Solid State Drives do not need defragmentation because they have no moving parts and can access any location on the drive equally fast.
Please disable any defragmentation utilities on your computer because they will only wear down the performance of your SSD.
Visit the OS Optimization section of Samsung SSD Magician for help doing this."
Micron: http://www.micron.com/about/blogs/2015/july/windows-10-and-your-ssd
" Defragmentation causes a lot of controversy and consternation. Generally, defrag should not be run on SSDs, and it’s probably best to disable any regularly scheduled defrag in Windows 7. Don’t worry too much if you’ve run defrag on your SSD a few times, though. It will cause a bit of extra wear, but probably not so much that you’d notice it over the lifetime of your SSD. The good news is that in the newer Windows versions, defrag for SSDs is completely removed so you don’t need to worry about this anymore."
Kingston: http://www.kingston.com/en/ssd/upgrades
" SSDs do not require defragmentation. Since there are no physical disks, there is no need to organize the data in order to reduce seek time. Therefore defragmenting an SSD is not effective. Also, defragmenting an SSD can put undue wear on specific areas of the drive. SSDs are designed to write data as evenly as possible over the entire drive to reduce undue wear to any one location. Nonetheless defragmenting your SSD drive a couple of times will not harm it. However if it is done continuously over a long period, it may reduce the life of the drive."
I had an official statement from Sandisk but can't find it at the moment.
O&O which offers great Windows OS products, and offers a defrag software, says it is not a good idea: https://blog.oo-software.com/en/defragmentation-and-ssds-opportunities-and-risks
" Defragmenting an SSD does not lead to any performance improvement: it can even reduce the life expectancy of an SSD. This must be avoided either through automatic recognition by the operating system or by a defragmentation software. A regular and automatic run of TRIM commands can substantially increase the performance of an SSD and extend its life expectancy."\
Piriform Defraggler only implements TRIM with defragmenting SSD's as well: https://www.piriform.com/docs/defraggler/technical-information/defraggler-and-ssdsLast edited: Sep 19, 2015 -
tilleroftheearth Wisdom listens quietly...
Doesn't matter what a manufacturer is saying. They are only doing it to cover their legal behinds and get their products past warranty anyways.
I buy my products to use as I wish. And defragging for the past 4 years or so hasn't shown anything but benefits to me and my clients' systems too.
I don't claim to know what the drive is doing internally. And I already stated that and don't really care, anyways.
What I do know is that logical file fragmentation on an SSD is just as real as physical fragmentation on a HDD. The O/S responds the same way; it starts stuttering and skipping a beat vs. what it feels like 'clean'.
Defragging for 'only' logical file fragmentation reduces seek times for both HDD and SSD's too. Even at a few ms of latency adds up to longer than HDD levels of seek time when a file's fragmented into the thousands.
And about the 30% OP'ing being unnecessary? The following link will show (once again...) how OP'ing gives the performance the SSD is capable of to the user, rather than to the GC/TRIM routines that won't run at a time we want them to (i.e. on demand).
See:
http://www.anandtech.com/show/9451/the-2tb-samsung-850-pro-evo-ssd-review/2
(See the first graph in the link above and select the same drive for the 25% OP graph).
-
It's not just OEM's but makers of defrag software but you must have missed the last part of my post.
I don't see 30% in that link you provided. What am I missing? I see QD32 performance which means nothing to most users. I also see 850 EVO performs minuscule faster with 12% OP vs 7% OP of the PRO, but the PRO still dogs the EVO with performance consistency despite the OP advantage. -
tilleroftheearth Wisdom listens quietly...
A Chevy Chevette won't become a Corvette, no matter what gas you put in it... So, no wonder the EVO is still left gasping for air vs. a better designed SSD...
And makers of defrag software don't want to be held liable either (they have lawyers too...).
30% (or more correctly, 33%, to be exact) is my finding of where additional OP'ing doesn't increase sustained performance over time vs. the reduction in capacity necessary. Still, my desktop SSD's are OP'ed 50% or more (Why? Because the reliability still increases and because I still have multiple SSD's for different tasks and the capacity hit isn't as important as on a limited bay mobile system - at least one drive for each 'type'; O/S drive, Scratch Disk(s) (temp files) and WIP Disk(s) (work in progress, aka Data files).
See:
http://www.flashmemorysummit.com/English/Collaterals/Proceedings/2012/20120822_TE21_Smith.pdf
It is kind of ironic that the quick search I did finds the above link/pdf (from, LSI - aka 'SandForce'). But it does show the state of nand even today.
See:
http://www.infortrend.com/ImageLoader/LoadDoc/435/True/True
And here is one for Intel SSD's...
See:
http://www.anandtech.com/show/6489/playing-with-op
And of course, the Anand Lal Shimpi article that finally (back then) validated what I was saying for a long time already.
Anand recommends 25% for a 'good balance'. I recommend 30/33% for the optimum balance, from my experience.
The anandtech link above also shows how OP'ing by 50% keeps giving improvements still...
You can argue about the benefits of OP'ing all you want. The facts say otherwise. Most tests use 32QD or higher because the 'benefits' shown will be more apparent. But even at single QD's the benefits are there, but I can't find a reference for you right now.
When normal/light users get their system back and tell me a few days/weeks later that their system has never been so responsive (with 33% OP'ing in addition to less than 50% of the remaining capacity in use...), it is a testament to how effective OP'ing is for any type of workload.
I understand that not all people are as sensitive to this increased responsiveness in their systems...
But like I mention when I suggest OP'ing by the 'tiller recommended amount... the reason to do so is to get the most consistent performance from the SSD that a user has paid for (and it still isn't anywhere close to the advertised speeds...).
Even some people that initially choose capacity instead of sustained performance tell me later (after I finally OP their system) that they really believed that they wouldn't notice a difference (but they did).
Not trying to convince you or anyone else to OP, just because I say you should. Everyone can make their own choice.
But facts are facts and they should be highlighted first and foremost and not a belief that OP'ing is not something that is useful for anyone if they want the fastest, most consistent performance possible for the still overpriced SSD's we are stuck with in Q3 2015.
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
-
tilleroftheearth Wisdom listens quietly...
But it sure isn't just an octane fuel level upgrade in this one. -
I never said OP'ing wasn't beneficial. But it has limited benefit to most users and MORE OP is useful only in *CERTAIN SPECIFIC SCENARIOS* which pertain to a small percentage of users, yet you make it seem like their computers will time warp back to 1999 performance. Even in the Intel example you gave, they only OP'd to 20% and specifically state "This application note aims to help users decide whether to adopt over-provisioning and choose its level in their environments." They don't say "It is imperative that you do this or your performance will drop through the floor." And if that were the case they would OP a lot more from the factory.
And regarding defrag (again) Intel even states it's not a good idea: http://www.intel.com/support/ssdc/hpssd/sb/CS-029623.htm
"Should I defrag my Intel® Solid-State Drive,using Windows* Disk Defragmenter, or a similar program?
No. SSD devices see no performance benefit from traditional hard disk drive defragmentation tools.
For legacy operating systems, disable any automatic or scheduled defragmentation utilities for your Intel® SSD. Using these tools adds unnecessary wear to the SSD.
Newer Windows* operating systems can detect the presence of an SSD and disable the defrag function.
Microsoft Windows 7*, Windows 8*, and Windows Server 2008* have the TRIM command enabled by default."
I've had a lot of experience with "average Joe" users and SSD's and can tell you I could OP 50% or OP 10% and they wouldn't know the difference. -
tilleroftheearth Wisdom listens quietly...
HTWingNut,
Where we differ on this issue seems to me to be:
- I want the most performance possible from my (very) expensive storage subsystems.
- Paying (effectively) 1/3 more for a given capacity is a fair tradeoff for the HUGE increase in sustained performance over time the SSD rewards me and my customers with.
- The absolute $$$ amount is not as important as getting to use the absolute maximum performance the drive can offer for the longest possible time.
- Legal restrictions and marketing concerns (aka; never build a perfect product... otherwise, you sell yourself out of business...) do not limit my testing, thinking or doing when the results of my own testing are so positive over such a long time frame. I've been saying this in one form or another since late 2009, btw. And we're still discussing this today? Shock...
- Average Joe I couldn't care less about. When someone asks for my expertise, they get it, in full. Not a watered down version or based on an assumption on my part. Yeah, some have asked to 'get their full SSD back' and I expand the drive and viola, it's there. But more people than not see the benefits. And, more people than not use an SSD that is barely full so OP'ing for them doesn't put them out one iota anyways.
And I have written about setups for clients where the performance was at 1999 levels... and OP'ing brought back a system from the dump it was destined to.
You obviously did not read those links I provided. Nor do you seem to be sensitive to the benefits offered by OP'ing either. No problem. Others are.
To use an SSD without OP'ing (i.e. optimizing it's setup characteristics...) is like paying for a Ferrari and putting Walmart tires on it with recycled oil and tractor fuel inside.
Yeah, it will move. But it won't run.
My clients don't just want the pretty ride. They want the bat out of hell, underneath.
Starlight5 likes this. -
I think there's some merit to where HTWingNut is going regarding specific use scenarios, so I wanted to have a closer look at the anadtech article ( http://www.anandtech.com/show/6489/playing-with-op). Thanks for linking it in, tilleroftheearth. The gains are indeed quite substantial, even up to an OP rate of 50%.
Having a closer look, the results are taken under two conditions:
-The drive is completely full before the test starts
-You're writing to the drive for a half hour (if someone can clarify what their 4k workload is, that would be helpful, but I couldn't find it)
Do both of these cases apply to Joe average? The first one almost surely does. Especially with a smaller or midsize SSD (128-256 GB), they tend to fill it all up. What about the second case? I can't find enough details about the workload to tell. If it's a heavy workload, then it almost surely doesn't apply. If it's a lighter workload, then it might well apply.
I think I would argue that Joe average is probably likely to benefit from 25% OPing, since it protects him from himself when he fills the drive all the way up. The performance of a completely full SSD with no OPing is pretty awful.
In my particular case, I tend to run less because I usually have a substantial amount of free space. In hindsight this means I probably should add more OPing to the darn thing if I'm never using the space. Strictly speaking, unallocated space > free space. Additionally, I have to go to work every day, so the drive gets a nice period of time to catch up on all of its maintenance. If I were using my computer for more intensive work with fewer rest periods, then I would be inclined to OP a bit more.
I think it's worthwhile to consider one's workload and make a reasonable choice. It's also worthwhile to OP to some degree as a safeguard so the user won't find themselves in the worst case - with their drive entirely full. In the case of tilleroftheearth where's he managing other people's systems and promising them maximum performance, 30% doesn't seem unreasonable.Last edited: Sep 20, 2015HTWingNut likes this. -
I agree. If you have a smaller drive, OP'ing some extra will benefit the average user if they tend to fill the drive to the brim. It all depends on the drive too. Some drives have more aggressive GC and general cleanup algorithms than others.
I'm just saying that tiller always thinks everyone is running the same tasks as him and that their computer will explode if it's not OP'd 50% and defragged every other day. It's misinformation. It's not accurate. It applies to specific situations and not *required* for most users. If I were a new user, average Joe, looking to buy an SSD to bring new life to my three year old laptop, I would needlessly spend twice as much as I needed to on an SSD based on tiller's recommendations. I'm just the sanity check here to say it's not necessary if your normal "workflow" is browsing the web, playing some games, maintaining some MS Office stuff, and watching some videos like 90% of users do.
Heck I'm a heavy user. I encode videos, I do some pretty heave disk I/O time and again, and OP my SSD's 20% for my OS drive and 10% for my "scratch" or "storage" SSD. And have nary an issue. Same with other users I work with and have helped as well. No sob stories of how their system has come to a crawl. -
tilleroftheearth Wisdom listens quietly...
Now you're doing what you say you're trying to prevent me from doing... Misinformation, inaccurate and as I've already stated, I don't care about most users or Average Joe... sigh...
I did not caution from exploding computers if not OP'd by 50% and defragged every other day - ever. I spelled out exactly what my recommendations are designed to achieve. A storage subsystem that slows down as minimally as possible with (almost) no matter how the system is used AND with the least capacity 'sacrificed' for this benefit.
As for your actual use of OP'ing in your own systems, is it because you saw the slowdowns I caution against in your own setups?
In any event, I'll state it again for the millionth time (yeah; exaggeration to make a point...). I do use and consider workflows other my own 'heavy' ones. I'm just more sensitive to what a few (more) points of OP'ing can do for a system than most, I guess.
-
StormJumper Notebook Virtuoso
-
Nice
Note : Just bought New Samsung 850 Pro 256GB, great deal, will install this as main drive and then use 840 EVO 250GB as an USB 3.0 Backup drive with Macrium Reflect backup software.
Cheers
3Fees
Randy
_____________________________________________________________________________________
HP Pavilion 17", Windows 10 Pro, 64 Bit installed, AMD Elite A10-5750M-8750G-APU,Micron-Crucial Ballistix Sport- 16GB DDR3L- 1866Mhz with automatic Over/Under Clocking of DDR3 Ram by the AMD APU-1866 MHz Memory Controller-Built in the APU Architecture, Samsung EVO 250GB SSD,Logitec LS1- Laser Mouse 5000 DPI, Seagate Backup Plus USB 3.0 drive -1TB size, . I have Lexar S33 32GB USB 3 Jump Drive ~ 100/50 MB/s.
HP DV5000 15.6" Windows 10 Pro, Hitachi 7200RPM EIDE -
Adding,,I just got and tested Startech USB 3 to 2.5 sata HDD/SSD cable with UASP or UAS same thing--SCSI Commands available, also the startech cable has a tiny red light on it,it blinks, it has chip in it on the USB 3.0 side that enables SCSI commands for faster input/output. Testing the Startech cable with chip crystal info showed Sata 600/ Sata 600,Current Mode|Supported Mode, Most if not all,USB 3.0 controller hubs have ,UASP(UAS) available, test drive : WD Blue 2.5", 750GB, Sata 3.
Did not test sata 3 drive with non chipped-SCSI Command enabled cable for comparison.
My seagate backup plus showed Sata 300/Sata 300, Current Mode|Supported Mode, Sata 2 drive
2 year warranty, ~$15,, comes in baggie with instructions/warranty and cable. Startech.com to check it out. USB3S2SAT3CB, USB 3.0A (9 pin SuperSpeed)
Samsung 850 Pro 256MB not in yet, My EVO will be using the startech cable, update on this later.
Cheers
3Fees
_____________________________________________________________________________________
HP Pavilion 17", Windows 10 Pro, 64 Bit installed, AMD Elite A10-5750M-8750G-APU,Micron-Crucial Ballistix Sport- 16GB DDR3L- 1866Mhz with automatic Over/Under Clocking of DDR3 Ram by the AMD APU-1866 MHz Memory Controller-Built in the APU Architecture, Samsung EVO 250GB SSD,Logitec LS1- Laser Mouse 5000 DPI, Seagate Backup Plus USB 3.0 drive -1TB size, . I have Lexar S33 32GB USB 3 Jump Drive ~ 100/50 MB/s.
HP DV5000 15.6" Windows 10 Pro, Hitachi 7200RPM EIDELast edited: Sep 29, 2015 -
Starlight5 Yes, I'm a cat. What else is there to say, really?
3fees, USB = no TRIM. I'd say using an SSD for backup is a waste.
-
Not necessarily a waste. But why go from 250 to 250GB drive? I can see if there was an upgrade in space, you have a spare SSD that you need to do something with. But from 850 EVO to 850 Pro will only give marginal improvements. That being said, even without TRIM, the drive will still execute GC and wear leveling routines since its independent of the OS and interface.
Starlight5 likes this.
Using SSD as storage drive.
Discussion in 'Hardware Components and Aftermarket Upgrades' started by arc1880, Aug 24, 2015.