Over time the price of storage on the Cloud becomes progressively cheaper. This translates to larger storage quotas at the same or slightly reduced price. Different Cloud providers use this as a competitive selling point; there is a higher quota for data storage with a certain provider, so that provider looks like a better option. But is this overemphasised? Do we really need Terabytes of storage for everything? Some large companies and databases will, but other would benefit more from other features from other Cloud Providers. Data storage is a factor, but not the only one.
Of course there is Parkinson’s Law – If you have more time and space then you will quickly find something to fill it up with. So if you have an abundance of computer memory you will find yourself filling it with disorganized and random content; you feel you have the space to spare.
And the problem with this? Sorting through all that data, which is time consuming; not getting into the habit of not really structuring things, which eventually works against you. Companies end up using providers with massive amounts of storage, rather than providers or different packages with the same provider that offer other features.
So what are the options?
– Structure data and files so that hold all the relavent information in a way that is easy to access.
– Question whether you need to hold all those older copies of files, or just the most recent version.
– Look at archiving, especially if you do need older information. Cold storage is significantly cheaper than online, and access time fro some options is less than a minute. If you only access a few time a year the cost is very low.
– Erase any obsolete data. If you cut costs by cutting staff then there is little need to keep all their old files; storing the files may be costing you money. If in doubt, archive them.
If you are in doubt about whether you will need something in the future, then archive it in cold storage. If you don’t need it after about 7 years you are probably safe, but achieved storage is cheap enough to keep it if you still have doubts.
Large amounts of data are going to be an issue if you have to migrate somewhere. Think about how long it will take to transfer all that data. If you have less in the main operating storage that you use everyday then the transfer will be much faster; and archived information can be transferred latter without you having to halt your regular operation. Really, and memory organization should follow the principle that says you should keep a tidy desk.
There was a rule in 1980’s computing – Generously estimate the resources you need, then buy double that amount. This idea was specifically inspired by memory needs. But back then memory was in Kilobytes or maybe Megabytes, and it was expensive. These days Memory is in Gigabytes or Terabytes, and is one of the cheaper elements of the industry. In the 1980’s you have to make sure you had enough, and used it economically. These days can find ourselves loaded with more than we need, possibly at the expense of other facilities that we might need more. If we have at least twice the estimated amount of memory we should be comfortable. And if not, then cloud usually allows up scaling quite easily if your business expands enough to require it.
Keep data and files that you use every day structured and easily accessible. Older files should be archived, in cheaper memory options like cold storage, because there will be occasion where some of it is needed.
Look into Cold storage archiving.
Look into software defined storage, to estimate requirements.