Over time the price of storage on the Cloud becomes progressively cheaper. This translates to larger storage quotas at the same or slightly reduced price. Different Cloud providers use this as a competitive selling point; there is a higher quota for data storage with a certain provider, so that provider looks like a better option. But is this overemphasised? Do we really need Terabytes of storage for everything? Some large companies and databases will, but other would benefit more from other features from other Cloud Providers. Data storage is a factor, but not the only one.

Of course there is Parkinson’s Law – If you have more time and space then you will quickly find something to fill it up with. So if you have an abundance of computer memory you will find yourself filling it with disorganized and random content; you feel you have the space to spare.

And the problem with this? Sorting through all that data, which is time consuming; not getting into the habit of not really structuring things, which eventually works against you. Companies end up using providers with massive amounts of storage, rather than providers or different packages with the same provider that offer other features.

So what are the options?
– Structure data and files so that hold all the relavent information in a way that is easy to access.
– Question whether you need to hold all those older copies of files, or just the most recent version.
– Look at archiving, especially if you do need older information. Cold storage is significantly cheaper than online, and access time fro some options is less than a minute. If you only access a few time a year the cost is very low.
– Erase any obsolete data. If you cut costs by cutting staff then there is little need to keep all their old files; storing the files may be costing you money. If in doubt, archive them.

If you are in doubt about whether you will need something in the future, then archive it in cold storage. If you don’t need it after about 7 years you are probably safe, but achieved storage is cheap enough to keep it if you still have doubts.

Large amounts of data are going to be an issue if you have to migrate somewhere. Think about how long it will take to transfer all that data. If you have less in the main operating storage that you use everyday then the transfer will be much faster; and archived information can be transferred latter without you having to halt your regular operation. Really, and memory organization should follow the principle that says you should keep a tidy desk.

There was a rule in 1980’s computing – Generously estimate the resources you need, then buy double that amount. This idea was specifically inspired by memory needs. But back then memory was in Kilobytes or maybe Megabytes, and it was expensive. These days Memory is in Gigabytes or Terabytes, and is one of the cheaper elements of the industry. In the 1980’s you have to make sure you had enough, and used it economically. These days can find ourselves loaded with more than we need, possibly at the expense of other facilities that we might need more. If we have at least twice the estimated amount of memory we should be comfortable. And if not, then cloud usually allows up scaling quite easily if your business expands enough to require it.

Keep data and files that you use every day structured and easily accessible. Older files should be archived, in cheaper memory options like cold storage, because there will be occasion where some of it is needed.
Look into Cold storage archiving.
Look into software defined storage, to estimate requirements.

Cold storage is basically the software equivalent of archiving. It is designed for data that customers and business only need occasional access to; think old files, photos, non-current accounting records …etc. It’s not a new concept. Companies have made the distinction between normal and archive storage for many years. Cold storage is considerably cheaper than the alternatives; with the disadvantage that access time is slow. This might now be changing.
Google is launching Cloud Storage Nearline. It looks to be storing data at about one cent per Gigabyte, as cheap as one could hope for; yet the access time is about three seconds, which is hardly any delay at all.
One of the previous disadvantages in archiving information was later using it for analysis. Either you kept a summary of the data, or you kept it current, or you had to request the data in advance before you analysed it, waiting a few hours for the request to come through. Moving any data between storage mediums, online storage and archive storage, was time consuming and perhaps costly. Google Nearline changes that. As the name implies the nearline data is not too far removed from online data, meaning the cold storage definition is rather blurred.
With this new Google system the online information is available instantly, or as fast as your internet connection allows; Nearline will be available with only a few seconds delay, something most users probably won’t find even noticeable.
It must be understood, however, that the price of storing the data is not only cost involved. Accessing the information incurs a small charge, which varies depending on whether the information is being transferred locally or across continents. But if you access the information less than one a month the nearline storage should still be cheaper than regular online storage.
This system would also have great possibilities for backup and disaster recovery. Recent files and updates would probably be stored elsewhere, but everything else could be backed up on nearline. In the rare but possible case of a disaster reasonably recent files, perhaps one month old, would be available within seconds.

REFERENCES.
http://techcrunch.com/2015/03/11/google-launches-cloud-storage-nearline-a-low-cost-storage-service-for-cold-data/

With all that’s going on the net it’s hard to see the trends within all the various tasks and decisions we encounter. But at a guess, in a world where people always want to cut costs and expand business operations, we tend to think the following:
-Platform as a service and Software as a Service means people use anything from smartphones to tablets to home computing to carry out their company business, reducing dependence on anything onsite or in-house for their company. How security is effected as information is passed to their various devices is a concern, but keeping it on the cloud is relatively safer that transferring it to other (personal consumer device) platforms.
– Office space can be smaller as people work from their portable devices. There will be less distinction between off and on duty, less concern about being physically in the office. For healthy and motivated people the blending of personal and business might well be enjoyable. For some there will be no escaping the office.
– Sophisticated software is now within reach of most businesses, so they will use these analytical tools and platforms to take their operations to new heights. Upgrades are less of a concerns (as is the risk of becoming obsolete) as the cloud will tend to stay current. Businesses that work on the net will not have to worry about geographical location, which has become less and less of a concern over the past generation. Unless there is a physical presence required businesses will operate either globally, or within the area of relevance and legal applicability. E-commerce is within the reach of moderately small businesses, with little investment necessary other than a solid website.
– Bookkeeping can be largely run online and accessed by those staff that need to know. This was previous possible with everybody in the office, now it is possible with staff working from home.
– Businesses will find PaaS and IaaS that suit there business needs. You don’t have to go for the most popular platform, though if it is popular there might be a good reason for it. At the same time compatibility with a customer’s needs is an issue. We expect compatibility will need to become smooth and seamless so both popular and obscure software systems can deal with what systems that customer’s use.
– Small businesses can sometime have major businesses as clients. Some major businesses have their own web developers who make sites; others hire a small but dependable group of specialists. If this is the case there is a security ‘backdoor’ issue. Some small businesses might not be worth targeting on their own, but is they work for a larger company the cybercriminal might try to hack the large business through the services of the smaller one. This is a security issue to watch out for.

The name sounds a little ominous, but shadow IT is a mixed bag. Basically it’s the use of unsanctioned cloud and mobile apps by employees of a company. This occurs without the knowledge or consent of those in charge (more or less) and poses a security threat if anything confidential gets passed through the system.
You can have all the security protocols you want, but if an authorized user copies information (without malice) to their phones or laptops so they can work on the way home, then who knows how many people might get access to it. If you stick to the same format for the employee’s computers as your own office, if they access the information on cloud, you can probably use the same security that you have on the rest of your system, and that should be fine. Else, there apps are not covered by your security.
Incompatibility is another issue. It’s a little like things being lost in translation. Concepts vary a little from one language to another, as software formats also vary to some extent. Even something as elementary as rounding off to a certain number of decimal places causes problems if different apps have different standards here. Pass the information around and the numbers slowly change. Other compatibility issues can be far worse.
But I did say it was a mixed bag? Companies interface with people in the real world, otherwise they are not much good. And we can’t presume that the people in the real world all run a system identical to ours. If our data and protocols are being used on a variety of platforms and systems in the company, then they are inadvertently being tested in real world situations. Additionally, if company staff are innovative enough to develop their own applications then we are encouraging some innovative thinking, which is always a good thing when people are working on your side. Of course if there are security issues then not everybody accessing the information actually is on your side, but we’ve already mentioned that.
How do we deal with this? Taking note of what is being used by company staff is a start. Examining the billing information of the individual employee’s will show, in many cases, who is using what services; similarity, any interactions with anything off-premises, unless they pay with their own money/credit card, should be on an office account somewhere. And if you do find these non-company apps, services and other systems being used, you might do well to find out what their appeal is to the users and incorporate them into the company’s system, should they prove viable. This is not so much adopting shadow IT as it is bringing this IT out of the shadows. If you know what’s happening then you can look at enabling some appropriate security.

The first groups who adopted office 365 online ran into teething problems. Because of the popularity of office 365 there is plenty of advice around from people who have already made the transition. Some of the issues were:
Unrealistic design decisions with on-premises and cloud. No one solution fits all. You may find combinations of cloud and on site systems work for some things, but not others. As you don’t have to move everything all at once so you don’t have to decide all at once. Latency might be an issue with cloud based mailboxes, so on premises exchange servers might be better for this.
Not having prerequisites for hybrid deployment. Exchange 2007 through to 2013 can be configured to work hybrid. But with 2010 and earlier there must be at least one exchange 2013 client access and mailbox server in place to run the hybrid configuration wizard.
Using a small version of Office 365 and finding they do not offer Azure AD sync, and therefore do not support hybrid deployments. Enterprise, government, Academic and midsize plans work here. Home and small business (apparently) do not. You may have to go for the larger package, which tends to be better in the long term.
Archived email can be an issue. After moving to the cloud you may find accessing older email is impossible. You can export the mail, but if it is achieved (is compressed) that can be time consuming, and a nightmare if it is corrupted in any way. If you have legacy archive you can use the native export tools to rehydrate the stubbed messages back into the mailbox directly. But beware, if you are doing several employee’s mailboxes you may be using huge amount of memory before you compress things again.
Data importing is subject of a daily limit of 400GB with exchange web services (EWS), so large amounts of data will take some time. Internet bandwidth can also cause a bottleneck here. Third part solutions, finding whether data is better kept on or off the cloud, or cloud-based solutions other than Microsoft can alter things here.
You may find that features vary between server versions and cloud versions. SharePoint Online is similar to SharePoint server, but not exactly the same. You might find that something you depend upon is not in the online version.
One issue overall is the abundance of information on the subject, and the fact that it doesn’t all fit together neatly. Everybody has an anecdote about what happened when they tried; but not all of the situations in which they tried were the same, not all of the individuals interpreted the problem the same way, so not all of the stories fit together. It’s easy to get lost in the conflicting information.
Microsoft has a free tool called Exchange Server deployment Assistant. Ask it questions and the information provided should be everything necessary to make migration work.