The public sector can benefit enormously from using a vendor for their data infrastructure instead of setting up a server room. Depending on the field in which your organization operates and its history, this may involve something of a cultural hurdle. The fact is, though, that today’s public clouds are safe and secure, and usually more cost efficient and easier to manage than an on-premises setup.
Once you’ve decided that a public cloud based infrastructure is the right choice for your organization, you’re faced with the task of deciding which cloud provider to choose. The criteria vary according to your priorities. In this article, we cover four of the big questions: cost control, uptime, management load and the cost of lock-in.
By choosing a public cloud you’re saving on hardware and software costs, but still incurring operational costs. Operational costs can be divided into two types:
● cloud provider fees
● management overhead
Don’t ignore that management cost! Depending on the cloud provider, there’s a range of tasks that you still need to take care of in-house, and you’ll need to hire talent to perform them.
And then there are the networking costs. Are those included in the service fee? Make sure that, if you need to move data from one place to another, there aren’t surprise ingress and egress fees between regions.
One more question to ask is whether you’ll need to pay extra for security. SSO logins and VPC peering may not be available on the cheapest tiers, and this has significant implications when it comes to the final cost.
The greatest cost savings from using a public service provider, however, come from the fact that your costs are governed by usage, not ownership. You’re free at any time to shop around for a more advantageous package and migrate—if, that is, you’ve chosen the right provider. More on this below.
Make sure you define how much downtime is acceptable, and use that to vet your vendors. To do this, put yourself in the shoes of your end users, and work backwards from there. 99.9% uptime comes to about 45 minutes of downtime each and every month. The current highest industry standard is 99.99% uptime, resulting in some 4 minutes of downtime per month.
That downtime may not come monthly, either—it could be once a year, but for hours at a time.
Also take a look at whether the anticipated downtime includes regular maintenance windows or not. Regular maintenance may take some 10 minutes per month, so it really matters whether the provider’s “20 minutes” is actually 20 minutes + 10 minutes.
This evaluation should happen separately for all the user groups your infrastructure serves. In the public sector, it may well be fine for a primary school’s attendance sheets to not be available 24/7, but in healthcare a 30-minute break (or even three 10-minute breaks) per month may prove literally fatal.
If you pick a setup where you’re only hiring server space but manage your own databases, make sure you have the in-house capability to manage this. The talent you need to have on call (and we really mean on call, since DBA isn’t a 9-to-5 job, not even in fields where end users don’t work around the clock) must also have experience not just with the software you’re using but also with the specific vendor environment you’re running in.
There are multiple reasons you should get a managed solution. If you haven’t already, we invite you to browse those links and think about it.
One of the primary reasons to choose a public cloud in the first place is often the ability to stay only for as long as it’s profitable to do so. Fees change, new providers enter the market, and the needs of your organization will change over time--these are just some of the reasons you want to stay light on your feet and retain your readiness to switch providers if it becomes necessary.
This is why it’s crucial to consider whether the provider makes it easy to not just enter, but also to leave.
Data egress fees are one incentive that some providers use to keep hold of their customers. They can be pretty steep, and may apply not just when leaving the whole environment but even when switching regions.
No one should be putting data in proprietary systems that don’t allow exporting it in a usable format these days, but sadly, it still happens. Either do your homework regarding connectors and ways to export from the system, or make the life-changing decision to depend on open source. In the public domain, this is especially relevant–and important.
This has been a quick look at the most burning issues when it comes to choosing a data infrastructure provider. It’s by no means an exhaustive list; there is no exhaustive list, because each organization has its own unique needs.
To make it easier for you to investigate further, here are some links we recommend: