Data location, location, location

William Morrish - 20 November 15

The recent ruling from the European Commission regarding the Safe Harbour agreement has raised significant questions not only about data privacy, but importantly its location. Taking in this latest change with regards to your own company's data, it’s perhaps time to take a more strategic look at both by whom and where your data is served.

Many businesses look to ‘The Cloud’ as a panacea to help improve business agility, lower costs and keep up with the fast paced changes in technology—but this rush to embrace new technology has meant that some sacrifices have been made, often without all of the implications being known by the business.

There is no 'The Cloud'

The first thing to understand is that there is no ‘The Cloud’. There are many providers offering what can be seen as similar services but they are often significantly different in delivery, scale and execution. It’s often assumed that when moving to the cloud people will move to one of the few hyperscale US providers, however their approach to Cloud is one of building big blocks of compute in as few locations as possible around the world, to serve as many people as possible and all accessed typically over the public internet. It doesn’t have to be this way and there are indeed many reasons why for some businesses it shouldn’t be.

Issue One: When your data is held by a company established in a different jurisdiction, the laws of that host country may apply

When the server or cloud service are run by a company established in another country other than the one your business is registered in, then this government in addition to your own may have additional access or laws that you may not be aware of, so pick your cloud service provider judiciously.

The large US providers are moving quickly to assure European businesses that their data is safe from outside influence with assurances that data won’t leave an area unless it’s ‘to comply with applicable law’. This is the point of the issue—different companies from different jurisdictions having possible rights to your data, so knowing which laws apply can be troublesome to define.

Issue Two: If the data moves further away from your users, you may have to redevelop or redeploy

If you’re running enterprise IT systems then you’ve likely built your IT infrastructure over years, developing architectures suited and optimised to serve your customers, often with close proximity of users to the platforms built to serve them.

Yet, most of the hyperscale cloud providers believe that individual markets within continents can be served by one or two mega-scale data centres and by using caching nodes or other technologies the experience will be great. But the reality is that the further away the server, the higher the latency and the slower the performance. This is often irrespective of the extra technologies you throw on top of it; nothing will beat a server being closer to your users, recreating the exact setups that you’ve built over years to serve them.

When you’re used to your data residing close to your users and you move to ‘The Cloud’ you’re often forced to redevelop, redeploy and change architectures to move to this new technology due to the differences between in-house setups and that of ‘The Cloud’, in turn spending significant sums on this transition. In addition to this, you may also find the public internet can’t deliver the quality or speed you need, and you then have to buy supplementary networking to ensure that your business critical applications deliver the performance they need to.

Issue Three: The further away the customer is from the server, the slower the customer experience

In today’s digital world speed is more important than ever; with the proliferation of connected devices and expectation of access anytime anywhere, it has made us all critical of even the slightest delay. Whether streaming video content, using business apps, or simply browsing websites, consumers have come to expect lightning fast delivery in both a business and personal environment.

However, the world is changing and we’re all going ever more ‘digital’ with an inevitable push towards the cloud. The Internet is changing at pace and continues to do so, with ever-increasing bandwidth and content demands where the user no longer wants to see ‘a’ website, they want to see ‘their’ website with content tailored right the way through the experience. Big data and other technologies are providing businesses huge insight into their customer base, meaning that when John from London logs on, he’ll see content specific to him at that moment, rather than a generic page for the whoever, wherever. All of this is meaning that the user needs an almost instantaneous response to their request at an ever increasing rate of demand.

This requirement for instant response has a limitation based not on technology but raw physics—the speed of light in glass fibre. This is the speed limit of how the Internet and all digital networks work. We’ve all been on a Skype call to the other side of the world and noticed how laggy it can be. However, that’s just one of the transactions that we’re involved in. When you’re browsing a website or using a business application, tens, hundreds or more of these transactions are happening simultaneously as each piece of content is requested, considered and delivered—and the further away you are from the server the slower the transfer of data, and the slower the experience.

Is there a different approach?

Once you’ve accepted that there is no ‘The Cloud’, you can start to look at your approach to digital, and in doing so look at cloud technologies in a new way. Not every solution is right for every application, and indeed not every provider is right for every customer.

At Interoute, we’ve been building hyperscale networks for over 13 years and we operate a large part of the backbone fibre infrastructure for Europe. For the last 10 years we've been providing managed hosting services to major enterprises with global operations. From this, we know a great deal about how and where data moves as well as gaining valuable insights into the requirements of global enterprises and large online retailers. This knowledge was fundamental in how we approached the design of a cloud platform. Rather than build a single mega-platform where, in the context of datacentres ‘a few would do’, we built our cloud platform, Interoute Virtual Data Centre (VDC), into as many of our on-net countries as possible throughout Europe, and we are currently globally expanding our distributed compute cloud.

We have built 11 individual VDC zones all over Europe (with more worldwide), bringing our cloud closer to the end customer and lowering the distance from server to user and solving the speed of light in glass issue. Having multiple zones in the same country, as VDC has in Germany, Switzerland and the United Kingdom, solves the potential problems arising from cross-border transfers of personal data, and this is one of the key differences between Interoute and other well-known cloud providers.

Once you’ve decided to move some or all of your data to the cloud you'll need to move your data around securely, which is why when we built VDC it was designed to be connected everywhere, always, on an entirely private global network that also happens to come free. So moving data from zone to zone or even from your existing IT estate is completely free, private and secure.

Interoute VDC: Globally built, locally delivered, totally connected

By using this more local way of delivering and serving data you’ll benefit from having lower latency and its associated performance improvements, and by privately integrating at a network layer your move to the cloud will be far more analogous to your existing setup—easing your migration and perhaps even solving some long time headaches on your existing platform—conveying further benefits from a move to cloud.

Once you've started to serve some users from sites more local to them it makes sense to use this methodology elsewhere (getting closer to even more users), distributing your platform over multiple locations and then using this capability to provide scaling and redundancy, lowering your costs whilst increasing your availability.

Embracing the globally built, locally delivered, totally connected concept gets you into the cloud quicker, with the associated savings, whilst ensuring the security and location of your users’ data.

So, rather than building your own mega-site in one location to serve a region (a la ‘The Cloud’) you can break down those requirements and build locally but connect globally, making the cloud your own.

VDC, Cloud, location, latency, Network, throughput, bandwidth

Follow for our latest news