BROUGHT TO YOU BY ST TELEMEDIA
Wherever you are as a consumer on your own digital journey, you’ll almost certainly have experienced the frustration of a website crash. My own most recent example came just before Christmas, when I was trying to order some perfume for my wife. No matter how many times I tried, nor how patient I was, I always seemed to get stuck on the same page in the ordering process on the perfume brand’s website. Eventually, I circumnavigated the problem by going to the website of a well-known department store, and ordered the perfume that way. A frustrating morning’s work, exacerbated by the fact that, mid-January, I’m still receiving emails from the perfume company asking me if I still want to proceed with my order. Okay, so I didn’t not buy the perfume (Christmas would have been somewhat frosty if I hadn’t), but we can all think of plenty of occasions when we have been put off a purchase because of a poor website/digital experience. And I certainly won’t be encouraging my wife to purchase anything else off the perfume company’s website.
Somewhere in the background of every bad online experience lies a (poorly run) data centre. My youngest son has taught me plenty of new words as he waits to download some obscure ‘house music’ track, cursing my internet connection, the website in question and life in general. While the retail sector has a long way to go when it comes to perfecting their collective approach to online sales (with plenty of notable exceptions), it is, perhaps, the content/digital media sector that could benefit the most from further understanding just what their customers want, and how best to provide this.
I guess that, as they are very much technology natives, the folks working in this sector are not only comfortable with IT, they also like to build and own their own IT infrastructure, and are reluctant to ‘trust’ any third party to be a vital part of the link between content and the customer.
However, let’s examine the facts. Right now, there seems to be a growing recognition of the need to bring large file content closer to the consumer. Yes, while heading to a consolidated and/or centralised Cloud IT infrastructure has made a lot of sense for a lot of organisations over the past few years, there’s a growing realisation that, when it comes to delivering, say, video content to consumers right across a country or wider region, it makes sense to bring that content nearer to consumers before delivering it across the network. So, maybe have a central, national or regional data centre, but also have access to several more local facilities to host and serve content closer to the customer. By all means buy and kit out these locations yourself, but it just might make more sense to take space in a range of colocation facilities. Local presence equals lower latency.
And then there’s the idea of bursting to consider. Adele releases a new single, yet another Star Wars movie is ready for streaming, a digital magazine hits publication day… If you are the organisation charged with providing this content to a substantial audience, then you know full well that there will be a massive initial interest in the new content.
So, you can choose to ensure that your own IT infrastructure has sufficient capacity to meet the massive demand, and then pay for this ‘extra capacity’ to remain dormant for another month or year, or you can ‘hire-in’ extra bandwidth and data centre capacity for the anticipated peak demand period. (Concert promoters and perfume companies please take particular note). So, you pay a bit extra for some extra resource, keep your customers very happy, and then return to using your own infrastructure for ‘everyday’ levels of demand.
And what about test and development? Trying to develop a new idea or application, but not sure if it’s worthwhile?
Well, chances are that it’s more likely to be (economically) viable if you can rent the necessary colocation and IT resource, rather than have to pay for the necessary capital investment, get the IT department to install the kit, and then decide two weeks later (having waited three months for the infrastructure) that the idea/application isn’t a goer. Never mind those limited window opportunities that demand you react right now, or the opportunity going (imagine that you’d developed some kind of a Bremain application, only to discover that you needed a Brexit one asap instead).
Following on from your in-house DevOps, maybe you want to test a new application with a sample audience – or give loyal customers an early glimpse of what might be coming shortly. Again, you have to decide is it worth investing or not in the infrastructure to provision the new app to give customers that special experience/early access? Well, it’s highly likely that a colocation will be able to offer rather more flexible and cost-effective terms than your in-house IT team.
And we can’t leave the content/digital media sector without discussing one of the major successes of the sector in recent years – the emergence of IT-enabled design collaboration. Chances are that if you’re involved in this sector, any project you work on requires input from a range of individuals and/or organisations, in a variety of locations. Hosting such projects in a colocation facility could well make more sense than any one of the project partners managing the required IT infrastructure internally.
Put quite simply, it’s no longer acceptable for any organisation to experience all but the most minor of disruptions with their online presence. Using the services of a colocation provider will, almost certainly, be an essential part of guaranteeing such seamless continuity.