No Turning Back: The State of the European Content Delivery Market
This article first appeared in the European edition of Streaming Media magazine. Click here for subscription information.
Editor’s note: Trying to get a handle on the content delivery space in the U.K. and Europe is a little like the old story in which six blind men are asked to describe what an elephant is like. It all depends, so the story goes, on where you touch it. That said, we decided to ask longtime industry veteran Dom Robinson for his perspective on the space; he in turn looked to his peers for a better sense of the trends, opportunities, and challenges facing content delivery providers in the U.K. and Europe.
As I was writing this article, Funky Porcini’s “It’s a Long Road” began playing, and the line “It’s been a long road and there’s no turning back” seemed an apt comment on the content delivery space in Europe.
As I was putting this article together, I had an email dialogue with Streaming Media executive vice president Dan Rayburn and Eric Schumacher-Rasmussen, the editor of this magazine, talking about how the definition of a content delivery network (CDN) has become too “indeterminate” these days. I tend to agree, particularly after asking CDNs to contact us for input into this article. The companies that contacted us calling themselves CDNs offered such a wide range of services that it was nigh impossible to find a common thread among them. In the most general sense, a CDN takes on the responsibility for delivering your video to a large number of end users. Digging in much deeper gets us into endless technical semantics, but for the purposes of this perspective, it’ll do. In historical terms, things were quite a bit more black and white: You would “originate” the content, and the CDN would deliver it.
Before I get into the article and air my opinions on the space, let me preface it with a bit of professional background, a personal history that runs parallel to the history of the streaming space in the UK and Europe. While I am writing for Streaming Media magazine and a bit more regularly on the Streaming Media Global site (plug plug), I am also CTO at Global-MIX. We are a specialised CDN focused on multicast over the public internet (MIX = Multicast Internet eXchange). We are based in London, and although we are on an increasingly successful mission to multicast-enable lots of ISPs, we also function as a full service provider offering everything from outside broadcast/webcast/multi-
system operations to multiformat CDN services (both unicast and multicast) and hosting capacity. We are not one of the giants, but we deliver more than 110 million-streams each month for a wide range of clients—mainly live.
I co-founded Global-MIX in 2001 in part because the CDN offerings in the local market here in the U.K. were very limited. I have been webcasting since 1996, and at that time I was either hit with large (often impossibly large) service fees from guys with servers in racks over here, or I faced trying to deal with the U.S., which again was prohibitively expensive for impromptu events. But until 1999, that was pretty much all I had in terms of choice.
Over time I ended up using an account on an early incarnation of the wonderful Live365.com for audio streaming. Although it was probably not the right way to do it, with a little tinkering it was possible to deliver the stream to Live365’s servers and on to a scaling audience on a provisional basis through my own URL—and it was in this way that I ran my first few hundred nightclub streams and concerts. And Live365 covered its bills from advertising on the play of its stream from its portal.
What Is Past Is Prologue
The real lack of ad hoc CDN options over here was always a problem for me as a freelance webcaster. By 2001, as demand gave rise to greater video requirements, I began to contract as a live engineer to many of the new streaming Video Service Providers (let’s call them VSPs) based in and around London. The extra expense of paying a minimum commit for an account with one of the larger CDNs for occasional use set the entry bar too high for an independent. By working with VSPs as their ad hoc live engineer, when I had my own clients I could do deals with the VSPs to use one of their accounts. In turn, they would often back the stream off onto their CDN (generally Akamai Technologies or Tiscali) account if the requirement was larger than their own hosted capacity.
These VSP companies were generally ticking over on corporate PR and financial relations webcasts, and we had a lot of fun webcasting some big concerts and festivals and so on. I am sure there were other clusters of companies I didn’t know that were doing other things, but in general the VSP companies I was working with over here in the U.K. had discovered that Energis (traditionally a large energy company) was providing hosting in a data centre near Watford, just outside London, where it was possible to buy 100Mbps ports with no per-gigabyte distribution fees. Simply buying a handful of these ports enabled most of the streaming companies to directly deliver their content from their own servers, thus not requiring CDN services.
CDNs were therefore generally used for live services and for launches of large VOD services where the audience size was unknown. CDNs were really for burst traffic or for those who anticipated huge audiences. In this way, most of the VSPs could start a service with a simulated CDN in place and then provision a suitable number of servers in their own facilities to take the average (now called long tail) 24/7 traffic in-house and not pay CDN services.
Interestingly (and whether this is because the users were local to the U.K. or not is unclear), the issues with “edge caching” and dealing with “a collapsing internet flooded by video content” that are supposedly causing problems in the U.S. have never really reared their head—centralised delivery was never an issue for most of the companies I know. Some VSPs with large ex-patriot and overseas targeted content continued with CDNs, but for all the logic of a requirement for a distributed architecture, it was actually rarely a real world requirement when most of the clients were in the U.K., local to your ISP and server. This created a tough market cycle for CDNs in terms of retaining business beyond the initial “bursts”.