urbanfox - company logo  
          urbanfox.tv > technology articles > streaming articles > Akamai
 

AKAMAI

by David Fox

"The Internet is supposed to connect the right content in the right format to the right user at the right time. From the outside, it looks very simple. You produce it, and all you have to do is put it in a box which fires it off into the Internet, and it mysteriously pops out the other end and everyone has a wonderful experience and goes off happy. Unfortunately, the experience to date of the Internet has not proved that to be the case. The problem itself is the Internet," says Ian King, director of operations, Northern Europe, Akamai.

Akamai has grown in just two years from being an MIT project to solve the problem of Internet congestion to having a network of over 4000 Web servers in 170 ISPs, delivering content for more than 1000 companies, including Apple, CNN, and Yahoo!, among nine of the Internet's ten most popular sites.

"Akamai's mission in life is to get the content from where you've finished with it to the user that actually wants to... I would say view, but I think it is better in the context of interactivity to say participate in the content," he says.

"High performance delivery of content is really driven by customers. If customers want this stuff, they go to the network. The network is increasingly becoming capable of delivering anything users want, which means that new technologies can be implemented on the network to deliver higher quality. Higher quality means more customers, more customers means a higher performance network, even higher technology, and so on. It's a virtuous circle and it will grow over time."

The problem is, with users at the outside, a server in the middle, routers around world trying to work out best way of getting content to the users, and thousands of content providers all trying to use the same network to get to the same group of users. "It's a real mess." There are traffic jams, especially when a lot of users try to seek out the same big event, such as an Oasis concert.

There is also "the bandwidth constraint of the last mile. An awful lot of stuff is still delivered at 28K or 56K. That situation is changing, of course, with the arrival, for example, of DSL connectivity, which dramatically increases the speed, but it's not resolved yet," he says.

With so many potential bottlenecks in the way, how can we avoid them?

In many cases, Webcasters start out putting their content "on a great big server somewhere and you deliver it. The trouble is, it creates a hot spot. Everybody has to come to that one place and all the content has to be served out of that one place. It doesn't scale very well. If your initial business model says that you want 1,000 users, what happens if you get good at it and you get 10,000 or 100,000 or a million users? Can you still serve the content? One of the ways of solving that is to cache it. That means putting it out into a number of large boxes in other places, but, in the main caching is not intelligent and one of the problems here is the lack of intelligence. There's no guarantee that content on the cache is actually fresh, and broadcasters know that if you are sending something out, it has to be what you want to send out, not something you wanted to send out ten minutes or half-an-hour ago. Mirroring is a very similar kind of thing, except it is fractionally more intelligent and has the same problem of concentrating stuff in a relatively small number of big boxes and then expecting to be able to send that data out to potentially a large number of users," explains King.

The answer to this is edge networking. "It is a combination of clever caching, of intelligently looking at the Internet to see where all the traffic is, and routing the stuff the quickest possible way to you."

He believes it is important to broadcasters, because "everybody in the business is really only interested in one thing: making money. The best way to do that is to increase your revenue and decrease your cost. So, you want scalability and you want to be able to serve 1,000 users today and 100,000 users tomorrow, without a massive increase in the expenditure you have to place on infrastructure," he says.

Users also expect to view that content with the same reliability and quality they have come to expect from their TV set. "The Internet has, 'till now, not reliably delivered that. Or even if what you want is there, you can't get at it. Certainly, it hasn't been at the quality levels you would expect, if you are used to a television environment."

Webcasters are also probably interested in reducing their infrastructure costs. "In fact, if you can get rid of most of the infrastructure cost altogether and then start using the Internet on the basis of the number of users you have, that is probably a better method for you," says King. Especially as it gives the flexibility to cover traffic peaks. For example, Akamai delivered the Victoria's Secrets' fashion show from Cannes during the film festival. "At the peak we were dealing with 1,700 hits per second looking for streaming video." As the users don't go away, there were 100,000 users on line after a minute. "So, the ability to scale was particularly important. You can not simply just serve the first few thousand," he says. "The first time [Victoria's Secrets] tried to do this [themselves] I think their site crashed after three seconds."

Even the smallest broadcaster could bring down the Net if it used a centralised server to reach its existing audience. The Food Channel, in the US, typically has about 100,000 viewers. One tenth of 1% of the available audience (in Nielsen rating terms). If it tried to reach all these viewers on the Net at mediocre streaming quality (300K per second), that would amount to 30 Gigabits per second, "which is 20% of the entire broadband capacity of the Internet. Just one programme will effectively take over the entire Internet. Because the Internet does other things as well, such as email. That's a problem.

"The solution is to serve from the edge. That means that you want to reduce the distance that the stream has to travel. The easiest way to do that is to put the server that is delivering the stream as close as possible to the user." The centralised server takes a request for a video stream. It serves that request to one of the edge servers, which then serves all users in that area who come on line looking for that stream.

Once that content is on the local server, users don't have to access the central server to view it. "The net effect is that it is very reliable and very scalable." Even with the Victoria's Secrets show, only about 7% of Akamai's network capacity was used (reaching more than a million users), and it has a policy of installing more servers once more than 20% of capacity is breached. "We have huge capability to deliver, so the effect is that the centralised model versus the decentralised model gives you a huge improvement in cost per user as the volume and the content richness improves," he claims. "Higher bandwidth means lower packet loss, which means higher quality and fewer disconnects. It results in higher resolution and less jerky video, better sound and continuous display."

Akamai distributes servers to ISPs, costing the ISP nothing, and in return gets the benefits of fast access to users. It charges Webcasters at a bandwidth rate amounting to tens of cents per user. It also offers tools to do real-time reporting, identifying who and where the users are, and how they compare to last time, to make it easier to tailor what you offer to user's requirements.

Of course, Akamai isn't the only one using edge servers, Germany's Internet Television Technologies, for example, is also using them. "Very close to the access point we are putting so-called NetCache appliances (from Network Appliance) which can at least send out 6,000 files simultaneously, and as demand increases, we only have to increase these appliances." It just has a small number of servers at base, leaving the caching machines to do the delivery, says Harald Mueller, ITT's manager business operations and the new CEO of beTVeen.com - a joint venture between ITT and Handelsblatt Online

RealNetworks has also recently done a deal with Madge in Europe to reduce the distance between its servers and the customer.

Back to the top.

© 2000 - 2010

More...
Successful Streaming - how to get it right.
Making streaming work - A seminar report from Sreaming Media Europe 2000.
Streaming Gets Reality Check - keynote speech by Martin Tobias of Loudeye at Streaming Media Europe 2000
Making money from your streaming media websites - you want business models? we got business models.
Webstation: Gritted.com - how one post house launched its own Web channel.
| BACK TO HOME PAGE | SEARCH | CONTACT US | TECHNOLOGY ARTICLES | PRODUCTION | CREATIVE STUFF | COURSE INFORMATION | CAMERA WORKBOOKS |

David Fox