Broadband News

The contentious debate on peer-to-peer management

The traditional way of sharing content on the Internet is to host a file server. Peer-to-peer (p2p) gets its name from describing the two parties interacting, in this case on an client-to-client level rather than an client-to-server. What has made p2p so popular and such a phenomenon has been the propagation of file sharing software such as Napster, Morpheus, Kazaa, Gnutella and many others.

The reason this traffic has primarily moved from a "client-server" topology to a p2p framework is the content, traditionally illegally copied music mp3 files and pirated software, but more recently DVD movies as well. Hosting this type of content on a "server" is much more difficult as it is very easy for the rights owners to shut down such servers whereas it is much more difficult to take to court every single client on a p2p network that may be sharing files (although they are making a start) than taking down a smaller number of servers which are easier to track as well.

ISPs are increasingly concerned about this traffic, because p2p is growing quickly and according to a report commissioned by Sandvine earlier this year, can consume up to 70% of traffic on broadband networks. Providers have to be careful in selecting the right approach as many if not most broadband users use p2p applications and it is one of the key drivers to broadband take-up. There are several options that are available to ISPs to deal with this phenomenon:

  • Do nothing - sooner or later they will run out of bandwidth and users will experience severe slowdowns
  • Increase capacity - this has been known to not work well as p2p will just keep taking up more and more capacity
  • Block p2p traffic - possible with most current protocols but ultimately a short term solution and also very unpopular with users
  • Manage/Traffic shape p2p ports - possible with most current procotols, not popular with users and in the long run workaround will be created (port hopping)
  • Usage based charging - again unpopular with users for the uncertainty, but likely to take off
  • Cache content locally to cut transit costs

Even with the management technologies such as the ones offered by Sandvine, we think they will face the same problems the Internet has always faced, namely that programmers will always 'work around' any problems. Whilst Sandvine has coped with port hopping issues which ISPs used to block or manage traffic, the encryption of traffic will present significant hurdles for this technology. On the positive side, this solution does not necessarily cause problems and can simply redirect requests to minimise ISP costs without affecting performance in some cases. Each implementation is customised to the ISP's requirements. Even then, smaller providers with less than 10,000 users need to look elsewhere for a solution.

Another potential commercial threat to the general p2p community is the shift from unlimited utilisation to a usage-based "per GB" charging system. The industry trend has been from 0845 dial-up through to unlimited and unmetered dialup followed by unmetered broadband and users aren't used to usage based billing any more.

A few providers have already introduced transfer caps on certain product ranges to put off those users who want to use their connection at 100% capacity. NTL's 1GB a day cap was not received well with a campaign being set up to lobby against it. It is inevitable that the change to usage-based charging will take place as low cost providers enter the market with baseline prices for a low-usage service and it will become financially unviable for the remaining ISPs to keep the high bandwidth users that will remain paying a higher price. What triggers the move is probably down to BT Wholesale and the pricing for baseline products.

"The latest trend in worm creation is to utilize peer-to-peer (P2P) file-sharing networks, such as KaZaA or Morpheus, as a means to infect innocent victims. By exploiting the benefits of Peer-to-Peer file sharing, worms spread more efficiently and have a greater potential of exhausting service provider's networks. [..] Residential subscribers represent the weakest, most uncontrolled point in the internet. [..] It is only a matter of time until a worm with a seriously damaging payload emerges"

"Worm Mitigation on Broadband Networks", Sandvine (October 2003)

What is perhaps more worrying is the issue of Internet worms such as Nimda, Slammer, etc. which have so far had a very tolerable but significant impact on the operation of the Internet. A new report published by Sandvine yesterday entitled "Worm Mitigation on Broadband Networks" (you need to register to download these but registration is free) looks at this increasing threat as the shift of Internet hosts is changing from business to consumer systems resulting in far less central protection against these kinds of attacks. It warns that not only could future worms wipe hard drives and cause other damage by overwriting the system's BIOS image (resulting in the machine becoming inoperable), they could broadcast sensitive information resulting in untold amounts of damage to companies and that individual protections mechanisms, although important are inadequate. Whilst p2p management may be a contentious subject ISPs do not wish to discuss openly, few users would want to bear the costs of being party to these attacks. [seb]


Post a comment

Login Register