Solutions for Large-Scale Content Delivery over the Internet Protocol
Peltotalo, Jani (2010)
Peltotalo, Jani
Tampere University of Technology
2010
Tieto- ja sähkötekniikan tiedekunta - Faculty of Computing and Electrical Engineering
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tty-201012011375
https://urn.fi/URN:NBN:fi:tty-201012011375
Tiivistelmä
The current trend is going more and more towards IP-based delivery for all kind of digital content. At the same time, the increasing quality of the digital media is simultaneously increasing the size of the media, which will also increase the requirements for the capacity in the content delivery path. It is obvious that the traditional content delivery based on the client-server model will easily overload the delivery network and none of the customers will be happy about the quality of the service. Hence, more scalable solutions in the digital content distribution area are needed.
In the best-effort service, like in the IP datagram forwarding, the successful delivery of a packet to its receivers is not guaranteed in the network layer. So, in IP-based applications, failures in the content delivery path between the sender and receiver will cause packet losses, which have to be dealt with in the transport or application layers. Another reason for packet losses in a multi-sender P2P environment is a peer churn which will cause the media being sent to a receiver to be temporarily interrupted.
This Thesis studies large-scale content delivery over the Internet Protocol, mainly at the scalability and reliability point of view, in two separate research areas: (a) file delivery to large user population, and (b) real-time P2P media streaming in a mobile networking environment. Multicast-based file delivery is one of the most efficient ways to deliver the same content to a large user population, but reliability becomes a concern, because multicast techniques are commonly based on unreliable transport protocols to scale up to large and massive receiver groups. As it is shown in this Thesis, FEC data carousel will be the best way to provide reliability in most cases when the total amount of data which is transmitted in the delivery system is used as a critical factor.
In P2P content distribution, overlay network structure and data partitioning are very important issues from the scalability point of view. A random mesh-based overlay architecture provides flexibility for handling peer departures, but good general connectivity between peers is better achieved using for example clustered overlay architecture. In P2P delivery, a downloading client becomes a leecher peer when it has at least one complete block, so with a small enough block size the number of alternative source peers will increase faster. Data partitioning in P2P media streaming applications is even more demanding. Partitioning based on fixed byte ranges, like in P2P file delivery, is not suitable for streaming a continuous media, which is of variable bit rate nature.
In contrast to file delivery where one does not care if the data parts arrive in the original order or not, since the viewing experience will be anyhow the same once the file is fully downloaded, P2P media streaming applications require that all data is received relatively close to its playback position. Good user experience is achieved by using client side buffers to eliminate the network induced delay and jitter. With bigger buffer size it is possible to smooth the variation between packet arrival times and have also time for packet loss recovery. On the other hand, the smaller the buffering time is the faster the playback can be started. The real-time P2P media streaming system presented in this Thesis contains several important improvements to enhance the mobile usage, like small ten seconds initial buffering time, ten seconds reception buffer due to the RTP usage, and the partial RTP stream concept, which allow a single media stream to be effectively received simultaneously from multiple senders.
In the best-effort service, like in the IP datagram forwarding, the successful delivery of a packet to its receivers is not guaranteed in the network layer. So, in IP-based applications, failures in the content delivery path between the sender and receiver will cause packet losses, which have to be dealt with in the transport or application layers. Another reason for packet losses in a multi-sender P2P environment is a peer churn which will cause the media being sent to a receiver to be temporarily interrupted.
This Thesis studies large-scale content delivery over the Internet Protocol, mainly at the scalability and reliability point of view, in two separate research areas: (a) file delivery to large user population, and (b) real-time P2P media streaming in a mobile networking environment. Multicast-based file delivery is one of the most efficient ways to deliver the same content to a large user population, but reliability becomes a concern, because multicast techniques are commonly based on unreliable transport protocols to scale up to large and massive receiver groups. As it is shown in this Thesis, FEC data carousel will be the best way to provide reliability in most cases when the total amount of data which is transmitted in the delivery system is used as a critical factor.
In P2P content distribution, overlay network structure and data partitioning are very important issues from the scalability point of view. A random mesh-based overlay architecture provides flexibility for handling peer departures, but good general connectivity between peers is better achieved using for example clustered overlay architecture. In P2P delivery, a downloading client becomes a leecher peer when it has at least one complete block, so with a small enough block size the number of alternative source peers will increase faster. Data partitioning in P2P media streaming applications is even more demanding. Partitioning based on fixed byte ranges, like in P2P file delivery, is not suitable for streaming a continuous media, which is of variable bit rate nature.
In contrast to file delivery where one does not care if the data parts arrive in the original order or not, since the viewing experience will be anyhow the same once the file is fully downloaded, P2P media streaming applications require that all data is received relatively close to its playback position. Good user experience is achieved by using client side buffers to eliminate the network induced delay and jitter. With bigger buffer size it is possible to smooth the variation between packet arrival times and have also time for packet loss recovery. On the other hand, the smaller the buffering time is the faster the playback can be started. The real-time P2P media streaming system presented in this Thesis contains several important improvements to enhance the mobile usage, like small ten seconds initial buffering time, ten seconds reception buffer due to the RTP usage, and the partial RTP stream concept, which allow a single media stream to be effectively received simultaneously from multiple senders.
Kokoelmat
- Väitöskirjat [4865]