Protocol Issues

Designing a network protocol which supports streaming media can be quite involved.

The following issues are noteworthy:Protocols called Datagram, such as the User Datagram Protocol, send the media stream as a series of small components of data. Although this is simple and efficient, the protocol does not contain any provision to guarantee delivery of the data stream. In fact, the onus is placed on the user to detect any loss or corruption in the data, and to reinstate such data using error correction methods. If some of the data is lost, the media stream may incur what is called a dropout.

The following protocols have been particularly designed to deliver streaming media over networks:Real-time Streaming Protocol, Real-time Transport Protocol, and the Real-time Transport Control Protocol.

Dependable protocols, such as the Transmission Control Protocol, ensure the correct delivery of each bit in the media stream. However, in order to achieve this, they use a system of timeouts and retries, which makes them more complex to implement. The downside is that when the network suffers from is a loss of data, the media stream is put on hold while the protocol handlers detect the loss and retransmit the missing data. By buffering the display data, it is possible to reduce this effect quite noticeably.

When using Unicast protocols, an alternative copy of the data stream is sent from the server to each user. Unicast is most commonly used for most Internet connections. However, it does not perform well when large numbers of users want to view the same program at the same time. When there are many recipients receiving Unicast content streams independently, there tends to be data replication with consequent server/network loads. Even when the streaming content is the same as that provided by the associated streaming server, there will still be a requirement for multiple connections from the Unicast Connections.

However, Multicast protocols were developed in order to minimise this sort of replication. Such protocols send just one stream from the source to a group of recipients. Whether Multicast transmissions are feasible depends on the network infrastructure and type. One possible downside of multicasting is the loss of the ability to utilise video on demand. In the case of continuously streaming radio or television material, the user will be unable to control playback. The use of servers with the ability to cache, digital set top boxes, and buffered media players are all methods which will tend to limit this problem.

The ability to send a single data stream to a number of end users on a computer network is available with IP Multicast. Since routers and firewalls must allow the flow of data destined for multicast groups, this must be considered when deploying IP multicast.

In the case of educational, government, and corporate intranets where the organization that is serving the content has control over the network between server and recipients, then routing protocols such as IGMP and PIM can be used to deliver stream content to multiple LAN segments.

Arranging for pre-recorded streams to be sent between computers forms the basis for Peer-to-peer (P2P) protocols. In this way, the server and its network connections are free from experiencing bottlenecks. There are, however, certain matters that may need to be resolved when using of such an arrangement. These include technical, effectiveness, accuracy of data, and various legal issues.

Audio Streaming – How To Succeed

Author's Bio: 

Peter Radford writes Articles with Websites on a wide range of subjects. Audio StreamingArticles cover History, Development, Multimedia, Protocols.

His Websitehas many more Audio Streaming Articles, written by others and carefully selected.

View his Website at: audio-streaming-how-to-succeed.com

View his Blog at: audio-streaming-how-to-succeed.blogspot.com