Peertopeer Multimedia Streaming and Caching Service by Won
Peer-to-peer Multimedia Streaming and Caching Service by Won J. Jeon and Klara Nahrstedt University of Illinois at Urbana-Champaign, Urbana, USA
Agenda n n n Introduction Proposed peer-to-peer architecture Caching and Streaming Simulation result Comparison with our server-less architecture Conclusion
Introduction n Important metrics for multimedia streaming n n n Low initial delay Small delay jitter during playback Minimum network bandwidth utilization
Current Solution(1) n Caching and pre-fetching by media gateway (proxy) n n n Cache as segments, or only the prefix Geographically close to the clients Achieve small initial delay and delay jitter during playback
Current Solution(2) n Broadcasting services (eg. Skyscraper) n n n Achieve minimum server network bandwidth utilization Assume the synchronous playing time Assume buffering at clients
Motivation n Motivation of the proposed peer-to-peer architecture n n n Exploits the proximity of clients Minimizes the bandwidth utilization between the server and group of clients Architecture n n Assumes the group of peer-to-peer clients is connected via LAN Each client not only receives streams from the server, but also acts as a proxy server.
Proposed Architecture n Topology n n One server, one cache manager and four nodes s(i, j) represents segment of stream between byte i and j S(0, t 1) S(t 3, t 4) C 2 C 3 S(0, t 4 ) Cache Manager S Cx S(t 1, t 2) C 1
Caching n Caching management n n Each client caches the retrieved stream and publishes its cache information to the cache manager Each client monitors its own resource availability (eg. network bandwidth), updates all information to the cache manager
Streaming n Cache lookup n n Send query message to cache manager for information of cached streams in peer clients Streaming and Pre-fetching n With response from the cache manager, the client send streaming and pre-fetching requests to the peer clients or the server
Streaming n Timing diagram of the streaming and prefetching (requested by Ci)
Streaming Switching n Minimize the switching delay jitter n The pre-fetching time t 1* is determined by, n Network bandwidth Bik, n Network delay Dik between Ci and Ck n n Case 1: Bik is larger than service rate i Case 2: Bik is smaller than i
Streaming Switching n Pre-fetching Time n Case 1: n Case 2: • : estimated available bandwidth between Ci and Ck in the time period t 1* and t 2 • : estimated size of stream at time t
Simulation n Simulator: ns-2 n n n Video: Jurassic Park I (1. 5 Mbps) One server, two routers, and four clients Background traffic for all links between nodes and routers: Pareto distribution
Simulation n Simulated Topology
Simulation Result n Without pre-fetching n With pre-fetching
Comparison n Proposed P 2 P architecture n n n Dedicated server is not eliminated Assume different segments cached by peers Assume centralized cache manager Single point of failure – server, cache manager Server-less architecture n n n Dedicated server is eliminated Video blocks distributed to all nodes Additional encoding step for fault tolerance
Conclusion n Proposed a peer-to-peer streaming and caching architecture Cache manager maintains all the cache and network connection information Achievements n n Reduces the initial delay Minimizes the delay jitter during playback
End of Presentation Thank you!
- Slides: 18