ACM Multimedia 2019 Grand Challenge

-- Live Video Streaming

Total Prize: $4500

  • Overview
  • Challenge
  • Participation
  • Timeline
  • Organizers
  • Result Submission
  • Personal Center
  • Leader Board
  • Contact Us


In recent years, a new breed of video services that support live video broadcast has become tremendously popular. These video services allow users to broadcast live videos over the Internet, interact with their viewers and has many applications including journalism and education.

Live video streaming over HTTP chunked-based streaming protocol, such as DASH, meets many new technical challenges, compared to on-demand streaming of pre-recorded video. Firstly, it requires a low end-to-end latency for real-time interaction between the broadcaster and the viewers, while still maintaining few rebuffering events and high video quality. Secondly, for better user experience, it is especially important to ensure the stability of transmission during the live broadcast. Thirdly, the challenge is compound of the fact that we can only access a few seconds of video ahead at every moment, unlike the case for pre-recorded on-demand, video streaming, which means there is less information that can be utilized to make optimal streaming decisions.

To encourage the research community to come together and address the challenges of live video streaming over DASH, we organize a new live video streaming challenge at ACM Multimedia 2019. We will provide a simulator platform, a set of video traces, a set of network traces, and a set of common evaluation metrics, which the challenge participants can use to implement and evaluate their live video streaming algorithms. We hope that the platform and dataset will serve as a common tool for researchers to benchmark their algorithms with each other and thus contribute towards reproducible research.

Grand Challenge Scenario

For this grand challenge, we consider the following scenario for live streaming. There is a streamer who captures and generates a live video stream (either through a mobile phone or a PC). The video stream is uploaded to a transcoding server, which re-encodes the same video into multiple representations, each with a different bitrate and quality level. Each representation is then transmitted to CDN (content delivery network) nodes, which act as edge servers. The client issues pull requests to one of CDN nodes, indicating which representation to download. The corresponding representation is then sent to the client and is buffered before playback.

Figure1 Universal framework for live broadcast scenarios

The main task for this grand challenge is to design the algorithm which runs at the client to decide on which representation to download, what is the playback rate, and whether to skip any frames.

Bitrate Control

The client decides which representation to download given the current network throughput.

Ideally, the client downloads the representation with the highest quality and bit rate. Playback of representation with higher bitrate improves the quality of experience (QoE) of the viewer. Downloading of the representation of higher bitrate, however, would fill the buffer slower, increasing the risk of the buffer being drained. An empty buffer would cause a playback stall, damaging the QoE. Frequent switching between representations of different quality would also negatively affect the QoE. The key issue here is thus carefully decide which representation to download to improves the quality while reducing stall and number of switches, given the current throughput. This decision is especially challenging in the context of live streaming as the buffer size is kept small to reduce the end-to-end latency, therefore increasing the likelihood of stalls.

Latency Control

Current apps for live video broadcasting often support interactions with users and thus is delay sensitive. Buffering too many segments would increase the end-to-end delay and affects the interaction between the user and the streamer. On the other hand, buffering too few segments would increase the likelihood of playback stall. To control the end-to-end latency, the client can adopt two mechanisms:

A. Playback Speed Control

The client can slow down its playback if necessary to avoid or reduce the duration of a stall (e.g., when the buffer is about to be drained)

The client can speed up its playback if necessary to catch up and reduce the end-to-end delay.

B.Frame Skipping

A stall in playback will inevitably increase the end-to-end latency.


Please enter your firstname

Please enter your lastname

Please enter the correct email format

Length of your password should be between 6 to 20

Teamname couldn't be empty.

Clicking [Register] means you agree Registration Agreement of Live Streaming Challenge






1.1 本规则是腾讯制定的关于获取和使用QQ号码的相关规则。本规则适用于腾讯提供的需要注册或使用QQ号码的全部软件和服务。

1.2 本规则属于腾讯的业务规则,是《腾讯服务协议》不可分割的组成部分。

1.3 您通过QQ号码使用腾讯的软件和服务时,须同时遵守各项服务的单独协议。