TY - GEN
T1 - Argus
T2 - 45th IEEE Real-Time Systems Symposium, RTSS 2024
AU - Chen, Qiang
AU - Li, Changlong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Real-time high-quality (HQ) videos are increasingly popular in daily life (e.g., 4K video, AR/VR). However, due to the ultra-high definition and high frame rate, existing decoders cannot deal with the video frames on time. Video decoding has become a bottleneck in modern computers, especially for consumer devices. To ensure low latency, the system drops frames when the decoder is under pressure, which sacrifices the video quality. This paper shows that it is possible to realize both low latency and high quality, as we observe that computers use customized hardware to decode video frames but idle CPU resources. In this paper, we propose a new real-time HQ video decoding solution called Argus. The key insight is to make use of the wasted CPU resources. However, we will show that scheduling improper frames to the CPU can degrade, instead of improve the performance, which is out of the expect. To tackle the fundamental challenges, this paper further proposes two novel schemes: (1) a light neural network model to estimate the decoder pressure; and (2) a scheduler with frame-characteristics awareness. We have implemented Argus on both simulators and real-life consumer devices. Experimental results illustrate that Argus can reduce the tail queuing latency by 4 3. 8 % on average. More importantly, with the coordination of CPUs, the smooth experience of video playback is effectively improved (2. 2 % frame loss is avoided on average), compared to the state-of-the-art.
AB - Real-time high-quality (HQ) videos are increasingly popular in daily life (e.g., 4K video, AR/VR). However, due to the ultra-high definition and high frame rate, existing decoders cannot deal with the video frames on time. Video decoding has become a bottleneck in modern computers, especially for consumer devices. To ensure low latency, the system drops frames when the decoder is under pressure, which sacrifices the video quality. This paper shows that it is possible to realize both low latency and high quality, as we observe that computers use customized hardware to decode video frames but idle CPU resources. In this paper, we propose a new real-time HQ video decoding solution called Argus. The key insight is to make use of the wasted CPU resources. However, we will show that scheduling improper frames to the CPU can degrade, instead of improve the performance, which is out of the expect. To tackle the fundamental challenges, this paper further proposes two novel schemes: (1) a light neural network model to estimate the decoder pressure; and (2) a scheduler with frame-characteristics awareness. We have implemented Argus on both simulators and real-life consumer devices. Experimental results illustrate that Argus can reduce the tail queuing latency by 4 3. 8 % on average. More importantly, with the coordination of CPUs, the smooth experience of video playback is effectively improved (2. 2 % frame loss is avoided on average), compared to the state-of-the-art.
KW - Frame scheduling
KW - Low latency
KW - Real-time videos
KW - User experience
UR - https://www.scopus.com/pages/publications/85217618797
U2 - 10.1109/RTSS62706.2024.00014
DO - 10.1109/RTSS62706.2024.00014
M3 - 会议稿件
AN - SCOPUS:85217618797
T3 - Proceedings - Real-Time Systems Symposium
SP - 43
EP - 56
BT - Proceedings - 2024 IEEE Real-Time Systems Symposium, RTSS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 10 December 2024 through 13 December 2024
ER -