Logo

Publikacije (39)

Nazad
John O'Sullivan, Darijo Raca, Jason J. Quinlan

In this short demo paper, we introduce godash 2.0 godash is a headless HTTP adaptive streaming (HAS) video streaming platform written in the Google programming language GO. godash has been extensively rewritten for this release so as to provide ease of use, and a host of new features. godash includes options for eight different state of the art adaptive algorithms, five HAS profiles, four video codecs, the ability to stream audio and video segments, two transport protocols (TCP and QUIC), realtime output from five Quality of Experience (QoE) models, as well as a collaborative framework for the evaluation of cooperative HAS streaming. In this demo, we will introduce each of the options available in the godash configuration file, and illustrate examples of how to use the collaborative players using both godash on a native machine as well as godash within godashbed. godashbed is an integrated large-scale testbed framework for the evaluation of HAS streaming, which uses a virtual environment to serve video content locally (which allows setting security certificates) through the Mininet virtual emulation tool. In this manner, godash provides a framework for rapid deployment and testing of new HAS algorithms, QoE models and transport protocols.

Darijo Raca, Dylan Leahy, C. Sreenan, Jason J. Quinlan

In this paper, we present a 5G trace dataset collected from a major Irish mobile operator. The dataset is generated from two mobility patterns (static and car), and across two application patterns (video streaming and file download). The dataset is composed of client-side cellular key performance indicators (KPIs) comprised of channel-related metrics, context-related metrics, cell-related metrics and throughput information. These metrics are generated from a well-known non-rooted Android network monitoring application, G-NetTrack Pro. To the best of our knowledge, this is the first publicly available dataset that contains throughput, channel and context information for 5G networks. To supplement our real-time 5G production network dataset, we also provide a 5G large scale multi-cell ns-3 simulation framework. The availability of the 5G/mmwave module for the ns-3 mmwave network simulator provides an opportunity to improve our understanding of the dynamic reasoning for adaptive clients in 5G multi-cell wireless scenarios. The purpose of our framework is to provide additional information (such as competing metrics for users connected to the same cell), thus providing otherwise unavailable information about the base station (eNodeB or eNB) environment and scheduling principle, to end user. Our framework permits other researchers to investigate this interaction through the generation of their own synthetic datasets.

Yusuf Sani, Darijo Raca, Jason J. Quinlan, C. Sreenan

The growth of online video-on-demand consumption continues unabated. Existing heuristic-based adaptive bit-rate (ABR) selection algorithms are typically designed to optimise video quality within a very narrow context. This may lead to video streaming providers implementing different ABR algorithms/players, based on a network connection, device capabilities, video content, etc., in order to serve the multitude of their users' streaming requirements. In this paper, we present SMASH: a Supervised Machine learning approach to Adaptive Streaming over HTTP, which takes a tentative step towards the goal of a one-size-fits-all approach to ABR. We utilise the streaming output from the adaptation logic of nine ABR algorithms across a variety of streaming scenarios (generating nearly one million records) and design a machine learning model, using systematically selected features, to predict the optimal choice of the bitrate of the next video segment to download. Our evaluation results show that SMASH guarantees a high QoE with consistent performance across a variety of streaming contexts.

Darijo Raca, Maëlle Manifacier, Jason J. Quinlan

In this short paper, we present goDASH, an infrastructure for headless streaming of HTTP adaptive streaming (HAS) video content, implemented in the language golang, an open-source programming language supported by Google. goDASH's main functionality is the ability to stream HAS content without decoding actual video (headless player). This results in low memory requirements and the ability to run multiple players in a large-scale-based evaluation setup. goDASH comes complete with numerous state-of-the-art HAS algorithms, and is fully written in the Google golang language, which simplifies the implementation of new adaptation algorithms and functions. goDASH supports two transportation protocols Transmission Control Protocol (TCP) and Quick UDP Internet Connections (QUIC). The QUIC protocol is a relatively new protocol with the promise of performance improvement over the widely used TCP. We believe that goDASH is the first emulation-based HAS player that supports QUIC. The main limitation in using QUIC protocol is the need for a security certificate setup on both ends (client and server) as QUIC demands an encrypted connection. This limitation is eased by providing our own testbed framework, known as goDASHbed. This framework uses a virtual environment to serve video content locally (which allows setting security certificates) through the Mininet virtual emulation tool. As part of Mininet, goDASH can be used in conjunction with other traffic generators.

Darijo Raca, A. Zahran, C. Sreenan, R. Sinha, Emir Halepovic, R. Jana, V. Gopalakrishnan

The highly dynamic wireless communication environment poses a challenge for many applications (e.g., adaptive multimedia streaming services). Providing accurate TP can significantly improve performance of these applications. The scheduling algorithms in cellular networks consider various PHY metrics, (e.g., CQI) and throughput history when assigning resources for each user. This article explains how AI can be leveraged for accurate TP in cellular networks using PHY and application layer metrics. We present key architectural components and implementation options, illustrating their advantages and limitations. We also highlight key design choices and investigate their impact on prediction accuracy using real data. We believe this is the first study that examines the impact of integrating network-level data and applying a deep learning technique (on PHY and application data) for TP in cellular systems. Using video streaming as a use case, we illustrate how accurate TP improves the end user's QoE. Furthermore, we identify open questions and research challenges in the area of AI-driven TP. Finally, we report on lessons learned and provide conclusions that we believe will be useful to network practitioners seeking to apply AI.

M. Cosovic, Muhamed Delalic, Darijo Raca, D. Vukobratović

The state estimation algorithm estimates the values of the state variables based on the measurement model described as the system of equations. Prior to applying the state estimation algorithm, the existence and uniqueness of the solution of the underlying system of equations is determined through the observability analysis. If a unique solution does not exist, the observability analysis defines observable islands and further defines an additional set of equations (measurements) needed to determine a unique solution. For the first time, we utilise factor graphs and Gaussian belief propagation algorithm to define a novel observability analysis approach. The observable islands and placement of measurements to restore observability are identified by following the evolution of variances across the iterations of the Gaussian belief propagation algorithm over the factor graph. Due to sparsity of the underlying power network, the resulting method has the linear computational complexity (assuming a constant number of iterations) making it particularly suitable for solving large-scale systems. The method can be flexibly matched to distributed computational resources, allowing for determination of observable islands and observability restoration in a distributed fashion. Finally, we discuss performances of the proposed observability analysis using power systems whose size ranges between 1354 and 70 000 buses.

Darijo Raca, A. Zahran, C. Sreenan, R. Sinha, Emir Halepovic, R. Jana, V. Gopalakrishnan, B. Bathula et al.

Today's HTTP adaptive streaming applications are designed to provide high levels of Quality of Experience (QoE) across a wide range of network conditions. The adaptation logic in these applications typically needs an estimate of the future network bandwidth for quality decisions. This estimation, however, is challenging in cellular networks because of the inherent variability of bandwidth and latency due to factors like signal fading, variable load, and user mobility. In this paper, we exploit machine learning (ML) techniques on a range of radio channel metrics and throughput measurements from a commercial cellular network to improve the estimation accuracy and hence, streaming quality. We propose a novel summarization approach for input raw data samples. This approach reduces the 90th percentile of absolute prediction error from 54% to 13%. We evaluate our prediction engine in a trace-driven controlled lab environment using a popular Android video player (ExoPlayer) running on a stock mobile device and also validate it in the commercial cellular network. Our results show that the three tested adaptation algorithms register improvement across all QoE metrics when using prediction, with stall reduction up to 85% and bitrate switching reduction up to 40%, while maintaining or improving video quality. Finally, prediction improves the video QoE score by up to 33%.

Darijo Raca, Yusuf Sani, C. Sreenan, Jason J. Quinlan

Recent years have witnessed an explosion of multimedia traffic carried over the Internet. Video-on-demand and live streaming services are the most dominant services. To ensure growth, many streaming providers have invested considerable time and effort to keep pace with ever-increasing users' demand for better quality and stall abolition. HTTP adaptive streaming (HAS) algorithms are at the core of every major streaming provider service. Recent years have seen sustained development in HAS algorithms. Currently, to evaluate their proposed solutions, researchers need to create a framework and numerous state-of-the-art algorithms. Often, these frameworks lack flexibility and scalability, covering only a limited set of scenarios. To fill this gap, in this paper we propose DASHbed, a highly customizable real-time framework for testing HAS algorithms in a wireless environment. Due to its low memory requirement, DASHbed offers a means of running large-scale experiments with a hundred competing players. Finally, we supplement the proposed framework with a dataset consisting of results for five HAS algorithms tested in various evaluated scenarios. The dataset showcases the abilities of DASHbed and presents the adaptation metrics per segment in the generated content (such as switches, buffer-level, P. 1203.1 values, delivery rate, stall duration, etc.), which can be used as a baseline when researchers compare the output of their proposed algorithm against the state-of-the-art algorithms.

A. Zahran, Darijo Raca, C. Sreenan

Dynamic adaptive streaming over HTTP (DASH) is widely adopted for video transport by major content providers. However, the inherent high variability in both encoded video and network rates represents a key challenge for designing efficient adaptation algorithms. Accommodating such variability in the adaptation logic design is essential for achieving a high user quality of Experience (QoE). In this paper, we present ARBITER+ as a novel adaptation algorithm for DASH. ARBITER+ integrates different components that are designed to ensure a high video QoE while accommodating inherent system variabilities. These components include a tunable adaptive target rate estimator, hybrid throughput sampling, controlled switching, and short-term actual video rate tracking. We extensively evaluate the streaming performance using real video and cellular network traces. We show that ARBITER+ components work in harmony to balance temporal and visual QoE aspects. Additionally, we show that ARBITER+ enjoys a noticeable QoE margin in comparison to state-of-the-art adaptation approaches in various operating conditions. Furthermore, we show that ARBITER+ also achieves the best application-level fairness when a group of mobile video clients shares a cellular base station.

Darijo Raca, Jason J. Quinlan, A. Zahran, C. Sreenan

In this paper, we present a 4G trace dataset composed of client-side cellular key performance indicators (KPIs) collected from two major Irish mobile operators, across different mobility patterns (static, pedestrian, car, bus and train). The 4G trace dataset contains 135 traces, with an average duration of fifteen minutes per trace, with viewable throughput ranging from 0 to 173 Mbit/s at a granularity of one sample per second. Our traces are generated from a well-known non-rooted Android network monitoring application, G-NetTrack Pro. This tool enables capturing various channel related KPIs, context-related metrics, downlink and uplink throughput, and also cell-related information. To the best of our knowledge, this is the first publicly available dataset that contains throughput, channel and context information for 4G networks. To supplement our real-time 4G production network dataset, we also provide a synthetic dataset generated from a large-scale 4G ns-3 simulation that includes one hundred users randomly scattered across a seven-cell cluster. The purpose of this dataset is to provide additional information (such as competing metrics for users connected to the same cell), thus providing otherwise unavailable information about the eNodeB environment and scheduling principle, to end user. In addition to this dataset, we also provide the code and context information to allow other researchers to generate their own synthetic datasets.

Darijo Raca, A. Zahran, C. Sreenan, R. Sinha, Emir Halepovic, R. Jana, V. Gopalakrishnan, B. Bathula et al.

Streaming over the wireless channel is challenging due to rapid fluctuations in available throughput. Encouraged by recent advances in cellular throughput prediction based on radio link metrics, we examine the impact on Quality of Experience (QoE) when using prediction within existing algorithms based on the DASH standard. By design, DASH algorithms estimate available throughput at the application level from chunk rates and then apply some averaging function. We investigate alternatives for modifying these algorithms, by providing the algorithms direct predictions in place of estimates or feeding predictions in place of measurement samples. In addition, we explore different prediction horizons going from one to three chunk durations. Furthermore, we induce different levels of error to ideal prediction values to analyse deterioration in user QoE as a function of average error. We find that by applying accurate prediction to three algorithms, user QoE can improve up to 55% depending on the algorithm in use. Furthermore having longer horizon positively affects QoE metrics. Accurate predictions have the most significant impact on stall performance by completely eliminating them. Prediction also improves switching behaviour significantly and longer prediction horizons enable a client to promptly reduce quality and avoid stalls when the throughput drops for a relatively long time that can deplete the buffer. For all algorithms, a 3-chunk horizon strikes the best balance between different QoE metrics and, as a result, achieving highest user QoE. While error-induced predictions significantly lower user QoE in certain situations, on average, they provide 15% improvement over DASH algorithms without any prediction.

Darijo Raca, A. Zahran, C. Sreenan, R. Sinha, Emir Halepovic, R. Jana, V. Gopalakrishnan

Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više