Logo

Publikacije (82)

Nazad
Adriana Lipovac, Ante Mihaljevic, V. Lipovac

Large peak-to-average power ratio (PAPR) and carrier frequency offset (CFO) are dominant impairments of the orthogonal frequency-division multiplexing (OFDM) symbol transmission that is applied within the state-of-the-art wireless operator networks. In this work, we deal with consequences of the amplitude peak clipping that is commonly used at the transmitter to reduce the PAPR of the OFDM symbol, and thus prevent its non-linear distortion which would otherwise be imposed by the output high-power amplifier (HPA). Accordingly, regardless of the clipping generating mechanism at the transmitter being either inherent (related to the HPA) or deliberate (due to PAPR reduction), the clipped incoming OFDM symbol at the receiver may lead to degraded detection accuracy and transmission performance. However, the methods that have been applied so far at the receiver for compensating non-linear distortion due to clipping, are quite complex and computationally demanding. On the contrary, we propose effective mitigation of the problem to be performed at the receiver, by deriving the closed-form enhanced detection criterion, which requires common measurements of the mean and the rms values, as well as the autocorrelation of the received OFDM symbol comprising both un-clipped and clipped sections. Such improved detection was shown to significantly reduce the side effects of clipping, and restore satisfactory transmission performance – the bit error rate (BER) in particular. The proposed analytical model was preliminarily verified by versatile Monte-Carlo simulations and professional industry-standard vector signal analysis (VSA) test system, as well as by BER testing. The evident convergence of the three methods’ test results leads to the conclusion that the proposed clipped OFDM symbol detection method provides clear improvement with respect to the conventional one.

Toni Besjedica, K. Fertalj, V. Lipovac, I. Zakarija

Given the growing number of devices and their need for internet access, researchers are focusing on integrating various network technologies. Concerning indoor wireless services, a promising approach in this regard is to combine light fidelity (LiFi) and wireless fidelity (WiFi) technologies into a hybrid LiFi and WiFi network (HLWNet). Such a network benefits from LiFi’s distinct capability for high-speed data transmission and from the wide radio coverage offered by WiFi technologies. In this paper, we describe the framework for the HWLNet architecture, providing an overview of the handover methods used in HLWNets and presenting the basic architecture of hybrid LiFi/WiFi networks, optimization of cell deployment, relevant modulation schemes, illumination constraints, and backhaul device design. The survey also reviews the performance and recent achievements of HLWNets compared to legacy networks with an emphasis on signal to noise and interference ratio (SINR), spectral and power efficiency, and quality of service (QoS). In addition, user behaviour is discussed, considering interference in a LiFi channel is due to user movement, handover frequency, and load balancing. Furthermore, recent advances in indoor positioning and the security of hybrid networks are presented, and finally, directions of the hybrid network’s evolution in the foreseeable future are discussed.

Jasmin Musovic, Adriana Lipovac, V. Lipovac

In this work, we adopt the analysis of a heterogeneous cellular network by means of stochastic geometry, to estimate energy and spectral network efficiency. More specifically, it has been the widely spread experience that practical field assessment of the Signal-to-Noise and Interference Ratio (SINR), being the key physical-layer performance indicator, involves quite sophisticated test instrumentation that is not always available outside the lab environment. So, in this regard, we present here a simpler test model coming out of the much easier-to-measure Bit Error Rate (BER), as the latter can deteriorate due to various impairments regarded here as equivalent with additive white Gaussian noise (AWGN) abstracting (in terms of equal BER degradation) any actual non-AWGN impairment. We validated the derived analytical model for heterogeneous two-tier networks by means of an ns3 simulator, as it provided the test results that fit well to the analytically estimated corresponding ones, both indicating that small cells enable better energy and spectral efficiencies than the larger-cell networks.

Adriana Lipovac, V. Lipovac, M. Hamza, V. Batos

Optical time-domain reflectometer (OTDR) has long been and is still considered the main test tool for characterizing fiber optic links, i.e. identify and localize refractive and reflective events such as breaks, splices and connectors, and measure their insertion/return loss. Specifically, sufficient dynamic range and thus alike signal-to-noise-ratio (SNR) enable clear far-end visibility even of long fiber links. Moreover, under such conditions, the highest achievable optical bit-error-rate (BER) floor is to the large extent determined by major reflective events such as the specific trace distortion caused by connectors and splices, each with significant return loss. Realizing this has provided the opportunity window to extend the standard OTDR capabilities list by the appropriate trace postprocessing to predict the BER floor. Accordingly, considering the SNR high, and thereby the inter-symbol interference dominant error generating mechanism, we applied the time-dispersion channel model that determines the BER floor by the rms delay spread of the (fiber) channel power-delay profile. We verified the BER floor prediction in the exemplar practical test situation, by measuring the actual BER on the same fiber link, and found the obtained values well matching the OTDR based predicted ones. Furthermore, when no dominant reflective events are identified on the OTDR trace, it implies very small time dispersion allowing the OFDM symbol cyclic prefix to always prevent inter-symbol interference. This retains the CFO to solely determine the residual BER floor and vice versa, enabling indirect estimation of CFOinduced phase distortion by simple BER testing. With this regard, we abstracted CFO with the AWGN being justified by the Central Limiting Theorem to enable efficient and quite accurate short-term BER (and so CFO phase error) predictions.

Jasmin Musovic, Adriana Lipovac, V. Lipovac

In this paper, we analyze an arbitrary heterogeneous cellular network applying stochastic geometry, and propose a modified model for assessing network spectral and energy efficiency. With this regard, we recognize that, in practice, determining Signal-to-Noise-and-Interference Ratio (SINR) as the key performance indicator, requires complex field test equipment, which might not be available or affordable. Therefore, we propose here a simple model that is based on the relatively easy measurable Bit-Error Rate (BER), whose degradation caused by various impairments is considered here as if it was due to the according additive white Gaussian noise (AWGN), thus abstracting any specific non-AWGN distortion. The proposed analytical model is verified by ns3 software network simulator, whose test results are found to match the corresponding estimated values. This indicates that both spectral and energy efficiencies of small-cell networks are higher than in larger-cell networks, even more for heterogeneous two-tier networks.

Adriana Lipovac, V. Lipovac, M. Hamza, V. Batos

Optical time-domain reflectometer (OTDR) enables simple identification and localization of a plethora of refractive and reflective events on a fiber link, including splices, connectors and breaks, and measuring insertion/return loss. Specifically, large enough OTDR dynamic range (DR) and thus high signal-to-noise-ratio (SNR) enable clear far-end visibility of longer fibers. We point out here that, under such conditions, the optical bit-error-rate (BER) floor is dominantly determined by reflective events that introduce significant return loss. This complements the OTDR legacy tests by appropriate optical BER floor estimation in the field. As high SNR implies inter-symbol interference as dominating error generating mechanism, we could apply the classical time-dispersion channel model for the optical BER floor determined by the root-mean-square (rms) delay spread of the actual fiber channel power-delay profile. However, as the high-SNR condition is not always fulfilled mostly due to insufficient DR, we propose here inserting a low-noise optical preamplifier as the OTDR front-end to reduce noise floor and amplify the backscattered signal. In order to verify the model for the exemplar test situation, we measured BER on the same fiber link to find very good matching between the measured BER floor values and the ones predicted from the OTDR trace.

Adriana Lipovac, V. Lipovac, M. Hamza, V. Batos

Optical time-domain reflectometer (OTDR) is used to characterize fiber optic links by identifying and localizing various refractive and reflective events such as breaks, splices, and connectors, and measuring insertion/return loss and fiber length. Essentially, OTDR inserts a pulsed signal into the fiber, from which a small portion that is commonly referred to as Rayleigh backscatter, is continuously reflected back with appropriate delays of the reflections expressed as the power loss versus distance, by conveniently scaling the time axis. Specifically, for long-distance events visibility and measurement accuracy, the crucial OTDR attribute is dynamic range, which determines how far downstream the fiber can the strongest transmitted optical pulse reach. As many older-generation but still operable OTDR units have insufficient dynamic range to test the far-end of longer fibers, we propose a simple and cost-effective solution to reactivate such an OTDR by inserting a low-noise high-gain optical preamplifier in front of it to lower the noise figure and thereby the noise floor. Accordingly, we developed an appropriate dynamic range and distance span extension model which provided the exemplar prediction values of 30 dB and 75 km, respectively, for the fiber under test at 1550 nm. These values were found to closely match the dynamic range and distance span extensions obtained for the same values of the relevant parameters of interest by the preliminary practical OTDR measurements conducted with the front-end EDFA optical amplifier, relative to the measurements with the OTDR alone. This preliminary verifies that the proposed concept enables a significantly longer distance span than the OTDR alone. We believe that the preliminary results reported here could serve as a hint and a framework for a more comprehensive test strategy in terms of both test diversification and repeating rate, which can be implemented in a network operator environment or professional lab.

Adriana Lipovac, V. Lipovac, B. Modlic

This work is motivated by growing evidence that the standard Cyclic Prefix (CP) length, adopted in the Long Term Evolution (LTE) physical layer (PHY) specifications, is oversized in propagation environments ranging from indoor to typical urban. Although this ostensibly seems to be addressed by 5G New Radio (NR) numerology, its scalable CP length reduction is proportionally tracked by the OFDM symbol length, which preserves the relative CP overhead of LTE. Furthermore, some simple means to optimize fixed or introduce adaptive CP length arose from either simulations or models taking into account only the bit-oriented PHY transmission performance. On the contrary, in the novel crosslayer analytical model proposed here, the closed-form expression for the optimal CP length is derived such as to minimize the effective average codeblock length, by also considering the error recovery retransmissions through the layers above PHY—the Medium Access Control (MAC) and the Radio Link Control (RLC), in particular. It turns out that, for given protective coding, the optimal CP length is determined by the appropriate rms delay spread of the channel power delay profile part remaining outside the CP span. The optimal CP length values are found to be significantly lower than the corresponding industry-standard ones, which unveils the potential for improving the net throughput.

F. Cardoso, V. Lipovac, L. M. Correia

This Special Issue originates from the international conference EuCNC 2020 (European Conference on Networks and Communications), which was planned to be held in June 2020 in Dubrovnik (Croatia), but due to the COVID-19 pandemic was changed to an Online Conference. The Technical Programme Chairs of the conference have selected the best papers and invited authors to submit an extended version of their paper, by at least one third of their length. Only the top ranked papers were invited to this Special Issue, in order to fulfil its purpose. The main target was to collect and present quality research contributions in the most recent activities related to systems and networks beyond 5G, already presenting ideas for 6G. Through this Special Issue, the state-of-the-art is presented and the new challenges are highlighted, regarding the latest advances on systems and network perspectives that are already being positioned beyond 5G, bridging as well with the evolution of 5G, including applications and trials. Therefore, the motivation for this Special Issue is to present the latest and finest results on the evolution of research of mobile and wireless communications, coming, but not exclusively (since EuCNC is a conference open to the whole research community), from projects co-financed by the European Commission within its R&D programmes.

Microwave line-of-sight radio relay (RR) systems are a constitutive part of a telecom operator transport network, as an alternative to optical transmission systems when the latter are not technically possible or rational to implement. Nowadays, RR links are quite often used in the access network for connecting mobile radio base stations, thus also enabling traffic aggregation, and so on. In this paper, we focus on a practical, real-life, five-section heterogeneous RR network, comprising classic synchronous digital hierarchy (SDH) and SDH new generation network (NGN) architecture, hybrid parallel and mutually independent transmission of native Ethernet and TDM services, and all-IP network parts. Specifically, the main task of this work is to answer whether such a diverse RR system could satisfy the quality norms for Ethernet-based services, meaning whether a tolerable RR unavailability will necessarily imply the according Ethernet quality of service (QoS) degradation. This question is addressed by the comprehensive in-service and out-of-service testing of an operational hybrid RR transmission system. After extensive practical testing and appropriate analysis of the achieved results, it came out that the impact of RR-level impairments that determine the performance prediction affected the Ethernet QoS to the extent that BER values increased to the acceptability threshold values. We believe that the preliminary results reported here could serve as a hint and a framework for a more comprehensive cross-layer test strategy in terms of both test diversity and repeating rate, which contemporary network operators need to implement in order to enable the appropriate quality of experience for users of their services.

Adriana Lipovac, V. Lipovac, Ivan Grbavac, Ines Obradovic

As the PHY/MAC-layer IR-HARQ and RLC-layer ARQ error recovery procedures, adopted in LTE, may impose additional delay when their code-block retransmissions occur, the arising question is whether these significantly contribute to IP and consequently RTP packet delays, and finally degrade the overall application-layer end-to-end QoE, especially when voice is transmitted over LTE? With this regard, we propose and demonstrate a VoLTE QoS and QoE test procedure based on PHY/MAC/RLC/IP/TCP-UDP/RTP cross-layer protocol analysis and perceptual speech quality QoE measurements. We identified monotonic relationship between the paired observations: QoE and HARQ RTT, i.e. between the PESQ voice quality rating and the IP/RTP packet latency, for given BLER of the received MAC/RLC code-blocks. Specifically, we found out that, for the HARQ RTT value of about 8 ms, only up to 2 HARQ retransmissions (and consequently no RLC-ARQ one) is appropriate during any voice packet, otherwise delay accumulation might not be accordingly “smoothed out” by jitter/playback buffers along the propagation path.

Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više