Logo

Publikacije (45499)

Nazad
Mile Šikman, Velibor Lalić

This paper analyses court cases which qualified as organised crime in Bosnia and Herzegovina (B&H). The final judgments were analysed according to the following criteria: the number of defendants; the continuity of membership within the crime organisation; the existence of criminal structure; the existence of a developed plan of activities; the type and number of the offences committed; influence on public authorities, the judiciary, and citizens; and sentences imposed on the defendants. This paper seeks to identify the extent to which court judgments are based on these criteria. A secondary analysis of the data related to the organised crime cases heard in the Court of Bosnia and Herzegovina in the period between 2015 and 2018 was conducted. This analysis encompassed 21 organised crime cases in which 27 judgments were pronounced. In the observed period (2015-2018), we identified two organised criminal groups that meet the criteria analysed. The identified number of organised criminal groups is minimal in relation to the total number of organised crime cases processed. Our findings contradict the prevailing view in public discourse that organised crime is a widespread security threat in BiH. The findings of our research demonstrated the existence of legal gaps, reflected in the lack of clear criteria on the basis of which OCGs can be distinguish from other forms of criminal activity. Legal and institutional weaknesses create opportunities for OCGs to operate and create a sense of insecurity among citizens in the already complex security environment in B&H.

Haris Muhović, Almedin Salkić, Emina Melic, Neira Džananović, M. Saric, D. Jokić, S. Lale

This paper presents the implementation of the Binary Search Algorithm (BSA) to determine the Maximum Power Point (MPP) of a photovoltaic (PV) system under variable weather conditions. Additionally, the conventional well-known Perturb and Observe (P&O) algorithm is also implemented to be compared with the binary search based Maximum Power Point Tracking (MPPT) algorithm. Both algorithms are implemented in real time in MATLAB/Simulink environment. The experimental study is performed using the two 260 W series connected PV modules, the buck converter, and Humusoft MF 634 card to enable real-time operation. The value of the duty cycle for the buck converter is being updated in each step moving the operation point closer to MPP. The obtained experimental results demonstrate that the binary search based MPPT algorithm is more efficient and accurate when compared to the P&O MPPT algorithm.

Manuel M. Ferreira, F. Cardoso, S. Ambroziak, Mariella Särestöniemi, Kenan Turbic, L. Correia

In this paper, an analysis of depolarisation in Body Area Networks for Body-to-Infrastructure communications based on a measurement campaign in the 5.8 GHz band in an indoor environment is performed. Measurements were made with an off-body antenna transmitting linearly polarised signals and dual-polarised receiving antennas carried by the user on the body. A Normal Distribution with a mean of 2.0 dB and a standard deviation of 4.3 dB is found to be the best fit for modelling cross-polarisation discrimination. The average correlation between the signals received by the orthogonally polarised antennas is below 0.5, showing that polarisation diversity can be used. A model is proposed for the average value of the standard deviation of the cross-polarisation discrimination ratio as a function of the transmitted polarisation, the mobility of users and link dynamics.

Cause-effect graphs are a commonly used black-box testing method, and many different algorithms for converting system requirements to cause-effect graph specifications and deriving test case suites have been proposed. However, in order to test the efficiency of black-box testing algorithms on a variety of cause-effect graphs containing different numbers of nodes, logical relations and dependency constraints, a dataset containing a collection of cause-effect graph specifications created by authors of existing papers is necessary. This paper presents CEGSet, the first collection of existing cause-effect graph specifications. The dataset contains a total of 65 graphs collected from the available relevant literature. The specifications were created by using the ETF-RI-CEG graphical software tool and can be used by future authors of papers focusing on the cause-effect graphing technique. The collected graphs can be re-imported in the tool and used for the desired purposes. The collection also includes the specification of system requirements in the form of natural language from which the cause-effect graphs were derived where possible. This will encourage future work on automatizing the process of converting system requirements to cause-effect graph specifications.

Selma Opačin, Lejla Rizvanović, B. Leander, S. Mubeen, Aida Čaušević

Technical advances as well as continuously evolving business demands are reshaping the need for flexible connectivity in industrial control systems. A way to achieve such needs is by using a service-oriented approach, where a connectivity service middleware provides controller as well as protocol-specific interfaces. The Message Queuing Telemetry Transport (MQTT) protocol is a widely used protocol for device-to-device communication in the Internet of Things (IoT). However it is not commonly integrated in industrial control systems. To address this gap, this paper describes the development and implementation of a prototype of a connectivity service middleware for MQTT within an industrial private control network. The prototype implementation is done in the context of an industrial controller, and used in a simulated modular automation system. Furthermore, various deployment scenarios are evaluated with respect to response time and scalability of the connectivity service.

Nina Slamnik-Kriještorac, W. Vandenberghe, Najmeh Masoudi-Dione, Stijn Van Staeyen, Xiangyu Lian, Rakshith Kusumakar, J. Márquez-Barja

As the shipping sector has been one of the major impact factors on economic growth over the past decades, its digitalization is expected to make unprecedented improvements in the safety and reliability of ship control, thereby ultimately enabling the autonomous operations of ships. The automated control of ships will not only mitigate the risks of human mistakes but will also improve the efficiency of operations by preventing unexpected delays while being environmentally sustainable. With the advent of the Internet of Ships (IoS) sector, well-known and mature concepts of the Internet of Things (IoT) are being applied to ships and ports, thereby making them more and more equipped with sensing and communication capabilities that set the ground for improved situational awareness and better decision-making. However, there are many challenges that need to be thoroughly studied, such as the communication between barges, ports, and services, as increased network latency and limitations on the bandwidth imposed by satellite communications could introduce significant risks for accident occurrence, ultimately affecting the overall automated operation/teleoperation of barges. In this paper, we present one of the first attempts to test the potential of 5G systems for automating barge operations, starting from teleoperation as an enabler of automation, thereby creating and validating a cellular-based automated barge control system in a real-life environment. In this system, the barge is sailing in a busy port area such as one of the Port of Antwerp Bruges, while being connected to the 5G network. We assess the quality of the 5G communication system and present and discuss our initial results on the enhancements that 5G could bring to teleoperation and automation of the barge control.

Vincent Charpentier, Nina Slamnik-Kriještorac, J. Brenes, A. Gavrielides, Marius Iordache, Georgios Tsiouris, Xiangyu Lian, J. Márquez-Barja

The proliferation of 5G technology is enabling vertical industries to improve their day-to-day operations by leveraging enhanced Quality of Service (QoS). One of the key enablers for such 5G performance is network slicing, which allows telco operators to logically split the network into various virtualized networks, whose configuration and thus performance can be tailored to verticals and their low-latency and high throughput requirements. However, given the end-to-end perspective of 5G ecosystems where slicing needs to be applied on all network segments, including radio, edge, transport, and core, managing the deployment of slices is becoming excessively demanding. There are also various verticals with strict requirements that need to be fulfilled. Thus, in this paper, we focus on the solution for dynamic and quality-aware network slice management and orchestration, which is simultaneously orchestrating network slices that are deployed on top of the three 5G testbeds built for transport and logistics use cases. The slice orchestration system is dynamically interacting with the testbeds, while at the same time monitoring the real-time performance of allocated slices, which is triggering decisions to either allocate new slices or reconfigure the existing ones. In this paper, we illustrate the scenarios where dynamic provisioning of slices is required in one of the testbeds while taking into account specific latency/throughput/location requirements coming from the verticals and their end users.

Marius Iordache, Razvan Mihai, Cristian Patachia, J. Brenes, Athina Ropodi, A. Margaris, G. Suciu, Alexandru Vulpe et al.

5G Stand Alone (SA) networks are starting to be considered, designed and implemented in multiple countries in various forms (public, private, experimental). 5G SA networks mass adoption is expected to materialize by 2025. Mass deployment is anticipated at a large scale, due to the rich features and capabilities offered by 5G networks, including but not limited to slicing, service orchestration and automation, bringing the benefits of 5G among industry stakeholders and verticals. The concept of Network Applications is gaining momentum, as a way to ease the process of deploying industry-specific services and applications and to integrate them seamlessly with the new 5G networks and customer-specific application components. We target deploying and operating the novel 5G SA testbeds, Network Application and related capabilities in different T&L facilities across Europe. We envision the architectural advancement in terms of 5G features, such as orchestration, multi-slice implementation, Quality of Service (QoS)/Quality of Experience (QoE) and an innovative end-to-end monitoring framework, for network and application KPIs. In this paper, 5G open testbed advancements (3GPP Rel. 16 compliant) and readiness for Network Application experiments in real-life scenarios are presented, integrated as a unitary whole within the ED-funded VITAL-5G project.

M. Dobric, Matija Furtula, M. Tesic, Stefan Timčić, Dušan Borzanović, N. Lazarević, Mirko Lipovac, Mihajlo Farkić et al.

Assessment of the functional significance of coronary artery stenosis using invasive measurement of fractional flow reserve (FFR) or non-hyperemic indices has been shown to be safe and effective in making clinical decisions on whether to perform percutaneous coronary intervention (PCI). Despite strong evidence from clinical trials, utilization of these techniques is still relatively low worldwide. This may be to some extent attributed to factors that are inherent to invasive measurements like prolongation of the procedure, side effects of drugs that induce hyperemia, additional steps that the operator should perform, the possibility to damage the vessel with the wire, and additional costs. During the last few years, there was a growing interest in the non-invasive assessment of coronary artery lesions, which may provide interventionalist with important physiological information regarding lesion severity and overcome some of the limitations. Several dedicated software solutions are available on the market that could provide an estimation of FFR using 3D reconstruction of the interrogated vessel derived from two separated angiographic projections taken during diagnostic coronary angiography. Furthermore, some of them use data about aortic pressure and frame count to more accurately calculate pressure drop (and FFR). The ideal non-invasive system should be integrated into the workflow of the cath lab and performed online (during the diagnostic procedure), thereby not prolonging procedural time significantly, and giving the operator additional information like vessel size, lesion length, and possible post-PCI FFR value. Following the development of these technologies, they were all evaluated in clinical trials where good correlation and agreement with invasive FFR (considered the gold standard) were demonstrated. Currently, only one trial (FAVOR III China) with clinical outcomes was completed and demonstrated that QFR-guided PCI may provide better results at 1-year follow-up as compared to the angiography-guided approach. We are awaiting the results of a few other trials with clinical outcomes that test the performance of these indices in guiding PCI against either FFR or angiography-based approach, in various clinical settings. Herein we will present an overview of the currently available data, a critical review of the major clinical trials, and further directions of development for the five most widely available non-invasive indices: QFR, vFFR, FFRangio, caFFR, and AccuFFRangio.

The paper evaluates statistical significance of the differences in the feature values necessary to differentiate the signals corresponding to cardiac arrhythmia (AR) and atrial fibrillation (AF). The initial set of heart rate variability (HRV) features includes time and frequency domain metrics, as well as geometric metrics based on the Poincare diagram. Due to non-uniformity of the heart rate signal, frequency domain features are calculated using two approaches: the Lomb-Scargle method for spectral analysis for non-uniform signals, and Welch method for uniform signals, but after the signal interpolation and resampling. Selection of an appropriate statistical test was depending on the distribution of feature values. Normal distribution allowed use of parametric ANOVA test and otherwise non-parametric Wilcoxon–Mann–Whitney test were used. The statistical tests indicated statistically significant difference between the two observed groups of signals of interest with respect to the evaluated feature. The success of the classification depends on the well-chosen features according to their importance. In the paper, statistical tests resulted in selection of 27 features out of the initial 51. The proposed set of features could be used for the classification between the AR and AF signals to assist diagnosis of the mentioned heart diseases.

E. Silva, D. Viegas, A. Martins, J. Almeida, C. Almeida, B. Neves, P. Madureira, A. J. Wheeler et al.

By creating a dependable, transparent, and cost-effective system for forecasting and ongoing environmental impact monitoring of exploration and exploitation activities in the deep sea, TRIDENT seeks to contribute to the sustainable exploitation of seabed mineral resources. In order to operate autonomously in remote locations under harsh conditions and send real-time data to authorities in charge of granting licenses and providing oversight, this system will create and integrate new technology and innovative solutions. The efficient monitoring and inspection system that will be created will abide by national and international legal frameworks. At the sea surface, mid-water, and the bottom, TRIDENT will identify all pertinent physical, chemical, geological, and biological characteristics that must be monitored. It will also look for data gaps and suggest procedures for addressing them. These are crucial actions to take in order to produce accurate indicators of excellent environmental status, statistically robust environmental baselines, and thresholds for significant impact, allowing for the standardization of methods and tools. In order to monitor environmental parameters on mining and reference areas at representative spatial and temporal scales, the project consortium will thereafter develop and test an integrated system of stationary and mobile observatory platforms outfitted with the most recent automatic sensors and samplers. The system will incorporate high-capacity data processing pipelines able to gather, transmit, process, and display monitoring data in close to real-time to facilitate prompt actions for preventing major harm to the environment. Last but not least, it will offer systemic and technological solutions for predicting probable impacts of applying the developed monitoring and mitigation techniques.

I. Kennedy, M. Hodzic

Advances in applied mechanics have facilitated a better understanding of the recycling of heat and work in the troposphere. This goal is important to meet practical needs for better management of climate science. Achieving this objective may require the application of quantum principles in action mechanics, recently employed to analyze the reversible thermodynamics of Carnot’s heat engine cycle. The testable proposals suggested here seek to solve several problems including (i) the phenomena of decreasing temperature and molecular entropy but increasing Gibbs energy with altitude in the troposphere; (ii) a reversible system storing thermal energy to drive vortical wind flow in anticyclones while frictionally warming the Earth’s surface by heat release from turbulence; (iii) vortical generation of electrical power from translational momentum in airflow in wind farms; and (iv) vortical energy in the destructive power of tropical cyclones. The scalar property of molecular action (@t ≡ ∫mvds, J-sec) is used to show how equilibrium temperatures are achieved from statistical equality of mechanical torques (mv2 or mr2ω2); these are exerted by Gibbs field quanta for each kind of gas phase molecule as rates of translational action (d@t/dt ≡ ∫mr2ωdϕ/dt ≡ mv2). These torques result from the impulsive density of resonant quantum or Gibbs fields with molecules, configuring the trajectories of gas molecules while balancing molecular pressure against the density of field energy (J/m3). Gibbs energy fields contain no resonant quanta at zero Kelvin, with this chemical potential diminishing in magnitude as the translational action of vapor molecules and quantum field energy content increases with temperature. These cases distinguish symmetrically between causal fields of impulsive quanta (Σhν) that energize the action of matter and the resultant kinetic torques of molecular mechanics (mv2). The quanta of these different fields display mean wavelengths from 10−4 m to 1012 m, with radial mechanical advantages many orders of magnitude greater than the corresponding translational actions, though with mean quantum frequencies (v) similar to those of radial Brownian movement for independent particles (ω). Widespread neglect of the Gibbs field energy component of natural systems may be preventing advances in tropospheric mechanics. A better understanding of these vortical Gibbs energy fields as thermodynamically reversible reservoirs for heat can help optimize work processes on Earth, delaying the achievement of maximum entropy production from short-wave solar radiation being converted to outgoing long-wave radiation to space. This understanding may improve strategies for management of global changes in climate.

This paper studies the dynamics of a class of host-parasitoid models with host refuge and the strong Allee effect upon the host population. Without the parasitoid population, the Beverton–Holt equation governs the host population. The general probability function describes the portion of the hosts that are safe from parasitism. The existence and local behavior of solutions around the equilibrium points are discussed. We conclude that the extinction equilibrium will always have its basin of attraction which implies that the addition of the host refuge will not save populations from extinction. By taking the host intrinsic growth rate as the bifurcation parameter, the existence of the Neimark–Sacker bifurcation can be shown. Finally, we present numerical simulations to support our theoretical findings.

Jelena Lazić, Aleksandra Krstić, S. Vujnović

Social networks have become an integral part of modern society, allowing users to express their thoughts, opinions, and feelings, and engage in discussions on various topics. The vast amount of user-generated content on these platforms provides a valuable source of data for sentiment analysis (SA), which is the computational analysis of opinions and sentiments expressed in text. However, most existing deep learning models for SA rely on minimizing the cross-entropy loss, which does not incorporate any knowledge of the sentiment of labels themselves. To address this limitation, a novel approach that utilizes an optimal transport-based loss function to improve sentiment analysis performance was proposed. Optimal transport (OT) metrics are fundamental theoretical properties for histogram comparison, and the proposed loss function uses the cost of the OT plan between ground truth and outputs of the classifier. The experimental results demonstrate that this approach can significantly reduce miss detections between positive and negative classes and suggest that using an OT-based loss function can effectively overcome the deficiency of existing SA models and improve their performance in real-world applications.

The concept of brand personality plays a crucial role in brand literature as consumers tend to anthropomorphize brands by attributing human characteristics to them. The creation of a brand personality that resonates with consumers leads to greater customer satisfaction and loyalty over the long term. This study investigates the mediating potential of brand personality dimensions, speci cally Competence and Sophistication, in the relationship between brand communication (both controlled and uncontrolled) as an antecedent and brand loyalty as an outcome. Using a sample of 340 users of a cosmetic brand, we employed structural equation modeling to analyze the data. Our results indicate that controlled communication signi cantly in uences both the Competence and Sophistication dimensions of brand personality, and that there are signi cant indirect effects of both controlled and uncontrolled communication through reference groups on loyalty mediated by personality dimensions. These ndings provide valuable insights for brand managers and marketers seeking to enhance brand loyalty by developing effective communication strategies that align with the desired brand personality dimensions.

Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više