Logo
User Name

Alem Čolaković

Društvene mreže:

Recently, the necessity of video testing at the point of reception has become a challenge for video distributors. This paper presents a new system framework for managing the quality of video degradation detection. The system is based on objective video quality assessment metrics and unsupervised machine learning techniques that use the dimensionality reduction of time series. It was demonstrated that it is possible to detect anomalies in the video during video streaming in soft real time. In addition, the model discovers degradations based on the visible correlation between adjacent images in the video sequence regardless the quick or slow change of a scene in the sequence. With additional hardware manipulations on the equipment on the user side, the proposed solution can be used in practical implementations where the need for monitoring possible degradations during video streaming exists.

Software process improvement implies a set of complex and systematic activities of software engineering. It requires theory and models established in management, technical and social sciences. The improvement is based on the assumption that the organization if it owns mature and capable processes, would be able to deliver quality software on time and in line with predicted costs. The maturity models are initially aimed for implementation in enterprise software organizations, government organizations and within the military industry. Their complexity and the size make them difficult to use in small software organizations and companies. In such organizations the interest for use and the efforts to make an efficient and effective organization is always presented, though. In this paper, the basic and derived capability maturity models are described and cases from their implementation are analyzed, along with assessment of results of such projects in business practices. The problem of the software process improvement in small organizations is described, extracting the risks and recommendations for its enhancement. These recommendations are provided in order to set up a foundation for implementation of these models in a specific managerial and organizational environment characterized by small organizations.

Introduction: Machine learning (ML) plays a significant role in the fight against the COVID-19 (officially known as SARS-CoV-2) pandemic. ML techniques enable the rapid detection of patterns and trends in large datasets. Therefore, ML provides efficient methods to generate knowledge from structured and unstructured data. This potential is particularly significant when the pandemic affects all aspects of human life. It is necessary to collect a large amount of data to identify methods to prevent the spread of infection, early detection, reduction of consequences, and finding appropriate medicine. Modern information and communication technologies (ICT) such as the Internet of Things (IoT) allow the collection of large amounts of data from various sources. Thus, we can create predictive ML-based models for assessments, predictions, and decisions. Methods: This is a review article based on previous studies and scientifically proven knowledge. In this paper, bibliometric data from authoritative databases of research publications (Web of Science, Scopus, PubMed) are combined for bibliometric analyses in the context of ML applications for COVID-19. Aim: This paper reviews some ML-based applications used for mitigating COVID-19. We aimed to identify and review ML potentials and solutions for mitigating the COVID-19 pandemic as well as to present some of the most commonly used ML techniques, algorithms, and datasets applied in the context of COVID-19. Also, we provided some insights into specific emerging ideas and open issues to facilitate future research. Conclusion: ML is an effective tool for diagnosing and early detection of symptoms, predicting the spread of a pandemic, developing medicines and vaccines, etc.

Routing in multidomain and multilayer networks is the subject of constant theoretical research, with special emphasis on routing optimization algorithms based on several criteria. Such research results in new proposals. The basic task of the algorithm is to perform the given task in a finite and reasonable period of time and with reasonable resource requirements. When new solutions are compared with previous solutions, it is necessary to consider as much information as possible about the characteristics and differences between these algorithms, which ultimately determines the degree of success of the algorithm. Routing algorithms depend on the goals to be achieved and most often solve a certain group of problems with certain simplifications of the overall problem and to the detriment of performance that are not crucial for a given routing optimization problem. Therefore, it is necessary to have acceptable methods for efficiency-complexity evaluation methods of routing algorithms with certain, universally applicable, metrics. Several theoretical approaches, including graph theory, optimization theory, complexity theory, allow approaches to compare the algorithms and the results achieved with the help of these algorithms.

...
...
...

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više