Logo

Publikacije (31)

Nazad

Embedded real-time clock systems have a large number of applications in practice. The main issue is the accuracy of time they show, which is why performing time synchronization is very important for their usability and reliability. This paper proposes an embedded real-time analogue clock that uses an AdaFruit NeoPixel LED ring for visualizing current time. Three different colors are used for showing hour, minute and second values, whereas different levels of brightness are used for describing accurate values of time to the level of milliseconds. An Ethernet LAN module is used for performing time synchronization via a remote NTP server. Dynamic synchronization interval change is used for removing the effect of the microcontroller clock error on the accuracy of the shown time. After being put to use, the system was able to perform multiple functions successfully, including the conveying of information to the user when the clock is out of sync.

Cause-effect graphs are a commonly used black-box testing method, and many different algorithms for converting system requirements to cause-effect graph specifications and deriving test case suites have been proposed. However, in order to test the efficiency of black-box testing algorithms on a variety of cause-effect graphs containing different numbers of nodes, logical relations and dependency constraints, a dataset containing a collection of cause-effect graph specifications created by authors of existing papers is necessary. This paper presents CEGSet, the first collection of existing cause-effect graph specifications. The dataset contains a total of 65 graphs collected from the available relevant literature. The specifications were created by using the ETF-RI-CEG graphical software tool and can be used by future authors of papers focusing on the cause-effect graphing technique. The collected graphs can be re-imported in the tool and used for the desired purposes. The collection also includes the specification of system requirements in the form of natural language from which the cause-effect graphs were derived where possible. This will encourage future work on automatizing the process of converting system requirements to cause-effect graph specifications.

Many different methods are used for generating blackbox test case suites. Test case minimization is used for reducing the feasible test case suite size in order to minimize the cost of testing while ensuring maximum fault detection. This paper presents an optimization of the existing test case minimization algorithm based on forward-propagation of the cause-effect graphing method. The algorithm performs test case prioritization based on test case strength, a newly introduced test case selection metric. The optimized version of the minimization algorithm was evaluated by using thirteen different examples from the available literature. In cases where the existing algorithm did not generate the minimum test case subsets, significant improvements of test effect coverage metric values were achieved. Test effect coverage metric values were not improved only in cases where maximum optimization was already achieved by using the existing algorithm.

Cause-effect graphs are often used as a method for deriving test case suites for black-box testing different types of systems. This paper represents a survey focusing entirely on the cause-effect graphing technique. A comparison of different available algorithms for converting cause-effect graph specifications to test case suites and problems which may arise when using different approaches are explained. Different types of graphical notation for describing nodes, logical relations and constraints used when creating cause-effect graph specifications are also discussed. An overview of available tools for creating cause-effect graph specifications and deriving test case suites is given. The systematic approach in this paper is meant to offer aid to domain experts and end users in choosing the most appropriate algorithm and, optionally, available software tools, for deriving test case suites in accordance to specific system priorities. A presentation of proposed graphical notation types should help in gaining a better level of understanding of the notation used for specifying cause-effect graphs. In this way, the most common mistakes in the usage of graphical notation while creating cause-effect graph specifications can be avoided.

Denial of Service attacks and the distributed variant of this type of attack called DDoS are attack types which are easy to start but hard to stop especially in the DDoS case. The significance of this type of attack is that attackers use a large number of packets usually created with programs and scripts for creating specially crafted types of packets for different types of attack such as SYN flood, ICMP smurf, etc. These packets have similar or identical attributes such as length of packets, interval time, destination port, TCP flags etc. Skilled engineers and researchers use these packet attributes as indicators to detect anomalous packets in network traffic. For fast detection of anomalous packets in legitimate traffic we proposed Interactive Data Extraction and Analysis with Newcombe-Benford power law which is able to detect matching first occurrences of leading digits – size of each packet that indicate usage of automated scripts for attack purposes. Power law can be used to detect the same first two, three, or second digits, last one or two digits in data set etc. We used own data set, and real devices.

— Cause-effect graphing is a commonly used black-box technique with many applications in practice. It is important to be able to create accurate cause-effect graph specifications from system requirements before converting them to test case tables used for black-box testing. In this paper, a new graphical software tool for creating cause-effect graph specifications is presented. The tool uses standardized graphical notation for describing different types of nodes, logical relations and constraints, resulting in a visual representation of the desired cause-effect graph which can be exported for later usage and imported in the tool. The purpose of this work is to make the cause-effect graph specification process easier for users in order to solve some of the problems which arise due to the insufficient amount of understanding of cause-effect graph elements. The proposed tool was successfully used for creating cause-effect graph specifications for small, medium and large graphs. It was also successfully used for performing different types of tasks by users without any prior knowledge of the functionalities of the tool, indicating that the tool is easy to use, helpful and intuitive. The results indicate that the usage of standardized notation is easier to understand than non-standardized approaches from other tools.

Cause-effect graphs are a popular black-box testing technique. The most commonly used approach for generating test cases from cause-effect graph specifications uses backward-propagation of forced effect activations through the graph in order to get the values of causes for the desired test case. Many drawbacks have been identified when using this approach for different testing requirements. Several algorithms for automatically generating test case suites from cause-effect graph specifications have been proposed. However, many of these algorithms do not solve the main drawbacks of the initial back-propagation approach and offer only minor improvements for specific purposes. This work proposes two new algorithms for deriving test cases from cause-effect graph representations. Forward-propagation of cause values is used for generating the full feasible test case suite, whereas multiple effect activations are taken into account for reducing the feasible test case suite size. Evaluation of the test case suites generated by using the proposed algorithms was performed by using the newly introduced test effect coverage and fault detection rate effectiveness metrics. The evaluation shows that the proposed algorithms work in real time even for a very large number of cause nodes. The results also indicate that the proposed algorithm for generating all feasible test cases generates a larger test case suite, whereas the proposed algorithm for test case suite minimization generates a smaller test case subset than the originally proposed approaches while ensuring the maximum effect coverage, fault detection rate effectiveness and a better test effect coverage ratio.

Ingmar Bešić, Herzegovina, Z. Avdagić, K. Hodzic

Visual impairments often pose serious restrictions on a visually impaired person and there is a considerable number of persons, especially among aging population, which depend on assistive technology to sustain their quality of life. Development and testing of assistive technology for visually impaired requires gathering information and conducting studies on both healthy and visually impaired individuals in a controlled environment. We propose test setup for visually impaired persons by creating RFID based assistive environment – Visual Impairment Friendly RFID Room. The test setup can be used to evaluate RFID object localization and its use by visually impaired persons. To certain extent every impairment has individual characteristics as different individuals may better respond to different subsets of visual information. We use virtual reality prototype to both simulate visual impairment and map full visual information to the subset that visually impaired person can perceive. Time-domain color mapping real-time image processing is used to evaluate the virtual reality prototype targeting color vision deficiency.

Mathematical modelling to compute ground truth from 3D images is an area of research that can strongly benefit from machine learning methods. Deep neural networks (DNNs) are state-of-the-art methods design for solving these kinds of difficulties. Convolutional neural networks (CNNs), as one class of DNNs, can overcome special requirements of quantitative analysis especially when image segmentation is needed. This article presents a system that uses a cascade of CNNs with symmetric blocks of layers in chain, dedicated to 3D image segmentation from microscopic images of 3D nuclei. The system is designed through eight experiments that differ in following aspects: number of training slices and 3D samples for training, usage of pre-trained CNNs and number of slices and 3D samples for validation. CNNs parameters are optimized using linear, brute force, and random combinatorics, followed by voter and median operations. Data augmentation techniques such as reflection, translation and rotation are used in order to produce sufficient training set for CNNs. Optimal CNN parameters are reached by defining 11 standard and two proposed metrics. Finally, benchmarking demonstrates that CNNs improve segmentation accuracy, reliability and increased annotation accuracy, confirming the relevance of CNNs to generate high-throughput mathematical ground truth 3D images.

Hydropower dam displacement is influenced by various factors (dam ageing, reservoir water level, air, water, and concrete temperature), which cause complex nonlinear behaviour that is difficult to predict. Object deformation monitoring is a task of geodetic and civil engineers who use different instruments and methods for measurements. Only geodetic methods have been used for the object movement analysis in this research. Although the whole object is affected by the influencing factors, different parts of the object react differently. Hence, one model cannot describe behaviour of every part of the object precisely. In this research, a localised approach is presented—two individual models are developed for every point strategically placed on the object: one model for the analysis and prediction in the direction of the X axis and the other for the Y axis. Additionally, the prediction of horizontal dam movement is not performed directly from measured values of influencing factors, but from predicted values obtained by machine learning and statistical methods. The results of this research show that it is possible to perform accurate short-term time series dam movement prediction by using machine learning and statistical methods and that the only limiting factor for improving prediction length is accurate weather forecast.

Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više