Deep learning techniques in computer vision (CV) tasks such as object detection, classification, and tracking can be facilitated using predefined markers on those objects. Selecting markers is an objective that can potentially affect the performance of the algorithms used for tracking as the algorithm might swap similar markers more frequently and, therefore, require more training data and training time. Still, the issue of marker selection has not been explored in the literature and seems to be glossed over throughout the process of designing CV solutions. This research considered the effects of symbol selection for 2D-printed markers on the neural network’s performance. The study assessed over 250 ALT code symbols readily available on most consumer PCs and provided a go-to selection for effectively tracking n-objects. To this end, a neural network was trained to classify all the symbols and their augmentations, after which the confusion matrix was analysed to extract the symbols that the network distinguished the most. The results showed that selecting symbols in this way performed better than the random selection and the selection of common symbols. Furthermore, the methodology presented in this paper can easily be applied to a different set of symbols and different neural network architectures.
The number of loan requests is rapidly growing worldwide representing a multi-billion-dollar business in the credit approval industry. Large data volumes extracted from the banking transactions that represent customers’ behavior are available, but processing loan applications is a complex and time-consuming task for banking institutions. In 2022, over 20 million Americans had open loans, totaling USD 178 billion in debt, although over 20% of loan applications were rejected. Numerous statistical methods have been deployed to estimate loan risks opening the field to estimate whether machine learning techniques can better predict the potential risks. To study the machine learning paradigm in this sector, the mental health dataset and loan approval dataset presenting survey results from 1991 individuals are used as inputs to experiment with the credit risk prediction ability of the chosen machine learning algorithms. Giving a comprehensive comparative analysis, this paper shows how the chosen machine learning algorithms can distinguish between normal and risky loan customers who might never pay their debts back. The results from the tested algorithms show that XGBoost achieves the highest accuracy of 84% in the first dataset, surpassing gradient boost (83%) and KNN (83%). In the second dataset, random forest achieved the highest accuracy of 85%, followed by decision tree and KNN with 83%. Alongside accuracy, the precision, recall, and overall performance of the algorithms were tested and a confusion matrix analysis was performed producing numerical results that emphasized the superior performance of XGBoost and random forest in the classification tasks in the first dataset, and XGBoost and decision tree in the second dataset. Researchers and practitioners can rely on these findings to form their model selection process and enhance the accuracy and precision of their classification models.
Air pollution is a major problem in developing countries and around the world causing lung diseases such as asthma, chronic bronchitis, emphysema, and chronic obstructive pulmonary disease. Therefore, innovative methods and systems for predicting air pollution are needed to reduce such risks. Some Internet of Things (IoT) technologies have been developed to assess and monitor various air quality parameters. In the context of IoT, Artificial intelligence is one of the main segments of smart cities that enables collecting a large amount of data to make recommendations, predict future events and help make decisions. Big data, as part of artificial intelligence, greatly contributes to making further decisions, determining the necessary resources, and identifying critical places thanks to the large amount of data it collects. This paper proposes a solution, with the integration of the Internet of Things (IoT), to predict pollution for any given day. This paper aims to show how sensor-derived data in smart air pollution monitoring solutions can be used for intelligent pollution management. By collecting data from the air pollution sensor that sends the data to the server via. NET 6 REST API endpoint and places it in a SQL Server database together with additional weather data that is collected from REST API for that part of the day, a dataset is created through the ETL process in Jupyter notebook. Linear regression algorithms will be used for making predictions. By detecting the largest sources of air pollution, artificial intelligence solutions can proactively reduce pollution and thus improve health conditions and reduce health costs.
The designing process of an IoT (Internet of Things) network requires adequate knowledge of various communication technologies that make the connection of the IoT modules possible. Many important factors such as scalability, bandwidth, data rate (speed), coverage, power consumption, and security support need to be considered to answer the needs of an IoT application with regards to the implemented radio communication technology. This paper studies the choices of three major LPWAN (Low-Power Wide-Area Networks) technologies that are currently leading in the market of radio communication technologies. Focusing on Sigfox, LoRaWAN (Low-Range Wide-Area Networks), and NB-IoT (Narrow-Band Internet of Things), this work intends to give the respective pros and cons of the mentioned technologies and a clear view of the recent trends and effective choices of radio communication technologies for major smart IoT applications.
With the global transition to the IPv6 (Internet Protocol version 6), IP (Internet Protocol) validation efficiency and IPv6 support from the aspect of network programming are gaining more importance. As global computer networks grow in the era of IoT (Internet of Things), IP address validation is an inevitable process for assuring strong network privacy and security. The complexity of IP validation has been increased due to the rather drastic change in the memory architecture needed for storing IPv6 addresses. Low-level programming languages like C/C++ are a great choice for handling memory spaces and working with simple devices connected in an IoT (Internet of Things) network. This paper analyzes some user-defined and open-source implementations of IP validation codes in Boost. Asio and POCO C++ networking libraries, as well as the IP security support provided for general networking purposes and IoT. Considering a couple of sample codes, the paper gives a conclusion on whether these C++ implementations answer the needs for flexibility and security of the upcoming era of IPv6 addressed computers.
With the emerging Internet of Things (IoT) technologies, the smart city paradigm has become a reality. Wireless low-power communication technologies (LPWAN) are widely used for device connection in smart homes, smart lighting, mitering, and so on. This work suggests a new approach to a smart parking solution using the benefits of narrowband Internet of Things (NB-IoT) technology. NB-IoT is an LPWAN technology dedicated to sensor communication within 5G mobile networks. This paper proposes the integration of NB-IoT into the core IoT platform, enabling direct sensor data navigation to the IoT radio stations for processing, after which they are forwarded to the user application programming interface (API). Showcasing the results of our research and experiments, this work suggests the ability of NB-IoT technology to support geolocation and navigation services, as well as payment and reservation services for vehicle parking to make the smart parking solutions smarter.
Distributed Ledger Technologies are one of the pillars of future technologies, prognozing to have a great impact to many aspects of our lives, including social, economic, juristic, security and many others. Bitcoin is still the most popular blockchain currency, but the opportunities to use Distribute Ledger Technologies are much more wide, outperforming financial applications as most known and popular. Besides blockchains, there are also other architectures of Distributed Ledger Technologies. This paper observes and analyses one technology as a very strong alternative to blockchains: hashgraphs, which are promising to outperform blockchains, but also tangles. Basis of their architecture and functionality will be explained and directions and prognosis of the further development will be given. The main paper contribution is a comparison of a hashgraph technology to its concurrent architectures, i.e. blockchains and tangles, considering different segments and different properties that define a quality of Distributed Ledgers.
Connected devices in IoT as well as the smartwatch market are getting more and more popular every year. The main mode of communication in IoT is an easy-to-use MQTT protocol suitable for devices with limited resources and battery power. Tizen is used for platforms such as mobile devices, smartwatches, TVs and even Linux kernel-based IoT devices. In this paper, we explain how MQTT protocol, Tizen operating systems and their architecture work, and suggest one possible implementation of a MQTT protocol for Smartwatches based on the Tizen operating system. We list the types of Tizen applications, develop a native application, and suggest possible future upgrades and appliances in IoT.
Blockchains are established as the most widely used P2P distributed application for Distributed Ledger Technologies. Nevertheless, their successful existence, especially known due to the cryptocurrency of Bitcoin, gets more and more competitors. One of those competitors is IOTA, which is based on a tangle or a Directed Acyclic Graph architecture. This paper stresses potentials of such architecture,especially in future IoT applications. In particular, few of the most import security issues are analyzed.
The main problem dealt with in this paper is the creation of a protocol for improved QoS-aware mobility management support in cellular all-IP networks, whereby we propose a new algorithm for QoS-aware mobility management, based on multidimensional QoS metrics. An analytical framework for performance evaluation was presented as well. The proposed algorithm for QoS-aware dynamic MAP selection relies on multidimensional QoS metrics, defined in QoS-preference spaces of the mobile node and QoS-ability spaces of MAP candidates, in the decision-making process. The metric is chosen to achieve the desired QoS level through three parameters: bandwidth, delay, and reliability, while retaining the balance of MAP's loads in the entire network. For purposes of performance evaluation of the proposed model, we used: algorithm convergence, traffic class distribution by MAP's, and handover delay. Results showed that the standard deviation for each component of the QoS-ability vector is two orders of magnitude smaller than the deviation in the static MAP selection scenario. We achieved a total handover delay decrease from 20 ms to several hundred milliseconds, by simplifying DAD procedures preserving the simplicity of architecture.
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više