Logo

Publikacije (45031)

Nazad
Edin Muratspahić, David Feldman, David E. Kim, Xiangli Qu, A. Bratovianu, Paula Rivera-Sánchez, Federica Dimitri, Jason Cao et al.

G protein-coupled receptors (GPCRs) play key roles in physiology and are central targets for drug discovery and development, yet the design of protein agonists and antagonists has been challenging as GPCRs are integral membrane proteins and conformationally dynamic. Here we describe computational de novo design methods and a high throughput “receptor diversion” microscopy-based screen for generating GPCR binding miniproteins with high affinity, potency and selectivity, and the use of these methods to generate MRGPRX1 agonists and CXCR4, GLP1R, GIPR, GCGR and CGRPR antagonists. Cryo-electron microscopy data reveals atomic-level agreement between designed and experimentally determined structures for CGRPR-bound antagonists and MRGPRX1-bound agonists, confirming precise conformational control of receptor function. Our de novo design and screening approach opens new frontiers in GPCR drug discovery and development.

Hila Safi, Medina Bandic, Christoph Niedermeier, C. G. Almudever, Sebastian Feld, Wolfgang Mauerer

Design space exploration (DSE) plays an important role in optimising quantum circuit execution by systematically evaluating different configurations of compilation strategies and hardware settings. In this paper, we conduct a comprehensive investigation into the impact of various layout methods, qubit routing techniques, and optimisation levels, as well as device-specific properties such as different variants and strengths of noise and imperfections, the topological structure of qubits, connectivity densities, and back-end sizes. By spanning through these dimensions, we aim to understand the interplay between compilation choices and hardware characteristics. A key question driving our exploration is whether the optimal selection of device parameters, mapping techniques, comprising of initial layout strategies and routing heuristics can mitigate device induced errors beyond standard error mitigation approaches. Our results show that carefully selecting software strategies (e.g., mapping and routing algorithms) and tailoring hardware characteristics (such as minimising noise and leveraging topology and connectivity density) significantly improve the fidelity of circuit execution outcomes, and thus the expected correctness or success probability of the computational result. We provide estimates based on key metrics such as circuit depth, gate count and expected fidelity. Our results highlight the importance of hardware–software co-design, particularly as quantum systems scale to larger dimensions, and along the way towards fully error corrected quantum systems: Our study is based on computationally noisy simulations, but considers various implementations of quantum error correction (QEC) using the same approach as for other algorithms. The observed sensitivity of circuit fidelity to noise and connectivity suggests that co-design principles will be equally critical when integrating QEC in future systems. Our exploration provides practical guidelines for co-optimising physical mapping, qubit routing, and hardware configurations in realistic quantum computing scenarios.

B. Duraković, Hazim Bašić

Maturity of companies, organizational culture and costs are some of the major limiting factors for implementation of cost-effective six sigma methodology in Bosnian companies.  Purpose of this paper is to analyze possibilities of applying six sigma concept in a medium-sized company with 200 employees. Considering all characteristics of the company, in this case, implementation model was proposed and tested. The project included process monitoring using statistical process control charts for two different periods, before and after improvement. Dominant defect and its causes were identified and it was found out that the process was out of control. Implementation of improvement measures, dominant defect was eliminated but the process has remained out of control. As a conclusion, the test indicated that the model is effective but it takes more iteration to achieve the desired state.

Artur Hermann, Nataša Trkulja, Echo Meissner, Benjamin Erb, Frank Kargl

Vehicular communication via V2X networks increases road safety, but is vulnerable to data manipulation which can lead to serious incidents. Existing security systems, such as misbehavior detection systems, have limitations in detecting and mitigating such threats. To address these challenges, we have implemented a software prototype of a Trust Assessment Framework (TAF) that assesses the trustworthiness of received V2X data by integrating evidence from multiple trust sources. This interactive demonstration illustrates the quantification of trust for a smart traffic light system application. We demonstrate the impact of varying evidence coming from a misbehavior detection system and a security report generator on the trust assessment process. We also showcase internal processing steps within our TAF when receiving new evidence, up to and including the eventual decision making on the trustworthiness of the received V2X data.

Artur Hermann, Nataša Trkulja, Patrick Wachter, Benjamin Erb, Frank Kargl

Future vehicles and infrastructure will rely on data from external entities such as other vehicles via V2X communication for safety-critical applications. Malicious manipulation of this data can lead to safety incidents. Earlier works proposed a trust assessment framework (TAF) to allow a vehicle or infrastructure node to assess whether it can trust the data it received. Using subjective logic, a TAF can calculate trust opinions for the trustworthiness of the data based on different types of evidence obtained from diverse trust sources. One particular challenge in trust assessment is the appropriate quantification of this evidence. In this paper, we introduce different quantification methods that transform evidence into appropriate subjective logic opinions. We suggest quantification methods for different types of evidence: security reports, misbehavior detection reports, intrusion detection system alerts, GNSS spoofing scores, and system integrity reports. Our evaluations in a smart traffic light system scenario show that the TAF detects attacks with an accuracy greater than 96% and intersection throughput increased by 42% while maintaining safety and security, when using our proposed quantification methods.

Jasmina Mušović, Danijela Tekić, Ana Jocić, Slađana Marić, Aleksandra Dimitrijević

The increasing demand for lithium-ion batteries (LIBs) and their limited lifespan emphasize the urgent need for sustainable recycling strategies. This study investigates the application of tetrabutylphosphonium-based ionic liquids (ILs) as alternative leaching agents for recovering critical metals, Li(I), Co(II), Ni(II), and Mn(II), from spent NMC cathode materials. Initial screening experiments evaluated the leaching efficiencies of nine tetrabutylphosphonium-based ILs for Co(II), Ni(II), Mn(II), and Li(I), revealing distinct metal dissolution behaviors. Three ILs containing HSO4−, EDTA2−, and DTPA3− anions exhibited the highest leaching performance and were selected for further optimization. Key leaching parameters, including IL and acid concentrations, temperature, time, and solid-to-liquid ratio, were systematically adjusted, achieving leaching efficiencies exceeding 90%. Among the tested systems, [TBP][HSO4] enabled near-complete metal dissolution (~100%) even at room temperature. Furthermore, an aqueous biphasic system (ABS) was investigated utilizing [TBP][HSO4] in combination with ammonium sulfate, enabling the complete extraction of all metals into the salt-rich phase while leaving the IL phase metal-free and potentially suitable for reuse, indicating the feasibility of integrating leaching and extraction into a continuous, interconnected process. This approach represents a promising step forward in LIB recycling, highlighting the potential for sustainable and efficient integration of leaching and extraction within established hydrometallurgical frameworks.

R. Tillaar, H. Pojskić, Håkan Andersson

Background/Objectives: This study aimed to investigate the skating determinants and differences between male and female bandy players in the spatiotemporal variables during acceleration and maximum sprint skating velocity. Methods: Seventy-four female bandy players (age: 18.9 ± 4.1 years; height: 1.67 ± 0.06 m; body mass: 63.2 ± 7.4 kg; training experience: 13.4 ± 3.9 yrs.; and 26 elite and 48 junior elite) and 111 male bandy players (age: 20.7 ± 5.0 years; height: 1.80 ± 0.05 m; body mass: 76.4 ± 8.4 kg; training experience: 13.8 ± 5.0 yrs.; and 47 elite and 66 junior elite players) performed linear sprint skating over 80 m. Split times were measured every ten metres by photocells to calculate velocities for each step and spatiotemporal skating variables (glide times and length, step length, and frequency) by IMUs attached to the skates. The first six steps (acceleration phase), the six steps at the highest velocity (maximal speed phase), and the average of all steps were used for analysing glide-by-glide spatiotemporal variables. Results: These revealed that male players exhibited higher acceleration and maximal skating velocity than female players. A higher acceleration in men was accompanied by shorter gliding time, longer step length, and higher step frequency. When skating at maximal speed, male players had a longer step length and gliding time and length. The sub-group analysis revealed that step frequency did not correlate with skating velocity, acceleration, or maximal speed phases. On the other hand, glide and step lengths significantly correlated with skating velocity in both phases (r ≥ 0.60). Conclusions: In general, for faster skating in bandy, it is generally better to prioritise glide and step length than stride frequency. Hence, players should be encouraged to stay low and have more knee flexion to enable a longer extension length and, therefore, a longer path and more horizontal direction of applied force to enhance their acceleration ability.

The blind and visually impaired group cannot use most of the cutting-edge technology that usually conveys information visually through different kinds of displays. Different solutions can help overcome this obstacle, such as the usage of sound output and tactile displays that use the Braille alphabet composed of mechanically raised dots. However, there is a considerable amount of visually impaired persons who cannot read Braille and an even larger amount of persons without visual impairment. This paper presents an IoT-based system that uses the Arduino Uno WiFi development board for reading Braille input from a $4 \times 4$ push button matrix, two letters at a time. The system uses the $32 \times 8$ matrix display to show the translated basic alphabet output that can be read by sighted users or to show the Braille alphabet output. It offers a quick way for the visually impaired to convey information to sighted people by typing Braille input with both hands simultaneously. The proposed system will be used to educate sighted individuals about the Braille alphabet and help reduce their learning time. It can also be used as a quick translator of Braille for sighted individuals who wish to read written Braille documents.

Texas Instruments development kits have a wide application in practical and scientific experiments due to their small size, processing power, available booster packs, and compatibility with different environments. The most popular integrated development environments for programming these development kits are Energia and Code Composer Studio. Unfortunately, there are no existing studies that compare the benefits and drawbacks of these environments and their performances. Conversely, the performances of the FreeRTOS environment are well-explored, making it a suitable baseline for embedded systems execution. In this paper, we performed the experimental evaluation of the performance of Texas Instruments MSP-EXP432P401R when using Energia, Code Composer Studio, and FreeRTOS for program execution. Three different sorting algorithms (bubble sort, radix sort, merge sort) and three different search algorithms (binary search, random search, linear search) were used for this purpose. The results show that Energia sorting algorithms outperform other environments with a maximum of 400 elements. On the other hand, FreeRTOS search algorithms far outperform other environments with a maximum of $\mathbf{2 5 5, 0 0 0}$ elements (whereas this maximum was $\mathbf{1 0, 0 0 0}$ elements for other environments). Code Composer Studio resulted in the largest processing time, which indicates that the lowlevel registry editing performed in this environment leads to significant performance issues.

Computer games can be used not only for entertainment but also for education. Embedded systems can be used to improve the gamified learning process by making the interaction with intended users more interesting. Texas Instruments development kits with booster plug-in modules can improve the outcomes of gamified learning. However, there is a lack of studies that explore the benefits and drawbacks of different input methods for gamified learning purposes. In this paper, a snake game was developed on the Texas Instruments MSP-EXP432P401R development kit that uses the analog joystick of the BOSTXL-EDUMKII plug-in module for controlling the snake. An experimental usability study was conducted on 61 3rd year university students, comparing the analog joystick to the computer keyboard, computer mouse, and mobile touchscreen input methods. The achieved results showed that the majority of students preferred the original computer keyboard input method and that more than half of the participants preferred the 90 -degree rotation of the snake compared to the 360 degree analog joystick. However, the analog joystick improved the gaming experience by 63.6 %, and many students made positive comments about its usability in general, indicating that its application for gamified learning may be possible for other types of games.

The impressive results achieved by language recognition using a generative pre-trained transformer have led to divided opinions on whether or not the Turing test has finally been passed. After understanding the working principles of the GPT programs, it was remarked that the tokenization concept, used by GPT, resulted in the loss of the word-to-letter relationship. Through about 36 specially prepared anagrams with a description of a term in a verse in the languages of the South Slavs, it was shown that ChatGPT and similar programs are far more capable of understanding the semantic connection between words and allusions than in performing the relatively simple task of searching for an adequate word from the offered letters.

Sead Delalic, Samra Behić, Harun Goralija, Zenan Sabanac

Warehouse Management Systems (WMS) employ advanced optimization techniques to enhance efficiency and streamline processes, from inventory positioning to order picking and packing. Among these, order picking represents the most time-consuming and resourceintensive operation. This paper presents a novel approach for monitoring worker efficiency in warehouses, focusing on estimating the complexity and time required for order picking. A variety of factors influence these estimates, including item location, quantity, dimensions and weight of items, picking sequence, and whether the location is in the stock or picking zone. Accurate estimation enables effective daily work planning, real-time monitoring of worker productivity, and overall warehouse efficiency. The proposed approach has been tested in real-world warehouse environments, demonstrating its practical applicability and potential to significantly improve worker performance, resource allocation, and operational management.

Sead Delalic, Zinedin Kadric, Jana Jerkić, Faris Mehmedović

This paper addresses the challenge of analyzing CVs to parse their content into structured formats suitable for further processing and analysis. The proposed solution processes CVs provided as images or PDFs, handling diverse input formats, including free-form, multi-language, non-standardized layouts, and highly structured documents. Various heuristic approaches are employed for layout analysis, complemented by lightweight language models for extracting information. While multimodal models demonstrate strong performance, their cost and deployment complexity remain significant barriers. This study explores alternative methods optimized for computational efficiency, processing accuracy, and easier deployment. A comparative analysis of approaches is conducted on a standard dataset containing CVs from diverse clients and job roles, ranging from entry-level to specialized positions in various domains. The findings highlight the potential of these tailored, efficient solutions for scalable and secure CV parsing.

Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!

Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više