In the public safety sector, 5G offers immense opportunities for enhancing mission-critical services by provisioning virtualized service functions at the network edge, which enables achieving high reliability and low-latency. One of these mission-critical services is Back Situation Awareness (BSA) that supports Emergency Vehicles (EmVs) by increasing awareness about them on the roads. In this article, we introduce an on-demand BSA application service, which has been developed for multi-domain Multi-Access Edge Computing (MEC) systems, enabling early notification for vehicles on the Estimated Time of Arrival (ETA) of an approaching EmV. The state-of-the-art approaches inform civilian vehicles about EmVs only when they are in a close proximity (up to 300 m). However, in some situations (e.g., in congested areas), this may not be enough for the civilian vehicles to safely and timely maneuver out of the lane of an EmV. Our approach is, to the best of our knowledge, a unique way to significantly extend this awareness by creating an orchestrated 5G-based MEC deployment of BSA application service on optimally selected edges, thereby stretching over multiple edge domains and even countries. While consuming the real-time location, speed, and heading of an EmV, such application service affords the drivers with sufficient time to create a clear corridor, allowing the EmV to pass through unhindered in a safe manner thereby increasing the mission success. The detailed design and the performance analysis of the BSA application service that has been created following modern cloud-native principles based on Docker and Kubernetes, is presented in terms of the impact of emergency scale on the MEC system resources and service response time. Moreover, we also introduce a metric called panic indicator, which depicts how the proposed BSA service can potentially help in enabling drivers to calmly maneuver out of the path of an EmV, thereby increasing road safety.
The increasing prevalence of colon and lung cancer presents a considerable challenge to healthcare systems worldwide, emphasizing the critical necessity for early and accurate diagnosis to enhance patient outcomes. The precision of diagnosis heavily relies on the expertise of histopathologists, constituting a demanding task. The health and well‐being of patients are jeopardized in the absence of adequately trained histopathologists, potentially leading to misdiagnoses, unnecessary treatments, and tests, resulting in the inefficient utilization of healthcare resources. However, with substantial technological advancements, deep learning (DL) has emerged as a potent tool in clinical settings, particularly in the realm of medical imaging. This study leveraged the LC25000 dataset, encompassing 25,000 images of lung and colon tissue, introducing an innovative approach by employing a self‐organized operational neural network (Self‐ONN) to accurately detect lung and colon cancer in histopathology images. Subsequently, our novel model underwent comparison with five pretrained convolutional neural network (CNN) models: MobileNetV2‐SelfMLP, Resnet18‐SelfMLP, DenseNet201‐SelfMLP, InceptionV3‐SelfMLP, and MobileViTv2_200‐SelfMLP, where each multilayer perceptron (MLP) was replaced with Self‐MLP. The models’ performance was meticulously assessed using key metrics such as precision, recall, F1 score, accuracy, and area under the receiver operating characteristic (ROC) curve. The proposed model demonstrated exceptional overall accuracy, precision, sensitivity, F1 score, and specificity, achieving 99.74%, 99.74%, 99.74%, 99.74%, and 99.94%, respectively. This underscores the potential of artificial intelligence (AI) to significantly enhance diagnostic precision within clinical settings, portraying a promising avenue for improving patient care and outcomes. The synopsis of the literature provides a thorough examination of several DL and digital image processing methods used in the identification of cancer, with a primary emphasis on lung and colon cancer. The experiments use the LC25000 dataset, which consists of 25,000 photos, for the purposes of training and testing. Various techniques, such as CNNs, transfer learning, ensemble models, and lightweight DL architectures, have been used to accomplish accurate categorization of cancer tissue. Various investigations regularly show exceptional performance, with accuracy rates ranging from 96.19% to 99.97%. DL models such as EfficientNetV2, DHS‐CapsNet, and CNN‐based architectures such as VGG16 and GoogleNet variations have shown remarkable performance in obtaining high levels of accuracy. In addition, methods such as SSL and lightweight DL models provide encouraging outcomes in effectively managing large datasets. In general, the research emphasizes the efficacy of DL methods in successfully diagnosing cancer from histopathological pictures. It therefore indicates that DL has the potential to greatly improve medical diagnostic techniques.
Abstract Objective. Studies that have evaluated correlation between body mass index (BMI) and novel lipid indices such as triglycerides (TG)/high-density lipoprotein-cholesterol (HDL-C), total cholesterol (TC)/HDL-C, and low-density lipoprotein cholesterol (LDL-C)/HDL-C in type 2 diabetes mellitus (T2DM) are scarce. Hence, the aim of the present study was to explore the correlation between BMI and novel lipid indices in Bosnian patients with T2DM. Methods. Present study included 117 patients with T2DM (mean age: 66.51 years) and 68 controls (mean age: 68.37 years). BMI was calculated as weight/height². Lipids were measured by standard methods. TG/HDL-C, TC/HDL-C, and LDL-C/HDL-C ratios were separately calculated. The differences between the groups were assessed by Student’s t-test or Man Whitney U test. Correlations were determined by Spearman’s test. Results. In a total sample of T2DM patients, 41.0% were overweight and 44.4% were obese. In the control group, 51.5% of subjects were overweight and 25.0% were obese. In T2DM group, a significant correlation was observed between BMI and HDL-C, LDL-C, TG/HDL, TC/HDL-C, and LDL-C/HDL-C ratios. In the control group, there was a significant correlation found between BMI and HDL-C, TG, TG/HDL, TC/HDL-C, and LDL-C/HDL-C-ratios. Correlation between BMI and other lipid parameters in T2DM and the control group was not determined. Conclusion. The present study showed significant correlation between BMI and novel lipid indices in both T2DM patients and the control group of subjects. Possible explanation for the observed results might be prevalence of overweight and obese participants in this study sample. Since novel lipid indices are used in the prediction of cardiometabolic risk, results obtained in the present study have valuable clinical implications.
Trains move in a specific way, along a pre-determined path, i.e. rails. The wheels of railway locomotives and rails are steel, and because of this, it is possible to achieve very high speeds, with relatively low resistance to movement. Analysis and research related to the movement of trains are very important from many aspects, especially for traffic safety. In this paper, a simulation of train movement under realistic conditions was performed. The simulation was created using the Python programming language. Infrastructure data, locomotive data, and resistances were used as input data, which Python later converts for simulation. The results of the simulation are presented both graphically and numerically. All data used in the Python simulation model was also input into the verified railway simulation software OpenTrack. The results from both tests were compared and analyzed, and a report was generated on the feasibility of using the created program in real-world scenarios.
Capacity of railway station is highly dependable on used interlocking system and traffic management pattern. Interlocking system's improvement requires significant financial resources, while traffic pattern could be changed by applying different management rules, depending on transport demand. This means that it is desirable to determine a certain trade-off between these parameters, up to the maximum capacity utilization index with the existing interlocking system. Several methods have been developed that consider given parameters for station capacity determination. In this paper we compare some of the methods for capacity determination, using the example of the Kalenic station, which was recently taken over by TENT from another industrial system.
Background: The incidence of HF following ACS remains unacceptably high at discharge and several identified risk factors contribute to the development of HF in this context. Objective: This study investigated the prevalence and clinical significance of HF in patients admitted to the Clinic for Heart, Blood Vessels, and Rheumatic Diseases at the Clinical Center of the University of Sarajevo following ACS. Methods: This retrospective observational study was conducted at the Clinic for Heart, Blood Vessels, and Rheumatic Diseases of the Clinical Center of the University of Sarajevo between February 1st and April 1st, 2023, involving patients who were admitted because of ACS. Results: Patients with HFrEF were significantly (p=0.034) older (70.0 (62.0;76.0) vs 67.0 (57.5;75.0)), had (p=0.046) higher median score of LDH (321.5 (222.3; 501.5) vs. 256.0 (200.0; 420.0)), fibrinogen (p=0.047) (4.5 (3.2; 5.1) vs 3.6 (2.8; 5.0)), and NT-proBNP (p<0.001) (3705.0 (2500.0; 12559.5) vs. 500.0 (275.0; 333.0)), had enlarged left atrium diameter (3.9 (3.4; 4.4) vs 3.6 (3.1; 4.1)), enlarged left ventricular diameter both in diastole (5.1 (4.5; 5.8) vs 4.6 (4.1; 5.1)) and systole (3.7 (3.2; 4.1) vs 3.5 (3.1; 3.7)), thinner interventricular septum diameter both in diastole (1.1 (1.0; 1.2) vs 1.2 (1.1; 1.3)) and systole (1.3 (1.2; 1.5) vs. 1.4 (1.3; 1.5)) and elevated right ventricular systolic pressure (37.0 (30.0; 47.5) vs. 35.0 (28.0; 40.0 )) compared to patients without HFrEF. Severe mitral regurgitation was more observed in group of patients with HFrEF (p<0.001). Conclusion: HFrEF patients showed a 40% incidence of post-ACS, had elevated LDH, fibrinogen, and NT-proBNP levels, along with distinct echocardiographic differences, including enlarged heart chambers and higher mitral regurgitation rates following ACS. Early HF risk factor management is crucial for optimizing outcomes in ACS patients.
Background/Aim: To evaluate the effect of curing light parameters (intensity, duration, and distance of curing tip) on the depth of cure of conventional resin-based composite. Material and methods: Cylindrical specimens made of nanohybride resin-based composite are cured with 12 different curing protocols, combined with 3 different light intensities (300, 650, and 1100 mW/cm²), 2 distances of curing tip (0 and 8 mm), and 2 exposure times (20 and 40 seconds). The specimens were measured after scraping the uncured composite material according to the ISO 4049 standard. The depth of cure was calculated by dividing the length of the remaining composite by 2. Data were analyzed using: Levene's test and Multivariate Analysis-of-variance (MANOVA). The level of significance was set at P<0.05. Results: The highest depth of cure (3.332 mm) was observed for curing protocol 1100mW/cm²/0mm/40s. The lowest depth of cure had specimens cured with curing protocol 300 mW/cm²/8mm/20s (2.034mm). MANOVA showed a significant influence of the distance of the curing tip (P=0.014; P=0.001) regardless of light intensity and duration of exposure time. Exposure time was a significant factor (P=0.009) when cured from different distances. Although higher light intensity produced a higher depth of cure, light intensity was not a significant factor. Conclusions: The depth of cure can be increased by reducing the distance of the curing tip, when it is possible. At a distance of 8 mm, the depth of cure can be increased by a longer exposure time, regardless of curing light intensity.
The article presents an approach to the automatic derivation of conceptual database models from heterogeneous source artifacts. The approach is based on the integration of conceptual database models that are derived from source artifacts of one single type by already existing tools, whereby those models possess limited certainty given their limited completeness and correctness. The uncertainty of the automatically derived models from specific source artifacts is expressed and managed through the effectiveness measure of the generation of specific concepts of the input conceptual database models. The approach is implemented by the DBomnia tool - the first online web-based tool enabling automatic derivation of conceptual database models from heterogeneous source artifacts (business process models and textual specifications). DBomnia employs other pre-existing tools to derive conceptual models from sources of the same type and then integrates those models. The case study-based evaluation proves that the implemented approach enables effective automatic derivation of the conceptual database model from a set of heterogeneous source artifacts. Moreover, the automatic derivation of the conceptual database model from a set of heterogeneous source artifacts is more effective than each independent automatic derivation of the conceptual database model from sources of one single type only.
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više