Autonomic nervous system (ANS) dysfunction has emerged as a central feature of post-infectious syndromes, including post-COVID syndrome (PCS), chronic fatigue syndrome (CFS), and late-stage Lyme disease. This cross-sectional study included 1036 patients evaluated in the Neurocardiological Laboratory of the Institute for Cardiovascular Diseases “Dedinje,” divided into four groups: PCS, CFS after COVID-19, CFS of insidious onset, and Lyme disease. All patients underwent head-up tilt testing (HUTT), and serological testing was performed in accredited institutions. The Lyme disease group demonstrated the highest prevalence of positive HUTT responses and a significantly greater frequency of orthostatic hypotension and syncope. Approximately 50–65% of patients in the PCS and Lyme groups were positive for IgM antibodies against at least one microorganism, with more than 10% showing positivity for three or more pathogens. Logistic regression analysis revealed that, beyond classical hemodynamic parameters, antibody status served as a significant predictor of HUTT outcomes, with specific associations identified for HSV-1, HHV-6, Coxiella burnetii, Toxoplasma gondii, and Borrelia spp. Multinomial regression further indicated that negative IgG antibodies, particularly to HSV-1 and VZV, predicted Lyme disease group membership. These findings support the hypothesis that ANS dysfunction in post-infectious syndromes may be driven by persistent or prior infections, highlighting the need for integrative diagnostic approaches.
Background: Testicular torsion scoring systems, based on a combination of clinical and imaging factors, have been developed to improve the diagnostic accuracy of testicular torsion in patients presenting with acute scrotum. This study aimed to validate and compare two current testicular torsion scores the Boettcher Alert Score (BAL) and the Testicular Workup for Ischemia and Suspected Torsion (TWIST)-in a retrospective cohort of pediatric patients with acute scrotum. Methods: We conducted a retrospective study of all pediatric patients admitted to our institution for acute scrotum between January 2010 and December 2022. Patients were categorized into the testicular torsion (TT) group and the non-testicular torsion (NTT) group. Collected data were used to calculate the scoring systems and perform statistical analyses. Results: A total of 241 patients were included, of whom 80 (33.2%) had testicular torsion. The mean age in the TT group was 13 years. The optimal individual cut-off value for the BAL score was >1 (sensitivity 90%, specificity 80.75%), and for the TWIST score >4 (sensitivity 82.5%, specificity 80.75%). A high-risk TWIST score >5 had a specificity of 80.75% and a negative predictive value (NPV) of 90.28%, while a BAL score of 4 showed a specificity of 98.48% and NPV of 94.2%. The area under the ROC curve was slightly higher for the BAL score (0.917; 95% CI, 0.875–0.949) than for the TWIST score (0.897; 95% CI, 0.851–0.932). The difference between the two scores was not statistically significant. Conclusion: The TWIST and BAL clinical scores have significant diagnostic value and may assist in the evaluation of testicular torsion in children. Both scores could be incorporated into a standardized approach for assessing pediatric acute scrotum, potentially reducing time to definitive diagnosis, and minimizing ischemia duration.
We investigate the neutrino sector in the framework of flavor deconstruction with an inverse-seesaw realization. This setup naturally links the hierarchical charged-fermion masses to the anarchic pattern of light-neutrino mixing. We determine the viable parameter space consistent with oscillation data and study the phenomenology of heavy neutral leptons (HNL) and lepton-flavor-violating (LFV) processes. Current bounds from direct HNL searches and LFV decays constrain the right-handed neutrino scale to a few TeV, while future $\mu \to e$ experiments will probe most of the region with $\Lambda \lesssim 10~\text{TeV}$. Among possible realizations, models deconstructing $\mathrm{SU}(2)_\mathrm{L} \times \mathrm{U}(1)_\mathrm{B-L}$ or $\mathrm{SU}(2)_\mathrm{L} \times \mathrm{U}(1)_\mathrm{R} \times \mathrm{U}(1)_\mathrm{B-L}$ are those allowing the lowest deconstruction scale.
Zeolites are particularly suitable adsorbents due to their pronounced ion-exchange capacity, high efficiency, stability, and the ability to be regenerated and reused multiple times. Their characteristic crystalline structure enables the exchange of sodium, potassium, calcium, and magnesium ions with heavy metal cations present in solution. For the successful application of zeolites under industrial conditions, a detailed understanding of the adsorption mechanisms and kinetics is essential, as it allows for process optimization and identification of key limiting factors. Experimental approaches typically involve varying the adsorbent mass and the initial concentration of the adsorbate in order to determine the optimal conditions for achieving maximum adsorption efficiency. A moisture content of 3.95% and ash content of 91.28% indicate high thermal and structural stability of the zeolite, while the presence of Na⁺ ions (0.2435 mmol g⁻¹) in the material confirms that cation exchange is the dominant mechanism. Adsorption of heavy metals was investigated in a batch reactor at initial concentrations of 10, 50, and 100 mg/L, at a constant temperature of 298 K, with stirring at 200 rpm for 60 minutes. The amount of adsorbed ions was found to increase with rising equilibrium concentrations in the solution. Metal ion concentrations were determined using atomic absorption spectrophotometry. The highest adsorption was observed for Cu(II) ions within 5 minutes, while Cr(III) and Ni(II) ions reached their maximum adsorption within 20 minutes. The experimental data fit best to the Langmuir isotherm model, and the adsorption efficiency followed the order: Cu(II) > Cr(III) > Ni(II).
This study examines the impact of an unintended fire at the Drava International plastic processing facility near Osijek, Croatia, on soil quality and the potential human health risks associated with agricultural soils within a 10 and 20 km radius. Surface soil samples (0–5 cm) were collected from ten locations within 10 km three days after the incident, and eight composite samples were taken from sites 10–20 km away 17 days’ post-event. Additionally, 18 control samples previously collected for soil fertility or quality monitoring were included for comparative analysis. In total, 36 agricultural soil samples were analyzed for pH, organic matter, total phosphorus, potassium, calcium, magnesium, and trace elements (Cr, Co, Ni, Cu, Zn, As, Pb). Eighteen post-fire samples were also analyzed for polycyclic aromatic hydrocarbons (PAHs), dioxins, and perfluoroalkyl substances (PFAS). Ecological risk was assessed using the pollution load index (PLI) and enrichment factor (EF), while human health risk was evaluated through the estimation of incremental lifetime cancer risk (ILCR) and individual carcinogenic risks (CRi) for As, Cr, Ni, and Pb. Results showed that concentrations of dioxins (TEQ LB and UB), dioxin-like PCBs, and non-dioxin-like PCBs in samples within 10 km were either below detection limits or present in trace amounts (4.0 × 10−6 mg/kg). PFAS compounds were not detected (<0.0005 mg/kg). The total concentration of non-dioxin-like PCBs ranged from 0.0023 to 0.0047 mg/kg, well below the maximum permissible levels. Post-fire contamination profiles revealed consistently higher PAH concentrations in the 0–10 km zone (mean 0.100 mg/kg) compared to the 10–20 km zone (mean 0.062 mg/kg). Twenty PLI values exceeded the threshold of 1 (range: 1.00–1.26), indicating moderate pollution, while the remaining values (PLI 0.82–0.99) suggested no pollution. EF values indicated minimal to moderate enrichment (EF < 2), supporting the conclusion that metal presence was predominantly geological with limited anthropogenic influence. All ILCR values for adults and children remained below the acceptable threshold of 1 × 10−4, indicating low carcinogenic risk under both pre- and post-fire conditions. CRi values followed a consistent decreasing trend across exposure pathways: ingestion > dermal absorption > inhalation.
Background: Artificial intelligence (AI), the overarching field that includes machine learning (ML) and its subfield deep learning (DL), is rapidly transforming clinical research by enabling the analysis of high-dimensional data and automating the output of diagnostic and prognostic tests. As clinical trials become increasingly complex and costly, ML-based approaches (especially DL for image and signal data) offer promising solutions, although they require new approaches in clinical education. Objective: Explore current and emerging AI applications in oncology and cardiology, highlight real-world use cases, and discuss the challenges and future directions for responsible AI adoption. Methods: This narrative review summarizes various aspects of AI technology in clinical research, exploring its promise, use cases, and its limitations. The review was based on a literature search in PubMed covering publications from 2019 to 2025. Search terms included “artificial intelligence”, “machine learning”, “deep learning”, “oncology”, “cardiology”, “digital twin”. and “AI-ECG”. Preference was given to studies presenting validated or clinically applicable AI tools, while non-English articles, conference abstracts, and gray literature were excluded. Results: AI demonstrates significant potential in improving diagnostic accuracy, facilitating biomarker discovery, and detecting disease at an early stage. In clinical trials, AI improves patient stratification, site selection, and virtual simulations via digital twins. However, there are still challenges in harmonizing data, validating models, cross-disciplinary training, ensuring fairness, explainability, as well as the robustness of gold standards to which AI models are built. Conclusions: The integration of AI in clinical research can enhance efficiency, reduce costs, and facilitate clinical research as well as lead the way towards personalized medicine. Realizing this potential requires robust validation frameworks, transparent model interpretability, and collaborative efforts among clinicians, data scientists, and regulators. Interoperable data systems and cross-disciplinary education will be critical to enabling the integration of scalable, ethical, and trustworthy AI into healthcare.
Understanding meat categorization is a fundamental component of veterinary education, especially within the context of food hygiene and public health. Veterinary students must grasp legal classifications of meat, which depend on variables such as species, age, quality, and processing techniques. This knowledge is essential for accurate meat inspection, labeling, and compliance with both national and international food safety standards. Despite prior exposure to muscle anatomy in anatomy course, students often face challenges in applying this knowledge to practical meat classification tasks. This study aimed to assess the effectiveness of three distinct instructional methods in improving veterinary students’ ability to identify meat categories and associated muscle structures: traditional classroom teaching, computer-based instruction using 3D models, and immersive virtual reality (VR). Participants included fourth-year veterinary students during the summer semester of the 2024/2025 academic year. To facilitate digital learning, a dedicated 3D model library “3DMeat” was developed as well as virtual reality environment. Results indicate that technology-enhanced instructional approaches, can significantly enhance student engagement and understanding of complex topics such as meat categorization. Initial test scores were highest in the group using 3D models (16.3 ± 4.1), followed by the traditional lecture-based group (15.6 ± 3.07), and the VR group (11.7 ± 5.1). However, a follow-up assessment conducted 2 weeks later revealed that VR group demonstrated the highest retention of knowledge. These findings suggest that although immediate performance may vary, immersive learning environments such as VR can foster stronger medium-term retention of complex material.
Background: Breast cancer remains the most common cancer in women worldwide. Treatment has evolved into multimodal approaches, with pathologic complete response (pCR) after neoadjuvant chemotherapy (NAC) serving as a key prognostic marker. The aim of this study was to evaluate the value of inflammatory markers in predicting pCR to NAC in breast cancer. Methods: This cross-sectional study of 74 patients with breast cancer who underwent NAC followed by surgery included demographic, tumor, and immune-inflammatory marker data. Receiver operating characteristic curve analysis and the Youden index were used to determine optimal cutoff values. Univariate and multivariate logistic regression assessed associations between markers and pCR, adjusting for tumor stage, human epidermal growth factor receptor 2 (HER2), and estrogen receptor (ER) status. Results: Our multivariate analysis identified the pan-immune-inflammation value (PIV), HER2 status, and ER status as significant independent predictors of pCR. PIV (OR, 4.28; 95% CI, 1.59–16.88) remained significant among inflammatory markers, while the neutrophil-to-lymphocyte ratio (NLR), monocyte-to-lymphocyte ratio (MLR), and platelet-to-lymphocyte ratio (PLR) did not. HER2-positive (OR, 7.45; 95% CI, 2.30–24.15) and hormone receptor (HR)–negative (OR, 7.02; 95% CI, 2.63–18.70) statuses were also strongly associated with pCR. Conclusion: PIV is a robust predictor of pCR in patients with breast cancer receiving NAC, offering a comprehensive reflection of the immune-inflammatory state. Incorporating PIV with tumor-specific markers (e.g., receptor status, Ki-67, grade) may enhance treatment stratification. Further validation in diverse cohorts is warranted.
Deformable medical image registration is a fundamental task in medical image analysis. While deep learning-based methods have demonstrated superior accuracy and computational efficiency compared to traditional techniques, they often overlook the critical role of regularization in ensuring robustness and anatomical plausibility. We propose DARE (Deformable Adaptive Regularization Estimator), a novel registration framework that dynamically adjusts elastic regularization based on the gradient norm of the deformation field. Our approach integrates strain and shear energy terms, which are adaptively modulated to balance stability and flexibility. To ensure physically realistic transformations, DARE includes a folding-prevention mechanism that penalizes regions with negative deformation Jacobian. This strategy mitigates non-physical artifacts such as folding, avoids over-smoothing, and improves both registration accuracy and anatomical plausibility
This study examines job performance among judo referees through the lens of personality traits during World Judo Tour events from 2018 to 2022. Sixty-three referees completed an online questionnaire including the Big Five Inventory (BFI) and the Conditions for Work Effectiveness Questionnaire (CWEQ-II). Data were analyzed using descriptive statistics, correlation analysis, and structural equation modeling (SEM). The measurement model showed acceptable validity and reliability, confirming the structural model. Support and resources emerged as the most influential factors affecting job satisfaction (JAS) and organizational role satisfaction (ORS). Incorporating refereeing experience at major events into the model indicated only partial model fit. Findings highlight the role of structural empowerment in mitigating job dissatisfaction among referees. Future research with larger samples should further strengthen the understanding of the relationship between personality traits, empowerment, and job performance.
Unmanned aircraft are increasingly recognized for their potential to enhance healthcare logistics, offering rapid and reliable transport solutions. Among the many envisioned use cases, emergency medical deliveries stand out as particularly promising due to their immediate societal value. This study investigates the potential of drones operating under U-space to support hospital-to-hospital emergency deliveries in Madrid. Using the GEMMA tool, we modeled and simulated operations with two drone types along direct routes between four hospitals, resulting in six hospital pairs. Drone travel times were estimated and compared against road transport times obtained from the Google Routes API, incorporating one week of traffic data to capture daily and weekend variability. The results show substantial advantages of aerial transport, with time savings ranging from 2 to 26 min, equivalent to 35–58% compared to road transport. Drones consistently ensured deliveries within 15 min, outperforming regular cars (39%) and ambulances or motorcycles in highly congested periods. Sensitivity analysis confirms their reliability in scenarios with strict time constraints, especially under 15 min. These findings demonstrate that drones reduce travel times and improve predictability, providing a robust evidence base for policymakers and regulators to advance U-space integration in healthcare logistics.
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više