Lumbar disc herniation (LDH) often results in significant pain and disability, and histopathologic evaluation of intervertebral discs offers critical insights into treatment outcomes. This prospective observational study explores histopathologic (HP) changes in intervertebral discs (IVD) and their association with clinical outcomes following surgical treatment for lumbar disc herniation (LDH). A cohort of 141 patients undergoing magnetic resonance imaging (MRI)-confirmed LDH surgery underwent HP evaluation using a semi-quantitative Histopathologic Degeneration Score (HDS). Preoperatively and at a six-month follow-up, comprehensive clinical assessment included the Oswestry Disability Index (ODI) and Visual Analog Scale (VAS), with a minimal clinically important difference (MCID) calculated from ODI and VAS. Results indicated significant associations between higher HDS and adverse clinical outcomes, including persistent pain and greater disability post-surgery. Specifically, HDS ≥ 7 was predictive (OR = 6.25, 95%CI: 2.56-15.23) of disability outcomes measured with MCID-ODI (AUC: 0.692, 95%CI: 0.609-0.767, P < 0.001), and HDS ≥ 8 was predictive (OR = 1.72, 95%CI: 1.04-2.77) of persistent pain measured with MCID-VAS (AUC: 0.628, 95%CI: 0.598-0.737, P = 0.008), highlighting the diagnostic potential of HDS in assessing postoperative recovery. This study underscores the potential of HP evaluation using HDS to provide valuable insights into disease progression and outcomes in LDH patients, complementing conventional radiologic methods. The findings support the application of personalized treatment strategies based on HP findings while acknowledging challenges in interpretation and clinical implementation.
Simple Summary This study explores hypoxia-inducible factors (HIFs) in glioblastoma development, progression, and treatment. Reviewing 104 relevant studies, it highlights diverse global contributions, with China leading at 23.1%. The most productive year was 2019, contributing 11.5% of the studies. Key factors studied included HIF1α, HIF2α, osteopontin, and cavolin-1, involving pathways such as GLUT1, GLUT3, VEGF, PI3K-Akt-mTOR, and ROS. HIF expression correlates with glioblastoma progression, survival, neovascularization, glucose metabolism, migration, and invasion. Overcoming treatment resistance and the lack of biomarkers is crucial for integrating HIF-related therapies into glioblastoma treatment to improve patient outcomes. Abstract Background: The study aims to investigate the role of hypoxia-inducible factors (HIFs) in the development, progression, and therapeutic potential of glioblastomas. Methodology: The study, following PRISMA guidelines, systematically examined hypoxia and HIFs in glioblastoma using MEDLINE (PubMed), Web of Science, and Scopus. A total of 104 relevant studies underwent data extraction. Results: Among the 104 studies, global contributions were diverse, with China leading at 23.1%. The most productive year was 2019, accounting for 11.5%. Hypoxia-inducible factor 1 alpha (HIF1α) was frequently studied, followed by hypoxia-inducible factor 2 alpha (HIF2α), osteopontin, and cavolin-1. Commonly associated factors and pathways include glucose transporter 1 (GLUT1) and glucose transporter 3 (GLUT3) receptors, vascular endothelial growth factor (VEGF), phosphoinositide 3-kinase (PI3K)-Akt-mechanistic target of rapamycin (mTOR) pathway, and reactive oxygen species (ROS). HIF expression correlates with various glioblastoma hallmarks, including progression, survival, neovascularization, glucose metabolism, migration, and invasion. Conclusion: Overcoming challenges such as treatment resistance and the absence of biomarkers is critical for the effective integration of HIF-related therapies into the treatment of glioblastoma with the aim of optimizing patient outcomes.
Introduction: Industrialization and urbanization led to a significant increase in the environment. Lead inhibits the activity of numerous enzymes, triggers oxidative stress, and causes protein biosynthesis dysregulation. Inhalation of lead particles is the most common route of intoxication associated with occupational exposure. This study aims to evaluate laboratory methods and biomarkers in the assessment of lead exposure. Methods: For non-experimental qualitative research, available scientific articles in English published in the relevant databases (MEDLINE and ScienceDirect) were used. The database search was performed using the keywords “Laboratory diagnostics”, “occupational exposure”, and “lead”. Results: Atomic absorption spectrometry (AAS) is the gold standard in laboratory monitoring of occupational lead exposure. Inductively coupled plasma with mass spectrometry is a commonly used method described as more sensitive than AAS due to its low detection limit. Lead concentrations can be determined in various samples, but blood and urine are the most commonly used in laboratory practice. The most important exposure biomarker is the enzyme δ-aminolevulinic acid dehydratase (ALAD) in the blood, which is characterized by progressive inactivation by lead and a negative correlation with its concentration. The concentration of urinary delta-aminolevulinic acid (δ-ALA-U) reflects the state of impaired enzyme function in heme biosynthesis. In addition, determining blood zinc protoporphyrin and urinary coproporphyrin levels significantly aids in assessing occupational lead exposure disorders. Conclusion: The availability of the laboratory methods used and the biomarker specificity and sensitivity play an important role in the adequacy of lead exposure monitoring. Accurate determination of ALAD and δ-ALA-U concentrations, along with other biomarkers, is critical for assessing individuals exposed to lead.
Introduction: Hypothyroidism is a common disorder of the endocrine system caused by insufficient biologically active hormones at the tissue level or the inability of the tissue to utilize thyroid hormones. Iron plays a crucial role in the synthesis and metabolism of thyroid hormones, and it is stored in the body as ferritin. We aimed to evaluate the correlation between serum ferritin (SF) levels and thyroid hormone panel levels in both hypothyroid and euthyroid subjects. Methods: In 2022, a matched case–control study was conducted. The study involved participants with hypothyroidism and a control group (n = 53). The levels of thyroid-stimulating hormone (TSH), free triiodothyronine (fT3), free thyroxine (fT4), and SF were measured using the chemiluminescence immunoassay on a Mindray Cl 900-i analyzer (Shenzhen Mindray Bio-Medical Electronics Co., China). Results: The hypothyroid group had TSH levels that were significantly higher (10.76 [8.54-18.76] vs. 1.76 [1.26-2.58]; p < 0.001) and SF concentrations that were significantly lower (39.08 [21.15-45.70] vs. 54.09 [41.41-71.82]; p < 0.001) compared to the control group. In both male and female subjects of the hypothyroid group, a strong negative correlation was found between SF concentration and TSH levels ([Rho = −0.855,p < 0.01]; [Rho = −0.747; p < 0.01]). In female subjects of the hypothyroid group, a weak positive correlation was found between SF concentration and fT3 (Rho = 0.488; p < 0.05). In the euthyroid group, a correlation of the same strength and direction was found for fT4 (Rho = 0.366; p < 0.05). Conclusion: Research results indicate a correlation between lower SF concentrations and hypothyroidism, which is of particular importance for understanding the etiopathogenesis, diagnosis, monitoring, and treatment modalities of patients with hypothyroidism.
Background: Coronavirus disease 2019 (COVID-19) can cause a wide clinical spectrum, ranging from asymptomatic to severe disease with a high mortality rate. In view of the current pandemic and the increasing influx of patients into healthcare facilities, there is a need to identify simple and reliable tools for stratifying patients. Objective: Study aimed to analyze whether hemogram-derived ratios (HDRs) can be used to identify patients with a risk of developing a severe clinical form and admission to hospital. Methods: This cross-sectional and observational study included 500 patients with a confirmed diagnosis of COVID-19. Data on clinical features and laboratory parameters were collected from medical records and 13 HDRs were calculated and analyzed. Descriptive and inferential statistics were included in the analysis. Results: Of the 500 patients, 43.8% had a severe form of the disease. Lymphocytopenia, monocytopenia, higher C-reactive protein (CRP), and erythrocyte sedimentation rate (ESR) were found in severe patients (p < 0.05). Significantly higher neutrophil-to-lymphocyte ratio (NLR), derived NLR (dNLR), neutrophil-to-platelet ratio (NPR), neutrophil-to-lymphocyte-to-platelet ratio (NLPR) and CRP-to-lymphocyte ratio (CRP/Ly) values were found in severe patients (p < 0.001). In addition, they have statistically significant prognostic potential (p < 0.001). The area under the curve (AUC) for CRP/Ly, dNLR, NLPR, NLR, and NPR were 0.693, 0.619, 0.619, 0.616, and 0.603, respectively. The sensitivity and specificity were 65.7% and 65.6% for CRP/Ly, 51.6% and 70.8 for dNLR, 61.6% and 57.3% for NLPR, 40.6% and 80.4% for NLR, and 48.8% and 69.1% for NPR. Conclusion: The results of the study suggest that NLR, dNLR, CRP/Ly, NPR, and NLPR can be considered as potentially useful markers for stratifying patients with a severe form of the disease. HDRs derived from routine blood tests results should be included in common laboratory practice since they are readily available, easy to calculate, and inexpensive.
[This corrects the article DOI: 10.3389/fpubh.2022.795841.].
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više