Σφακιανάκης Αλέξανδρος
ΩτοΡινοΛαρυγγολόγος
Αναπαύσεως 5 Άγιος Νικόλαος
Κρήτη 72100
00302841026182
00306932607174
alsfakia@gmail.com

Αρχειοθήκη ιστολογίου

! # Ola via Alexandros G.Sfakianakis on Inoreader

Η λίστα ιστολογίων μου

Τρίτη 13 Δεκεμβρίου 2022

TNFSF13B rs9514828 gene polymorphism and soluble B cell activating factor levels: association with apical periodontitis

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Aim

The aim of this case-control study was to evaluate the association between the TNFSF13B rs9514828 (-871 C>T) polymorphism and soluble BAFF (sBAFF) in apical periodontitis (AP) patients.

Methodology

261 healthy subjects (HS) and 158 patients with AP classified as: 46 acute apical abscess (AAA), 81 primary AP (pAP) and 31 secondary AP (sAP) patients were included. Genomic DNA (gDNA) was extracted from peripheral blood cells according to the salting-out method. The TNFSF13B rs9514828 (NC_000013.11:g.108269025C>T) were identified using polymerase chain reaction (PCR) followed by restriction fragment length polymorphisms (RFLP). Serum sBAFF levels were measured by ELISA test. The chi-square or Fisher's exact test was performed. Odds ratios (OR) and 95% confidence intervals (95% CI) were calculated to evaluate the risk of AP associated with the rs9514828. The Mann–Whitney U test and Kruskal–Wallis analysis were used for non- normally distributed data. Differences were considered significant with a P-value < 0.05.

Results

No differences in the genotype/ allele frequencies were shown between HS and patients with AAA. However, the TT genotype (OR= 2.68, 95%CI: 1.10-6.53; P = 0.025) and T allele (OR= 1.46, 95%CI: 1.00-2.12; P = 0.045) were associated with increased risk of pAP. In contrast, the minor allele T significantly decreased the risk of sAP (OR = 0.49, 95%CI: 0.024-0.99; P = 0.043). sBAFF serum levels were increased in AAA and pAP compared with HS (P < 0.01 and P = 0.021, respectively). The AAA patients had higher sBAFF serum levels than pAP (P = 0.034) and sAP (P < 0.01).

Conclusions

These results suggest that the TNFSF13B rs9514828 (-871 C>T) polymorphism is associated with pAP susceptibility and that BAFF is a cytokine that might be involved in acute and chronic AP. The future exploration of the rs9514828 polymorphism in other AP cohorts is recommended.

View on Web

Wide mismatches in the sequences of primers and probes for Monkeypox virus diagnostic assays

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Rapid and accurate diagnosis of infections is fundamental to containment of disease. Several monkeypox virus (MPV) real-time diagnostic assays have been recommended by the CDC; however, the specificity of the primers and probes in these assays for the ongoing MPV outbreak has not been investigated. We analyzed the primer and probe sequences present in the CDC recommended monkeypox virus (MPV) generic real-time PCR assay by aligning those sequences against 1,730 MPV complete genomes reported in 2022 worldwide. Sequence mismatches were found in 99.08% and 97.46% of genomes for the MPV generic forward and reverse primers, respectively. Mismatch-corrected primers were synthetized and compared to the generic assay for MPV detection. Results showed that the two primer-template mismatches resulted in a ~11-fold underestimation of initial template DNA in the reaction and 4-fold increase in the 95% LOD. We further evaluated the specificity of seven other real-time PCR assays used for MPV a nd orthopoxvirus (OPV) detection and identified two assays with the highest matching score (>99.6%) to the global MPV genome database in 2022. Genetic variations in the primer-probe regions across MPV genomes could indicate the temporal and spatial emergence pattern of monkeypox disease. Our results show that the current MPV real-time generic assay may not be optimal to accurately detect MPV, and the mismatch-corrected assay with full complementarity between primers and current MPV genomes could provide a more sensitive and accurate detection of MPV.

This article is protected by copyright. All rights reserved.

View on Web

A Performance Comparison of two (Electro) chemiluminescence Immunoassays for Detection and Quantitation of Serum Anti‐Spike Antibodies According to SARS‐CoV‐2 Vaccination and Infections Status

alexandrossfakianakis shared this article with you from Inoreader

ABSTRACT

The information provided by SARS-CoV-2 Spike (S)-targeting immunoassays can be instrumental in clinical-decision making. We compared the performance of the Elecsys® Anti-SARS-CoV-2 S assay (Roche Diagnostics) and the LIAISON® SARS-CoV-2 TrimericS IgG assay (DiaSorin) using a total of 1,176 sera from 797 individuals, of which 286 were from vaccinated-SARS-CoV-2/experienced (Vac-Ex), 581 from vaccinated/naïve (Vac-N), 147 from unvaccinated/experienced (Unvac-Ex) and 162 from unvaccinated/naïve (Unvac-N) individuals. The Roche assay returned a higher number of positive results (907 vs. 790; P=0.45; overall sensitivity: 89.3% vs. 77.6%). The concordance between results provided by the two immunoassays was higher for sera from Vac-N (Kappa: 0.58; IQR, 0.50-0.65) than for sera from Vac-Ex (Kappa: 0.19; IQR, -0.14-0.52) or Unvac-Ex (Kappa: 0.18; IQR, 0.06-0.30). Discordant results occurred more frequently among sera from Unvac-Ex (34.7%) followed by Vac-N (14.6%) and Vac-Ex (2 .7%). Antibody levels quantified by both immunoassays were not significantly different when <250 (P=0.87) or <1,000 BAU/ml (P=0.13); in contrast, for sera ≥1,000 BAU/ml, the Roche assay returned significantly higher values than the DiaSorin assay (P<0.008). Neutralizing antibody titers (NtAb) were measured in 127 sera from Vac-Ex or Vac-N using a S-pseudotyped virus neutralization assay of Wuhan-Hu-1, Omicron BA.1 and Omicron BA.2. The correlation between antibody levels and NtAb titers was higher for sera from Vac-N than those from Vac-Ex, irrespective of the (sub)variant considered. In conclusion, neither qualitative nor quantitative results returned by both immunoassays are interchangeable. The performance of both assays was found to be greatly influenced by the vaccination and SARS-CoV-2 infection status of individuals.

This article is protected by copyright. All rights reserved.

View on Web

Impact of High MELD Scores on CMV Viremia Following Liver Transplantation

alexandrossfakianakis shared this article with you from Inoreader

ABSTRACT

Introduction

: Advanced liver disease or cirrhosis is associated with an increased risk of infections; however, the impact of high pretransplant MELD score on cytomegalovirus (CMV) viremia after liver transplantation is unknown.

Methods

: This single center, retrospective, cohort study evaluated CMV high-risk (CMV IgG D+/R-) liver transplant recipients who received valganciclovir prophylaxis for 3 months between 2009 and 2019. Patients were stratified by pretransplant MELD score of < 35 (low MELD) and ≥ 35 (high MELD). The primary outcome was 12-month CMV viremia and secondary outcomes included CMV resistance and tissue invasive disease, mortality, biopsy-proven acute rejection (BPAR), leukopenia, and thrombocytopenia. Multivariable Cox proportional-hazards modeling was used to assess the association of MELD score with the time to CMV viremia.

Results

: There were 162 and 79 patients in the low and high MELD groups, respectively. Pretransplant MELD score ≥ 35 was associated with an increased risk of CMV viremia (HR 1.73; CI 1.06 to 2.82, p = 0.03). CMV viremia occurred at 162 ± 61 days in the low MELD group and 139 ± 62 days in the high MELD group. While BPAR occurred early at 30 days (13-59) in the low MELD group and at 18 days (11-66) in the high MELD group (p = 0.56), BPAR was not associated with an increased risk of CMV viremia (HR 1.55 (0.93-2.60), p = 0.1).

Discussion

: MELD scores ≥ 35 were associated with an increased hazards of CMV viremia. In liver transplant recipients with MELD scores ≥ 35 who are CMV high-risk, additional CMV intervention may be warranted.

This article is protected by copyright. All rights reserved

View on Web

Burden of Neutropenia and Leukopenia Among Adult Kidney Transplant Recipients

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Background

: Leukopenia and neutropenia (L/N) may affect treatment decisions, potentially resulting in poor clinical and economic outcomes among kidney transplant recipients (KTRs). The burden of L/N is poorly quantified systematically. This systematic literature review aimed to summarize the incidence of, risk factors for, and clinical and economic outcomes associated with L/N post-KT.

Methods

: We systematically searched MEDLINE, Embase, and the Cochrane Library (from database inception-June 14, 2021) and conferences (past 3 years) to identify observational studies examining epidemiology, risk factors, or outcomes associated with L/N among adult KTRs.

Results

: Of 2,081 records, 82 studies met inclusion criteria. Seventy-three studies reported the epidemiology of L/N post-KT. Pooled incidence of neutropenia, defined as absolute neutrophil counts (ANC) <1000/μL, ranged from 13%-48% within 1 year post-transplant; ANC <500/μL ranged from 15%-20%. Leukopenia, defined as white blood cell counts <3500/μL, was 19%-83%. Eleven studies reported independent risk factors associated with L/N post-KT. D+/R- cytomegalovirus (CMV) status, mycophenolic acid (MPA), and tacrolimus use were the most consistent risk factors across studies. Fourteen studies reported L/N-associated clinical outcomes. We noted a trend toward a positive association between neutropenia and acute rejection/opportunistic infections. Mixed findings were noted on the association between L/N and graft failure or mortality. Dosage modifications of valganciclovir, MPA, cotrimoxazole, and anti-thymoglobulin and the need for G-CSF use were common with L/N.

Conclusion

: Findings suggest post-transplant L/N were common and associated with frequent modifications of immunosuppressive agents, requiring G-CSF use, and rejection or opportunistic infections. Findings highlight the need for interventions to reduce risk of L/N post-KT.

This article is protected by copyright. All rights reserved

View on Web

Trigeminal Sensitivity in Patients With Allergic Rhinitis and Chronic Rhinosinusitis

alexandrossfakianakis shared this article with you from Inoreader
Trigeminal Sensitivity in Patients With Allergic Rhinitis and Chronic Rhinosinusitis

Allergic patients react more sensitively to trigeminal stimuli in the nose than a comparable control group. This is important because it supports the suggestion that local factors in the nasal mucosa are significantly involved in influencing the trigeminal system of the nose.


Objective

Allergic rhinitis (AR) and chronic rhinosinusitis with nasal polyps (CRSwNP) are of high importance in otorhinolaryngology. Some of their symptoms are related to changes in the nasal trigeminal sensitivity. The aim of this study was to compare nasal trigeminal sensitivity in patients with AR, CRSwNP, and healthy controls (HC).

Methods

A total of 75 individuals participated (age 19–78 years; 34 AR, 10 CRSwNP and 31 HC). Olfactory function was determined using the extended Sniffin' Sticks test battery. Trigeminal sensitivity was assessed with CO2 detection thresholds. Trigeminal negative mucosal potentials (NMP) and EEG-derived event-related potentials (ERP) were recorded in response to selective olfactory (phenylethyl alcohol) and trigeminal (CO2) stimuli using high-precision air-dilution olfactometry.

Results

In comparison to HC, AR patients had lower CO2 thresholds, also reflected in shorter peak latencies in NMP and trigeminal ERP measurements. CRSwNP patients had a decreased sensitivity for trigeminal stimuli, also reflected in prolonged trigeminal ERP latencies, and reduced olfactory function compared to HC.

Conclusion

AR patients seemed to be more sensitive to trigeminal stimuli than CRSwNP patients. Importantly, the differences could be shown on psychophysical and electrophysiological levels. The changes in trigeminal sensitivity appear to be present already at the level of the respiratory epithelium. The differences between the two groups may depend on the specific inflammatory changes accompanying each disorder, the degree of inflammatory activity, or duration of the inflammatory disorder. However, because the sample sizes are relatively small, these results need to be confirmed in the future studies with larger groups.

Level of Evidence

4 Laryngoscope, 2022

View on Web

Homoarginine in the Cardiovascular System: Pathophysiology and Recent Developments

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Upcoming experimental and epidemiological data have identified the endogenous non-proteinogenic amino acid L-homoarginine (L-hArg) not only as a novel biomarker for cardiovascular disease but also as being directly involved in the pathogenesis of cardiac dysfunction.

The association of low L-hArg levels with adverse cardiovascular events and mortality has proposed the idea of nutritional supplementation to rescue pathways inversely associated with cardiovascular health. Subsequent clinical and experimental studies contributed significantly to our knowledge of potential effects on the cardiorenal axis, acting either as a biomarker or a cardiovascular active agent.

In this review article, we provide a comprehensive summary of L-hArg metabolism, pathophysiological aspects, and current developments in the field of experimental and clinical evidence in favor of protective cardiovascular effects. Establishing a reliable biomarker to identify patients at high risk to die of cardiovascular disease represents one of the main goals for tackling this disease and providing individual therapeutic guidance.

View on Web

Extended‐release naltrexone for people with alcohol use disorder on therapeutic anticoagulation: A case series

alexandrossfakianakis shared this article with you from Inoreader
Extended-release naltrexone for people with alcohol use disorder on therapeutic anticoagulation: A case series

Guide to XR-NTX Discussion for Patients on Therapeutic Anticoagulation.


Abstract

What is known and objective

Individuals with medication adherence challenges or a preference for long-acting medications may benefit from extended-release naltrexone (XR-NTX) for treatment of alcohol use disorder (AUD). Individuals on therapeutic anticoagulation were excluded from XR-NTX studies and its safety in this population has not been reported.

Case summary

We conducted structured retrospective chart review of six individuals who received XR-NTX for AUD while on therapeutic anticoagulation between November 2019 and Deccember 2020. We found no documented complications among six individuals who received up to 11 doses of XR-NTX while on therapeutic anticoagulation.

What is new and conclusion

XR-NTX may be safely tolerated by patients on therapeutic anticoagulation. We need larger studies evaluating XR-NTX administration in patients on therapeutic anticoagulation and those with coagulopathies, including individuals with alcohol-related liver disease, to better quantify risks and benefits for shared decision-making.

View on Web

Facial implant gingival level and thickness changes following maxillary anterior immediate tooth replacement with scarf‐connective tissue graft: A 4–13‐year retrospective study

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Objective

A scarf-shaped connective tissue graft can be placed at the facial and proximal aspect of the peri-implant soft tissue zone during immediate implant placement and provisionalization (IIPP) procedures in the esthetic zone to optimize implant esthetics without the need of flap reflection. This retrospective study evaluated soft tissue stability after scarf-connective tissue graft (S-CTG) in conjunction with IIPP procedures in the esthetic zone.

Materials and Methods

Patients who received IIPP with S-CTG with a minimum 1-year follow-up were evaluated. Mid-facial gingival level (MFGL) change and mid-facial gingival thickness (MFGT) change were measured and compared at the pre-op (T0), IIPP + S-CTG surgery (T1), follow up appointment with MFGT measurement (T2), and latest follow-up appointment (T3). Implant success rate and graft necrosis were also recorded.

Results

A total of 22 IIPP and S-CTG procedures in 20 patients were evaluated in the study. After a mean follow-up of 8.2 years (3.9–13.4) (T3), all implants remained osseointegrated (22/22 [100%]), with statistically insignificant mean midfacial gingival level change of −0.19 mm (−1.5 to 0.8). Statistically significant difference in midfacial gingival thickness (MFGT) was noted (2.5 mm [1.8–3.5 mm]) after a mean follow-up time (T2) of 2.3 years (1–8.6) when compared with MFGT at baseline (1.1 mm [0.6–1.3 mm]) (T1). Necrosis of S-CTG during initial healing phase was noted in 9% (2/22) of the sites.

Conclusions

Within the confines of this study, scarf-connective tissue graft at time of immediate implant placement and provisionalization can thicken the gingiva and maintain the gingival level at the critical soft tissue zone.

Clinical Significance

Managing the soft tissue zone is as important as that of the hard tissue zone for peri-implant esthetics. Connective tissue graft is one of the methods that can enhance the final esthetic outcomes. This retrospective study has demonstrated that Scarf-CTG technique is an effective treatment modality to maintain soft tissue stability.

View on Web

Malnutrition and clinical outcomes post allogeneic stem cell transplantation

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Background

Malnutrition has been linked with higher risk of poor outcomes post-allogeneic stem cell transplantation (alloSCT), however few studies have used a validated nutrition assessment tool such as the Patient Generated Subjective Global Assessment (PG-SGA) to measure nutritional status and investigate associations with long-term clinical outcomes. This study aimed to assess the incidence of malnutrition prior to alloSCT and determine if there was an association between nutritional status pre-transplant and post-transplant clinical outcomes including acute kidney injury, graft-versus-host disease, intensive care admission (ICU), need for haemodialysis and survival.

Methodology

A retrospective analysis of 362 patients (213 m:149 f, mean age±SD 47.8±14.1y) who underwent alloSCT from 2008 to 2013 was conducted. Data on clinical outcomes was obtained for five years post-transplant.

Results

Fifteen percent (n=56) of patients were identified as malnourished pre-admission. Malnutrition was associated with longer hospital stay (p=0.007), increased requirement for haemodialysis (p=0.016), and increased admissions to ICU (p=0.003). There was no association between malnutrition and acute kidney injury, graft-versus-host disease or survival. Following multivariate analyses, malnutrition remained significantly associated with increased admission rates to ICU (OR 3.8, 95%CI 1.3 – 10.5, p=0.011) and increased LOS>30 days (OR 3.6. 95%CI 1.8-7.4, p=<0.001).

Conclusion

These findings add importance to the need for nutrition screening and assessment to be routinely undertaken for patients prior to alloSCT and throughout hospitalisation in order to provide early nutrition intervention for the prevention of malnutrition, poor clinical outcomes, and increased healthcare costs.

This article is protected by copyright. All rights reserved.

View on Web

Αρχειοθήκη ιστολογίου