Σφακιανάκης Αλέξανδρος
ΩτοΡινοΛαρυγγολόγος
Αναπαύσεως 5 Άγιος Νικόλαος
Κρήτη 72100
00302841026182
00306932607174
alsfakia@gmail.com

Αρχειοθήκη ιστολογίου

! # Ola via Alexandros G.Sfakianakis on Inoreader

Η λίστα ιστολογίων μου

Πέμπτη 7 Ιουλίου 2022

Automated Detection, Segmentation, and Classification of Pleural Effusion From Computed Tomography Scans Using Machine Learning

alexandrossfakianakis shared this article with you from Inoreader
imageObjective This study trained and evaluated algorithms to detect, segment, and classify simple and complex pleural effusions on computed tomography (CT) scans. Materials and Methods For detection and segmentation, we randomly selected 160 chest CT scans out of all consecutive patients (January 2016–January 2021, n = 2659) with reported pleural effusion. Effusions were manually segmented and a negative cohort of chest CTs from 160 patients without effusions was added. A deep convolutional neural network (nnU-Net) was trained and cross-validated (n = 224; 70%) for segmentation and tested on a separate subset (n = 96; 30%) with the same distribution of reported pleural complexity features as in the training cohort (eg, hyperdense fluid, gas, pleural thickening and loculation). On a separate consecutive cohort with a high prevalence of pleural complexity features (n = 335), a random forest model was implemented for classification of segmented effusions with Hounsfield unit thresholds, density distribution, and radiomics-based features as input. As performance measures, sensitivity, specificity, and area under the curves (AUCs) for detection/classifier evaluation ( per-case level) and Dice coefficient and volume analysis for the segmentation task were used. Results Sensitivity and specificity for detection of effusion were excellent at 0.99 and 0.98, respectively (n = 96; AUC, 0.996, test data). Segmentation was robust (median Dice, 0.89; median absolute volume difference, 13 mL), irrespective of size, complexity, or contrast phase. The sensitivity, specificity, and AUC for classification in simple versus complex effusions were 0.67, 0.75, and 0.77, respectively. Conclusion Using a dataset with different degrees of complexity, a robust model was developed for the detection, segmentation, and classification of effusion subtypes. The algorithms are openly available at https://github.com/usb-radiology/pleuraleffusion.git.
View on Web

New-Generation Low-Field Magnetic Resonance Imaging of Hip Arthroplasty Implants

alexandrossfakianakis shared this article with you from Inoreader
imageObjectives Despite significant progress, artifact-free visualization of the bone and soft tissues around hip arthroplasty implants remains an unmet clinical need. New-generation low-field magnetic resonance imaging (MRI) systems now include slice encoding for metal artifact correction (SEMAC), which may result in smaller metallic artifacts and better image quality than standard-of-care 1.5 T MRI. This study aims to assess the feasibility of SEMAC on a new-generation 0.55 T system, optimize the pulse protocol parameters, and compare the results with those of a standard-of-care 1.5 T MRI. Materials and Methods Titanium (Ti) and cobalt-chromium total hip arthroplasty implants embedded in a tissue-mimicking American Society for Testing and Materials gel phantom were evaluated using turbo spin echo, view angle tilting (VAT), and combined VAT and SEMAC (VAT + SEMAC) pulse sequences. To refine an MRI protocol at 0.55 T, the type of metal artifact reduction techniques and the effect of various pulse sequence parameters on metal artifacts were assessed through qualitative ranking of the images by 3 expert readers while taking measured spatial resolution, signal-to-noise ratios, and acquisition times into consideration. Signal-to-noise ratio efficiency and artifact size of the optimized 0.55 T protocols were compared with the 1.5 T standard and compressed-sensing SEMAC sequences. Results Overall, the VAT + SEMAC sequence with at least 6 SEMAC encoding steps for Ti and 9 for cobalt-chromium implants was ranked higher than other sequences for metal reduction (P
View on Web

Coronary Computed Tomography Angiography-Based Calcium Scoring: In Vitro and In Vivo Validation of a Novel Virtual Noniodine Reconstruction Algorithm on a Clinical, First-Generation Dual-Source Photon Counting-Detector System

alexandrossfakianakis shared this article with you from Inoreader
imagePurpose The aim of this study was to evaluate coronary computed tomography angiography (CCTA)-based in vitro and in vivo coronary artery calcium scoring (CACS) using a novel virtual noniodine reconstruction (PureCalcium) on a clinical first-generation photon-counting detector–computed tomography system compared with virtual noncontrast (VNC) reconstructions and true noncontrast (TNC) acquisitions. Materials and Methods Although CACS and CCTA are well-established techniques for the assessment of coronary artery disease, they are complementary acquisitions, translating into increased scan time and patient radiation dose. Hence, accurate CACS derived from a single CCTA acquisition would be highly desirable. In this study, CACS based on PureCalcium, VNC, and TNC, reconstructions was evaluated in a CACS phantom and in 67 patients (70 [59/80] years, 58.2% male) undergoing CCTA on a first-generation photon counting detector–computed tomography system. Coronary artery calcium scores were quantified for the 3 reconstructions and compared using Wilcoxon test. Agreement was evaluated by Pearson and Spearman correlation and Bland-Altman analysis. Classification of coronary artery calcium score categories (0, 1–10, 11–100, 101–400, and >400) was compared using Cohen κ. Results Phantom studies demonstrated strong agreement between CACSPureCalcium and CACSTNC (60.7 ± 90.6 vs 67.3 ± 88.3, P = 0.01, r = 0.98, intraclass correlation [ICC] = 0.98; mean bias, 6.6; limits of agreement [LoA], −39.8/26.6), whereas CACSVNC showed a significant underestimation (42.4 ± 75.3 vs 67.3 ± 88.3, P
View on Web

Severity of SARS-CoV-2 Infection in Pregnancy

alexandrossfakianakis shared this article with you from Inoreader
Abstract
Background
Pregnancy represents a physiological state associated with increased vulnerability to severe outcomes from infectious diseases, both for the pregnant person and developing infant. The SARS-CoV-2 pandemic may have important health consequences for pregnant individuals, who may also be more reluctant than non-pregnant people to accept vaccination.
Methods
We sought to estimate the degree to which increased severity of SARS-CoV-2 outcomes can be attributed to pregnancy using a population-based SARS-CoV-2 case file from Ontario, Canada. Due to varying propensity to receive vaccination, and changes in dominant circulating viral strains over time, a time-matched cohort study was performed to evaluate the relative risk of severe illness in pregnant women with SARS-CoV-2 compared to other SARS-CoV-2 infected women of childbearing age (10 to 49 years old). Risk of severe SARS-CoV-2 outcomes was evaluated in pregnant women and time-matched non-pregnant controls using multivariable conditional logistic regression.
Results
Compared to the rest of the population, non-pregnant women of childbearing age had an elevated risk of infection (standardized morbidity ratio (SMR) 1.28), while risk of infection was reduced among pregnant women (SMR 0.43). After adjustment for confounding pregnant women had a markedly elevated risk of hospitalization (adjusted OR 4.96, 95% CI 3.86 to 6.37) and ICU admission (adjusted OR 6.58, 95% CI 3.29 to 13.18). The relative increase in hospitalization risk associated with pregnancy was greater in women without comorbidities than in those with comorbidities (P for heterogeneity 0.004).
Conclusions
Given the safety of SARS-CoV-2 vaccines in pregnancy, risk-benefit calculus strongly favours SARS-CoV-2 vaccination in pregnant women.
View on Web

Antibody and T-cell responses 6 months after COVID-19 mRNA-1273 vaccination in patients with chronic kidney disease, on dialysis, or living with a kidney transplant

alexandrossfakianakis shared this article with you from Inoreader
Abstract
Background
The immune response to COVID-19 vaccination is inferior in kidney transplant recipients (KTR), and to a lesser extent in patients on dialysis or with chronic kidney disease (CKD). We assessed the immune response 6 months after mRNA-1273 vaccination in kidney patients and compared this to controls.
Methods
152 participants with CKD stages G4/5 (eGFR <30  mL/min/1.73m2), 145 participants on dialysis, 267 KTR, and 181 controls were included. SARS-CoV-2 Spike S1-specific IgG antibodies were measured by fluorescent bead-based multiplex-immunoassay, neutralizing antibodies to ancestral, Delta and Omicron (BA.1) variants by plaque reduction, and T-cell responses by IFN-γ release assay.
Results
At 6 months after vaccination S1-specific antibodies were detected in 100% of controls, 98.7% of CKD G4/5 patients, 95.1% of dialysis patients, and 56.6% of KTR. These figures were comparable to the respons e rates at 28 days, but antibody levels waned significantly. Neutralization of the ancestral and Delta variant was detected in most participants, whereas neutralization of Omicron was mostly absent. S-specific T-cell responses were detected 6 months in 75.0% of controls, 69.4% of CKD G4/5 patients, 52.6% of dialysis patients, and 12.9% of KTR. T-cell responses at 6 months were significantly lower than responses at 28 days.
Conclusions
Although seropositivity rates at 6 months were comparable to that at 28 days after vaccination, significantly decreased antibody levels and T-cell responses were observed. The combination of low antibody levels, reduced T-cell responses, and absent neutralization of the newly-emerging variants indicates the need for additional boosts or alternative vaccination strategies in KTR.
View on Web

Exploring the impact of anterior chest wall scars from implantable venous ports in adolescent survivors of cancer

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Background

In children with cancer, port-a-caths (ports) are commonly placed in the right anterior chest wall, leaving a visible scar when removed. The psychological impact of port scars on survivors is unknown. It is unclear whether alternative sites should be considered. We assessed the impact of port scars on pediatric cancer survivors to determine whether a change in location is indicated.

Methods

We performed a cross-sectional single-center study of pediatric cancer survivors aged 13–18 years. A questionnaire explored participants' perceptions of their port scars. Four additional validated tools were used: Fitzpatrick scale, Patient and Observer Scar Assessment Scale (POSAS), Children's Dermatology Life Quality Index, and a Distress Thermometer.

Results

Among 100 participants (median age 15.8 years [13–18], median duration since treatment 8 years [1.5–14.8]), 75 'never/occasionally' thought about their port scars, 85 were not bothered by its location and 87 would not have preferred another site. Eleven participants were highly impacted by their scars: six thought about their scar 'everyday/all the time', four were highly bothered by its location, and nine would have preferred a different location. There was an association between the desire for different scar location and how much the location bothered participants (p < 0.0001), female sex (p = 0.03) and Patient POSAS score (p = 0.04).

Conclusion

A port scar on the anterior chest wall was not a major concern for the majority of this cohort. A minority of participants were highly impacted by the scar and its location. Advance identification of those likely to be impacted by their scars may not be possible.

View on Web

Strategies for Evaluating Anosmia Therapeutics in the COVID-19 Era

alexandrossfakianakis shared this article with you from Inoreader

jamanetwork.com

Two studies in this issue of JAMA Otolaryngology–Head & Neck Surgery evaluate the use of nasal theophylline, a phosphodiesterase inhibitor that may promote neural olfactory signaling and recovery for postviral olfactory dysfunction (OD). The first, a dose-modification Research Letter by Lee et al was conducted in patients with non-COVID-19–related hyposmia or anosmia secondary to viral infection and confirmed by the objective University of Pennsylvania Smell Identification Test. This open-label, dose-escalation trial provides an educational description of how to evaluate the appropriate dose for an emerging pharmacologic intervention. Specifically, the authors identified 400 mg of theophylline twice daily as a tolerated dosage. This was calculated as an equivalent oral dose of 20 mg. A phase 2 pilot study by Gupta et al was desig ned using this valuable information. Patients with suspected COVID-19–related OD completed the University of Pennsylvania Smell Identification Test and were randomized to either theophylline or placebo nasal irrigations for 6 weeks. This study was inconclusive regarding the clinical benefit of theophylline nasal irrigations, although there was suggested improvement by subjective assessments. The authors acknowledge several limitations, including small sample size, the virtual nature of the study design and subsequent inability to conduct endoscopic nasal examinations, lack of information regarding participant COVID-19 vaccination status, lack of polymerase chain reaction–confirmed COVID-19 infection, and short-term participant follow-up. These acknowledgments are justified, and many can be overcome in subsequent studies with adequate funding and time. However, the heterogeneous nature of COVID-19 and associated research make this area of work particularly challenging. Herein, we propose several approaches to improve the rigor of OD research in the COVID-19 era.
View on Web

Safety of High-Dose Nasal Theophylline Irrigation in the Treatment of Postviral Olfactory Dysfunction

alexandrossfakianakis shared this article with you from Inoreader

jamanetwork.com

This case series aims to determine the maximum tolerable dose of theophylline delivered via high-volume, low-pressure nasal saline irrigation for treatment of postviral olfactory dysfunction.
View on Web

Surgeon Thyroidectomy Case Volume Impacts Disease‐free Survival in the Management of Thyroid Cancer

alexandrossfakianakis shared this article with you from Inoreader
Surgeon Thyroidectomy Case Volume Impacts Disease-free Survival in the Management of Thyroid Cancer

In this population-based cohort study involving 37,233 thyroidectomies performed in Ontario, Canada between 1993 and 2017, we found both high-volume surgeons and hospitals to be predictors of better disease-free survival (DFS) in patients with well-differentiated thyroid cancer. DFS is higher among surgeons performing more than 40 thyroidectomies a year.


Objectives

To assess the association between surgeons thyroidectomy case volume and disease-free survival (DFS) for patients with well-differentiated thyroid cancer (WDTC). A secondary objective was to assess a surgeon volume cutoff to optimize outcomes in those with WDTC. We hypothesized that surgeon volume will be an important predictor of DFS in patients with WDTC after adjusting for hospital volume and sociodemographic and clinical factors.

Methods

In this retrospective population-based cohort study, we identified WDTC patients in Ontario, Canada, who underwent thyroidectomy confirmed by both hospital-level and surgeon-level administrative data between 1993 and 2017 (N = 37,233). Surgeon and hospital volumes were calculated based on number of cases performed in the year prior by the physician and at an institution performing each case, respectively and divided into quartiles. A multilevel hierarchical Cox regression model was used to estimate the effect of volume on DFS.

Results

A crude model without patient or treatment characteristics demonstrated that both higher surgeon volume quartiles (p < 0.001) and higher hospital volume quartiles (p < 0.001) were associated with DFS. After controlling for clustering and patient/treatment covariates and hospital volume, moderately low (18–39/year) and low (0–17/year) volume surgeons (hazard ratios [HR]: 1.23, 95% confidence interval [CI]: 1.09–1.39 and HR: 1.34, 95% CI: 1.17–1.53 respectively) remained an independent statistically significant negative predictor of DFS.

Conclusion

Both high-volume surgeons and hospitals are predictors of better DFS in patients with WDTC. DFS is higher among surgeons performing more than 40 thyroidectomies a year.

Level of Evidence

3 Laryngoscope, 2022

View on Web

Characterizing Pediatric Bilateral Vocal Fold Dysfunction: Analysis with the Pediatric Health Information System Database

alexandrossfakianakis shared this article with you from Inoreader
Characterizing Pediatric Bilateral Vocal Fold Dysfunction: Analysis with the Pediatric Health Information System Database

This database study represents the largest cohort analysis to date characterizing bilateral vocal fold dysfunction. The majority of pediatric patients with bilateral vocal fold dysfunction (BVFD) have a complex chronic condition, with respiratory conditions being the most common followed by gastrointestinal conditions. Prognostic indicators of improved hospital survival include gastrointestinal comorbidities and presence of tracheostomy.


Objectives

The purpose of this study was to characterize pediatric bilateral vocal fold dysfunction and to examine the overall inpatient mortality.

Methods

Retrospective cohort analysis. Data from the Pediatric Health Information System was gathered for all pediatric patients with a diagnosis of bilateral vocal fold dysfunction between January 2008 and September 2020. Univariate and multivariate analyses were performed using Cox proportional hazard models.

Results

2395 patients accounted for 4799 hospitalizations with bilateral vocal fold dysfunction. Inpatient mortality occurred in 2.9% of the study sample. Chiari 2 was found in 2.8% of patients. The most common associated diagnoses were related to comorbid respiratory conditions (61.1%). The median adjusted ratio of cost to charges was $76,569. Aspiration was noted in 28 patients (1.2%). Gastrostomy was performed in 607 patients (25.3%). Tracheostomy was performed in 27% of patients. The overall 90-day readmission rate was 61%. On multivariate analysis, prognostic factors associated with increased hospital survival include gastrointestinal comorbidities (hazard ratio [HR]: 0.29; 95% confidence interval [CI]: 0.18–0.49) and tracheostomy (HR: 0.21; 95% CI: 0.12–0.37).

Conclusion

This database study represents the largest cohort analysis to date characterizing bilateral vocal fold dysfunction. Favorable prognostic indicators of overall hospital survival include gastrointestinal comorbidities and the presence of tracheostomy. Tracheostomy is associated with an increase in hospital costs, comorbidities, gastrostomy tube placement, and Chiari diagnosis.

Level of Evidence

4 Laryngoscope, 2022

View on Web

Αρχειοθήκη ιστολογίου