Articles from BioMed Central

 

Recent Submissions

  • Holopainen, Elina; Vakkilainen, Svetlana; Mäkitie, Outi (BioMed Central, 2020)
    Abstract Background Cartilage-hair hypoplasia (CHH) is a rare skeletal dysplasia characterized by disproportionate short stature, immunodeficiency, anemia and risk of malignancies. All these features can affect pregnancy and predispose to maternal and fetal complications. This study aimed to evaluate obstetric history and maternal and fetal outcomes in women with CHH. Methods Among 47 Finnish women with CHH, we identified 14 women with ICD codes related to pregnancies, childbirth and puerperium in the National Hospital Discharge Registry and obtained detailed data on gynecologic and obstetric history with a questionnaire. Offspring birth length and weight were collected and compared with population-based normal values. Results There were altogether 42 pregnancies in 14 women (median height 124 cm, range 105–139 cm; 4′1′′, range 3′5′′–4′7′’). Twenty-six pregnancies (62%), including one twin pregnancy, led to a delivery. Miscarriages, induced abortions and ectopic pregnancies complicated 9, 5, and 2 pregnancies, respectively. Severe pregnancy-related complications were rare. All women with CHH delivered by cesarean section, mostly due to evident cephalo-pelvic disproportion, and in 25/26 cases at full-term. In the majority, the birth length (median 48 cm, range 45.5–50 cm; 1′7′′, range 1′6′′–1′8′′) and weight (3010 g, range 2100–3320 g; 6.6 lb, range 4.6–7.3 lb) of the offspring in full-term singleton pregnancies was normal. Conclusions Despite CHH mothers’ significant short stature and other potential CHH-related effects on pregnancy outcome, most pregnancies lead to a term cesarean section delivery. Since fetal growth was generally unaffected, cephalo-pelvic disproportion was evident and planned cesarean section should be contemplated in term pregnancies.
  • Hakulinen, Christian; Mok, Pearl L H; Horsdal, Henriette T; Pedersen, Carsten B; Mortensen, Preben B; Agerbo, Esben; Webb, Roger T (BioMed Central, 2020)
    Abstract Background Links between parental socioeconomic position during childhood and subsequent risks of developing mental disorders have rarely been examined across the diagnostic spectrum. We conducted a comprehensive analysis of parental income level, including income mobility, during childhood and risks for developing mental disorders diagnosed in secondary care in young adulthood. Methods National cohort study of persons born in Denmark 1980–2000 (N = 1,051,265). Parental income was measured during birth year and at ages 5, 10 and 15. Follow-up began from 15th birthday until mental disorder diagnosis or 31 December 2016, whichever occurred first. Hazard ratios and cumulative incidence were estimated. Results A quarter (25.2%; 95% CI 24.8–25.6%) of children born in the lowest income quintile families will have a secondary care-diagnosed mental disorder by age 37, versus 13.5% (13.2–13.9%) of those born in the highest income quintile. Longer time spent living in low-income families was associated with higher risks of developing mental disorders. Associations were strongest for substance misuse and personality disorders and weaker for mood disorders and anxiety/somatoform disorders. An exception was eating disorders, with low parental income being associated with attenuated risk. For all diagnostic categories examined except for eating disorders, downward socioeconomic mobility was linked with higher subsequent risk and upward socioeconomic mobility with lower subsequent risk of developing mental disorders. Conclusions Except for eating disorders, low parental income during childhood is associated with subsequent increased risk of mental disorders diagnosed in secondary care across the diagnostic spectrum. Early interventions to mitigate the disadvantages linked with low income, and better opportunities for upward socioeconomic mobility could reduce social and mental health inequalities.
  • Kaivola, Karri; Salmi, Samuli J; Jansson, Lilja; Launes, Jyrki; Hokkanen, Laura; Niemi, Anna-Kaisa; Majamaa, Kari; Lahti, Jari; Eriksson, Johan G; Strandberg, Timo; Laaksovirta, Hannu; Tienari, Pentti J (BioMed Central, 2020)
    Abstract The hexanucleotide repeat expansion in intron 1 of the C9orf72 gene causes amyotrophic lateral sclerosis (ALS) and frontotemporal dementia. In addition to the effects of the pathogenic expansion, a role of intermediate-length alleles has been suggested in ALS, corticobasal degeneration and Parkinson’s disease. Due to the rarity of intermediate-length alleles with over 20 repeats and the geographical variability in their frequency, large studies that account for population stratification are needed to elucidate their effects. To this aim, we used repeat-primed PCR and confirmatory PCR assays to determine the C9orf72 repeat allele lengths in 705 ALS patients and 3958 controls from Finland. After exclusion of expansion carriers (25.5% of the ALS patients and 0.2% of the controls), we compared the frequency of intermediate-length allele carriers of 525 ALS cases and 3950 controls using several intermediate-length allele thresholds (7–45, 17–45, 21–45, 24–45 and 24–30). The carriership of an intermediate-length allele did not associate with ALS (Fisher’s test, all p ≥ 0.15) nor was there any association with survival (p ≥ 0.33), when we divided our control group into three age groups (18–65, 66–84 and 85–105 years). Carriership of two intermediate-length alleles was associated with ALS, when the longer allele was ≥ 17 repeats (p = 0.002, OR 5.32 95% CI 2.02–14.05) or ≥ 21 repeats (p = 0.00016, OR 15.21 95% CI 3.79–61.0). Our results show that intermediate-length alleles are a risk factor of ALS when present in both alleles, whereas carrying just one intermediate-length allele was not associated with ALS or survival.
  • Snell, Kym I E; Allotey, John; Smuk, Melanie; Hooper, Richard; Chan, Claire; Ahmed, Asif; Chappell, Lucy C; Von Dadelszen, Peter; Green, Marcus; Kenny, Louise; Khalil, Asma; Khan, Khalid S; Mol, Ben W; Myers, Jenny; Poston, Lucilla; Thilaganathan, Basky; Staff, Anne C; Smith, Gordon C S; Ganzevoort, Wessel; Laivuori, Hannele; Odibo, Anthony O; Arenas Ramírez, Javier; Kingdom, John; Daskalakis, George; Farrar, Diane; Baschat, Ahmet A; Seed, Paul T; Prefumo, Federico; da Silva Costa, Fabricio; Groen, Henk; Audibert, Francois; Masse, Jacques; Skråstad, Ragnhild B; Salvesen, Kjell Å; Haavaldsen, Camilla; Nagata, Chie; Rumbold, Alice R; Heinonen, Seppo; Askie, Lisa M; Smits, Luc J M; Vinter, Christina A; Magnus, Per; Eero, Kajantie; Villa, Pia M; Jenum, Anne K; Andersen, Louise B; Norman, Jane E; Ohkuchi, Akihide; Eskild, Anne; Bhattacharya, Sohinee; McAuliffe, Fionnuala M; Galindo, Alberto; Herraiz, Ignacio; Carbillon, Lionel; Klipstein-Grobusch, Kerstin; Yeo, Seon A; Browne, Joyce L; Moons, Karel G M; Riley, Richard D; Thangaratinam, Shakila (BioMed Central, 2020)
    Abstract Background Pre-eclampsia is a leading cause of maternal and perinatal mortality and morbidity. Early identification of women at risk during pregnancy is required to plan management. Although there are many published prediction models for pre-eclampsia, few have been validated in external data. Our objective was to externally validate published prediction models for pre-eclampsia using individual participant data (IPD) from UK studies, to evaluate whether any of the models can accurately predict the condition when used within the UK healthcare setting. Methods IPD from 11 UK cohort studies (217,415 pregnant women) within the International Prediction of Pregnancy Complications (IPPIC) pre-eclampsia network contributed to external validation of published prediction models, identified by systematic review. Cohorts that measured all predictor variables in at least one of the identified models and reported pre-eclampsia as an outcome were included for validation. We reported the model predictive performance as discrimination (C-statistic), calibration (calibration plots, calibration slope, calibration-in-the-large), and net benefit. Performance measures were estimated separately in each available study and then, where possible, combined across studies in a random-effects meta-analysis. Results Of 131 published models, 67 provided the full model equation and 24 could be validated in 11 UK cohorts. Most of the models showed modest discrimination with summary C-statistics between 0.6 and 0.7. The calibration of the predicted compared to observed risk was generally poor for most models with observed calibration slopes less than 1, indicating that predictions were generally too extreme, although confidence intervals were wide. There was large between-study heterogeneity in each model’s calibration-in-the-large, suggesting poor calibration of the predicted overall risk across populations. In a subset of models, the net benefit of using the models to inform clinical decisions appeared small and limited to probability thresholds between 5 and 7%. Conclusions The evaluated models had modest predictive performance, with key limitations such as poor calibration (likely due to overfitting in the original development datasets), substantial heterogeneity, and small net benefit across settings. The evidence to support the use of these prediction models for pre-eclampsia in clinical decision-making is limited. Any models that we could not validate should be examined in terms of their predictive performance, net benefit, and heterogeneity across multiple UK settings before consideration for use in practice. Trial registration PROSPERO ID: CRD42015029349 .
  • Blukacz, Alice; Cabieses, Báltica; Markkula, Niina (BioMed Central, 2020)
    Abstract Mental health in a context of international migration is a particularly pressing issue, as migration is recognised as a social determinant of physical and mental health. As Chile is increasingly becoming a receiving country of South-South migration, immigrants face mental health inequities, with regards to outcomes and access to care. In order to identify and synthetize mental healthcare inequities faced by international migrants with regards to locals in Chile, a narrative review of the literature on national mental healthcare policies in Chile and a narrative review of the literature on migrants’ mental healthcare in Chile were conducted, with a focus on describing mental health outcomes, policy environment and persisting gaps and barriers for both topics. The existing literature on mental healthcare in Chile, both for the general population and for international migrants, following the social determinant of health framework and categorised in terms of i) Inequities in mental health outcomes; ii) Description of the mental health policy environment and iii) Identification of the main barriers to access mental healthcare. Despite incremental policy efforts to improve the reach of mental healthcare in Chile, persisting inequities are identified for both locals and international migrants: lack of funding and low prioritisation, exacerbation of social vulnerability in the context of a mixed health insurance system, and inadequacy of mental healthcare services. International migrants may experience specific layers of vulnerability linked to migration as a social determinant of health, nested in a system that exacerbates social vulnerability. Based on the findings, the article discusses how mental health is a privilege for migrant populations as well as locals experiencing layers of social vulnerability in the Chilean context. International migrants’ access to comprehensive and culturally relevant mental healthcare in Chile and other countries is an urgent need in order to contribute to reducing social vulnerability and fostering mechanisms of social inclusion. International migration, social determinants of mental health, mental health inequities, social vulnerability, review.
  • Michalcova, Jana; Vasut, Karel; Airaksinen, Marja; Bielakova, Katarina (BioMed Central, 2020)
    Abstract Background Falls are common undesirable events for older adults in institutions. Even though the patient’s fall risk may be scored on admission, the medication-induced fall risk may be ignored. This study developed a preliminary categorization of fall-risk-increasing drugs (FRIDs) to be added as a risk factor to the existing fall risk assessment tool routinely used in geriatric care units. Methods Medication use data of older adults who had experienced at least one fall during a hospital ward or a nursing home stay within a 2-year study period were retrospectively collected from patient records. Medicines used were classified into three risk categories (high, moderate and none) according to the fall risk information in statutory summaries of product characteristics (SmPCs). The fall risk categorization incorporated the relative frequency of such adverse drug effects (ADEs) in SmPCs that were known to be connected to fall risk (sedation, orthostatic hypotension, syncope, dizziness, drowsiness, changes in blood pressure or impaired balance). Also, distribution of fall risk scores assessed on admission without considering medications was counted. Results The fall-experienced patients (n = 188, 128 from the hospital and 60 from nursing home records) used altogether 1748 medicaments, including 216 different active substances. Of the active substances, 102 (47%) were categorized as high risk (category A) for increasing fall risk. Fall-experienced patients (n = 188) received a mean of 3.8 category A medicines (n = 710), 53% (n = 375) of which affected the nervous and 40% (n = 281) the cardiovascular system. Without considering medication-related fall risk, 53% (n = 100) of the patients were scored having a high fall risk (3 or 4 risk scores). Conclusion It was possible to develop a preliminary categorization of FRIDs basing on their adverse drug effect profile in SmPCs and frequency of use in older patients who had experienced at least one documented fall in a geriatric care unit. Even though more than half of the fall-experienced study participants had high fall risk scores on admission, their fall risk might have been underestimated as use of high fall risk medicines was common, even concomitant use. Further studies are needed to develop the FRID categorization and assess its impact on fall risk.
  • Laulajainen-Hongisto, Anu; Lyly, Annina; Hanif, Tanzeela; Dhaygude, Kishor; Kankainen, Matti; Renkonen, Risto; Donner, Kati; Mattila, Pirkko; Jartti, Tuomas; Bousquet, Jean; Kauppi, Paula; Toppila-Salmi, Sanna (BioMed Central, 2020)
    Abstract Genome wide association studies (GWASs) have revealed several airway disease-associated risk loci. Their role in the onset of asthma, allergic rhinitis (AR) or chronic rhinosinusitis (CRS), however, is not yet fully understood. The aim of this review is to evaluate the airway relevance of loci and genes identified in GWAS studies. GWASs were searched from databases, and a list of loci associating significantly (p < 10–8) with asthma, AR and CRS was created. This yielded a total of 267 significantly asthma/AR–associated loci from 31 GWASs. No significant CRS -associated loci were found in this search. A total of 170 protein coding genes were connected to these loci. Of these, 76/170 (44%) showed bronchial epithelial protein expression in stained microscopic figures of Human Protein Atlas (HPA), and 61/170 (36%) had a literature report of having airway epithelial function. Gene ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) annotation analyses were performed, and 19 functional protein categories were found as significantly (p < 0.05) enriched among these genes. These were related to cytokine production, cell activation and adaptive immune response, and all were strongly connected in network analysis. We also identified 15 protein pathways that were significantly (p < 0.05) enriched in these genes, related to T-helper cell differentiation, virus infection, JAK-STAT signaling pathway, and asthma. A third of GWAS-level risk loci genes of asthma or AR seemed to have airway epithelial functions according to our database and literature searches. In addition, many of the risk loci genes were immunity related. Some risk loci genes also related to metabolism, neuro-musculoskeletal or other functions. Functions overlapped and formed a strong network in our pathway analyses and are worth future studies of biomarker and therapeutics.
  • Bousquet, Jean; Anto, Josep M; Iaccarino, Guido; Czarlewski, Wienczyslawa; Haahtela, Tari; Anto, Aram; Akdis, Cezmi A; Blain, Hubert; Walter Canonica, G.; Cardona, Victoria; Cruz, Alvaro A; Illario, Maddalena; Ivancevich, Juan C; Jutel, Marek; Klimek, Ludger; Kuna, Piotr; Laune, Daniel; Larenas‑Linnemann, Désirée; Mullol, Joaquim; Papadopoulos, Nikos G; Pfaar, Oliver; Samolinski, Boleslaw; Valiulis, Arunas; Yorgancioglu, Arzu; Zuberbier, Torsten (BioMed Central, 2020)
    An amendment to this paper has been published and can be accessed via the original article.
  • Mäkitaipale, J.; Sankari, S.; Sievänen, H.; Laitinen-Vapaavuori, O. (BioMed Central, 2020)
    Abstract Background Vitamin D deficiency and related metabolic bone diseases in pet rabbits have been intermittently debated. In human research, the parathyroid hormone concentration in relation to the 25-hydroxyvitamin D concentration is used to determine vitamin D deficiency. Thus, this study aimed to identify the breakpoint in the 25-hydroxyvitamin D concentration indicating a significant change in the parathyroid hormone concentration in 139 pet rabbits. An enzyme immunoassay kit was used for 25-hydroxyvitamin D analysis and the intact parathyroid hormone (PTH 1–84) immunoradiometric assay kit for parathyroid hormone analysis. The mid-tibial cortical bone density was measured using peripheral quantitative computed tomography. A segmented linear regression analysis was performed, with the 25-hydroxyvitamin D concentration as the independent variable, and parathyroid hormone, ionised calcium, total calcium, inorganic phosphorus concentrations and the mid-tibial cortical density as the dependent variables. Results The breakpoint for the parathyroid hormone concentration occurred at a 25(OH)D concentration of 17 ng/mL, whereas the cortical bone density breakpoint occurred at a 25-hydroxyvitamin D concentration of 19 ng/mL. No breakpoints were found for ionised calcium, total calcium or phosphorus. Conclusions These results suggest that a serum 25-hydroxyvitamin D concentration of 17 ng/mL serves as the threshold for vitamin D deficiency in rabbits. Nearly one-third of the rabbits had a serum 25-hydroxyvitamin D concentration below this threshold. Concerns persist regarding the high prevalence of vitamin D deficiency in pet rabbits and the possible health consequences caused by a chronic vitamin D deficiency, including the risk for metabolic bone diseases.
  • Kemp, Kirsi; Alakare, Janne; Harjola, Veli-Pekka; Strandberg, Timo; Tolonen, Jukka; Lehtonen, Lasse; Castrén, Maaret (BioMed Central, 2020)
    Abstract Background The aim of the emergency department (ED) triage is to recognize critically ill patients and to allocate resources. No strong evidence for accuracy of the current triage instruments, especially for the older adults, exists. We evaluated the National Early Warning Score 2 (NEWS2) and a 3-level triage assessment as risk predictors for frail older adults visiting the ED. Methods This prospective, observational study was performed in a Finnish ED. The data were collected in a six-month period and included were ≥ 75-year-old residents with Clinical Frailty Scale score of at least four. We analyzed the predictive values of NEWS2 and the three-level triage scale for 30-day mortality, hospital admission, high dependency unit (HDU) and intensive care unit (ICU) admissions, a count of 72-h and 30-day revisits, and ED length-of-stay (LOS). Results A total of 1711 ED visits were included. Median for age, CFS, LOS and NEWS2 were 85 years, 6 points, 6.2 h and 1 point, respectively. 30-day mortality was 96/1711. At triage, 69, 356 and 1278 of patients were assessed as red, yellow and green, respectively. There were 1103 admissions, of them 31 to an HDU facility, none to ICU. With NEWS2 and triage score, AUCs for 30-day mortality prediction were 0.70 (0.64–0.76) and 0.62 (0.56–0.68); for hospital admission prediction 0.62 (0.60–0.65) and 0.55 (0.52–0.56), and for HDU admission 0.72 (0.61–0.83) and 0.80 (0.70–0.90), respectively. The NEWS2 divided into risk groups of low, medium and high did not predict the ED LOS (p = 0.095). There was a difference in ED LOS between the red/yellow and as red/green patient groups (p < 0.001) but not between the yellow/green groups (p = 0.59). There were 48 and 351 revisits within 72 h and 30 days, respectively. With NEWS2 AUCs for 72-h and 30-day revisit prediction were 0.48 (95% CI 0.40–0.56) and 0.47 (0.44–0.51), respectively; with triage score 0.48 (0.40–0.56) and 0.49 (0.46–0.52), respectively. Conclusions The NEWS2 and a local 3-level triage scale are statistically significant, but poor in accuracy, in predicting 30-day mortality, and HDU admission but not ED LOS or revisit rates for frail older adults. NEWS2 also seems to predict hospital admission.
  • Lee, Yunsung; Haftorn, Kristine L; Denault, William R P; Nustad, Haakon E; Page, Christian M; Lyle, Robert; Lee-Ødegård, Sindre; Moen, Gunn-Helen; Prasad, Rashmi B; Groop, Leif C; Sletner, Line; Sommer, Christine; Magnus, Maria C; Gjessing, Håkon K; Harris, Jennifer R; Magnus, Per; Håberg, Siri E; Jugessur, Astanand; Bohlin, Jon (BioMed Central, 2020)
    Abstract Background Epigenetic clocks have been recognized for their precise prediction of chronological age, age-related diseases, and all-cause mortality. Existing epigenetic clocks are based on CpGs from the Illumina HumanMethylation450 BeadChip (450 K) which has now been replaced by the latest platform, Illumina MethylationEPIC BeadChip (EPIC). Thus, it remains unclear to what extent EPIC contributes to increased precision and accuracy in the prediction of chronological age. Results We developed three blood-based epigenetic clocks for human adults using EPIC-based DNA methylation (DNAm) data from the Norwegian Mother, Father and Child Cohort Study (MoBa) and the Gene Expression Omnibus (GEO) public repository: 1) an Adult Blood-based EPIC Clock (ABEC) trained on DNAm data from MoBa (n = 1592, age-span: 19 to 59 years), 2) an extended ABEC (eABEC) trained on DNAm data from MoBa and GEO (n = 2227, age-span: 18 to 88 years), and 3) a common ABEC (cABEC) trained on the same training set as eABEC but restricted to CpGs common to 450 K and EPIC. Our clocks showed high precision (Pearson correlation between chronological and epigenetic age (r) > 0.94) in independent cohorts, including GSE111165 (n = 15), GSE115278 (n = 108), GSE132203 (n = 795), and the Epigenetics in Pregnancy (EPIPREG) study of the STORK Groruddalen Cohort (n = 470). This high precision is unlikely due to the use of EPIC, but rather due to the large sample size of the training set. Conclusions Our ABECs predicted adults’ chronological age precisely in independent cohorts. As EPIC is now the dominant platform for measuring DNAm, these clocks will be useful in further predictions of chronological age, age-related diseases, and mortality.
  • Mokart, Djamel; Darmon, Michael; Schellongowski, Peter; Pickkers, Peter; Soares, Marcio; Rello, Jordi; Bauer, Philippe R; van de Louw, Andry; Lemiale, Virginie; Taccone, Fabio S; Martin-Loeches, Ignacio; Salluh, Jorge; Rusinova, Katerina; Mehta, Sangeeta; Antonelli, Massimo; Kouatchet, Achille; Barratt-Due, Andreas; Valkonen, Miia; Landburg, Precious P; Bukan, Ramin B; Pène, Frédéric; Metaxa, Victoria; Burghi, Gaston; Saillard, Colombe; Nielsen, Lene B; Canet, Emmanuel; Bisbal, Magali; Azoulay, Elie (Springer International Publishing, 2020)
    Abstract Background The impact of neutropenia in critically ill immunocompromised patients admitted in a context of acute respiratory failure (ARF) remains uncertain. The primary objective was to assess the prognostic impact of neutropenia on outcomes of these patients. Secondary objective was to assess etiology of ARF according to neutropenia. Methods We performed a post hoc analysis of a prospective multicenter multinational study from 23 ICUs belonging to the Nine-I network. Between November 2015 and July 2016, all adult immunocompromised patients with ARF admitted to the ICU were included in the study. Adjusted analyses included: (1) a hierarchical model with center as random effect; (2) propensity score (PS) matched cohort; and (3) adjusted analysis in the matched cohort. Results Overall, 1481 patients were included in this study of which 165 had neutropenia at ICU admission (11%). ARF etiologies distribution was significantly different between neutropenic and non-neutropenic patients, main etiologies being bacterial pneumonia (48% vs 27% in neutropenic and non-neutropenic patients, respectively). Initial oxygenation strategy was standard supplemental oxygen in 755 patients (51%), high-flow nasal oxygen in 165 (11%), non-invasive ventilation in 202 (14%) and invasive mechanical ventilation in 359 (24%). Before adjustment, hospital mortality was significantly higher in neutropenic patients (54% vs 42%; p = 0.006). After adjustment for confounder and center effect, neutropenia was no longer associated with outcome (OR 1.40, 95% CI 0.93–2.11). Similar results were observed after matching (52% vs 46%, respectively; p = 0.35) and after adjustment in the matched cohort (OR 1.04; 95% CI 0.63–1.72). Conclusion Neutropenia at ICU admission is not associated with hospital mortality in this cohort of critically ill immunocompromised patients admitted for ARF. In neutropenic patients, main ARF etiologies are bacterial and fungal infections.
  • Gormley, Gerard J; Kajamaa, Anu; Conn, Richard L; O’Hare, Sarah (BioMed Central, 2020)
    Abstract Background The healthcare needs of our societies are continual changing and evolving. In order to meet these needs, healthcare provision has to be dynamic and reactive to provide the highest standards of safe care. Therefore, there is a continual need to generate new evidence and implement it within healthcare contexts. In recent times, in situ simulation has proven to have been an important educational modality to accelerate individuals’ and teams’ skills and adaptability to deliver care in local contexts. However, due to the increasing complexity of healthcare, including in community settings, an expanded theoretical informed view of in situ simulation is needed as a form of education that can drive organizational as well as individual learning. Main body Cultural-historical activity theory (CHAT) provides us with analytical tools to recognize and analyse complex health care systems. Making visible the key elements of an in situ simulation process and their interconnections, CHAT facilitates development of a system-level view of needs of change. Conclusion In this paper, we theorize how CHAT could help guide in situ simulation processes—to generate greater insights beyond the specific simulation context and bring about meaningful transformation of an organizational activity.
  • Duplouy, Anne; Pranter, Robin; Warren-Gash, Haydon; Tropek, Robert; Wahlberg, Niklas (BioMed Central, 2020)
    Abstract Background Phylogenetically closely related strains of maternally inherited endosymbiotic bacteria are often found in phylogenetically divergent, and geographically distant insect host species. The interspecies transfer of the symbiont Wolbachia has been thought to have occurred repeatedly, facilitating its observed global pandemic. Few ecological interactions have been proposed as potential routes for the horizontal transfer of Wolbachia within natural insect communities. These routes are however likely to act only at the local scale, but how they may support the global distribution of some Wolbachia strains remains unclear. Results Here, we characterize the Wolbachia diversity in butterflies from the tropical forest regions of central Africa to discuss transfer at both local and global scales. We show that numerous species from both the Mylothris (family Pieridae) and Bicyclus (family Nymphalidae) butterfly genera are infected with similar Wolbachia strains, despite only minor interclade contacts across the life cycles of the species within their partially overlapping ecological niches. The phylogenetic distance and differences in resource use between these genera rule out the role of ancestry, hybridization, and shared host-plants in the interspecies transfer of the symbiont. Furthermore, we could not identify any shared ecological factors to explain the presence of the strains in other arthropod species from other habitats, or even ecoregions. Conclusion Only the systematic surveys of the Wolbachia strains from entire species communities may offer the material currently lacking for understanding how Wolbachia may transfer between highly different and unrelated hosts, as well as across environmental scales.
  • Renko, Elina; Knittle, Keegan; Palsola, Minttu; Lintunen, Taru; Hankonen, Nelli (BioMed Central, 2020)
    Abstract Background To achieve real-world impacts, behavior change interventions need to be scaled up and broadly implemented. Implementation is challenging however, and the factors influencing successful implementation are not fully understood. This study describes the nationwide implementation of a complex theory-based program targeting physical activity and sedentary behavior in vocational schools (Lets’s Move It; LMI). The implementation primarily involved a systematic and theory-based training and user manual for school staff. We explore how the perceived acceptability of this training (in line with the Theoretical Framework of Acceptability) relates to (un) successful implementation. The study evaluates (1) the experienced acceptability of the training and anticipated acceptability of later delivering the program; (2) reach and implementation, including adaptations and barriers; (3) whether acceptability ratings predict teachers’ intentions for implementation. Methods Upper secondary school staff from vocational and high schools (n = 194) enrolled in a two-part training, covering implementation of the LMI program and training in motivational interaction styles. One hundred fifty-one participants attended both parts of the training. Participants reported their perceived acceptability of the training and their implementation efforts in online questionnaires at baseline, after training sessions and at long-term follow-up. Qualitative data (open-ended questions) were analysed with content analysis to collate responses. Quantitative data analyses involved correlations and logistic regression. Results Participants rated the training as highly acceptable on all dimensions (average ratings exceeded 4.0 on a 5-point scale). The implementation reached at least 6100 students and 341 school classes. Most teachers intended to continue program implementation. Acceptability ratings explained 51.7% of teachers’ intentions to implement the student program (훘2 = 30.08; df = 8; p < .001), with affective attitude, perceived effectiveness and self-efficacy the most influential. Teachers commonly reported condensing program content, and reported deficits of time and collegial support as common barriers to implementation. Conclusion High acceptability and reach of the training indicate strong potential for implementation success. Multiple facets of acceptability seem important to successful implementation. Future research should explore ways to improve acceptability, thereby promoting successful implementation in real-world settings.
  • Gordevičius, Juozas; Narmontė, Milda; Gibas, Povilas; Kvederavičiūtė, Kotryna; Tomkutė, Vita; Paluoja, Priit; Krjutškov, Kaarel; Salumets, Andres; Kriukienė, Edita (BioMed Central, 2020)
    Abstract Background Massively parallel sequencing of maternal cell-free DNA (cfDNA) is widely used to test fetal genetic abnormalities in non-invasive prenatal testing (NIPT). However, sequencing-based approaches are still of high cost. Building upon previous knowledge that placenta, the main source of fetal circulating DNA, is hypomethylated in comparison to maternal tissue counterparts of cfDNA, we propose that targeting either unmodified or 5-hydroxymethylated CG sites specifically enriches fetal genetic material and reduces numbers of required analytical sequencing reads thereby decreasing cost of a test. Methods We employed uTOPseq and hmTOP-seq approaches which combine covalent derivatization of unmodified or hydroxymethylated CG sites, respectively, with next generation sequencing, or quantitative real-time PCR. Results We detected increased 5-hydroxymethylcytosine (5hmC) levels in fetal chorionic villi (CV) tissue samples as compared with peripheral blood. Using our previously developed uTOP-seq and hmTOP-seq approaches we obtained whole-genome uCG and 5hmCG maps of 10 CV tissue and 38 cfDNA samples in total. Our results indicated that, in contrast to conventional whole genome sequencing, such epigenomic analysis highly specifically enriches fetal DNA fragments from maternal cfDNA. While both our approaches yielded 100% accuracy in detecting Down syndrome in fetuses, hmTOP-seq maintained such accuracy at ultra-low sequencing depths using only one million reads. We identified 2164 and 1589 placenta-specific differentially modified and 5-hydroxymethylated regions, respectively, in chromosome 21, as well as 3490 and 2002 Down syndrome-specific differentially modified and 5-hydroxymethylated regions, respectively, that can be used as biomarkers for identification of Down syndrome or other epigenetic diseases of a fetus. Conclusions uTOP-seq and hmTOP-seq approaches provide a cost-efficient and sensitive epigenetic analysis of fetal abnormalities in maternal cfDNA. The results demonstrated that T21 fetuses contain a perturbed epigenome and also indicated that fetal cfDNA might originate from fetal tissues other than placental chorionic villi. Robust covalent derivatization followed by targeted analysis of fetal DNA by sequencing or qPCR presents an attractive strategy that could help achieve superior sensitivity and specificity in prenatal diagnostics.
  • Griffith, Evan F; Pius, Loupa; Manzano, Pablo; Jost, Christine C (Springer Berlin Heidelberg, 2020)
    Abstract COVID-19 is a global pandemic that continues to spread around the world, including to Africa where cases are steadily increasing. The African Centres for Disease Control and Prevention is leading the pandemic response in Africa, with direction from the World Health Organization guidelines for critical preparedness, readiness, and response actions. These are written for national governments, lacking nuance for population and local differences. In the greater Horn of Africa, conditions unique to pastoralists such as inherent mobility and limited health and service infrastructure will influence the dynamics of COVID-19. In this paper, we present a One Health approach to the pandemic, consisting of interdisciplinary and intersectoral collaboration focused on the determinants of health and health outcomes amongst pastoralists. Our contextualized public health strategy includes community One Health teams and suggestions for where to implement targeted public health measures. We also analyse the interaction of COVID-19 impacts, including those caused directly by the disease and those that result from control efforts, with ongoing shocks and vulnerabilities in the region (e.g. desert locusts, livestock disease outbreaks, floods, conflict, and development displacement). We give recommendations on how to prepare for and respond to the COVID-19 pandemic and its secondary impacts on pastoral areas. Given that the full impact of COVID-19 on pastoral areas is unknown currently, our health recommendations focus on disease prevention and understanding disease epidemiology. We emphasize targeting pastoral toponymies with public health measures to secure market access and mobility while combating the direct health impacts of COVID-19. A contextualized approach for the COVID-19 public health response in pastoral areas in the Greater Horn of Africa, including how the pandemic will interact with existing shocks and vulnerabilities, is required for an effective response, while protecting pastoral livelihoods and food, income, and nutrition security.
  • Äärelä, Linnea; Hiltunen, Pauliina; Soini, Tea; Vuorela, Nina; Huhtala, Heini; Nevalainen, Pasi I; Heikinheimo, Markku; Kivelä, Laura; Kurppa, Kalle (BioMed Central, 2020)
    Abstract Background Introduction of nitisinone and newborn screening (NBS) have transformed the treatment of type 1 tyrosinemia, but the effects of these changes on the long-term outcomes remain obscure. Also, the predictors for later complications, the significance of drug levels and the normalization of laboratory and imaging findings are poorly known. We investigated these issues in a nationwide study. Results Type 1 tyrosinemia was diagnosed in 22 children in 1978–2019 in Finland. Incidence was 1/90,102, with a significant enrichment in South Ostrobothnia (1/9990). Median age at diagnosis was 5 (range 0.5–36) months, 55% were girls and 13 had homozygotic Trp262X mutation. Four patients were detected through screening and 18 clinically, their main findings being liver failure (50% vs. 100%, respectively, p = 0.026), ascites (0% vs. 53%, p = 0.104), renal tubulopathy (0% vs. 65%, p = 0.035), rickets (25% vs. 65%, p = 0.272), growth failure (0% vs. 66%, p = 0.029), thrombocytopenia (25% vs. 88%, p = 0.028) and anaemia (0% vs. 47%, p = 0.131). One patient was treated with diet, seven with transplantation and 14 with nitisinone. Three late-diagnosed (6–33 months) nitisinone treated patients needed transplantation later. Kidney dysfunction (86% vs. 7%, p = 0.001), hypertension (57% vs. 7%, p = 0.025) and osteopenia/osteoporosis (71% vs. 14%, p = 0.017) were more frequent in transplanted than nitisinone-treated patients. Blood/serum alpha-fetoprotein decreased rapidly on nitisinone in all but one patient, who later developed intrahepatic hepatocellular carcinoma. Liver values normalized in 31 months and other laboratory values except thrombocytopenia within 18 months. Imaging findings normalized in 3–56 months excluding five patients with liver or splenic abnormalities. Low mean nitisinone concentration was associated with higher risk of severe complications (r = 0.758, p = 0.003) despite undetectable urine succinylacetone. Conclusions Prognosis of type 1 tyrosinemia has improved in the era of nitisinone, and NBS seems to provide further benefits. Nevertheless, the long-term risk for complications remains, particularly in the case of late diagnosis and/or insufficient nitisinone levels.
  • Zhang, Zhe; Liu, Lei; Kucukoglu, Melis; Tian, Dongdong; Larkin, Robert M; Shi, Xueping; Zheng, Bo (BioMed Central, 2020)
    Abstract Background The CLV3/ESR-RELATED (CLE) gene family encodes small secreted peptides (SSPs) and plays vital roles in plant growth and development by promoting cell-to-cell communication. The prediction and classification of CLE genes is challenging because of their low sequence similarity. Results We developed a machine learning-aided method for predicting CLE genes by using a CLE motif-specific residual score matrix and a novel clustering method based on the Euclidean distance of 12 amino acid residues from the CLE motif in a site-weight dependent manner. In total, 2156 CLE candidates—including 627 novel candidates—were predicted from 69 plant species. The results from our CLE motif-based clustering are consistent with previous reports using the entire pre-propeptide. Characterization of CLE candidates provided systematic statistics on protein lengths, signal peptides, relative motif positions, amino acid compositions of different parts of the CLE precursor proteins, and decisive factors of CLE prediction. The approach taken here provides information on the evolution of the CLE gene family and provides evidence that the CLE and IDA/IDL genes share a common ancestor. Conclusions Our new approach is applicable to SSPs or other proteins with short conserved domains and hence, provides a useful tool for gene prediction, classification and evolutionary analysis.
  • Hietikko, Ronja; Kilpeläinen, Tuomas P; Kenttämies, Anu; Ronkainen, Johanna; Ijäs, Kirsty; Lind, Kati; Marjasuo, Suvi; Oksala, Juha; Oksanen, Outi; Saarinen, Tuomas; Savolainen, Ritja; Taari, Kimmo; Tammela, Teuvo L J; Mirtti, Tuomas; Natunen, Kari; Auvinen, Anssi; Rannikko, Antti (BioMed Central, 2020)
    Abstract Background The aim of this study is to investigate the potential impact of prostate magnetic resonance imaging (MRI) -related interreader variability on a population-based randomized prostate cancer screening trial (ProScreen). Methods From January 2014 to January 2018, 100 men aged 50–63 years with clinical suspicion of prostate cancer (PCa) in Helsinki University Hospital underwent MRI. Nine radiologists individually reviewed the pseudonymized MRI scans of all 100 men in two ProScreen trial centers. All 100 men were biopsied according to a histological composite variable comprising radical prostatectomy histology (N = 38) or biopsy result within 1 year from the imaging (N = 62). Fleiss’ kappa (κ) was used to estimate the combined agreement between all individual radiologists. Sample data were subsequently extrapolated to 1000-men subgroups of the ProScreen cohort. Results Altogether 89% men of the 100-men sample were diagnosed with PCa within a median of 2.4 years of follow-up. Clinically significant PCa (csPCa) was identified in 76% men. For all PCa, mean sensitivity was 79% (SD ±10%, range 62–96%), and mean specificity 60% (SD ±22%, range 27–82%). For csPCa (Gleason Grade 2–5) MRI was equally sensitive (mean 82%, SD ±9%, range 67–97%) but less specific (mean 47%, SD ±20%, range 21–75%). Interreader agreement for any lesion was fair (κ 0.40) and for PI-RADS 4–5 lesions it was moderate (κ 0.60). Upon extrapolating these data, the average sensitivity and specificity to a screening positive subgroup of 1000 men from ProScreen with a 30% prevalence of csPCa, 639 would be biopsied. Of these, 244 men would be true positive, and 395 false positive. Moreover, 361 men would not be referred to biopsy and among these, 56 csPCas would be missed. The variation among the radiologists was broad as the least sensitive radiologist would have twice as many men biopsied and almost three times more men would undergo unnecessary biopsies. Although the most sensitive radiologist would miss only 2.6% of csPCa (false negatives), the least sensitive radiologist would miss every third. Conclusions Interreader agreement was fair to moderate. The role of MRI in the ongoing ProScreen trial is crucial and has a substantial impact on the screening process.

View more