Articles from BioMed Central


Recent Submissions

  • Pesonen, Ida; Carlson, Lisa; Murgia, Nicola; Kaarteenaho, Riitta; Sköld, Carl M; Myllärniemi, Marjukka; Ferrara, Giovanni (BioMed Central, 2018)
    Abstract Background Idiopathic pulmonary fibrosis (IPF) is characterized by progressive loss of lung function with high mortality within the first 5 years from diagnosis. In 2011–2014, two drugs, pirfenidone and nintedanib, have been approved worldwide for prevention of IPF progression. National IPF-registries have been established in both Finland and Sweden. Our study explored potential differences in the care of IPF in these two countries. Methods Patients included consecutively in the Finnish and Swedish IPF-registries from January 1, 2014 through December 31, 2016 were included in the study. Data on demographics and lung function at the time of inclusion were collected. Access to antifibrotic drugs and data on disease outcomes, mortality and the proportion of patients who underwent lung transplantation, was collected during a 3-year follow up. Results One-hundred and fifty-two patients from the Finnish and 160 patients from the Swedish IPF-cohorts were included in the study. At inclusion, Finnish patients were significantly older than the Swedish patients (74.6 years vs 72.5 years, p = 0.017). The proportion of non-smokers was significantly higher in the Finnish cohort (41.7% vs 26.9%, p = 0.007). Forced vital capacity (FVC), % of predicted (78.2 vs 71.7 for Finnish and Swedish patients, respectively, p = 0.01) and diffusion capacity for carbon monoxide (DLCO), % of predicted (53.3 vs 48.2 for Finnish and Swedish patients, respectively, p = 0.002) were significantly higher in the Finnish cohort compared to the Swedish cohort at the time of inclusion. During the 3-year follow up period, 45 (29.6%) Finnish and 111 (69.4%) Swedish patients, respectively, were initiated on treatment with an antifibrotic drug (pirfenidone or nintedanib) (p <  0.001). When comparing possible determinants of treatment, patients with higher FVC % were less likely to start antifibrotic drugs (OR 0.96, 95%CI 0.93–1.00, p <  0.024). To be resident in Sweden was the main determinant for receiving antifibrotic drugs (OR 5.48, 95%CI 2.65–11.33, p < 0.0001). No significant difference in number of deaths and lung transplantation during the follow up period was found. Conclusions This study highlights differences concerning how IPF patients are treated in Finland and Sweden. How these differences will influence the long-term outcome of these patients is unknown.
  • Markkula, Niina; Cabieses, Baltica; Lehti, Venla; Uphoff, Eleonora; Astorga, Sofia; Stutzin, Francisca (BioMed Central, 2018)
    Abstract Background Migrant children have specific health needs, and may face difficulties in accessing health care, but not enough is known about their health service use. This study aims to describe patterns of use of health services of international migrant children and differences to respective native populations. Methods Electronic databases PubMed and Web of Science, references of identified publications, and websites of relevant international agencies were searched. We included observational studies published between 2006 and 2016 that reported use of formal health services by migrant children (0–18 years), including first and second generation migrants. Data on study characteristics, study theme, main outcome and study quality were extracted. Results One hundred seven full texts were included in the review. Of the studies that reported comparable outcomes, half (50%) indicated less use of healthcare by migrants compared with non-migrants; 25% reported no difference, 18% reported greater use, and 7% did not report this outcome. There was variation by theme, so that the proportion of conclusions “less use” was most common in the categories “general access to care”, “primary care” and “oral health”, whereas in the use of emergency rooms or hospitalisations, the most common conclusion was “greater use”. Conclusions Migrant children appear to use different types of healthcare services less than native populations, with the exception of emergency and hospital services. Systematic review registration PROSPERO systematic review registration number: CRD42016039876 .
  • Rantonen, J.; Karppinen, J.; Vehtari, A.; Luoto, S.; Viikari-Juntura, E.; Hupli, M.; Malmivaara, A.; Taimela, S. (BioMed Central, 2018)
    Abstract Background We assessed the effectiveness of three interventions that were aimed to reduce non-acute low back pain (LBP) related symptoms in the occupational health setting. Methods Based on a survey (n = 2480; response rate 71%) on LBP, we selected a cohort of 193 employees who reported moderate LBP (Visual Analogue Scale VAS > 34 mm) and fulfilled at least one of the following criteria during the past 12 months: sciatica, recurrence of LBP ≥ 2 times, LBP ≥ 2 weeks, or previous sickness absence. A random sample was extracted from the cohort as a control group (Control, n = 50), representing the natural course of LBP. The remaining 143 employees were invited to participate in a randomised controlled trial (RCT) of three 1:1:1 allocated parallel intervention arms: multidisciplinary rehabilitation (Rehab, n = 43); progressive exercises (Physio, n = 43) and self-care advice (Advice, n = 40). Seventeen employees declined participation in the intervention. The primary outcome measures were physical impairment (PHI), LBP intensity (Visual Analogue Scale), health related quality of life (QoL), and accumulated sickness absence days. We imputed missing values with multiple imputation procedure. We assessed all comparisons between the intervention groups and the Control group by analysing questionnaire outcomes at 2 years with ANOVA and sickness absence at 4 years by using negative binomial model with a logarithmic link function. Results Mean differences between the Rehab and Control groups were − 3 [95% CI -5 to − 1] for PHI, − 13 [− 24 to − 1] for pain intensity, and 0.06 [0.00 to 0.12] for QoL. Mean differences between the Physio and Control groups were − 3 [95% CI -5 to − 1] for PHI, − 13 [− 29 to 2] for pain intensity, and 0.07 [0.01 to 0.13] for QoL. The main effects sizes were from 0.4 to 0.6. The interventions were not effective in reducing sickness absence. Conclusions Rehab and Physio interventions improved health related quality of life, decreased low back pain and physical impairment in non-acute, moderate LBP, but we found no differences between the Advice and Control group results. No effectiveness on sickness absence was observed. Trial registration Number NCT00908102
  • Pakkanen, Soile A E; de Vries, Annemarie; Raekallio, Marja R; Mykkänen, Anna K; Palviainen, Mari J; Sankari, Satu M; Vainio, Outi M (BioMed Central, 2018)
    Abstract Background Romifidine, an α-2 adrenoceptor agonist, is a widely-used sedative in equine medicine. Besides the desired sedative and analgesic actions, α-2 adrenoceptor agonists have side effects like alterations of plasma concentrations of glucose and certain stress-related hormones and metabolites in various species. Vatinoxan (previously known as MK-467), in turn, is an antagonist of α-2 adrenoceptors. Because vatinoxan does not cross the blood brain barrier in significant amounts, it has only minor effect on sedation induced by α-2 adrenoceptor agonists. Previously, vatinoxan is shown to prevent the hyperglycaemia, increase of plasma lactate concentration and the decrease of insulin and non-esterified free fatty acids (FFAs) caused by α-2 adrenoceptor agonists in different species. The aim of our study was to investigate the effects of intravenous romifidine and vatinoxan, alone and combined, on plasma concentrations of glucose and some stress-related hormones and metabolites in horses. Results Plasma glucose concentration differed between all intravenous treatments: romifidine (80 μg/kg; ROM), vatinoxan (200 μg/kg; V) and the combination of these (ROM + V). Glucose concentration was the highest after ROM and the lowest after V. Serum FFA concentration was higher after V than after ROM or ROM + V. The baseline serum concentration of insulin varied widely between the individual horses. No differences were detected in serum insulin, cortisol or plasma adrenocorticotropic hormone (ACTH) concentrations between the treatments. Plasma lactate, serum triglyceride or blood sodium and chloride concentrations did not differ from baseline or between the treatments. Compared with baseline, plasma glucose concentration increased after ROM and ROM + V, serum cortisol, FFA and base excess increased after all treatments and plasma ACTH concentration increased after V. Serum insulin concentration decreased after V and blood potassium decreased after all treatments. Conclusions Romifidine induced hyperglycaemia, which vatinoxan partially prevented despite of the variations in baseline levels of serum insulin. The effects of romifidine and vatinoxan on the insulin concentration in horses need further investigation.
  • Valenzuela, Daniel; Norri, Tuukka; Välimäki, Niko; Pitkänen, Esa; Mäkinen, Veli (BioMed Central, 2018)
    Abstract Background Typical human genome differs from the reference genome at 4-5 million sites. This diversity is increasingly catalogued in repositories such as ExAC/gnomAD, consisting of >15,000 whole-genomes and >126,000 exome sequences from different individuals. Despite this enormous diversity, resequencing data workflows are still based on a single human reference genome. Identification and genotyping of genetic variants is typically carried out on short-read data aligned to a single reference, disregarding the underlying variation. Results We propose a new unified framework for variant calling with short-read data utilizing a representation of human genetic variation – a pan-genomic reference. We provide a modular pipeline that can be seamlessly incorporated into existing sequencing data analysis workflows. Our tool is open source and available online: . Conclusions Our experiments show that by replacing a standard human reference with a pan-genomic one we achieve an improvement in single-nucleotide variant calling accuracy and in short indel calling accuracy over the widely adopted Genome Analysis Toolkit (GATK) in difficult genomic regions.
  • Rypdal, Veronika; Arnstad, Ellen D; Aalto, Kristiina; Berntson, Lillemor; Ekelund, Maria; Fasth, Anders; Glerup, Mia; Herlin, Troels; Nielsen, Susan; Peltoniemi, Suvi; Zak, Marek; Rygg, Marite; Rypdal, Martin; Nordal, Ellen (BioMed Central, 2018)
    Abstract Background The aim was to develop prediction rules that may guide early treatment decisions based on baseline clinical predictors of long-term unfavorable outcome in juvenile idiopathic arthritis (JIA). Methods In the Nordic JIA cohort, we assessed baseline disease characteristics as predictors of the following outcomes 8 years after disease onset. Non-achievement of remission off medication according to the preliminary Wallace criteria, functional disability assessed by Childhood Health Assessment Questionnaire (CHAQ) and Physical Summary Score (PhS) of the Child Health Questionnaire, and articular damage assessed by the Juvenile Arthritis Damage Index-Articular (JADI-A). Multivariable models were constructed, and cross-validations were performed by repeated partitioning of the cohort into training sets for developing prediction models and validation sets to test predictive ability. Results The total cohort constituted 423 children. Remission status was available in 410 children: 244 (59.5%) of these did not achieve remission off medication at the final study visit. Functional disability was present in 111/340 (32.7%) children assessed by CHAQ and 40/199 (20.1%) by PhS, and joint damage was found in 29/216 (13.4%). Model performance was acceptable for making predictions of long-term outcome. In validation sets, the area under the curves (AUCs) in the receiver operating characteristic (ROC) curves were 0.78 (IQR 0.72–0.82) for non-achievement of remission off medication, 0.73 (IQR 0.67–0.76) for functional disability assessed by CHAQ, 0.74 (IQR 0.65–0.80) for functional disability assessed by PhS, and 0.73 (IQR 0.63–0.76) for joint damage using JADI-A. Conclusion The feasibility of making long-term predictions of JIA outcome based on early clinical assessment is demonstrated. The prediction models have acceptable precision and require only readily available baseline variables. Further testing in other cohorts is warranted.
  • Presseau, Justin; Mackintosh, Joan; Hawthorne, Gillian; Francis, Jill J; Johnston, Marie; Grimshaw, Jeremy M; Steen, Nick; Coulthard, Tom; Brown, Heather; Kaner, Eileen; Elovainio, Marko; Sniehotta, Falko F (BioMed Central, 2018)
    Abstract Background National diabetes audits in the UK show room for improvement in the quality of care delivered to people with type 2 diabetes in primary care. Systematic reviews of quality improvement interventions show that such approaches can be effective but there is wide variability between trials and little understanding concerning what explains this variability. A national cohort study of primary care across 99 UK practices identified modifiable predictors of healthcare professionals’ prescribing, advising and foot examination. Our objective was to evaluate the effectiveness of an implementation intervention to improve six guideline-recommended health professional behaviours in managing type 2 diabetes in primary care: prescribing for blood pressure and glycaemic control, providing physical activity and nutrition advice and providing updated diabetes education and foot examination. Methods Two-armed cluster randomised trial involving 44 general practices. Primary outcomes (at 12 months follow-up): from electronic medical records, the proportion of patients receiving additional prescriptions for blood pressure and insulin initiation for glycaemic control and having a foot examination; and from a patient survey of a random sample of 100 patients per practice, reported receipt of updated diabetes education and physical activity and nutrition advice. Results The implementation intervention did not lead to statistically significant improvement on any of the six clinical behaviours. 1,138,105 prescriptions were assessed. Intervention (29% to 37% patients) and control arms (31% to 35%) increased insulin initiation relative to baseline but were not statistically significantly different at follow-up (IRR 1.18, 95%CI 0.95–1.48). Intervention (45% to 53%) and control practices (45% to 50%) increased blood pressure prescription from baseline to follow-up but were not statistically significantly different at follow-up (IRR 1.05, 95%CI 0.96 to 1.16). Intervention (75 to 78%) and control practices (74 to 79%) increased foot examination relative to baseline; control practices increased statistically significantly more (OR 0.84, 95%CI 0.75–0.94). Fewer patients in intervention (33%) than control practices (40%) reported receiving updated diabetes education (OR = 0.74, 95%CI 0.57–0.97). No statistically significant differences were observed in patient reports of having had a discussion about nutrition (intervention = 73%; control = 72%; OR = 0.98, 95%CI 0.59–1.64) or physical activity (intervention = 57%; control = 62%; OR = 0.79, 95%CI 0.56–1.11). Development and delivery of the intervention cost £1191 per practice. Conclusions There was no measurable benefit to practices’ participation in this intervention. Despite widespread use of outreach interventions worldwide, there is a need to better understand which techniques at which intensity are optimally suited to address the multiple clinical behaviours involved in improving care for type 2 diabetes. Trial registration ISRCTN, ISRCTN66498413 . Registered April 4, 2013
  • Idehen, Esther E; Koponen, Päivikki; Härkänen, Tommi; Kangasniemi, Mari; Pietilä, Anna-Maija; Korhonen, Tellervo (BioMed Central, 2018)
    Abstract Background Cervical cancer is currently ranked as the fourth commonly diagnosed cancer in women globally. A higher incidence has been reported in low- and-middle-income countries, and the disease poses significant public health challenges. Evidence suggests that this disease is preventable by means of regular screening using the Papanicolaou (Pap) test. However, limited knowledge exists about disparities in cervical screening participation among immigrants compared with non-immigrants, in countries with universal cervical screening programmes. We aimed to examine disparities in cervical screening participation among women of Russian, Somali, and Kurdish, origin in Finland, comparing them with the general Finnish population (Finns). We controlled for differences in several socio-demographic and health-related variables as potential confounders. Methods We employed data from the Finnish Migrant Health and Well-being Study 2010–2012 and the National Health 2011 Survey. Data collection involved face-to-face interviews. Data on screening participation in the previous five years from women aged 29–60 were available from 537 immigrants (257 Russians, 113 Somalis, 167 Kurds) and from 436 Finns. For statistical analyses, we used multiple logistic regression. Results Age-adjusted screening participation rates were as follows: Russians 79% (95% CI 72.9–84.4), Somalis 41% (95% CI 31.4–50.1), and Kurds 64% (95% CI 57.2–70.8), compared with 94% (95% CI 91.4–95.9) among Finns. After additionally adjusting for socio-demographic and health-related confounders, all the immigrant groups showed a significantly lower likelihood of screening participation when compared with Finns. The Odds Ratios were as follows: Russians 0.32 (95% CI 0.18–0.58), Somalis 0.10 (95% CI 0.04–0.23), and Kurds 0.17 (95% CI 0.09–0.35). However, when additionally accounting for country of origin-confounder interactions, such differences were attenuated. Conclusions Our results indicate disparities in screening participation among these immigrants and a lower likelihood of screening participation compared with the general Finnish population. To improve equity in cervical cancer screening participation, appropriate culturally tailored intervention programmes for each immigrant group might be beneficial.
  • Chen, An; Tenhunen, Henni; Torkki, Paulus; Peltokorpi, Antti; Heinonen, Seppo; Lillrank, Paul; Stefanovic, Vedran (BioMed Central, 2018)
    Abstract Background Population-based prenatal screening has become a common and widely available obstetrical practice in majority of developed countries. Under the patient autonomy principle, women should understand the screening options, be able to take their personal preferences and situations into account, and be encouraged to make autonomous and intentional decisions. The majority of the current research focuses on the prenatal screening uptake rate, women’s choice on screening tests, and the influential factors. However, little attention has been paid to women’s choice-making processes and experiences in prenatal screening and their influences on choice satisfaction. Understanding women’s choice-making processes and experiences in pregnancy and childbirth is the prerequisite for designing women-centered choice aids and delivering women-centered maternity care. This paper presents a pilot study that aims to investigate women’s experiences when they make choices for screening tests, quantify the choice-making experience, and identify the experiential factors that affect women’s satisfaction on choices they made. Method We conducted a mixed-method research at Helsinki and Uusimaa Hospital District (HUS) in Finland. First, the women’s choice-making experiences were explored by semi-structured interviews. We interviewed 28 women who participated in prenatal screening. The interview data was processed by thematic analysis. Then, a cross-sectional self-completion survey was designed and implemented, assessing women’s experiences in choice-making and identifying the experiential factors that influence choice satisfaction. Of 940 distributed questionnaires, 185 responses were received. Multivariable linear regression analysis was used to detect the effects of the variables. Results We developed a set of measurements for women’s choice-making experiences in prenatal screening with seven variables: activeness, informedness, confidence, social pressure, difficulty, positive emotion and negative emotion. Regression revealed that activeness in choice-making (β = 0.176; p = 0.023), confidence in choice-making (β = 0.388; p < 0.001), perceived social pressure (β = − 0.306; p < 0.001) and perceived difficulty (β = − 0.274; p < 0.001) significantly influenced women’s choice satisfaction in prenatal screening. Conclusions This study explores the experiential dimension of women’s choice-making in prenatal screening. Our result will be useful for service providers to design women-centered choice environment. Women’s willingness and capabilities of making active choices, their preferences, and social reliance should be well considered in order to facilitate autonomous, confident and satisfying choices.
  • Barok, Mark; Puhka, Maija; Vereb, Gyorgy; Szollosi, Janos; Isola, Jorma; Joensuu, Heikki (BioMed Central, 2018)
    Abstract Background Trastuzumab emtansine (T-DM1) is an antibody-drug conjugate that carries a cytotoxic drug (DM1) to HER2-positive cancer. The target of T-DM1 (HER2) is present also on cancer-derived exosomes. We hypothesized that exosome-bound T-DM1 may contribute to the activity of T-DM1. Methods Exosomes were isolated from the cell culture medium of HER2-positive SKBR-3 and EFM-192A breast cancer cells, HER2-positive SNU-216 gastric cancer cells, and HER2-negative MCF-7 breast cancer cells by serial centrifugations including two ultracentrifugations, and treated with T-DM1. T-DM1 not bound to exosomes was removed using HER2-coated magnetic beads. Exosome samples were analyzed by electron microscopy, flow cytometry and Western blotting. Binding of T-DM1-containing exosomes to cancer cells and T-DM1 internalization were investigated with confocal microscopy. Effects of T-DM1-containg exosomes on cancer cells were investigated with the AlamarBlue cell proliferation assay and the Caspase-Glo 3/7 caspase activation assay. Results T-DM1 binds to exosomes derived from HER2-positive cancer cells, but not to exosomes derived from HER2-negative MCF-7 cells. HER2-positive SKBR-3 cells accumulated T-DM1 after being treated with T-DM1-containg exosomes, and treatment of SKBR-3 and EFM-192A cells with T-DM1-containing exosomes resulted in growth inhibition and activation of caspases 3 and/or 7. Conclusion T-DM1 binds to exosomes derived from HER2-positive cancer cells, and T-DM1 may be carried to other cancer cells via exosomes leading to reduced viability of the recipient cells. The results suggest a new mechanism of action for T-DM1, mediated by exosomes derived from HER2-positive cancer.
  • Heponiemi, Tarja; Hyppönen, Hannele; Kujala, Sari; Aalto, Anna-Mari; Vehko, Tuulikki; Vänskä, Jukka; Elovainio, Marko (BioMed Central, 2018)
    Abstract Background Among the important stress factors for physicians nowadays are poorly functioning, time consuming and inadequate information systems. The present study examined the predictors of physicians’ stress related to information systems (SRIS) among Finnish physicians. The examined predictors were cognitive workload, staffing problems, time pressure, problems in teamwork and job satisfaction, adjusted for baseline levels of SRIS, age, gender and employment sector. Methods The study has a follow-up design with two survey data collection waves, one in 2006 and one in 2015, based on a random sample of Finnish physicians was used. The present study used a sample that included 1109 physicians (61.9% women; mean age in 2015 was 54.5; range 34–72) who provided data on the SRIS in both waves. The effects of a) predictor variable levels in 2006 on SRIS in 2015 and b) the change in the predictor variables from 2006 to 2015 on SRIS in 2015 were analysed with linear regression analyses. Results Regression analyses showed that the higher level of cognitive workload in 2006 significantly predicted higher level of SRIS in 2015 (β = 0.08). The reciprocity of this association was tested with cross-lagged structural equation model analyses which showed that the direction of the association was from cognitive workload to SRIS, not from SRIS to cognitive workload. Moreover, increases in time pressure (β = 0.16) and problems in teamwork (β = 0.10) were associated with higher levels of SRIS in 2015, whereas job satisfaction increase was associated with lower SRIS (β = − 0.06). Conclusions According to our results, physicians’ cognitive workload may have long-lasting negative ramifications in regard to how stressful physicians experience their health information systems to be. Thus, organisations should pay attention to physicians workload if they wish physicians to master all the systems they need to use. It is also important to provide physicians with enough time and collegial support in their system-related problems, and in learning new systems and system updates.
  • Hewetson, Michael; Venner, Monica; Volquardsen, Jan; Sykes, Ben W; Hallowell, Gayle D; Vervuert, Ingrid; Fosgate, Geoffrey T; Tulamo, Riitta-Mari (BioMed Central, 2018)
    Abstract Background Equine gastric ulcer syndrome is an important cause of morbidity in weanling foals. Many foals are asymptomatic, and the development of an inexpensive screening test to ensure an early diagnosis is desirable. The objective of this study was to determine the diagnostic accuracy of blood sucrose for diagnosis of EGUS in weanling foals. Results 45 foals were studied 7 days before and 14 days after weaning. The diagnostic accuracy of blood sucrose for diagnosis of gastric lesions (GL); glandular lesions (GDL); squamous lesions (SQL) and clinically significant gastric lesions (CSL) at 45 and 90 min after administration of 1 g/kg of sucrose via nasogastric intubation was assessed using ROC curves and calculating the AUC. For each lesion type, sucrose concentration in blood was compared to gastroscopy; and sensitivities (Se) and specificities (Sp) were calculated across a range of sucrose concentrations. Cut-off values were selected manually to optimize Se. Because of concerns over the validity of the gold standard, additional Se, Sp, and lesion prevalence data were subsequently estimated and compared using Bayesian latent class analysis. Using the frequentist approach, the prevalence of GL; GDL; SQL and CSL before weaning was 21; 9; 7 and 8% respectively; and increased to 98; 59; 97 and 82% respectively after weaning. At the selected cut-off, Se ranged from 84 to 95% and Sp ranged from 47 to 71%, depending upon the lesion type and time of sampling. In comparison, estimates of Se and Sp were consistently higher when using a Bayesian approach, with Se ranging from 81 to 97%; and Sp ranging from 77 to 97%, depending upon the lesion type and time of sampling. Conclusions Blood sucrose is a sensitive test for detecting EGUS in weanling foals. Due to its poor specificity, it is not expected that the sucrose blood test will replace gastroscopy, however it may represent a clinically useful screening test to identify foals that may benefit from gastroscopy. Bayesian latent class analysis represents an alternative method to evaluate the diagnostic accuracy of the blood sucrose test in an attempt to avoid bias associated with the assumption that gastroscopy is a perfect test.
  • Oghabian, Ali; Greco, Dario; Frilander, Mikko J (BioMed Central, 2018)
    Abstract Background In-depth study of the intron retention levels of transcripts provide insights on the mechanisms regulating pre-mRNA splicing efficiency. Additionally, detailed analysis of retained introns can link these introns to post-transcriptional regulation or identify aberrant splicing events in human diseases. Results We present IntEREst, Intron–Exon Retention Estimator, an R package that supports rigorous analysis of non-annotated intron retention events (in addition to the ones annotated by RefSeq or similar databases), and support intra-sample in addition to inter-sample comparisons. It accepts binary sequence alignment/map (.bam) files as input and determines genome-wide estimates of intron retention or exon-exon junction levels. Moreover, it includes functions for comparing subsets of user-defined introns (e.g. U12-type vs U2-type) and its plotting functions allow visualization of the distribution of the retention levels of the introns. Statistical methods are adapted from the DESeq2, edgeR and DEXSeq R packages to extract the significantly more or less retained introns. Analyses can be performed either sequentially (on single core) or in parallel (on multiple cores). We used IntEREst to investigate the U12- and U2-type intron retention in human and plant RNAseq dataset with defects in the U12-dependent spliceosome due to mutations in the ZRSR2 component of this spliceosome. Additionally, we compared the retained introns discovered by IntEREst with that of other methods and studies. Conclusion IntEREst is an R package for Intron retention and exon-exon junction levels analysis of RNA-seq data. Both the human and plant analyses show that the U12-type introns are retained at higher level compared to the U2-type introns already in the control samples, but the retention is exacerbated in patient or plant samples carrying a mutated ZRSR2 gene. Intron retention events caused by ZRSR2 mutation that we discovered using IntEREst (DESeq2 based function) show considerable overlap with the retained introns discovered by other methods (e.g. IRFinder and edgeR based function of IntEREst). Our results indicate that increase in both the number of biological replicates and the depth of sequencing library promote the discovery of retained introns, but the effect of library size gradually decreases with more than 35 million reads mapped to the introns.
  • Tolonen, Matti; Coccolini, Federico; Ansaloni, Luca; Sartelli, Massimo; Roberts, Derek J; McKee, Jessica L; Leppaniemi, Ari; Doig, Christopher J; Catena, Fausto; Fabian, Timothy; Jenne, Craig N; Chiara, Osvaldo; Kubes, Paul; Kluger, Yoram; Fraga, Gustavo P; Pereira, Bruno M; Diaz, Jose J; Sugrue, Michael; Moore, Ernest E; Ren, Jianan; Ball, Chad G; Coimbra, Raul; Dixon, Elijah; Biffl, Walter; MacLean, Anthony; McBeth, Paul B; Posadas-Calleja, Juan G; Di Saverio, Salomone; Xiao, Jimmy; Kirkpatrick, Andrew W (BioMed Central, 2018)
    Abstract Background Severe complicated intra-abdominal sepsis (SCIAS) is a worldwide challenge with increasing incidence. Open abdomen management with enhanced clearance of fluid and biomediators from the peritoneum is a potential therapy requiring prospective evaluation. Given the complexity of powering multi-center trials, it is essential to recruit an inception cohort sick enough to benefit from the intervention; otherwise, no effect of a potentially beneficial therapy may be apparent. An evaluation of abilities of recognized predictive systems to recognize SCIAS patients was conducted using an existing intra-abdominal sepsis (IAS) database. Methods All consecutive adult patients with a diffuse secondary peritonitis between 2012 and 2013 were collected from a quaternary care hospital in Finland, excluding appendicitis/cholecystitis. From this retrospectively collected database, a target population (93) of those with either ICU admission or mortality were selected. The performance metrics of the Third Consensus Definitions for Sepsis and Septic Shock based on both SOFA and quick SOFA, the World Society of Emergency Surgery Sepsis Severity Score (WSESSSS), the APACHE II score, Manheim Peritonitis Index (MPI), and the Calgary Predisposition, Infection, Response, and Organ dysfunction (CPIRO) score were all tested for their discriminant ability to identify this subgroup with SCIAS and to predict mortality. Results Predictive systems with an area under-the-receiving-operating characteristic (AUC) curve > 0.8 included SOFA, Sepsis-3 definitions, APACHE II, WSESSSS, and CPIRO scores with the overall best for CPIRO. The highest identification rates were SOFA score ≥ 2 (78.4%), followed by the WSESSSS score ≥ 8 (73.1%), SOFA ≥ 3 (75.2%), and APACHE II ≥ 14 (68.8%) identification. Combining the Sepsis-3 septic-shock definition and WSESSS ≥ 8 increased detection to 80%. Including CPIRO score ≥ 3 increased this to 82.8% (Sensitivity-SN; 83% Specificity-SP; 74%. Comparatively, SOFA ≥ 4 and WSESSSS ≥ 8 with or without septic-shock had 83.9% detection (SN; 84%, SP; 75%, 25% mortality). Conclusions No one scoring system behaves perfectly, and all are largely dominated by organ dysfunction. Utilizing combinations of SOFA, CPIRO, and WSESSSS scores in addition to the Sepsis-3 septic shock definition appears to offer the widest “inclusion-criteria” to recognize patients with a high chance of mortality and ICU admission. Trial registration ; Registered on May 22, 2017.
  • Koponen, Mikael; Havulinna, Aki S; Marjamaa, Annukka; Tuiskula, Annukka M; Salomaa, Veikko; Laitinen-Forsblom, Päivi J; Piippo, Kirsi; Toivonen, Lauri; Kontula, Kimmo; Viitasalo, Matti; Swan, Heikki (BioMed Central, 2018)
    Abstract Background Long QT syndrome (LQTS) is an inherited cardiac disorder predisposing to sudden cardiac death (SCD). We studied factors affecting the clinical course of genetically confirmed patients, in particular those not receiving β-blocker treatment. In addition, an attempt was made to associate risk of events to specific types of KCNQ1 and KCNH2 mutations. Methods A follow-up study covering a mean of 18.6 ± 6.1 years was conducted in 867 genetically confirmed LQT1 and LQT2 patients and 654 non-carrier relatives aged 18–40 years. Cox regression models were used to evaluate the contribution of clinical and genetic risk factors to cardiac events. Results In mutation carriers, risk factors for cardiac events before initiation of β-blocker included LQT2 genotype (hazard ratio [HR] = 2.1, p = 0.002), female gender (HR = 3.2, p < 0.001), a cardiac event before the age of 18 years (HR = 5.9, p < 0.001), and QTc ≥500 ms (vs < 470 ms, HR = 2.7, p = 0.001). LQT1 patients carrying the KCNQ1 D317N mutation were at higher risk (HR = 3.0–3.9, p < 0.001–0.03) compared to G589D, c.1129-2A > G and other KCNQ1 mutation carriers after adjusting for gender, QTc duration, and cardiac events before age 18. KCNH2 c.453delC, L552S and R176W mutations associated with lower risk (HR = 0.11–0.23, p < 0.001) than other KCNH2 mutations. Conclusions LQT2 (compared to LQT1), female gender, a cardiac event before age 18, and long QT interval increased the risk of cardiac events in LQTS patients aged 18 to 40 years. The nature of the underlying mutation may be associated with risk variation in both LQT1 and LQT2. The identification of high-risk and low-risk mutations may enhance risk stratification.