Browsing by Title

Sort by: Order: Results:

Now showing items 645-664 of 1596
  • Helakorpi, Satu (Helsingin yliopisto, 2008)
    The aim of the study was to evaluate the impact of the Finnish tobacco control measures for reduction of smoking. First, the trends and patterns in ever smoking among adult Finns in 1978 2001 as well as the associations of trends with the Tobacco Control Act in 1976 were examined. Secondly, the impact of the 1976 TCA on the proportion of ever daily smokers in different socioeconomic groups was studied. Thirdly, the impact of the 1995 TCAA on recent trends in the prevalence of daily smoking was evaluated by gender and employment status. Fourthly, the trends of exposure to environmental tobacco smoke (ETS) at workplaces and homes were investigated. The study is based on data of the Health Behaviour among the Finnish Adult Population surveys. Among Finnish men smoking initiation declined from earlier to later cohorts, whereas among women it increased in successive birth cohorts born before 1956. The lasting differences between birth cohorts as regards ever daily smoking reflected well the impact of measures to reduce smoking in Finland in 1976. Smoking initiation in the birth cohorts (born in 1961 or later) which were in critical age as regards the risk of smoking initiation when the TCA came into force was less common than could be expected according to the trends seen in the earlier birth cohorts. Marked socioeconomic differences were found in smoking in the different birth cohorts. Smoking was more prevalent in the lower socioeconomic groups than in the higher ones, and the differences were larger in the later birth cohorts compared to the earlier ones. The differences between the birth cohorts in ever daily smoking were compatible with the hypothetical impact of the TCA in almost all socioeconomic groups, except farmers. Among men the 1976 TCA appears to have had the greatest impact on white-collar employees. Among women the effect of the act was highly significant in all socioeconomic groups. However, female smoking prevalence continues to show wide socioeconomic disparities. Daily smoking decreased among employees after the 1995 TCAA, supporting the hypothesis of the lowering impact of the amendment on daily smoking due to increased smoking cessation. No parallel change in daily smoking was found in the population without direct expose to ETS legislation (farmers, students, housewives, pensioners or unemployed). Exposure to ETS decreased markedly among non-smokers at work after the 1995 TCAA. The 1976 TCA and the 1995 TCAA were useful in controlling smoking initiation and cessation, but their impact was not equal across the population groups. The results of this study strongly suggested that tobacco control policies markedly contribute to the decrease in smoking and in exposure to environmental tobacco smoke.
  • Penninkilampi-Kerola, Varpu (Helsingin yliopisto, 2006)
    Objective: The aim of the present study was to examine co-twin dependence and its impact on twins' social contacts, leisure-time activities and psycho-emotional well-being. The role of co-twin dependence was also examined as a moderator of genetic and environmental influences on alcohol use in adolescence and in early adulthood. Methods: The present report is based on the Finnish Twin Cohort Study (FinnTwin16), a population-based study of five consecutive birth cohorts of Finnish twins born in the years 1975-1979. Baseline assessments were collected through mailed questionnaires, within two months of the twins' sixteenth birthday yielding replies from 5563 twin individuals. All respondent twins were sent follow-up questionnaires at ages of 17, 18½, and in early adulthood, when twins were 22-27 years old. Measures: The questionnaires included a survey of health habits and attitudes, a symptom checklist and questions about twins' relationships with parents, peers and co-twin. Measures used were twins' self-reports of their own dependence and their co-twin's dependence at age 16, reports of twins' leisure-time activities and social contacts, alcohol use, psychological distress and somatic symptoms both in adolescence and in early adulthood. Results: In the present study 25.6% of twins reported dependence on their co-twin. There were gender and zygosity differences in dependence, females and MZ twins were more likely to report dependence than males and DZ twins. Co-twin dependence can be viewed on one hand as an individual characteristic, but on the other hand as a pattern of dyadic interaction that is mutually regulated and reciprocal. Most of the twins (80.7%) were either concordantly co-twin dependent or concordantly co-twin independent. The associations of co-twin dependence with twins' social interactions and psycho-emotional characteristics were relatively consistent both in adolescence and in early adulthood. Dependence was related to higher contact frequency and a higher proportion of shared leisure-time activities between twin siblings at the baseline and the follow-up. Additionally co-twin dependence was associated with elevated levels of psycho-emotional distress and somatic complaints, especially in adolescence. In the framework of gene-environment interaction, these results suggest that the genetic contribution to individual differences in drinking patterns is dependent on the nature of the pair-wise relationship of twin siblings. Conclusions: The results of this study indicate that co-twin dependence is a genuine feature of the co-twin relationship and shows the importance of studying the impact of various features of co-twin relationships on individual twins' social and psycho-emotional life and well-being. Our study also offers evidence that differences in inter-personal relationships contribute to the effects of genetic propensities.
  • Virtanen, Anni (Helsingin yliopisto, 2015)
    High coverage amongst those at risk and a high attendance rate are essential in achieving a good impact in a cervical cancer screening programme. In Finland, attendance in the programme is approximately 70% with a slight decreasing trend. There is wide variation in the current invitation practice between municipalities. The introduction of human papillomavirus (HPV) testing in cervical cancer screening has brought about a new possible means of improving attendance rates, as HPV-testing can be performed on self-collected samples. This offers the opportunity to supply sampling devices directly to the homes of the women (self-sampling). The aim of this study was to investigate the effects and feasibility of using self-taken samples for HPV-testing to conduct cervical cancer screening of non-attendees to the Finnish cervical screening programme. The effect on attendance to the screening programme, on overall screening test coverage (including also testing outside the screening programme), on the yield of precancerous lesions detected by screening and on the costs of a screening programme were assessed, as were women s views on this new screening modality. The effects of self-sampling were first studied as a first reminder (i.e. among non-attendees after the primary invitation) in a randomized setting in comparison to a reminder letter, and then in a non-randomized setting as a second reminder after two invitation letters. As a first reminder to non-attendees after the primary invitation, a self-sampling test resulted in somewhat higher attendance than a reminder letter. The difference was small, and in terms of resulting costs (price per extra screened woman and price per detected CIN2+ lesion), a reminder letter with a pre-assigned appointment time is a more feasible choice than a self-sampling test. However, self-sampling can be used to increase screening attendance as a second reminder after two invitation letters. Overall attendance rates increased by 4-8%, and the combined effect of reminder letters and self-sampling showed a 12-23% increase. The yield of detected CIN2+ lesions increased by 25-33% with two reminders. As opportunistic screening is very common in Finland, the increase in overall test coverage remained smaller than the increase in the uptake to the programme. Based on a questionnaire study conducted alongside self-sampling, self-sampling at home helps to overcome both practical and emotional barriers to traditional screening. Women who took part in screening by self-sampling reported mainly positive experiences, but negative experiences were more common among women with a mother tongue other than Finnish or Swedish. The invitation protocol preceding the self-sampling option must be carefully arranged to achieve optimal attendance. A total attendance of well over 80% is achievable in the national programme if personal invitations and reminder letters to non-attendees are sent, scheduled appointments are used in both letters and self-sampling tests are sent to those women who still do not attend.
  • Koski, Anniina (Helsingin yliopisto, 2012)
    Gene therapy with oncolytic adenoviruses is a promising novel treatment modality for cancer. Adenoviruses have shown excellent safety and tolerability in clinical studies, but their efficacy still needs improvements, particularly when systemic administration is used. Problems related to systemic administration include natural adenoviral tropism to the liver through virus interaction with soluble coagulation factors in circulation and direct viral binding to cellular HSPG in the liver. The purpose of this thesis was to improve systemic administration and efficacy and safety of adenovirus treatments for cancer. We showed that ablation of vitamin K-dependent coagulation factors resulted in reduced liver transduction after intravenous administration to tumor-bearing mice. Further, combination of this with platelet depletion and inhibition of virus uptake to liver macrophages resulted in enhanced tumor-to-liver ratio of viral gene expression. We also constructed an adenovirus that has a mutation in the capsid fiber KKTK region to abolish interactions with liver cells through HSPG binding and a chimeric fiber knob from serotype 3 to enhance tumor cell transduction. The virus exhibited reduced liver tropism after systemic administration to mice. This virus was also investigated together with alterations of coagulation factor availability and these pathways of liver transduction were found to be separate and may work in an additive fashion. Preclinical studies have shown that adjuvant use of the calcium channel blockers can improve the efficacy of oncolytic adenoviruses in cancer gene therapy. We investigated the calcium channel blocker verapamil in combination with oncolytic adenovirus treatments of advanced cancer patients in the context of an Advanced Therapy Access Program (ATAP). Verapamil resulted in elevated serum viral titers after treatments, compared to non-randomized retrospectively matched controls, which suggests enhanced viral spread and release in the tumors. The frequency or severity of adverse events was not increased by verapamil. Therefore, verapamil seems a safe adjuvant of oncolytic adenovirus treatments and able to enhance viral kinetics. However, the effect of the adjuvant treatment with regard to treatment efficacy should be determined in a randomized clinical trial. Arming adenoviruses with immunostimulatory molecules is a promising way to boost antitumoral immune responses and thus enhance the overall treatment efficacy. We constructed Ad5/3-D24-GMCSF, an oncolytic adenovirus with a 5/3 chimeric capsid, encoding for the immunostimulatory cytokine GM-CSF. In preclinical experiments Ad5/3-D24-GMCSF displayed strong oncolytic potential, good tumor-selectivity and potent antitumor efficacy. 21 advanced cancer patients were treated with Ad5/3-D24-GMCSF in the context of ATAP. Treatments were well tolerated, with generally only mild or moderate adverse events. Intriguing signs of possible treatment benefits were also recorded. Further, signs of induction of antiviral and antitumoral immune responses were observed. Therefore Ad5/3-D24-GMCSF is a promising agent for treatment of cancer and clinical Phase I and I/II trials have been initialized for further analysis of this agent.
  • Simojoki, Kaarlo (Helsingin yliopisto, 2013)
    The purpose of this study was to investigate whether pharmacological or clinical management methods could improve patients' adherence to treatment and reduce the resource burden, thus improving treatment effectiveness. Finland was the first country in Europe to use buprenorphine-naloxone combination medication as part of 0pioid maintenance treatment (OMT), which was expected to have lower potential for diversion into the drug market. The study investigated whether the transition from mono-buprenorphine to buprenorphine-naloxone combination would cause adverse events or lower patient compliance. One way to reduce the diversion of buprenorphine medication is to crush the tablet when administering it, this has not been studied earlier, and it was investigated whether crushing mono-buprenorphine tablets would influence the kinetics and serum levels of buprenorphine, or whether patients would have adverse events following the use of crushed tablets. One main problem in OMT has been patient compliance and adherence to treatment. One main component has been visually supervised urine drug screens. Thus it was investigated whether a new unsupervised screening method would affect urine testing reliability, patient compliance, and the time/resources used by personnel in screening. The large buprenorphine abuse problem in Finland provides good possibilities for being able to study the abuse. A seven-year follow-up study evaluated the trends in street buprenorphine prices, intravenous abuse doses, and its abuse potential in Finland. The studies showed that the use of the new buprenorphine-naloxone combination product is as safe as mono-buprenorphine alone, and that no dose adjustments are needed during medication change. Crushing of the mono-buprenorphine tablet did not affect serum levels or buprenorphine kinetics, and the study subjects did not experience more or less adverse events than the control group. It was concluded that crushing is a safe and effective management for patients with high risk of medication abuse or diversion. The study with the new marker-based urine drug screen indicated that the new method did not jeopardize the safe and reliable assessment of concomitant drug use. Both patients and medical staff thought it was more comfortable than the traditional visually controlled screen. The new method saved considerable time previously spent on controlling the screen. So it was concluded that the new screening method improves patient compliance, reduces the burden of the control time and thus may increase the effectiveness of the treatment. The long-term follow-up study revealed that the street price of the new combination product is significantly less than of the mono-buprenorphine product and that the price difference remained the same during the follow-up period. Thus it was concluded that the abuse potential of the combination product is less than that of mono-buprenorphine. The studies demonstrate that there are several effective methods for reducing the abuse of OMT medications, and that patient compliance and thus the outcomes of treatment can both be improved. These methods should be used broadly in the clinical management of OMT.
  • Kangasniemi, Lotta (Helsingin yliopisto, 2010)
    Although the treatment of most cancers has improved steadily, only few metastatic solid tumors can be cured. Despite responses, refractory clones often emerge and the disease becomes refractory to available treatment modalities. Furthermore, resistance factors are shared between different treatment regimens and therefore loss of response typically occurs rapidly, and there is a tendency for cross-resistance between agents. Therefore, new agents with novel mechanisms of action and lacking cross-resistance to currently available approaches are needed. Modified oncolytic adenoviruses, featuring cancer-celective cell lysis and spread, constitute an interesting drug platform towards the goals of tumor specificity and the implementation of potent multimodal treatment regimens. In this work, we demonstrate the applicability of capsid-modified, transcriptionally targeted oncolytic adenoviruses in targeting gastric, pancreatic and breast cancer. A variety of capsid modified adenoviruses were tested for transductional specificity first in gastric and pancreatic cancer cells and patient tissues and then in mice. Then, oncolytic viruses featuring the same capsid modifications were tested to confirm that successful transductional targeting translates into enhanced oncolytic potential. Capsid modified oncolytic viruses also prolonged the survival of tumor bearing orthotopic models of gastric and pancreatic cancer. Taken together, oncolytic adenoviral gene therapy could be a potent drug for gastric and pancreatic cancer, and its specificity, potency and safety can be modulated by means of capsid modification. We also characterized a new intraperitoneal virus delivery method in benefit for the persistence of gene delivery to intraperitoneal gastric and pancreatic cancer tumors. With a silica implant a steady and sustained virus release to the vicinity of the tumor improved the survival of the orthotopic tumor bearing mice. Furthermore, silica gel-based virus delivery lowered the toxicity mediating proimflammatory cytokine response and production of total and anti-adenovirus neutralizing antibodies (NAbs). On the other hand, silica shielded the virus against pre-excisting NAbs, resulting in a more favourable biodistribution in the preimmunized mice. The silica implant might therefore be of interest in treating intraperitoneally disseminated disease. Cancer stem cells are thought to be resistant to conventional cancer drugs and might play an important role in cancer relapse and the formation of metastasis. Therefore, we examined if transcriptionally modified oncolytic adenoviruses are able to kill these cells. Complete eradication of CD44+CD24-/low putative breast cancer stem cells was seen in vitro, and significant antitumor activity was detected in CD44+CD24-/low –derived tumor bearing mice. Thus, genetically engineered oncolytic adenoviruses have potential in destroying cancer initiating cells, which may have relevance for the elimination of cancer stem cells in humans.
  • Pelkonen, Tuula (Helsingin yliopisto, 2011)
    Background Acute bacterial meningitis (BM) continues to be an important cause of childhood mortality and morbidity, especially in developing countries. Prognostic scales and the identification of risk factors for adverse outcome both aid in assessing disease severity. New antimicrobial agents or adjunctive treatments - except for oral glycerol - have essentially failed to improve BM prognosis. A retrospective observational analysis found paracetamol beneficial in adult bacteraemic patients, and some experts recommend slow β-lactam infusion. We examined these treatments in a prospective, double-blind, placebo-controlled clinical trial. Patients and methods A retrospective analysis included 555 children treated for BM in 2004 in the infectious disease ward of the Paediatric Hospital of Luanda, Angola. Our prospective study randomised 723 children into four groups, to receive a combination of cefotaxime infusion or boluses every 6 hours for the first 24 hours and oral paracetamol or placebo for 48 hours. The primary endpoints were 1) death or severe neurological sequelae (SeNeSe), and 2) deafness. Results In the retrospective study, the mortality of children with blood transfusion was 23% (30 of 128) vs. without blood transfusion 39% (109 of 282; p=0.004). In the prospective study, 272 (38%) of the children died. Of those 451 surviving, 68 (15%) showed SeNeSe, and 12% (45 of 374) were deaf. Whereas no difference between treatment groups was observable in primary endpoints, the early mortality in the infusion-paracetamol group was lower, with the difference (Fisher s exact test) from the other groups at 24, 48, and 72 hours being significant (p=0.041, 0.0005, and 0.005, respectively). Prognostic factors for adverse outcomes were impaired consciousness, dyspnoea, seizures, delayed presentation, and absence of electricity at home (Simple Luanda Scale, SLS); the Bayesian Luanda Scale (BLS) also included abnormally low or high blood glucose. Conclusions New studies concerning the possible beneficial effect of blood transfusion, and concerning longer treatment with cefotaxime infusion and oral paracetamol, and a study to validate our simple prognostic scales are warranted.
  • Suvisaari, Jaana (Helsingin yliopisto, 1999)
  • Nisula, Sara (Helsingin yliopisto, 2014)
    Acute kidney injury (AKI) is a syndrome encompassing kidney damage from mild injury to total loss of function that seriously disturbs the homeostasis of fluid and electrolyte balances. The objectives of this study were to evaluate the incidence, risk factors, and outcome of acute kidney injury in adult intensive care unit (ICU) patients in Finland, and to test the ability of two new biomarkers to predict AKI, renal replacement therapy (RRT), and 90-day mortality in ICU patients. A prospective, observational FINNAKI-study was conducted in 17 Finnish ICUs and all admitted patients were screened for eligibility during the study period of five months (2011-2012). All adult emergency admissions and elective admissions with an expected stay over 24 hours were included. AKI was defined with the Kidney Disease: Improving Global Outcomes (KDIGO) criteria. Study I included all patients in the FINNAKI study and evaluated the incidence and risk factors for AKI and reported the 90-day mortality of patients with AKI. Of the 2901 patients 1141 (39%) developed AKI during the screening period of five days. The proportions of patients in the different stages of AKI were 499 (17%) in stage 1, 232 (8%) in stage 2, and 410 (14%) in stage 3. RRT was initiated for 272 (9%) patients. The population-based incidence of AKI was 746 per million adults per year. Patients that developed AKI were older and more severely ill, and had more chronic comorbidities than patients without AKI. Hypovolaemia prior to ICU admission, administration of diuretics or colloids (HES or gelatin) prior to ICU admission, and chronic kidney disease were independent risk factors for AKI. Of the 1141 AKI patients, 385 (34%) died within 90-days. In study II urine neutrophil gelatinase-associated lipocalin (NGAL) was measured from 1042 patients. NGAL predicted AKI with an AUC (95% CI) of 0.733 (0.701 0.765), RRT with an AUC (95% CI) of 0.839 (0.797 0.880), and 90-day mortality with an AUC (95% CI) of 0.634 (0.593 0.675). In Study III urine interleukin-18 (IL-18) was analysed from 1439 patients. IL-18 predicted the development of AKI with an AUC (95%CI) of 0.586 (0.546 - 0.627), initiation of RRT with an AUC (95% CI) of 0.655 (0.572 - 0.739), and 90-day mortality with an AUC (95% CI) of 0.536 (0.497 - 0.574). Study IV included 1568 patients and evaluated the 6-month mortality and the survivors health-related quality of life (HRQol) at ICU admission and six-months later with the EQ-5D questionnaire. The EQ-5D index for AKI patients at six-months (0.676) was lower than for the age- and sex-matched general population (0.826) but equal to that of patients without AKI (0.690). There was no significant change in the EQ-5D over six-months for either patient group. Despite their measured lower HRQol, AKI patients evaluated their quality of life to be as good as that of the age- and sex-matched general population at six-months after the ICU treatment with the EQ-5D visual analogue scale. Of the 635 AKI patients in this study, 224 (35%) died within 6-months. Incidence of AKI among critically ill patients was high. Hypovolaemia, diuretics, and colloids prior to ICU admission were independently associated with the development of AKI. In this population, urine NGAL was statistically associated with the need to initiate RRT, but the transformation of this result into clinical practice is complicated. Urine NGAL lacks power to predict AKI or 90-day mortality. Urine IL-18 has no adequate power to predict AKI, RRT, or 90-Day mortality in critically ill adult patients. AKI was associated with significantly increased 90-day and 6-month mortality. The HRQol of all ICU patients was lower than that of the age- and sex-matched general population already before ICU treatment. This HRQol did not change during critical illness or during a six-month follow up. Despite their lower HRQol, AKI patients felt their health was equal to that of the general population.
  • Linko, Rita (Helsingin yliopisto, 2012)
    Acute respiratory failure (ARF) is the most common organ failure in critically ill patients. Up to 74% of patients in intensive care units (ICUs) need some kind of ventilatory support. Additional organ failures are associated with higher mortality. High morbidity and mortality together with increased cost of mechanical ventilation, necessitates assessment of long-term outcome, and an analysis of cost-effectiveness. The aim of this study was to evaluate the incidence, treatment, and outcome of patients suffering from overall ARF, and a subset suffering from pandemic influenza A(H1N1) virus infection, in Finnish ICUs. The predictive value of serum zinc in organ failure and mortality was studied in ARF patients. One-year outcome for ARF was assessed. Health related quality of life (HRQOL), quality-adjusted life years (QALYs) for one-year survivors, and cost for one QALY, was estimated. A total of 958 patients from 25 Finnish ICUs needed ventilatory support for more than 6 hours during an 8-week period in 2007. Serum zinc level was assessd in 551 patients. A total of 132 H1N1 patients were assessed for incidence, treatment, and short-term outcome during an outbreak between 11 October and 31 December 2009. The incidence of ARF, and acute respiratory distress syndrome (ARDS) in the adult population were 149.5/100,000 and 5.0/100,000 per year, respectively. Median tidal volume per predicted body weight was 8.7 ml/kg and airway pressure was 19 cmH2O. The 90-day mortality of ARF was 31%, and one-year mortality was 35%. The incidence of H1N1 was 24.7 per million inhabitants. Corticosteroids were used frequently and their use was not associated with mortality in these patients. Rescue therapies, except prone positioning, were rarely used. Hospital mortality of H1N1 patients was 8%. The level of serum zinc decreased with increased severity of cardiovascular organ failure, but was not associated with 30-day mortality. HRQOL at one year after ARF was lower than population values of similar age and gender. The mean estimated cost for a hospital survivor was 20,739. The mean predicted lifetime QALYs were 11.3, and cost for one QALY for all ARF patients was 1,391. This study showed that the incidence of ARF was higher, while the incidence of ARDS was lower in Finland than reported from other countries. Tidal volumes were higher than recommended in the concept of lung protective ventilation. The short- and long term mortality was low. The incidence of H1N1 was similar to that previously reported. Corticosteroid treatment was frequently used. Hospital mortality of H1N1 was 8%. Serum zinc level was not useful in predicting 30-day mortality. Cost per hospital survivor, and lifetime cost-utility was reasonable.
  • Seikku, Paula (Helsingin yliopisto, 2008)
    Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.
  • Lepäntalo, Aino (Helsingin yliopisto, 2007)
    Antiplatelet medication is known to decrease adverse effects in patients with atherothrombotic disease. However, despite ongoing antiplatelet medication considerable number of patients suffer from atherothrombotic events. The aims of the study were 1) to evaluate the individual variability in platelet functions and compare the usability of different methods in detecting it, 2) to assess variability in efficacy of antiplatelet medication with aspirin (acetylsalicylic acid) or the combination of aspirin and clopidogrel and 3) to investigate the main genetic and clinical variables as well as potential underlying mechanisms of variability in efficacy of antiplatelet medication. In comparisons of different platelet function tests in 19 healthy individuals PFA-100® correlated with traditional methods of measuring platelet function and was thus considered appropriate for testing individual variability in platelet activity. Efficacy of ongoing 100mg aspirin daily was studied in 101 patients with coronary artery disease (CAD). Aspirin response was measured with arachidonic acid (AA)-induced platelet aggregation, which reflects cyclo-oxygenase (COX)-1 dependent thromboxane (Tx) A2 formation, and PFA-100®, which evaluates platelet activation under high shear stress in the presence of collagen and epinephrine. Five percent of patients failed to show inhibition of AA-aggregation and 21% of patients had normal PFA-100® results despite aspirin and were thus considered non-responders to aspirin. Interestingly, the two methods of assessing aspirin efficacy, platelet aggregation and PFA-100®, detected different populations as being aspirin non-responders. It could be postulated that PFA-100® actually measures enhanced platelet function, which is not directly associated with TxA2 inhibition exerted by aspirin. Clopidogrel efficacy was assessed in 50 patients who received a 300mg loading dose of clopidogrel 2.5 h prior to percutaneous coronary intervention (PCI) and in 51 patients who were given a loading dose of 300mg combined with a five day treatment of 75mg clopidogrel daily mimicking ongoing treatment. Clopidogrel response was assessed with ADP-induced aggregations, due to its mechanism of action as an inhibitor of ADP-induced activation. When patients received only a loading dose of clopidogrel prior to PCI, 40% did not gain measurable inhibition of their ADP-induced platelet activity (inhibition of 10% or less). Prolongation of treatment so that all patients had reached a plateau of inhibition exerted by clopidogrel, decreased the incidence of non-responders to 20%. Polymorphisms of COX-1 and GP VI, as well as diabetes and female gender, were associated with decreased in vitro aspirin efficacy. Diabetes also impaired the in vitro efficacy of short-term clopidogrel. Decreased response to clopidogrel was associated with limited inhibition by ARMX, an antagonist of P2Y12-receptor, suggesting the reason for clopidogrel resistance to be receptor-dependent. Conclusions: Considerable numbers of CAD patients were non-responders either to aspirin, clopidogrel or both. In the future, platelet function tests may be helpful to individually select effective and safe antiplatelet medication for these patients.
  • Hellgren, Ulla-Maija (Helsingin yliopisto, 2012)
    Indoor air in hospitals is important to both employees and patients. Complaints and problems concerning the quality of indoor air are posing an increasing challenge to occupational health and safety in hospitals. The aim of the present study was to assess the perceived indoor air quality and prevalence of indoor-air-related symptoms among hospital workers. We also determined the relationship between these factors and the condition of hospital buildings and ventilation systems. An additional aim was to find how the problem solution process functions in hospitals from the occupational health perspective. We also tested the usability of nasal lavage in patient examinations. A modified questionnaire was used to collect information on the complaints and indoor-air-related symptoms of hospital employees. Construction and ventilation professionals examined the hospitals. Semi-structured interviews concerning the processes aimed towards resolving indoor air problems were carried out among hospital personnel working in occupational health, occupational safety, and infection control. Nasal lavage was performed as part of the examinations of the employees working in a moisture-damaged hospital ward and a control group before and after the repair. Hospital employees experienced poor indoor air quality and symptoms related to indoor air more often than office workers. The workers in moisture-damaged departments had complaints and symptoms more often than the workers in departments that were in good condition. In hospitals where, for the most part, the ventilation systems were in need of repair, the workers experienced more inconvenience and symptoms than those in hospitals in which the ventilation systems were mostly in good condition. Workers in moisture-damaged departments showed signs of immune-suppression in their nasal lavage samples, and their inflammatory cell counts and cytokine levels were lower than those of the controls. Occupational health and safety personnel considered the indoor air problems difficult to tackle. The roles and responsibilities of occupational health professionals, the technical department, and the employer in solving the problems were not clear. The flow of information between the different parties clearly needed improvement. An indoor air group had been appointed in under half of the hospitals in which the interviews were carried out. These groups were considered good, especially in regard to the flow of information. In conclusion, an indoor air group should be established in every hospital. Indoor air quality should be monitored by conducting regular questionnaire surveys and walk-throughs of the buildings, and by evaluating ventilation systems. Nasal lavage needs further development before it can be applied in the occupational health tool kit for examining indoor-air-related symptoms.
  • Lehto, Juho (Helsingin yliopisto, 2007)
    Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.