Browsing by Title

Sort by: Order: Results:

Now showing items 374-393 of 1613
  • Sihvola, Elina (Helsingin yliopisto, 2010)
    Early-onset psychiatric illnesses effects scatter to academic achievements as well as functioning in familial and social environments. From a public health point of view, depressive disorders are the most significant mental health disorders that begin in adolescence. Using prospective and longitudinal design, this study aimed to increase the understanding of early-onset depressive disorders, related mental health disorders and developing substance use in a large population-derived sample of adolescent Finnish twins. The participants of this study, FinnTwin12, an ongoing longitudinal population-based study, came from Finnish families with twins born in 1983-87 (exhaustive of five birth cohorts, identified from Finland s Central Population Register). With follow-up ongoing at age 20-24, this thesis assessed adolescent mental health in the first three waves, starting from baseline age 11-12 to follow-ups at age 14 and 17½. Some 5600 twins participated in questionnaire assessments of a wide range of health related behaviors. Mental health was further assessed among an intensively studied subsample of 1852 adolescents, who completed also professionally administered interviews at age 14, which provided data for full DSM-IV/III-R (Diagnostic and Statistical Manual for Mental Health disorders, 4th and 3rd editions) diagnoses. The participation rates of the study were 87-92%. The results of the study suggest, that the diagnostic criteria for major depressive disorder (MDD) may not capture youth with clinically significant early-onset depressive conditions outside clinical settings. Milder cases of depression, namely adolescents fulfilling the diagnostic criteria for minor depressive disorder, a qualitatively similar condition to MDD with fewer symptoms are also associated with marked suicidal thoughts, plans and attempts, recurrences and a high degree of comorbidity. Prospectively and longitudinally, early-onset depressive disorders were of substantial importance in the context of other mental health disorders and substance use behaviors: These data from a large population-derived sample established a substantial overlap between early-onset depressive disorders and attention deficit hyperactivity disorder in adolescent females, both of them significantly predictive for development of substance use among girls. Only in females baseline DSM-IV ADHD symptoms were strong predictors of alcohol abuse and dependence and illicit drug use at age 14 and frequent alcohol use and illicit drug use at age 17.½ when conduct disorder and previous substance use were controlled for. Early-onset depressive disorders were also prospectively and longitudinally associated to daily smoking behavior, smokeless tobacco use, frequent alcohol use and illicit drug use and eating disorders. Analysis of discordant twins suggested that these predictive associations were independent of familial confounds, such as family income, structure and parental models. In sum, early-onset depressive disorders predict subsequent involvement of substance use and psychiatric morbidity. A heightened risk for substance use is substantial also among those depressed below categorical diagnosis of MDD. Whether early recognition and interventions among these young people hold potential for substance use prevention further in their lives has potential public health significance and calls for more research. Data from this population-derived sample with balanced representation of boys and girls, suggested that boys and girls with ADHD behaviors may differ from each other in their vulnerability to substance use and depressive disorders: the data suggest more adverse substance use outcome for girls that was not attenuated by conduct disorder or previous substance use. Further, the prospective associations of early-onset depressive disorders and future elevated levels of addictive substance use is not explained by familial factors supporting future substance use, which could have important implications for substance use prevention.
  • Bingham, Clarissa (Helsingin yliopisto, 2012)
    Men at the age of military service are in a transition phase between childhood home and independent adulthood. They are starting to make own decisions about their future and ways of life including also eating habits. In Finland, all men are liable to military service and a majority (nearly 80%) complete service. The increasing prevalence of overweight and obesity also among soldiers has raised concerns about conscripts eating habits. This doctoral dissertation studied eating habits of young men before and during military service, determinants of eating habits and associations between diet and health risk factors, and effects of an intervention promoting healthy eating in military conditions. Two datasets were used. In the first, data on conscripts food use and nutrient intake was collected in garrison, encampment and leave conditions. The second belonged to the VARU intervention study in which the supply of healthy food was increased in military eating environments i.e. garrison canteens and soldier s homes. Study questionnaires were collected before military service and during it at the 8th week and 6th month of service. Conscripts health status was also followed during service through 13 anthropometric and clinical risk factors. Main dietary outcomes were food indexes which were formed specifically for studying young men and to suit military conditions. Prior to military service, the diet was mainly healthy although fruit and vegetable consumption was clearly low. Upper secondary school and healthy behaviour predicted healthy eating. During military service, nutrient intake was adequate although salt intake was high and fibre intake low. Food at garrison met nutrition recommendations most and especially fibre-containing foods belonged to it. At fee-time, nutrient intake was less favourable and sugar consumption high. Nutrient-poor foods, such as soft drinks and pizza, were frequently used. Conscripts health risk factor levels were low. During service, overweight decreased and body composition improved. Blood pressure improved but lipid and glucose levels deteriorated. The intervention succeeded in improving conscripts eating habits. In intervention group, cereal foods were consumed more and several fat- and sugar-containing foods less than in control group. The intervention did not increase fruit and vegetable consumption. Positively, young men s everyday diet contained several healthy foods which consumption increased during military service. On the other hand, consumption of some unhealthy foods increased especially at free-time. Already in early adulthood, young men s eating habits, other health behaviours and health risk factors accumulate. Conscripts healthy eating habits can be supported by promoting the food supply in garrison canteens and soldier s homes. Effective intervening on individual choices, as low fruit and vegetable consumption, remains more challenging.
  • Leivo, Tiina (Helsingin yliopisto, 2001)
  • Sarkola, Taisto (Helsingin yliopisto, 2001)
  • Kausto, Johanna (Helsingin yliopisto, 2014)
    In Finland, partial sickness benefit is used to promote recovery and return to work (RTW) to full-time employment after sickness absence since 2007. This thesis aimed to examine the effectiveness and efficacy of the benefit and related partial sick leave on RTW, work retention and work participation mainly in four diagnostic categories: musculoskeletal diseases, mental disorders and traumas and tumors. The first of the five substudies was a literature review which investigated the empirical evidence on the use and effects of partial sick leave on RTW in the Nordic countries. Three of the substudies were longitudinal register-based studies examining the effects of partial sick leave on return to work, work retention and work participation in working populations with prolonged sickness absence. In addition, it was assessed whether the effects differed between men and women or by age, socioeconomic position or diagnostic category. The fifth study, a randomized controlled trial (RCT), focused on the efficacy of partial sick leave on sustained RTW at an earlier phase of work disability attributable to musculoskeletal diseases. Two register-based samples (n = 38 865 and n = 68 924) of the working population with prolonged sickness absence were drawn from the sickness insurance register of The Social Insurance Institution of Finland (SII). Comprehensive prospective and retrospective register data on work participation were collected from the national registers of the SII and the Finnish Centre for Pensions. There were methodological and analytical challenges in comparing work participation between the studied groups in the register-based substudies, namely the selection of individuals into partial sick leave and the complexity of the context. This was taken into account by investigating the study questions in different study samples, with different study designs and several statistical methods. A contrafactual approach with propensity score and difference-in differences methods were applied. A systematic search of literature was carried out in 2008 and replicated in 2012. A total of five methodologically rigorous studies from other Nordic countries were identified. In four of them, partial sick leave was associated with an increased likelihood of return to regular working hours or a higher subsequent employment rate. Some of the reviewed studies suffered from methodological limitations. The register-based substudies showed that both men and women on partial sick leave when compared with individuals on full sick leave, had their first recurrent sick leave sooner and they also had more periods of sick leaves during the follow up. Approximately 60% of subjects on partial sick leave and 30% of those on full sick leave had at least one recurrent sick leave during the follow up time. The adjusted risks of the first recurrent sick leave were 1.8 and 1.7 for men and women, respectively, when subjects on partial sick leave were compared with those on full sick leave. Partial sickness benefit reduced the risk (change in absolute risk) of full disability pension by 6% but conversely increased the risk of partial disability pension by 8% compared with full sick leave. In men, the use of full disability pension was reduced by 10% and in women by 4%. Corresponding 5% and 9% increases in the use of partial disability pension were detected. The effects were stronger in the group of mental disorders than in the group of musculoskeletal diseases. During a follow-up period of five years, the decline in work participation was 5% smaller among those on partial sick leave than in the comparison group. The favorable effect of partial sick leave on work participation was found in those aged from 45 to 65 years, in those with mental disorders and among those with a higher socioeconomic position. No major difference was found in the effect between men and women. In the RCT, both the intervention and the control group consisted of 31 participants with early work disability due to musculoskeletal diseases. In addition to the clinical data collected by the physicians, the participants filled in six questionnaires during the follow up year. Survey information was linked with register-based data on sickness absences and employment periods, obtained from the registers of the occupational health services and employers. Time to RTW sustained for at least four weeks was found to be shorter in the intervention group (median 12 versus 20 days, p = 0.10) and the fully adjusted hazard ratio of RTW was 1.8 (95% CI 1.2-2.8). Compliance with the intervention was high. Overall, the results were rather consistent across the four substudies revealing beneficial effects of partial sick leave on RTW and work participation irrespective of the methodological differences and varying outcomes. Partial sick leave was found to be an effective and efficient way of enhancing RTW and work participation. The findings of this study suggest that, even if the practice so far has been mainly benefitted by women, the use of partial sick leave can be recommended among men as well. Partial sick leave is a relevant measure both in musculoskeletal and mental disorders, at least in the context of the Finnish societal system. More attention needs to be paid to the implementation of the measure among young workers and individuals in physically strenuous, low pay jobs. To conclude, the overall results suggest that partial sickness benefit - if applied in a larger scale in the future - may prove to be an effective tool in increasing the work participation of working population with long term sickness absence.
  • Snäll, Johanna (Helsingin yliopisto, 2015)
    Background and purpose Short-term glucocorticoids (GCs) are frequently used in association with oral and maxillofacial surgery to prevent postoperative pain, edema, and nausea. However, the influence on tissue repair and the anti-inflammatory and immunosuppressive features of GCs may have an adverse impact on healing of the surgical site. The main aim of this study was to determine the occurrence of disturbance in surgical wound healing (DSWH) and pulp necrosis (PN) after surgical treatment of facial fractures and the influence of perioperative administration of GCs on these complications. Patients This study comprised four populations of patients (Studies I-IV) treated for facial fractures. For Study I, the medical records of 280 consecutive patients who had undergone open reduction of different types of facial fractures or reconstruction of orbital wall fracture were assembled retrospectively. Prospective Studies II-IV consisted of patients with mandibular fractures (n=41) (Study II) and patients with a simple zygomatic complex (ZC) fractures (n=64) (Study III). The fourth population (n=24) (Study IV) was extracted from the population of patients with mandibular fractures recruited for Study II. Methods In the retrospective study (Study I), the outcome variable was DSWH, which was established when any kind of aberrant wound healing and/or sign of infection in the surgical site occurred. The primary predictor variable was the perioperative use of GC. Patients recruited for Studies II and III were randomly assigned to one of two groups. Patients in the study group received dexamethasone (DXE) (Oradexon®), whereas patients in the control group received no GC. The main outcome variables were DSWH (Studies II-III) and PN of teeth in the area of mandibular fracture (Study IV). The primary predictor variable was the perioperative use of DXE. Results In patients with ZC fractures (Study III), DSWH was significantly associated with perioperative use of DXE as well as with intraoral surgical approach. In patients operated on for different types of facial fractures (Study I), DSWH was associated significantly with intraoral surgical approach. DSWH occurred more frequently in patients receiving GCs, however, without statistical significance. In patients undergoing intraoral surgery for mandibular fractures (Study II), DSWH occurred more frequently in the DXE group. Also PN occurred more frequently in the DXE group (Study IV). The delay of DSWH was notably longer in the DXE groups (Study I-III). Particularly PN (Study IV) was observed much later in the DXE group. Conclusions Perioperative DXE cannot be recommended in association with surgery of ZC fractures. Moreover, GCs should be used with caution in association with surgery of other facial fractures as well, particularly when the intraoral approach is used.
  • Hirvonen, Johanna (Helsingin yliopisto, 2007)
    Although the principle of equal access to medically justified treatment has been promoted by official health policies in many Western health care systems, practices do not completely meet policy targets. Waiting times for elective surgery vary between patient groups and regions, and growing problems in the availability of services threaten equal access to treatment. Waiting times have come to the attention of decision-makers, and several policy initiatives have been introduced to ensure the availability of care within a reasonable time. In Finland, for example, the treatment guarantee came into force in 2005. However, no consensus exists on optimal waiting time for different patient groups. The purpose of this multi-centre randomized controlled trial was to analyse health-related quality of life, pain and physical function in total hip or knee replacement patients during the waiting time and to evaluate whether the waiting time is associated with patients health outcomes at admission. This study also assessed whether the length of waiting time is associated with social and health services utilization in patients awaiting total hip or knee replacement. In addition, patients health-related quality of life was compared with that of the general population. Consecutive patients with a need for a primary total hip or knee replacement due to osteoarthritis were placed on the waiting list between August 2002 and November 2003. Patients were randomly assigned to a short waiting time (maximum 3 months) or a non-fixed waiting time (waiting time not fixed in advance, instead the patient followed the hospitals routine practice). Patients health-related quality of life was measured upon being placed on the waiting list and again at hospital admission using the generic 15D instrument. Pain and physical function were evaluated using the self-report Harris Hip Score for hip patients and a scale modified from the Knee Society Clinical Rating System for knee patients. Utilization measures were the use of home health care, rehabilitation and social services, physician visits and inpatient care. Health and social services use was low in both waiting time groups. The most common services used while waiting were rehabilitation services and informal care, including unpaid care provided by relatives, neighbours and volunteers. Although patients suffered from clear restrictions in usual activities and physical functioning, they seemed primarily to lean on informal care and personal networks instead of professional care. While longer waiting time did not result in poorer health-related quality of life at admission and use of services during the waiting time was similar to that at the time of placement on the list, there is likely to be higher costs of waiting by people who wait longer simply because they are using services for a longer period. In economic terms, this would represent a negative impact of waiting. Only a few reports have been published of the health-related quality of life of patients awaiting total hip or knee replacement. These findings demonstrate that, in addition to physical dimensions of health, patients suffered from restrictions in psychological well-being such as depression, distress and reduced vitality. This raises the question of how to support patients who suffer from psychological distress during the waiting time and how to develop strategies to improve patients initiatives to reduce symptoms and the burden of waiting. Key words: waiting time, total hip replacement, total knee replacement, health-related quality of life, randomized controlled trial, outcome assessment, social service, utilization of health services
  • Stenberg, Jan-Henry (Helsingin yliopisto, 2014)
    Schizophrenia is a severe, psychiatric illness with neurocognitive deficits as its major component, and affects about 1% of the world population. Improving impaired neurocognitive function is one of the pivotal treatment goals in this patient population. In the treatment of schizophrenia, only a partial treatment response is typically achieved with dopamine antagonists; i.e., antipsychotics . The antidepressant mirtazapine has a unique mechanism of action with, in theory, an ability to enhance neurocognition and provide added value to antipsychotic treatment. This study explored whether or not adjunctive mirtazapine has the potential to improve neurocognitive performance and alleviate clinical symptoms in patients with schizophrenia who demonstrated a suboptimal treatment response to first-generation antipsychotics (FGAs). This study was a neurocognitive arm of a single-center, randomized, add-on, double-blinded, placebo-controlled study, which was carried out in the Karelian Republic, Petrozavodsk, Russia. Patients with schizophrenia or a depressive type schizoaffective disorder, according to the Diagnostic and Statistical Manual of Mental and Behavioral Disorders 4th edition (DSM-IV) criteria, who received stable doses of FGA with inadequate treatment response were enrolled into the trial. Twenty patients were assigned to mirtazapine and 21 to placebo. After a one-week single-blind placebo run-in period, the participants were randomized to receive either 30 mg of mirtazapine or the placebo four times a day (QID) in a double-blind fashion for 6 weeks. Subsequently, those who were eligible to continue entered the following 6-week open-label phase, where they were treated with mirtazapine 30 mg QID. At study weeks 0, 6, and 12, a senior psychologist performed neuropsychological examinations to evaluate neurocognitive functioning. Verbal and visual memory, visuo-spatial and executive functions, verbal fluency and both general mental and psychomotor speeds were assessed by commonly used, validated neuropsychological tests for different neurocognitive domains. Clinical examinations were conducted at week 1 (screening), week 0 (baseline) and after 1, 2, 4, 6, 7, 8, 10, and 12 weeks of treatment. Within group and between group differences were analyzed on a Modified Intent-to Treat (MITT) basis with Last Observations Carried Forward (LOCF). After 6 weeks of treatment, 5/21 neurocognitive parameters (i.e. Wechsler Adult Intelligence Scale Revised (WAIS-R) Block Design, p=0.021; Wechsler Memory Scale (WMS) Logical Memory, p=0.044; WMS Logical Memory Delayed, p=0.044; Stroop Dots, p=0.044; Trail Making Test Part A (TMT-A), p=0.018) were improved with statistical significance in the mirtazapine group. In contrast, only 1 of the 21 parameters changed significantly (WMS Logical Memory, p=0.039) in the placebo group. Add-on 6-week mirtazapine treatment was superior when compared with placebo in the neuropsychological domains of visuo-spatial ability and general mental speed/attentional control (Block Design mirtazapine group vs. placebo and Stroop dots mirtazapine group vs. placebo, p=0.044 for both comparisons). The enhancing effect on the Block Design-measured visuo-spatial functioning was mediated through changes in positive, depressive symptoms and parkinsonism-like side effects, but not via changes in negative symptoms. Moreover, higher doses of FGAs, longer duration of illness and lower initial Block Design scores predicted this effect. During the 6 weeks extension phase, individuals who continued mirtazapine treatment and those who were switched from placebo to mirtazapine showed significant improvements on several neurocognitive tests. Those who switched from placebo to open label mirtazapine treatment achieved similar results in the 6 following weeks as the mirtazapine group during their first 6 weeks of mirtazapine treatment. From week 0 to week 12, the continuation group demonstrated improvements in 17/21 neurocognitive parameters, while the switch group improved in 8/21 of the measured parameters. Twelve weeks of mirtazapine treatment indicated an advantage over a shorter, 6-week mirtazapine treatment on Stroop Dots time (p=0.035) and Trail Making Test part B (TMT-B), and number of mistakes (p=0.043). During the 6-week open-label phase, significant improvements on several clinical parameters, which included the Positive and Negative Syndrome Scale (PANSS) total score, were observed. In the total population (i.e., pooled switch and continuation groups), the effect size was 0.94 (CI 95%=0.451.43) as determined by the PANSS total score. Conclusions. Adjunctive mirtazapine treatment might offer added value as a neurocognitive enhancer, and may augment the antipsychotic effect in FGA-treated schizophrenia patients with inadequate treatment response. The ability to generalize these results for a larger population may be limited by the small sample size of the present study.
  • Vainio, Petri J. (Helsingin yliopisto, 2000)
  • Pilvi, Taru (Helsingin yliopisto, 2008)
    Diet high in dairy products is inversely associated with body mass index, risk of metabolic syndrome and prevalence of type 2 diabetes in several populations. Also a number of intervention studies support the role of increased dairy intake in the prevention and treatment of obesity. Dairy calcium has been suggested to account for the effect of dairy on body weight, but it has been repeatedly shown that the effect of dairy is superior to the effect of supplemental calcium. Dairy proteins are postulated to either enhance the effect of calcium or have an independent effect on body weight, but studies in the area are scarce. The aim of this study was to evaluate the potential of dairy proteins and calcium in the prevention and treatment of diet-induced obesity in C57Bl/6J mice. The effect of dairy proteins and calcium on the liver and adipose tissue was also investigated in order to characterise the potential mechanisms explaining the reduction of risk for metabolic syndrome and type 2 diabetes. A high-calcium diet (1.8%) in combination with dietary whey protein inhibited body weight and fat gain and accelerated body weight and fat loss in high-fat-fed C57Bl/6J mice during long-term studies of 14 to 21 weeks. α-lactalbumin, one of the major whey proteins, was the most effective whey protein fraction showing significantly accelerated weight and fat loss during energy restriction and reduced the amount of visceral fat gain during ad libitum feeding after weight loss. The microarray data suggest sensitisation of insulin signalling in the adipose tissue as a result of a calcium-rich whey protein diet. Lipidomic analysis revealed that weight loss on whey protein-based high-calcium diet was characterised by significant decreases in diabetogenic diacylglycerols and lipotoxic ceramide species. The calcium supplementation led to a small, but statistically significant decrease in fat absorption independent of the protein source of the diet. This augments, but does not fully explain the effects of the studied diets on body weight. A whey protein-containing high-calcium diet had a protective effect against a high-fat diet-induced decline of β3 adrenergic receptor expression in adipose tissue. In addition, a high-calcium diet with whey protein increased the adipose tissue leptin expression which is decreased in this obesity-prone mouse strain. These changes are likely to contribute to the inhibition of weight gain. The potential sensitisation of insulin signalling in adipose tissue together with the less lipotoxic and diabetogenic hepatic lipid profile suggest a novel mechanistic link to explain why increased dairy intake is associated with a lower prevalence of metabolic syndrome and type 2 diabetes in epidemiological studies. Taken together, the intake of a high-calcium diet with dairy proteins has a body weight lowering effect in high-fat-fed C57Bl/6J mice. High-calcium diets containing whey protein prevent weight gain and enhance weight loss, α-lactalbumin being the most effective whey protein fraction. Whey proteins and calcium have also beneficial effects on hepatic lipid profile and adipose tissue gene expression, which suggest a novel mechanistic link to explain the epidemiological findings on dairy intake and metabolic syndrome. The clinical relevance of these findings and the precise mechanisms of action remain an intriguing field of future research.
  • Penttinen, Heidi (2013)
    The main goal of the thesis was to investigate the effects of a 12-month supervised exercise intervention on breast cancer patients' QoL shortly after adjuvant treatment. The secondary aims were to assess the physical and psychological well-being of patients immediately after adjuvant treatment of the largest breast cancer survivor population intervention study (BREX) to date and the patients' willingness to participate in such a long intervention. In addition, the work aimed to further clarify the results of the intervention by analyzing the QoL changes of participants of the exercise study compared to those of normal follow-up patients. Patients: Of the 573 randomized patients (aged 35 to 68 years), 500 were included in the final analyses of the effects of exercise intervention: 263 in the exercise group and 237 in the control group. A total of 73 patients were excluded from the final analyses for various reasons: not meeting the inclusion criteria (36), declined for personal reasons (20), breast cancer recurrence (14), and new malignancy during the intervention (3). All 537 patients who met the inclusion criteria after the baseline visit and investigations were included in the baseline analysis. Patients of the additional study consisted of two separate patient groups: 237 control patients of the BREX study and 108 similar breast cancer patients (who met the same inclusion criteria and were treated according to the same guidelines as the patients in exercise study) participating in a follow-up study which did not contain any intervention. Methods: The EORTC QLQ-C30 and BR-23 questionnaires were used to evaluate QoL, FACIT-F for fatigue, and the Finnish modified version of Beck s 13-item depression scale (BDI). Physical fitness was assessed by a 2-km walking test and figure-8 running test, and physical activity (PA) by metabolic equivalent (MET) hours per week (MET-h/wk) based on data collected by a prospective two-week physical activity diary. Results: Of the eligible patients 78.3% participated in the study. At baseline, the global QoL of the study patients was lower than in the general population; 26% of them were rated as depressed, 20% as fatigued, and 82% suffered from menopausal symptoms, which seemed to impair QoL. Most scores from the EORTC QLQ-C30 improved significantly within both the exercise and control groups during 12 months, with no significant differences between groups. Participation in the exercise intervention study per se seemed not to improve the QoL of breast cancer survivors. At baseline, 62% of the walking test results were below the population average. Physical performance in walking tests correlated significantly to MET hours and to previous leisure-time physical activity. The amount of physical activity increased from baseline over a 12-month period in both groups; there were no significant differences between the groups. Neuromuscular performance improved significantly in the training group during 12 months. No significant effect of intervention was observed on cardiorespiratory fitness. However, the walk time improved significantly in both groups during the intervention. Already at baseline, PA improved QoL. After the 12-month follow-up there was a linear relationship between increased PA and improved QoL, irrespective of the intervention. There was a significant linear trend between higher physical activity and improved QoL and recovery from fatigue. No significant relationship was detected between physical activity and depression or between physical performance (figure-8 running or 2-km walking test) and QoL, fatigue or depression. Conclusions: Recruiting patients to the study succeeded excellently. With the exception of age limit and musculoskeletal disorders, the study population represents the general Finnish breast cancer population. At baseline, the QoL of the patients was impaired and the physical performance poor compared with the general population; in particular, depression and fatigue were related to impaired QoL. The study did not find evidence to support the superiority of the 12-month supervised vigorous aerobic exercise and home training over the control arm in improving physical activity and QoL of the patients. In contrast, both groups improved equally during the follow-up duration. Spontaneous recovery of QoL seems to be the most likely explanation for the observed results, at least when the intervention is timed to the rehabilitation period. Future exercise intervention studies targeting improvement of QoL should identify groups of patients that could benefit the most from an intervention and tailor the interventions to their specific needs.
  • Xiang, Xiaoqiang (Helsingin yliopisto, 2011)
    Bile acids are important steroid-derived molecules essential for fat absorption in the small intestine. They are produced in the liver and secreted into the bile. Bile acids are transported by bile flow to the small intestine, where they aid the digestion of lipids. Most bile acids are reabsorbed in the small intestine and return to the liver through the portal vein. The whole recycling process is referred to as the enterohepatic circulation, during which only a small amount of bile acids are removed from the body via faeces. The enterohepatic circulation of bile acids involves the delicate coordination of a number of bile acid transporters expressed in the liver and the small intestine. Organic anion transporting polypeptide 1B1 (OATP1B1), encoded by the solute carrier organic anion transporter family, member 1B1 (SLCO1B1) gene, mediates the sodium independent hepatocellular uptake of bile acids. Two common SNPs in the SLCO1B1 gene are well known to affect the transport activity of OATP1B1. Moreover, bile acid synthesis is an important elimination route for cholesterol. Cholesterol 7α-hydroxylase (CYP7A1) is the rate-limiting enzyme of bile acid production. The aim of this thesis was to investigate the effects of SLCO1B1 polymorphism on the fasting plasma levels of individual endogenous bile acids and a bile acid synthesis marker, and the pharmacokinetics of exogenously administered ursodeoxycholic acid (UDCA). Furthermore, the effects of CYP7A1 genetic polymorphism and gender on the fasting plasma concentrations of individual endogenous bile acids and the bile acid synthesis marker were evaluated. Firstly, a high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) method for the determination of bile acids was developed (Study I). A retrospective study examined the effects of SLCO1B1 genetic polymorphism on the fasting plasma concentrations of individual bile acids and a bile acid synthesis marker in 65 healthy subjects (Study II). In another retrospective study with 143 healthy individuals, the effects of CYP7A1 genetic polymorphism and gender as well as SLCO1B1 polymorphism on the fasting plasma levels of individual bile acids and the bile acid synthesis marker were investigated (Study III). The effects of SLCO1B1 polymorphism on the pharmacokinetics of exogenously administered UDCA were evaluated in a prospective genotype panel study including 27 healthy volunteers (Study IV). A robust, sensitive and simple HPLC-MS/MS method was developed for the simultaneous determination of 16 individual bile acids in human plasma. The method validation parameters for all the analytes met the requirements of the FDA (Food and Drug Administration) bioanalytical guidelines. This HPLC-MS/MS method was applied in Studies II-IV. In Study II, the fasting plasma concentrations of several bile acids and the bile acid synthesis marker seemed to be affected by SLCO1B1 genetic polymorphism, but these findings were not replicated in Study III with a larger sample size. Moreover, SLCO1B1 polymorphism had no effect on the pharmacokinetic parameters of exogenously administered UDCA. Furthermore, no consistent association was observed between CYP7A1 genetic polymorphism and the fasting plasma concentrations of individual bile acids or the bile acid synthesis marker. In contrast, gender had a major effect on the fasting plasma concentrations of several bile acids and also total bile acids. In conclusion, gender, but not SLCO1B1 or CYP7A1 polymorphisms, has a major effect on the fasting plasma concentrations of individual bile acids. Moreover, the common genetic polymorphism of CYP7A1 is unlikely to influence the activity of CYP7A1 under normal physiological conditions. OATP1B1 does not play an important role in the in vivo disposition of exogenously administered UDCA.