Browsing by Title

Sort by: Order: Results:

Now showing items 1360-1379 of 28486
  • Kuosmanen, Soile (2013)
    The lower respiratory infection tuberculosis (TB) has been the leading cause of death for centuries causing millions of deaths worldwide. The development of antibiotic therapy has reduced the morbidity and mortality during the 20th century, at least in the developed countries. However, tuberculosis is still the world’s second leading cause of death from infectious diseases. Although TB can be treated and even cured with drug therapy, the treatment is extremely long and requires 6-9 months constant drug therapy. This prolonged treatment causes poor patient compliance, which is usually the reason for the selection of drug resistant and often multidrug (MDR-TB) or even extensively drug-resistant (XDR-TB) TB bacteria. Limitations of available therapies and the emergence of drug-resistant strains have intensified the search for new drugs from natural sources. Marine micro- and macro-organisms have proven to be an excellent source of structurally unique biologically active natural products. EU FP7 –funded MAREX project, launched in 2010, aims at identifying new biologically active compounds from marine sources. This Master’s thesis was carried out as a part of the MAREX project. The aim of this study was to optimize and validate a reproducible method to determine antimicrobial activity of natural products against Mycobacterium smegmatis, which is a widely used non-pathogenic surrogate model for TB. In the present study, spectrophotometric microplate assay was optimized and validated using existing antibacterial agents ciprofloxacin and rifampicin as reference compounds. The assay was performed on 96-well plate by using two detection techniques, absorbance measurement and a colorimetric indicator, for the antibacterial MIC end-point determination. The results obtained by the described methods were compared with each other in order to achieve the most optimal assay conditions. The quality control parameters S/B, S/N and Z’ factor were used in order to determine the optimal experimental conditions for the assay. Obtaining reliable results with the turbidimetric method required incubation for two days in the case of ciprofloxacin, and for five days with rifampicin. Colorimetric measurement led to similar results as the turbidimetric measurement for both of the reference compounds. The method was further used for the screening of a group of marine extracts. None of the 21 samples tested showed significant activity against M. smegmatis.
  • Chandrasekaran, Sinduja (2016)
    NGS technologies and the advancement of bioinformatics methodologies have led to the start and success of many projects in genomics. One such project is the Caenorhabditis genome project, aimed at generating draft genomes for all the known and non-sequenced Caenorhabditis species. Except for C. elegans, the model organism responsible for discoveries such as the molecular mechanism of cell death and RNA interference, not much is known about other species in this genus. This hinders our understanding of the evolution of C. elegans and its distinct characteristics. This project is therefore an initiative to understand the Caenorhabditis genus. The aim of my project was to sequence and annotate the genome of Caenorhabditis doughertyi, as a part of the Caenorhabditis Genome Project. C. doughertyi is the sister species of C. brenneri, which is known for its high level of polymorphism among eukaryotes. It was initially found in the regions of Kerala, India by MA Felix in 2007 and consists of both male and female adults. The sequencing of C. doughertyi would pave the way for understanding the evolution of high diversity levels observed in C. brenneri. The raw data of the genome consisted of two paired-end libraries with insert size of 300- and 500 bp, with read lengths of 125 bp. The quality of reads was ensured by quality control measures such as trimming of adaptor sequences, error correction and removal of DNA from non-target organisms. The reads were then assembled using multiple assemblers and ABySS was decided as the best assembly based on metrics such as N50 and biological parameters like CEGMA. The draft genome was then annotated using MAKER pipeline and the orthologs were identified using OrthoMCL. The obtained draft genome can aid in preliminary comparative genomic analyses with other species in the genus. Further work may focus on improving the quality of this draft assembly towards a publication quality genome sequence for this species. ACM Computing Classification System (CCS): Life and medical sciences -> Computational biology    Life and medical sciences ->  Bioinformatics
  • Tukiainen, Jenni (Helsingin yliopisto, 2016)
    Tutkielma käsittelee pitkän englannin ylioppilaskokeen ainekirjoitusten arvostelua. Tarkastelun kohteena ovat arviointikriteerit, joiden perusteella kirjoitelmat pisteytetään sekä virallisen ohjeistuksen että kaytännön tasolla. Lisäksi tarkoituksena on selvittää, millaisia tekstejä kokelailta vaaditaan pitkän englannin kokeessa sekä millaisia kirjoitelman arviointiin liittyviä kysymyksiä ja huolia kokelaat tuovat ilmi Ylen Abitreenit-ohjelmassa, jossa kokeiden arvioijat kommentoivat ja vastaavat kokeeseen liittyviin kysymyksiin suorissa lähetyksissä heti kokeiden jälkeen. Aineisto kattaa koekerrat syksystä 2003 syksyyn 2013 eli 21 koetta, joista kussakin on neljä vaihtoehtoista tehtävänantoa. Abitreenit-ohjelman taltiointien lisäksi aineistoon kuuluu viisi arvioitua kirjoitelmaa jokaisesta kokeesta eli yhteensä 105 kirjoitelmaa, ylioppilastutkintolautakunnan kokouspöytäkirjoja sekä koekohtaiset määräykset ja ohjeet vuosilta 2002, 2004, 2005, 2007 ja 2011. Analyysissä käytettävät metodit ovat pääosin kvalitatiivisia eli deskriptiivisiä ja komparatiivisia. Jonkin verran mukana on myös kvantitatiivista analyysiä. Analyysin perusteella kirjoitelmien arviointi nojaa määräysten ja ohjeiden asettamiin kriteereihin, jotka sisältävät paljon epämääräisiä kuvauksia sekä tulkinnanvaraisuutta. Esimerkiksi kaikille mahdollisille pisteille ei ole määritelty erillisiä kuvauksia. Esseisiin tehtyjen merkintöjen sekä arviointiohjeiden välillä ei ole paljoakaan yhteyttä, sillä merkinnät koskevat lähes yksinomaan kielivirheitä, kun taas ohjeissa kuvataan taitotasoja myös mm. kommunikatiivisuuden ja sisällön osalta. Arviointi näyttää täten perustuvan pitkälti arvioijan tulkintaan ja sisäistettyyn käsitykseen tietyn tasoisesta kirjoitelmasta. Tätä päätelmää tukevat myös kokousten pöytäkirjat, joiden perusteella kokouksissa käydään läpi mm. esimerkkitapauksia tietyn pistemäärän tasoisista kirjoitelmista. Aineiston perusteella arvioinnissa on kuitenkin nähtävissä jonkin verran epäjohdonmukaisuutta. Tehtävänannoissa pyydetään kirjoittamaan tekstejä, joista useimmat vaativat akateemisia kirjoitustaitoja eli argumentaatiota ja ilmiöiden kuvaamista. Myös retoriset taidot painottuvat erityisesti puheissa. Loput tehtävistä keskittyvät faktatiedon esittämiseen tai menneiden tapahtumien kerrontaan. Esimerkiksi luovaa kirjoittamista ei edellytetä yhdessäkään tehtävänannossa. Kokelaiden kysymykset ja huolenaiheet liittyvät hyvin konkreettisiin ja usein yksityiskohtaisiin kieli- ja muotoseikkoihin, jotka mahdollisesti johtavat pistevähennyksiin. Kokeiden arvioijat eivät pysty antamaan yksiselitteisiä vastauksia suurimpaan osaan kysymyksistä, ellei vastaus löydy suoraan ohjeista ja määräyksistä (esim. sanamäärärajoitukset). Tutkielman tulosten valossa näyttää siltä, että kirjoitelmien arviointi pitkän englannin ylioppilaskokeessa perustuu suurelta osin arvioijan intuitioon, joka on kehittynyt niin tarkaksi, että sen voitaneen olettaa takaavan arvioinnin riittävän luotettavuuden. Monet tehtävänantoihin ja arviointikriteereihin liittyvät seikat näyttävät kuitenkin vuodesta toiseen aiheuttavan epävarmuutta sekä kokelaissa että arvioijissa.
  • Helle, Inari (Helsingin yliopisto, 2015)
    Coastal and marine ecosystems across the globe are heavily impacted by various anthropogenic stressors, which has led to a significant loss of biodiversity and ecosystem services in recent decades. In order to find means to counteract this trend, there is a need to develop methods for assessing the environmental impacts of human activities and the effectiveness of management practices to mitigate the harmful effects. However, this is a challenging task due to the complex interactions within and between the ecosystems and human components, and various uncertainties related to them. Bayesian networks (BNs) are graphical models for reasoning under uncertainty. A BN consists of a set of probabilistic variables connected with links describing causalities within the system. As the states of the variables are described with probability distributions, uncertainty can be described in an explicit manner. BNs also enable integration of qualitative and quantitative knowledge from various sources such as observational data sets, models and expert knowledge. In this thesis I have developed BN models to study environmental risks related to anthropogenic stressors in the Gulf of Finland and the Finnish Archipelago Sea. The main aim is to quantify human impacts on the environment, and to assess the ability of different management measures to lessen these impacts. I focus especially on oil spills resulting from potential tanker accidents and I set out to fill various information gaps related to this recently emerged threat. The thesis includes five papers. In paper I, the main aim is to assess the spatial risk posed by oil spills in the Gulf of Finland and the Finnish Archipelago Sea, and identify species and habitat types with the highest risk. In paper II, I focus on the effectiveness of different oil combating methods to mitigate the negative impacts of oil spills on the ecosystem, and paper III widens the approach to a probabilistic cost-benefit analysis of preventive and post-spill measures. Paper IV deals with multiple risks as, in addition to oil spills, eutrophication and harvesting of species are studied. Paper V reviews and discusses various methods that can be applied to evaluate the uncertainty related to deterministic models, which could increase their usefulness in decision-making. The results suggest that risks related to tanker accidents are distributed unevenly between areas, habitats and species. Furthermore, the results support the current Finnish strategy to base oil combating primarily on offshore recovery vessels instead of chemical dispersants. However, as the efficiency of mechanical recovery is dependent on several factors, there is also a need to develop preventive measures. Although major oil accidents are estimated to be rare events, the costs can be very high, if a spill occurs. The work offers new insights to the oil spill risks in the study area and provides examples how Bayesian networks can be applied in environmental risk assessment. The thesis is a part of the work needed in order to develop comprehensive decision support tools related to environmental risk management in the northern Baltic Sea.
  • Reiman, Teemu (Helsingin yliopisto, 2007)
    Failures in industrial organizations dealing with hazardous technologies can have widespread consequences for the safety of the workers and the general population. Psychology can have a major role in contributing to the safe and reliable operation of these technologies. Most current models of safety management in complex sociotechnical systems such as nuclear power plant maintenance are either non-contextual or based on an overly-rational image of an organization. Thus, they fail to grasp either the actual requirements of the work or the socially-constructed nature of the work in question. The general aim of the present study is to develop and test a methodology for contextual assessment of organizational culture in complex sociotechnical systems. This is done by demonstrating the findings that the application of the emerging methodology produces in the domain of maintenance of a nuclear power plant (NPP). The concepts of organizational culture and organizational core task (OCT) are operationalized and tested in the case studies. We argue that when the complexity of the work, technology and social environment is increased, the significance of the most implicit features of organizational culture as a means of coordinating the work and achieving safety and effectiveness of the activities also increases. For this reason a cultural perspective could provide additional insight into the problem of safety management. The present study aims to determine; (1) the elements of the organizational culture in complex sociotechnical systems; (2) the demands the maintenance task sets for the organizational culture; (3) how the current organizational culture at the case organizations supports the perception and fulfilment of the demands of the maintenance work; (4) the similarities and differences between the maintenance cultures at the case organizations, and (5) the necessary assessment of the organizational culture in complex sociotechnical systems. Three in-depth case studies were carried out at the maintenance units of three Nordic NPPs. The case studies employed an iterative and multimethod research strategy. The following methods were used: interviews, CULTURE-survey, seminars, document analysis and group work. Both cultural analysis and task modelling were carried out. The results indicate that organizational culture in complex sociotechnical systems can be characterised according to three qualitatively different elements: structure, internal integration and conceptions. All three of these elements of culture as well as their interrelations have to be considered in organizational assessments or important aspects of the organizational dynamics will be overlooked. On the basis of OCT modelling, the maintenance core task was defined as balancing between three critical demands: anticipating the condition of the plant and conducting preventive maintenance accordingly, reacting to unexpected technical faults and monitoring and reflecting on the effects of maintenance actions and the condition of the plant. The results indicate that safety was highly valued at all three plants, and in that sense they all had strong safety cultures. In other respects the cultural features were quite different, and thus the culturally-accepted means of maintaining high safety also differed. The handicraft nature of maintenance work was emphasised as a source of identity at the NPPs. Overall, the importance of safety was taken for granted, but the cultural norms concerning the appropriate means to guarantee it were little reflected. A sense of control, personal responsibility and organizational changes emerged as challenging issues at all the plants. The study shows that in complex sociotechnical systems it is both necessary and possible to analyse the safety and effectiveness of the organizational culture. Safety in complex sociotechnical systems cannot be understood or managed without understanding the demands of the organizational core task and managing the dynamics between the three elements of the organizational culture.
  • Andersson, Matias (2014)
    This research is aimed at investigating the possibility of implementing ecological sanitation technologies in the Taita Hills in south-eastern Kenya, therefore contributing to a sustainable local development approach. The approach taken to this aim is that of a description and analysis of social and cultural preferences regarding sanitation and the idea of reusing human excreta in agricultural production. Poor sanitation circumstances, with the range of problems that they give rise to, is a widely acknowledged and researched issue in the field of human development which is underlined by the inclusion of sanitation in both the Millennium Development Goals (MDGs) and the upcoming Sustainable Development Goals (SDGs). In addition to putting a burden of disease on affected populations, lack of proper sanitation facilities are identified as both a cause and a consequence of poverty. Sanitation solutions also play a notable role in the interaction between settlements and the natural environment. Ecological sanitation includes a wide range of technologies and other solutions with the aim of improving sanitation in a given community and simultaneously diminishing the waste that is allowed to pollute the environment, most notably water bodies. A remarkable aspect of ecological sanitation solutions in agricultural areas is the possibility of treating human waste in order to produce fertilizers suitable for usage in local farming. This would enable communities to close the cycle of nutrient flows as nutrients withdrawn from the soil in the form of agricultural produce would be returned as fertilizer. In addition, local, low-cost production of fertilizers is assumed to be a sustainable way of weakening dependence of international fertilizer markets, thereby improving rural livelihoods. The possibility of improved access to suitable fertilizers is also a key aspect of improved food security. Understanding local perceptions and attitudes regarding sanitation is crucial in finding socially and culturally applicable, acceptable and sustainable ecological sanitation solutions. This study will use semi-structured stakeholder interviews and expert interviews to investigate those attitudes, as well as to gain insights on current sanitation and farming practices. Involvement of the local views in the research process is enhanced by the use of participatory ranking exercises, thereby enabling local views and preferences to find practical and specific expression. Current sanitation solutions and their connection to the environment are also included in the interview framework. The results of the fieldwork are investigated with a qualitative content analysis to present a comprehensive picture of the current sanitation situation in relation to local livelihoods, to describe local attitudes towards different sanitation solutions and to describe how ecological sanitation solutions might be implemented that improves local livelihoods and food security. Through this, a framework will be produced that can be used for further work on ecological sanitation in the Taita Hills area. The ultimate objective of the study is to assess the feasibility and potential of using ecological sanitation to improve both food security and sanitation in the study area. The results of this study point to the conclusion that reusing human waste cannot be considered as a taboo in the Taita Hills but could be promoted through locally designed solutions as well as education and training regarding ecological sanitation.
  • Santangeli, Andrea (Helsingin yliopisto, 2013)
    Humans are the main cause of the on-going large-scale biodiversity crisis, mostly through processes like habitat loss and fragmentation, and habitat degradation. The recent recognition of the scale and rate of biodiversity erosion has stimulated strong political and institutional reactions, culminating in the implementation of a large number of conservation initiatives. Such efforts have been largely insufficient to revert or slow down the rapid loss of biodiversity. Commonly, conservation resources have been allocated based on decisions supported by traditional knowledge or expert opinion rather than scientific evidence. Therefore, it is relevant that interventions are evaluated, to ultimately allow learning from past actions and taking better decisions in the future. With this thesis I aim to provide evidence needed to improve the effectiveness of different approaches to conservation of some species affected by anthropogenic activities. In doing so, I considered conservation interventions implemented mostly on private land with different underlying approaches: voluntary and inexpensive (based on self-motivation of landowners); voluntary market-based (landowners are compensated); Compulsory land reservation or legislation (landowners have no choice). I first evaluate the effectiveness of a conservation program aimed at protecting raptor nests in private forests of North Karelia in eastern Finland. I show that here an inexpensive voluntary approach, based on self-motivation of landowners, may represent an effective instrument for achieving conservation with very limited financial resources. This approach was effective not only at eliciting participation of local forest owners, but it also provided ecological benefits to the raptor species considered. Relevant outcomes for practical conservation can also emerge when multiple interventions are compared. This was the case for nest protection of the Montagu s harrier breeding in cropland of Spain and France. In France, protection of nests from harvesting operations has been achieved on a voluntary basis. Here I show that the most effective interventions to enhance nest productivity were those that not only protect from harvesting, but also from predation. This was achieved by erecting a protective fence around the nest. On the other hand, some nest protection measures in Spain were more expensive due to payments to farmers. Here, temporary removal of the chicks during harvest operations or relocation of the nest to a nearby safe place, as well as harvest delay, were the most effective measures to enhance nest productivity. Harvest delay was also the most expensive among all measures, therefore removal or relocation of the nest should be prioritized wherever it is operationally feasible. Interestingly, the most commonly employed measure, the retention of a small buffer of un-harvested crop, was also less effective compared to the other means. Unexpected but positive outcomes for conservation management emerged also from an evaluation of the effects of nest site protection for breeding White-tailed eagles in south-western Finland. The species was breeding as often and successfully in protected and unprotected areas, which suggests that compulsory and expensive protection through land reservation may not be necessary under the studied conditions. The species apparently thrives also in unprotected land subject to some levels of anthropogenic activities. I found opposite results in a study on protection of flying squirrel sites in Finland. Here I provide evidence indicating that the enforced legislation to protect the species habitat in Finnish forests is ineffective. The species occupancy at sites protected according to the law strongly declined following tree harvest. This indicates that the primary objective of the legislation (i.e. prevent deterioration of the sites where the species occurs) are not met. This is due to the fact that conservation of flying squirrel s habitat may conflict with forestry interests, and thus restrictions have been largely set in favour of the latter and at the detriment of the former. The case studies presented here indicate that evaluating the effectiveness of past actions is important. This step allows understanding whether past efforts have reached their initial objectives. Only with the strength of this evidence it is possible to adaptively revise current conservation plans and increase the chances of reaching the desired outcome from any given action. This is particularly relevant in the modern era, where conservation challenges are enormous, and the resources limited. Therefore, it is crucial that any implemented effort produces the best possible outcome for conservation.
  • Pagels, Max (2013)
    Productivity is an important aspect of any software development project as it has direct implications on both the cost of software and the time taken to produce it. Though software development as a field has evolved significantly during the last few decades in terms of development processes, best practices and the emphasis thereon, the way in which the productivity of software developers is measured has remained comparatively stagnant. Some established metrics focus on a sole activity, such as programming, which paints an incomplete picture of productivity given the multitude of different activities that a software project consists of. Others are more process-oriented --- purporting to measure all types of development activities --- but require the use of estimation, a technique that is both time-consuming and prone to inaccuracy. A metric that is comprehensive, accurate and suitable in today's development landscape is needed. In this thesis, we examine productivity measurement in software engineering from both theoretical and pragmatic perspectives in order to determine if a proposed metric, implicitly estimated velocity, could be a viable alternative for productivity measurement in Agile and Lean software teams. First, the theory behind measurement --- terminology, data types and levels of measurement --- is presented. The definition of the term productivity is then examined from a software engineering perspective. Based on this definition and the IEEE standard for validating software quality metrics, a set of criteria for validating productivity metrics is proposed. The motivations for measuring productivity and the factors that may impact it are then discussed and the benefits and drawbacks of established metrics --- chief amongst which is productivity based on lines of code written --- explored. To assess the accuracy and overall viability of implicitly estimated velocity, a case study comparing the metric to LoC-based productivity measurement was carried out at the University of Helsinki's Software Factory. Two development projects were studied, both adopting Agile and Lean methodologies. Following a linear-analytical approach, quantitative data from both project artefacts and developer surveys indicated that implicitly estimated velocity is a metric more valid than LoC-based measurement in situations where the overall productivity of an individual or team is of more importance than programming productivity. In addition, implicitly estimated velocity was found to be more consistent and predictable than LoC-based measurement in most configurations, lending credence to the theory that implicitly estimated velocity can indeed replace LoC-based measurement in Agile and Lean software development environments.
  • Hailikari, Telle (2010)
    The aim of this dissertation was to explore how different types of prior knowledge influence student achievement and how different assessment methods influence the observed effect of prior knowledge. The project started by creating a model of prior knowledge which was tested in various science disciplines. Study I explored the contribution of different components of prior knowledge on student achievement in two different mathematics courses. The results showed that the procedural knowledge components which require higher-order cognitive skills predicted the final grades best and were also highly related to previous study success. The same pattern regarding the influence of prior knowledge was also seen in Study III which was a longitudinal study of the accumulation of prior knowledge in the context of pharmacy. The study analysed how prior knowledge from previous courses was related to student achievement in the target course. The results implied that students who possessed higher-level prior knowledge, that is, procedural knowledge, from previous courses also obtained higher grades in the more advanced target course. Study IV explored the impact of different types of prior knowledge on students’ readiness to drop out from the course, on the pace of completing the course and on the final grade. The study was conducted in the context of chemistry. The results revealed again that students who performed well in the procedural prior-knowledge tasks were also likely to complete the course in pre-scheduled time and get higher final grades. On the other hand, students whose performance was weak in the procedural prior-knowledge tasks were more likely to drop out or take a longer time to complete the course. Study II explored the issue of prior knowledge from another perspective. Study II aimed to analyse the interrelations between academic self-beliefs, prior knowledge and student achievement in the context of mathematics. The results revealed that prior knowledge was more predictive of student achievement than were other variables included in the study. Self-beliefs were also strongly related to student achievement, but the predictive power of prior knowledge overruled the influence of self-beliefs when they were included in the same model. There was also a strong correlation between academic self-beliefs and prior-knowledge performance. The results of all the four studies were consistent with each other indicating that the model of prior knowledge may be used as a potential tool for prior knowledge assessment. It is useful to make a distinction between different types of prior knowledge in assessment since the type of prior knowledge students possess appears to make a difference. The results implied that there indeed is variation between students’ prior knowledge and academic self-beliefs which influences student achievement. This should be taken into account in instruction.
  • Jääskeläinen, Elina (Helsingin yliopisto, 2008)
    Despite of improving levels of hygiene, the incidence of registered food borne disease has been at the same level for many years: there were 40 to 90 epidemics in which 1000-9000 persons contracted food poisoning through food or drinking water in Finland. Until the year 2004 salmonella and campylobacter were the most common bacterial causes of food borne diseases, but in years 2005-2006 Bacillus cereus was the most common. Similar developement has been published i.e. in Germany already in the 1990´s. One reason for this can be Bacillus cereus and its emetic toxin, cereulide. Bacillus cereus is a common environmental bacterium that contaminates raw materials of food. Otherwise than salmonella and campylobacter, Bacillus cereus is a heat resistant bacterium, capable of surviving most cooking procedures due to the production of highly thermo resistant spores. The food involved has usually been heat treated and surviving spores are the source of the food poisoning. The heat treatment induces germination of the spore and the vegetative cells then produce toxins. This doctoral thesis research focuses on developing methods for assessing and eliminating risks to food safety by cereulide producing Bacillus cereus. The biochemistry and physiology of cereulide production was investigated and the results were targeted to offer tools for minimizing toxin risk in food during the production. I developed methods for the extraction and quantitative analysis of cereulide directly from food. A prerequisite for that is knowledge of the chemical and physical properties of the toxin. Because cereulide is practically insoluble in water, I used organic solvents; methanol, ethanol and pentane for the extraction. For extraction of bakery products I used high temperature (100C) and pressure (103.4 bars). Alternaties for effective extraction is to flood the plain food with ethanol, followed by stationary equilibration at room temperature. I used this protocol for extracting cereulide from potato puree and penne. Using this extraction method it is also possible also extract cereulide from liquid food, like milk. These extraction methods are important improvement steps for studying of Bacillus cereus emetic food poisonings. Prior my work, cereulide extraction was done using water. As the result, the yield was poor and variable. To investigate suspected food poisonings, it is important to show actual toxicity of the incriminated food. Many toxins, but not cereulide, inactivate during food processing like heating. The next step is to identify toxin by chemical methods. I developed with my colleague Maria Andesson a rapid assay for the detection of cereulide toxicity, within 5 to 15 minutes. By applying this test it is possible to rapidly detect which food was causing the food poisoning. The chemical identification of cereulide was achieved using mass spectrometry. I used cereulide specific molecular ions, m/z (+/-0.3) 1153.8 (M+H+), 1171.0 (M+NH4+), 1176.0 (M+Na+) and 1191.7 (M+K+) for reliable identification. I investigated foods to find out their amenability to accumulate cereulide. Cereulide was formed high amounts (0.3 to 5.5 microg/g wet wt) when of cereulide producing B. cereus strains were present in beans, rice, rice-pastry and meat-pastry, if stored at non refrigerated temperatures (21-23C). Rice and meat pastries are frequently consumed under conditions where no cooled storage is available e.g. picnics and outdoor events. Bacillus cereus is a ubiquitous spore former and is therefore difficult to eliminate from foods. It is therefore important to know which conditions will affect the formation of cereulide in foods. My research showed that the cereulide content was strongly (10 to 1000 fold differences in toxin content) affected by the growth environment of the bacterium. Storage of foods under nitrogen atmosphere (> 99.5 %) prevented the production of cereulide. But when also carbon dioxide was present, minimizing the oxygen contant (< 1%) did not protect the food from formation of cereulide in preliminary experiments. Also food supplements affected cereulide production at least in the laboratory. Adding free amino acids, leucine and valine, stimulated cereulide production 10 to 20 fold. In peptide bonded form these amino acids are natural constituents in all proteins. Interestingly, adding peptide bonded leucine and valine had no significant effect on cereulide production. Free amino acids leucine and valine are approved food supplements and widely used as flawour modifiers in food technology. My research showed that these food supplements may increase food poisoning risk even though they are not toxic themselves.
  • Klenberg, Liisa (Helsingin yliopisto, 2015)
    Executive functions (EFs) are essential for coordination and controlling of other cognitive functions and behavior. EFs are involved in all purposeful activity, and for children, they are important for learning and functioning in school environments. Difficulties in EFs are common in school-age children with developmental disabilities, such as attention deficit disorder (ADHD). This thesis consists of three studies addressing EFs in school-age children. The first study employed neuropsychological tests from the Developmental Neuropsychological Assessment NEPSY to examine age-related differences in EFs in a sample of 400 children. The second study investigated the methodological factors related to EF measures in a sample of 340 children using response inhibition tasks from the NEPSY-II, the second edition of the Developmental Neuropsychological Assessment. The third study aimed at constructing a new instrument, the Attention and Executive Function Rating Inventory ATTEX, for the clinical assessment of EFs and verifying the psychometric properties of the rating scale in a sample of 916 children. Age-related improvement in EF task performance continued throughout the school-age period, proceeding from inhibition to attention control, and further to fluency of actions. A closer examination of response inhibition showed developmental variation even within this EF domain. The developmental change was apparent at school age, but different outcome measures and the cognitive requirements of tasks had an effect on how the development of response inhibition was depicted. In the assessment of everyday EF behaviors, the ATTEX rating scale demonstrated high internal consistency reliability and good criterion and discriminant validity for ADHD. The EF profiles were different for the ADHD subtypes, and children with predominantly inattentive symptoms showed more wide-ranging difficulties in EFs than children with combined symptoms of inattention and hyperactivity-impulsivity. Carefully examined, reliable, and valid measures are essential for both the scientific research and clinical assessment of EFs. Factors related to the measures, e.g., materials, stimuli, and the selected outcome measure, and the cognitive processes involved in tasks have effect on how development of EFs is depicted. Close examination of these factors can help to attain a more consisted account on EF development. In the clinical assessment of EF difficulties, the measures need to be sensitive to the actual difficulties that arise in every-day situations. These are best assessed with standardized rating scales. The new rating scale presented in this thesis proved to be a suitable measure both for screening and examining the detailed EF profiles of children in school situations.
  • Gideon Neba, Shu (Helsingfors universitet, 2013)
    This study quantified above-ground biomass affected by selective logging in the tropical rainforest of South East Cameroon and also investigated the suitability of the density of logging roads, the density of log yards as well as variables from MODIS 250 m data (Red, NIR, MIR, NDVI, EVI) in explaining above-ground biomass logged. Above-ground biomass logged was quantified using allometric equations. The surface area of logging roads and log yards were quantified and used in the determination of above-ground biomass affected by these infrastructures based on a national reference baseline value for the forest zone of Cameroon. A comparative analysis revealed that 50% of potentially exploitable commercial tree species were effectively harvested with a harvesting intensity of 0.78 trees ha-1 representing an average above-ground biomass of 3.51 Mg ha-1. The results also indicated that 5.65 Mg ha-1 of above-ground biomass was affected by logging infrastructure .i.e. 62% as compared to 38% of above-ground biomass that was logged. Correlation and regression analysis showed that the density of the logging roads explained 66% of the variation in above-ground biomass logged and 73% of the variation in above-ground biomass logged was explained by the density of the logging roads and NDVI from MODIS data. The density of log yards and the variables from MODIS data were generally weak in explaining the variation in above-ground biomass logged.
  • Hielm-Björkman, Anna (Helsingin yliopisto, 2007)
    The series of investigations presented in this thesis examined different methods of assessing chronic pain in dogs suffering from osteoarthritis (OA) and compared the effects of three different treatments. Data were obtained from two cohorts; 41 dogs with OA due to canine hip dysplasia (CHD) (I,III) and 61 dogs with OA due to CHD or elbow dysplasia (II,IV,V). Questionnaires, veterinary evaluations, visual analog scales (VAS), plasma hormones, radiographs, and force plate evaluations were assessed as OA treatment outcome measures and/or measurements of chronic pain. The results indicated that the multidimensional pain scale including 11 questions with 5-point scale responses was a valid and reliable tool for evaluating chronic pain. This Helsinki chronic pain index (HCPI) can be applied as an outcome measure in clinical trials where chronic pain is evaluated by owners. Of the evaluated complementary therapies for chronic pain due to OA, all three indicated a positive treatment outcome. In the first trial, gold bead implants resulted in a significant positive treatment outcome for the treatment group. However, the placebo group in this study also improved significantly. A positive effect was seen in 65% of the placebo dogs and this exceptionally high incidence of amelioration suggests that the placebo group may have got an effect of unintentional needle acupuncture. The results of this study are therefore controversial and treatment guidelines based on these findings cannot be given. The second trial tested two ingestible OA remedies, green lipped mussel and a homeopathic low-dose combination preparation. Both treatments resulted in statistically significant positive treatment outcomes compared with placebo, but with the positive control (carprofen) being more effective than either of them. The results suggest that both tested treatments may be beneficial for chronic OA. To establish the true role of all these three treatments in outcome-based animal analgesia, more clinical trials, using larger cohorts, should be conducted. Possible mechanisms of action should also be studied.
  • Kiiski, Kirsi (Helsingin yliopisto, 2015)
    Nemaline myopathy (NM) and related disorders constitute a heterogeneous group of congenital myopathies. Mutations in the nebulin gene (NEB) are the main cause of the recessively inherited form. NEB is one of the largest genes in the human genome consisting of 249 kb of genomic sequence. NEB contains 183 exons and a 32 kb homologous triplicate region (TRI) where eight exons are repeated three times. The aims of this Doctoral Thesis study were to develop and implement into diagnostics new efficient variant analysis methods for NEB and other NM-causing genes. The first aim was to design and validate a custom copy number microarray targeting the NM-causing genes for the detection of copy number variations. MLPA (multiplex ligation-dependent probe amplification) and Sanger sequencing were also used. The second aim was to utilise whole-exome sequencing to search for novel disease-causing variants in the known NM genes and try to identify novel NM genes. Lastly, the aim was to collect more data in order to try to find genotype-phenotype correlations of NEB-caused NM. The design and validation of the NM-CGH microarray was successful. Of the total sample cohort of 356 NM families, 196 NM families were studied using the custom-made NM-CGH array. Nine different novel large causative variants were identified in ten NM families. The size of these variants varies greatly, covering only a part of one NEB exon on up to dozens of NEB exons (72bp - 133 kb). In addition, a novel recurrent variation of the NEB TRI region was identified in 13% of the NM families and in 10% of the studied 60 control samples. Deviations of one copy are suggested to be benign but gains of two or more copies might be pathogenic. One novel homozygous deletion was also identified in another NM gene, TPM3, in a patient with severe NM. Furthermore, ten samples were studied using exome sequencing, and for six of those samples, novel disease-causing variant(s) were identified. Two variants were identified in one family in a novel, putative NM gene that is currently under further investigation. 165 NM families from the total cohort of 356 NM families have been identified thus far with two pathogenic NEB variants. Altogether 220 different pathogenic variants were identified in these 165 families, accentuating that the patients in the majority (84%) of the families are compound heterozygous for two different NEB variants. Most of the variants are small single nucleotide changes whereas large variants are more rare. However, copy number variations are much more frequent than previously thought: pathogenic copy number variants were identified in 16% of these 356 NM-families. Genotype-phenotype correlations between the type of NEB mutation and the NM subtypes remained, however, unobtainable. The NM-CGH microarray has been implemented into molecular diagnostics of NM. Using the NM-CGH microarray followed by exome-sequencing has accelerated mutation detection. This combination has increased the coverage of the NM genes and thus improved the diagnostics of NM and NM-related disorders.
  • Laakso, Hanna (Helsingin yliopisto, 2015)
    Objective: Cognitive impairment as a consequence of a stroke is common. Advanced age increases the frequency of poststroke cognitive deficits. Particularly executive dysfunction has an important role in poststroke disability. Complex by their nature, however, measuring executive function is difficult. The Hayling test, Design fluency task and Questioning task are some of the less common assessment methods of executive functions, and thus, they are not widely studied. The aim of the present study was to assess the feasibility of these tests in elderly patients three months after ischemic stroke. Performances on these tests were compared to conventional assessment methods of executive functions, and their predictive value on functional disability in follow-up was examined. Methods: 62 stroke patients and 39 control subjects, aged 55-85, underwent comprehensive neurological and neuropsychological examinations three months after the index stroke. Executive functions were studied with the Trail Making test, Stroop test, Wisconsin card sorting test, Verbal fluency task as well as with the Hayling test, Design fluency task and Questioning task. The modified Rankin Scale (mRS) and the Lawton’s Instrumental activities of daily living -scale (IADL) were used to assess functional abilities at three months, and the mRS after 15 months follow-up. Results and conclusions: The Hayling test and Questioning task and the four conventional tests of executive functions differentiated stroke patients from healthy controls. Furthermore, the executive functions predicted functional dependence in the elderly stroke patients. The Hayling test was most consistently associated with functional disability as evaluated with mRS and IADL three months after the stroke, and predicted functional disability as evaluated with mRS at 15 months follow-up. Of all executive functions tests, the Hayling test proved to be the most constant predictor of functional abilities in elderly stroke patients. However, there is no golden standard for measuring executive functions, and in the future, more sensitive methods are needed. Nevertheless, the present study confirms the importance of assessing executive functions in clinical populations, when predicting functional disability even in the long-term.
  • Järvinen, Päivi (Helsingin yliopisto, 2011)
    The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.
  • Kylmälä, Minna (Helsingin yliopisto, 2014)
    Myocardial infarct size is clinically relevant, affecting heart function and patient prognosis. After myocardial infarction (MI), it is important to establish whether myocardial dysfunction is due to permanent infarct damage, or if the myocardium is viable, in which case contraction may be improved by revascularization. Established methods for assessing infarct size and viability, such as cardiac magnetic resonance imaging with delayed contrast enhancement (DE-CMR) and myocardial perfusion imaging are neither readily available nor suitable in acute MI. In contrast, electrocardiography (ECG) and echocardiography are widely available diagnostic methods at the patient’s bedside at any hour. Echocardiographic strain rate imaging, based on measurement of myocardial velocities by tissue Doppler, is a sensitive and objective method for quantification of myocardial contraction. If myocardium contracts, it is viable. Body surface potential mapping (BSPM) records ECG with multiple leads covering the entire thorax, with a variety of ECG variables automatically calculated from its recordings. The aim of the studies for this thesis was to evaluate whether infarct size and myocardial viability can be assessed by strain rate imaging and BSPM. The studies included up to 57 patients with acute coronary syndrome, most with an infarction. BSPM with 123 leads and echocardiography were performed within 48 hours after onset of chest pain, and repeated during recovery of the infarction, at 1 to 4 weeks, and after healing, at 6 to 12 months. Global infarct size and segmental extent of infarct were determined by DE-CMR after healing. Strain rate imaging allowed assessment of viability and global infarct size in both acute and chronic MI. Strain mapping was validated, for the first time, as a semi-quantitative method for the assessment of systolic strain, and showed excellent correlation with quantitative strain values. In chronic MI, segments with systolic strain values less than -6% were most probably viable, having no infarct or a non-transmural infarct. Strain mapping proved as good as quantitative strain in distinguishing transmural from non-transmural infarcts. In acute MI, strain- and strain-rate variables could distinguish viable from non-viable segments, post-systolic strain having the best accuracy at predicting recovery of severe contraction abnormality (AUC 0.78). BSPM could estimate infarct size at all stages of the infarction, with Q- and R-wave variables, as well as the QRS integral having the strongest correlations with infarct size at all time-points. The repolarization variables were clearly inferior; only in chronic MI did the T-wave variables have nearly as strong correlations with infarct size as did QRS variables. In contrast, the repolarization variables proved good at predicting recovery of left ventricular (LV) function in acute MI, irrespective of MI location. The 1st QRS integral was the only depolarization variable good at predicting recovery of LV dysfunction, and the only variable able to estimate infarct size in addition to viability. In conclusion, strain rate imaging as well as computed ECG variables can predict recovery of myocardial function in acute MI and can assess infarct size in both acute and chronic MI. Strain values can be quickly and accurately estimated by the strain-mapping method, validated now for the first time in the assessment of infarct transmurality. These methods, easily performed at bedside, may help the clinician assess patient prognosis and the need for revascularization after MI.