Browsing by Title

Sort by: Order: Results:

Now showing items 358-377 of 862
  • Wendland, Lauri (Helsingin yliopisto, 2009)
    This thesis describes methods for the reliable identification of hadronically decaying tau leptons in the search for heavy Higgs bosons of the minimal supersymmetric standard model of particle physics (MSSM). The identification of the hadronic tau lepton decays, i.e. tau-jets, is applied to the gg->bbH, H->tautau and gg->tbH+, H+->taunu processes to be searched for in the CMS experiment at the CERN Large Hadron Collider. Of all the event selections applied in these final states, the tau-jet identification is the single most important event selection criterion to separate the tiny Higgs boson signal from a large number of background events. The tau-jet identification is studied with methods based on a signature of a low charged track multiplicity, the containment of the decay products within a narrow cone, an isolated electromagnetic energy deposition, a non-zero tau lepton flight path, the absence of electrons, muons, and neutral hadrons in the decay signature, and a relatively small tau lepton mass compared to the mass of most hadrons. Furthermore, in the H+->taunu channel, helicity correlations are exploited to separate the signal tau jets from those originating from the W->taunu decays. Since many of these identification methods rely on the reconstruction of charged particle tracks, the systematic uncertainties resulting from the mechanical tolerances of the tracking sensor positions are estimated with care. The tau-jet identification and other standard selection methods are applied to the search for the heavy neutral and charged Higgs bosons in the H->tautau and H+->taunu decay channels. For the H+->taunu channel, the tau-jet identification is redone and optimized with a recent and more detailed event simulation than previously in the CMS experiment. Both decay channels are found to be very promising for the discovery of the heavy MSSM Higgs bosons. The Higgs boson(s), whose existence has not yet been experimentally verified, are a part of the standard model and its most popular extensions. They are a manifestation of a mechanism which breaks the electroweak symmetry and generates masses for particles. Since the H->tautau and H+->taunu decay channels are important for the discovery of the Higgs bosons in a large region of the permitted parameter space, the analysis described in this thesis serves as a probe for finding out properties of the microcosm of particles and their interactions in the energy scales beyond the standard model of particle physics.
  • Heinonen, Satu-Maarit (Helsingin yliopisto, 2006)
    Epidemiological studies have associated high soy intake with a lowered risk for certain hormone-dependent diseases, such as breast and prostate cancers, osteoporosis, and cardiovascular disease. Soy is a rich source of isoflavones, diphenolic plant compounds that have been shown to possess several biological activities. Soy is not part of the traditional Western diet, but many dietary supplements are commercially available in order to provide the proposed beneficial health effects of isoflavones without changing the original diet. These supplements are usually manufactured from extracts of soy or red clover, which is another important source of isoflavones. However, until recently, detailed studies of the metabolism of these compounds in humans have been lacking. The aim of this study was to identify urinary metabolites of isoflavones originating from soy or red clover using gas chromatography - mass spectrometry (GC-MS). To examine metabolism, soy and red clover supplementation studies with human volunteers were carried out. In addition, the metabolism of isoflavones was investigated in vitro by identification of metabolites formed during a 24-h fermentation of pure isoflavones with a human fecal inoculum. Qualitative methods for identification and analysis of isoflavone metabolites in urine and fecal fermentation samples by GC-MS were developed. Moreover, a detailed investigation of fragmentation of isoflavonoids in electron ionization mass spectrometry (EIMS) was carried out by means of synthetic reference compounds and deuterated trimethylsilyl derivatives. After isoflavone supplementation, 18 new metabolites of isoflavones were identified in human urine samples. The most abundant urinary metabolites of soy isoflavones daidzein, genistein, and glycitein were found to be the reduced metabolites, i.e. analogous isoflavanones, a-methyldeoxybenzoins, and isoflavans. Metabolites having additional hydroxyl and/or methoxy substituents, or their reduced analogs, were also identified. The main metabolites of red clover isoflavones formononetin and biochanin A were identified as daidzein and genistein. In addition, reduced and hydroxylated metabolites of formononetin and biochanin A were identified; however, they occurred at much lower levels in urine samples than daidzein or genistein or their reduced metabolites. The results of this study show that the metabolism of isoflavones is diverse. More studies are needed to determine whether the new isoflavonoid metabolites identified here have biological activities that contribute to the proposed beneficial effects of isoflavones on human health. Another task is to develop validated quantitative methods to determine the actual levels of isoflavones and their metabolites in biological matrices in order to assess the role of isoflavones in prevention of chronic diseases.
  • Nurmi, Petteri (Helsingin yliopisto, 2009)
    Place identification refers to the process of analyzing sensor data in order to detect places, i.e., spatial areas that are linked with activities and associated with meanings. Place information can be used, e.g., to provide awareness cues in applications that support social interactions, to provide personalized and location-sensitive information to the user, and to support mobile user studies by providing cues about the situations the study participant has encountered. Regularities in human movement patterns make it possible to detect personally meaningful places by analyzing location traces of a user. This thesis focuses on providing system level support for place identification, as well as on algorithmic issues related to the place identification process. The move from location to place requires interactions between location sensing technologies (e.g., GPS or GSM positioning), algorithms that identify places from location data and applications and services that utilize place information. These interactions can be facilitated using a mobile platform, i.e., an application or framework that runs on a mobile phone. For the purposes of this thesis, mobile platforms automate data capture and processing and provide means for disseminating data to applications and other system components. The first contribution of the thesis is BeTelGeuse, a freely available, open source mobile platform that supports multiple runtime environments. The actual place identification process can be understood as a data analysis task where the goal is to analyze (location) measurements and to identify areas that are meaningful to the user. The second contribution of the thesis is the Dirichlet Process Clustering (DPCluster) algorithm, a novel place identification algorithm. The performance of the DPCluster algorithm is evaluated using twelve different datasets that have been collected by different users, at different locations and over different periods of time. As part of the evaluation we compare the DPCluster algorithm against other state-of-the-art place identification algorithms. The results indicate that the DPCluster algorithm provides improved generalization performance against spatial and temporal variations in location measurements.
  • Kalke, Martti (Helsingin yliopisto, 2014)
    The purpose of this thesis was two-fold: Firstly, to evaluate if limited angle tomography is suitable for clinical implant planning. Secondly, to improve clinical image quality and workflow of the limited angle tomography by developing new imaging processing algorithms. Conventional computed tomography (CT) design is not optimal in the sense of cost, workflow or dose for two reasons. Firstly, CT devices are typically expensive and bulky devices because they require a stable X-ray production, rigid gantry with accurate and repeatable movements, high scanning speed, solid patient support and a low-noise X-ray detector. Secondly, current non-regularized reconstruction techniques require high dose per projection image as well as a huge number of projection image. This also limits the usage of the CT imaging to serious trauma cases and other lethal diseases. To overcome the limitations mentioned above, new approaches have been introduced to replace conventional CT imaging. For example, year 2007 a dental imaging technology company Palodex Group released an upgrade kit for standard panoramic X-ray device, called Volumetric Tomography (VT), which is based on limited angle tomography. In the first article, we demonstrated that limited angle tomography is able to give similar clinical information as CT devices in dental implant planning. Therefore, the implant planning could be executed more cost and dose effectively when suitable algorithms are applied throughout the reconstruction process. Since limited angle tomography system applies small number of X-ray images taken from a limited aperture, new image processing methods are required for clinically suitable image quality. For that reason, two novel imaging processing methods and one analyzing method were created and documented in this work. In the second article, a new image processing method based on modification of the constrained least-square filter for extremely sparse situations was introduced. In this method, called Wiener-filter based iterative reconstruction technique (WIRT), we considered the uncertainty of the interpolation as noise and utilized the regularization only in the regions where the uncertainty is the dominating factor. In the third article, a new sinogram estimation algorithm called sinogram interpolation technique (SINT) was created, where the missing sinogram columns were estimated based on the known columns. In the fourth article, a method named mutual information based technology (MINT) was developed to estimate the imaging geometry directly from the projection data. In this method, the imaging angles can be estimated based on the projection images without any external markers or additional constructions to the device. Therefore, this method simplifies workflow and the improves imaging angle accuracy significantly.
  • Leinonen, Lasse (Helsingin yliopisto, 2015)
    Supersymmetry is a proposed new symmetry that relates bosons and fermions. If supersymmetry is realized in nature, it could provide a solution to the hierarchy problem, and one of the new particles it predicts could explain dark matter. In this thesis, I study supersymmetric models in which the lightest supersymmetric particle can be responsible for dark matter. I discuss a scenario in which the supersymmetric partner of the top quark called stop is the next-to-lightest supersymmetric particle in the constrained Minimal Supersymmetric Standard Model. Mass limits and various decay branching fractions are considered when the allowed parameter space for the scenario is determined. If the mass of stop is close to the mass of the lightest supersymmetric particle, one can obtain the observed dark matter density. The scenario leads to a novel experimental signature consisting of high transverse momentum top jets and large missing energy, which can be used to probe the model at the LHC. I also discuss an extended supersymmetric model with spontaneous charge-parity (CP) violation and a right-handed neutrino. When CP is spontaneously violated, a light singlet scalar appears in the particle spectrum, which provides new annihilation channels for the lightest supersymmetric particle. In the model, a neutralino or a right-handed sneutrino can produce the observed dark matter density. Dark matter direct detection limits are found to be especially constraining for right-handed sneutrinos.
  • Nordman, Maaria (Helsingin yliopisto, 2010)
    Accurate and stable time series of geodetic parameters can be used to help in understanding the dynamic Earth and its response to global change. The Global Positioning System, GPS, has proven to be invaluable in modern geodynamic studies. In Fennoscandia the first GPS networks were set up in 1993. These networks form the basis of the national reference frames in the area, but they also provide long and important time series for crustal deformation studies. These time series can be used, for example, to better constrain the ice history of the last ice age and the Earth s structure, via existing glacial isostatic adjustment models. To improve the accuracy and stability of the GPS time series, the possible nuisance parameters and error sources need to be minimized. We have analysed GPS time series to study two phenomena. First, we study the refraction in the neutral atmosphere of the GPS signal, and, second, we study the surface loading of the crust by environmental factors, namely the non-tidal Baltic Sea, atmospheric load and varying continental water reservoirs. We studied the atmospheric effects on the GPS time series by comparing the standard method to slant delays derived from a regional numerical weather model. We have presented a method for correcting the atmospheric delays at the observational level. The results show that both standard atmosphere modelling and the atmospheric delays derived from a numerical weather model by ray-tracing provide a stable solution. The advantage of the latter is that the number of unknowns used in the computation decreases and thus, the computation may become faster and more robust. The computation can also be done with any processing software that allows the atmospheric correction to be turned off. The crustal deformation due to loading was computed by convolving Green s functions with surface load data, that is to say, global hydrology models, global numerical weather models and a local model for the Baltic Sea. The result was that the loading factors can be seen in the GPS coordinate time series. Reducing the computed deformation from the vertical time series of GPS coordinates reduces the scatter of the time series; however, the long term trends are not influenced. We show that global hydrology models and the local sea surface can explain up to 30% of the GPS time series variation. On the other hand atmospheric loading admittance in the GPS time series is low, and different hydrological surface load models could not be validated in the present study. In order to be used for GPS corrections in the future, both atmospheric loading and hydrological models need further analysis and improvements.
  • Lehtonen, Miro (Helsingin yliopisto, 2006)
    XML documents are becoming more and more common in various environments. In particular, enterprise-scale document management is commonly centred around XML, and desktop applications as well as online document collections are soon to follow. The growing number of XML documents increases the importance of appropriate indexing methods and search tools in keeping the information accessible. Therefore, we focus on content that is stored in XML format as we develop such indexing methods. Because XML is used for different kinds of content ranging all the way from records of data fields to narrative full-texts, the methods for Information Retrieval are facing a new challenge in identifying which content is subject to data queries and which should be indexed for full-text search. In response to this challenge, we analyse the relation of character content and XML tags in XML documents in order to separate the full-text from data. As a result, we are able to both reduce the size of the index by 5-6\% and improve the retrieval precision as we select the XML fragments to be indexed. Besides being challenging, XML comes with many unexplored opportunities which are not paid much attention in the literature. For example, authors often tag the content they want to emphasise by using a typeface that stands out. The tagged content constitutes phrases that are descriptive of the content and useful for full-text search. They are simple to detect in XML documents, but also possible to confuse with other inline-level text. Nonetheless, the search results seem to improve when the detected phrases are given additional weight in the index. Similar improvements are reported when related content is associated with the indexed full-text including titles, captions, and references. Experimental results show that for certain types of document collections, at least, the proposed methods help us find the relevant answers. Even when we know nothing about the document structure but the XML syntax, we are able to take advantage of the XML structure when the content is indexed for full-text search.
  • Utz, Margarete (Helsingin yliopisto, 2010)
    Individual movement is very versatile and inevitable in ecology. In this thesis, I investigate two kinds of movement body condition dependent dispersal and small-range foraging movements resulting in quasi-local competition and their causes and consequences on the individual, population and metapopulation level. Body condition dependent dispersal is a widely evident but barely understood phenomenon. In nature, diverse relationships between body condition and dispersal are observed. I develop the first models that study the evolution of dispersal strategies that depend on individual body condition. In a patchy environment where patches differ in environmental conditions, individuals born in rich (e.g. nutritious) patches are on average stronger than their conspecifics that are born in poorer patches. Body condition (strength) determines competitive ability such that stronger individuals win competition with higher probability than weak individuals. Individuals compete for patches such that kin competition selects for dispersal. I determine the evolutionarily stable strategy (ESS) for different ecological scenarios. My models offer explanations for both dispersal of strong individuals and dispersal of weak individuals. Moreover, I find that within-family dispersal behaviour is not always reflected on the population level. This supports the fact that no consistent pattern is detected in data on body condition dependent dispersal. It also encourages the refining of empirical investigations. Quasi-local competition defines interactions between adjacent populations where one population negatively affects the growth of the other population. I model a metapopulation in a homogeneous environment where adults of different subpopulations compete for resources by spending part of their foraging time in the neighbouring patches, while their juveniles only feed on the resource in their natal patch. I show that spatial patterns (different population densities in the patches) are stable only if one age class depletes the resource very much but mainly the other age group depends on it.
  • Hussein, Tareq (Helsingin yliopisto, 2005)
  • Juurinen, Iina (Helsingin yliopisto, 2014)
    Not only being essential to life as we know it water also has peculiar behaviour in comparison to other liquids. The macroscopic anomalies of water are driven by its microscopic structure. It is agreed upon that the molecular-level structure of liquid water is highly structured due to its extensive hydrogen-bond network. However, the details on the structure remain debated. Moreover, it is surprising how little is known about the behaviour of water when it interacts with other substances, taken how important water is as a solvent. Studying the structure of liquids is not straightforward and each method has its particular sensitivity. Inelastic x-ray scattering is a versatile method for structural analyses, and thus can provide information from a new perspective. Compared to many other techniques it has several advantages when investigating liquids: for example, bulk structures can be studied and no vacuum environment is needed. In the element-sensitive x-ray Raman scattering the electronic excitations from core to unoccupied molecular orbitals reveal the local environment. On the other hand, information on the electronic structure is also obtained with x-ray Compton scattering, in which the ground-state electron momentum distribution is probed. In this thesis, the problematics of the hydrogen-bond network of water is approached by observing the effects other components have on it with x-ray Raman and Compton scattering. The thesis includes studies on ionic, hydrophobic, and hydrophilic interaction, particularly in aqueous LiCl, water-alcohol mixtures and aqueous polymer poly(N-isopropylacrylamide). The information obtained from these systems both elucidates the structure of the hydrogen-bond network of water, and further affirms the benefit of inelastic x-ray scattering methods in studying a wide range of disordered materials.
  • Heikkala, Ville (Helsingin yliopisto, 2002)
  • Wasenius, Niko (Helsingin yliopisto, 2014)
    The prevalence and incidence of non-communicable diseases, which have been associated with physical inactivity, are increasing worldwide. Thus, there is a great need for understanding possibilities to increase health enhancing physical activity. The main aims of this study were to investigate 1) the effects of a 13-day in-patient rehabilitation intervention and a 12-week exercise intervention on the intensity and volume of daily total physical activity and on its subcategories 2) the effect of exercise intervention on risk factors for type 2 diabetes, and 3) the effect of non-structured leisure-time physical activity (LTPA) on response to exercise training. The study consists of two separate study cohorts. The first data set included subjects (n = 19, 16 women and 3 men) with chronic neck or shoulder pain and who participated in active rehabilitation interventions. The second data set included 144 overweight or obese middle aged men with impaired glucose regulation who were randomly allocated into a non-exercise control (C) group, a Nordic walking (NW) group, and a power type resistance training (RT) group. During the 12-week intervention, the exercise groups performed structured supervised exercises three times a week for 60 minutes. In both datasets intensity and volume of physical activity was measured in metabolic equivalents of tasks (MET) and MET-hours before and during the interventions with combinations of objective measurement, diaries, and questionnaires. In the second dataset changes in glucose, lipid, and liver enzymes metabolism, adipocytokines, body composition, blood pressure, physical capacity, and dietary intake were measured with standard methods. The measurements were performed before and after the intervention. No increase in the volume of total physical activity was observed with either intervention. Both the rehabilitation and NW intervention increased the volume of leisure-time physical activity (LTPA). The weekly increase in the volume of total LTPA (structured exercises + non-structured LTPA) was associated with a decrease in the volume of non-LTPA (other than structured exercise or non-structured exercise). Compared to the control group, especially NW had beneficial effects on the body adiposity tissue and the adipocytokines (leptin and chemerin) associated with the regulation of lipid and glucose metabolism. The intensity of non-structured LTPA during the exercise intervention was found to independently explain 10%, 9%, and 7% of the variation of change in walking speed, body weight, and BMI, respectively. This effect was observed especially after the intensity threshold of 6.3 MET (77% of maximal physical capacity). Thus, interventions aimed to increase physical activity do not automatically increase the volume of total physical activity due to the compensation. They can, however, increase the volume of LTPA, which can subsequently have beneficial health effect on risk factors of type 2 diabetes. Better understanding of the physical activity regulation in response to training can also increase the specificity of the physical activity dosage.
  • Väisänen, Petri (Helsingin yliopisto, 2001)
  • Sipilä, Mikko (Helsingin yliopisto, 2010)
    Atmospheric aerosol particles have a strong impact on the global climate. A deep understanding of the physical and chemical processes affecting the atmospheric aerosol climate system is crucial in order to describe those processes properly in global climate models. Besides the climatic effects, aerosol particles can deteriorate e.g. visibility and human health. Nucleation is a fundamental step in atmospheric new particle formation. However, details of the atmospheric nucleation mechanisms have remained unresolved. The main reason for that has been the non-existence of instruments capable of measuring neutral newly formed particles in the size range below 3 nm in diameter. This thesis aims to extend the detectable particle size range towards close-to-molecular sizes (~1nm) of freshly nucleated clusters, and by direct measurement obtain the concentrations of sub-3 nm particles in atmospheric environment and in well defined laboratory conditions. In the work presented in this thesis, new methods and instruments for the sub-3 nm particle detection were developed and tested. The selected approach comprises four different condensation based techniques and one electrical detection scheme. All of them are capable to detect particles with diameters well below 3 nm, some even down to ~1 nm. The developed techniques and instruments were deployed in the field measurements as well as in laboratory nucleation experiments. Ambient air studies showed that in a boreal forest environment a persistent population of 1-2 nm particles or clusters exists. The observation was done using 4 different instruments showing a consistent capability for the direct measurement of the atmospheric nucleation. The results from the laboratory experiments showed that sulphuric acid is a key species in the atmospheric nucleation. The mismatch between the earlier laboratory data and ambient observations on the dependency of nucleation rate on sulphuric acid concentration was explained. The reason was shown to be associated in the inefficient growth of the nucleated clusters and in the insufficient detection efficiency of particle counters used in the previous experiments. Even though the exact molecular steps of nucleation still remain an open question, the instrumental techniques developed in this work as well as their application in laboratory and ambient studies opened a new view into atmospheric nucleation and prepared the way for investigating the nucleation processes with more suitable tools.
  • Mogensen, Ditte (Helsingin yliopisto, 2015)
    Forests emit biogenic volatile organic compounds (BVOCs) that, together with e.g. sulfuric acid, can operate as aerosol precursor compounds when oxidised. Aerosol particles affect both air visibility, human health and the Earth s radiative budget, thus making the emission inputs and oxidation mechanisms of VOCs absolutely crucial to understand. This thesis discusses the life cycle of compounds in the atmosphere. Specifically, we studied the representations of emission of BVOCs, the atmosphere s oxidation ability along with the sources and sinks of sulfuric acid. The main tool to achieve this was numerical modelling, often compared to field observations. Additionally, we performed computational chemistry simulations in order to calculate transitions in sulfuric acid. The main findings of this thesis can be summarised into the following: (1) Biological understanding of VOC emission processes needs to be enhanced in order to predict VOC concentrations with a high precision. (2) The unexplained fraction of the total OH reactivity in the boreal forest is larger than the known fraction and known secondary organic oxidation products of primary emitted terpenes cannot explain the missing reactivity. (3) OH is the main oxidation agent of organic compounds in the boreal atmosphere. (4) Criegee Intermediates, produced from unsaturated hydrocarbons, can oxidise SO2 effectively in order to provide as an essential source of sulfuric acid in areas with high VOC concentrations. (5) Two-photon electronic excitation did not turn out to be a significant sink of gaseous sulfuric acid in the stratosphere. This thesis closes a large part of the sulfuric acid concentration gap in VOC rich environments. Further, this thesis raises awareness of the fact that we still do not fully comprehend the mechanisms leading to BVOC emissions nor the organic atmospheric chemistry in the boreal forest. Finally, this work encourage to study alternative BVOC emission sources as well as alternative atmospheric oxidants.
  • Tomczak, Yoann (Helsingin yliopisto, 2014)
    Atomic Layer Deposition (ALD) is a thin film deposition method allowing the growth of highly conformal films with atomic level thickness and composition precision. For most of the ALD processes developed, the reaction mechanisms occurring at each step of the deposition remain unclear. Learning more about these reactions would help to control and optimize the existing growth processes and develop new ones more quickly. For that purpose, in situ methods such as quartz crystal microbalance (QCM) and quadrupole mass spectrometer (QMS) are used. These techniques present numerous advantages because they allow monitoring the thin film growth mechanisms directly during the process. Additionally, they do not require separate experiments or large amounts of precursors to test the efficiency of new processes and could be very effective means to monitor industrial processes in real time. This thesis explores the most common in situ analytical methods used to study ALD processes. A review on the ALD metal precursors possessing ligands with nitrogen bonded to the metal center and their reactivity is provided. The results section reports the reaction mechanisms of ALD processes for the deposition of Nb2O5, Ta2O5, Li2SiO3, TiO2 and ZrO2. All the processes studied are using metal precursors with nitrogen bonded ligands and ozone or water for the deposition of high-k and other oxide films.
  • Knapas, Kjell (Helsingin yliopisto, 2012)
    Pdf-file, see above