Matemaattis-luonnontieteellinen tiedekunta

 

Recent Submissions

  • Marnela, Marika (Helsingin yliopisto, 2016)
    The Arctic Ocean and its exchanges with the Nordic Seas influence the north-European climate. The Fram Strait with its 2600 m sill depth is the only deep passage between the Arctic Ocean and the other oceans. Not just all the deep water exchanges between the Arctic Ocean and the rest of the world's oceans take place through the Fram Strait, but also a significant amount of cold, low-saline surface waters and sea ice exit the Arctic Ocean through the strait. Correspondingly, part of the warm and saline Atlantic water flowing northward enters the Arctic Ocean through the Fram Strait bringing heat into the Arctic Ocean. The oceanic exchanges through the Fram Strait as well as the water mass properties and the changes they undergo in the Fram Strait and its vicinity are studied from three decades of ship-based hydrographic observations collected from 1980-2010. The transports are estimated from geostrophic velocities. The main section, comprised of hydrographic stations, is located zonally at about 79 °N. For a few years of the observed period it is possible to combine the 79 °N section with a more northern section, or with a meridional section at the Greenwich meridian, to form quasi-closed boxes and to apply conservation constraints on them in order to estimate the transports through the Fram strait as well as the recirculation in the strait. In a similar way, zonal hydrographic sections in the Fram Strait and along 75 °N crossing the Greenland Sea are combined to study the exchanges between the Nordic Seas and the Fram Strait. The transport estimates are adjusted with drift estimates based on Argo floats in the Greenland Sea. The mean net volume transports through the Fram Strait are averaged from the various approaches and range from less than 1 Sv to about 3 Sv. The heat loss to the atmosphere from the quasi-closed boxes both north and south of the Fram Strait section is estimated at about 10 TW. The net freshwater transport through the Fram Strait is estimated at 60-70 mSv southward. The insufficiently known northward transport of Arctic Intermediate Water (AIW) originating in the Nordic Seas is estimated using 2002 Oden expedition data. At the time of data collection, excess sulphur hexafluoride (SF6) was available, a tracer that besides a background anthropogenic origin derives from a mixing experiment in the Greenland Sea in 1996. The excess SF6 can be used to distinguish AIW from the upper Polar Deep Water originating in the Arctic Ocean. It is estimated that 0.5 Sv of AIW enters the Arctic Ocean. The deep waters in the Nordic Seas and in the Arctic Ocean have become warmer and in the Greenland Sea also more saline during the three decades studied in this work. The temperature and salinity properties of the deep waters found in the Fram Strait from both Arctic Ocean and Greenland Sea origins have become similar and continue to do so. How these changes will affect the circulation patterns will be seen in the future.
  • Tuomikoski, Laura (Helsingin yliopisto, 2016)
    The rapid development of different imaging modalities related to radiation therapy (RT) has largely affected the entire RT process from the planning phase of the treatment to the final treatment delivery. Treatment planning requires accurate anatomical information that can be provided by computed tomography (CT) and magnetic resonance imaging (MRI). Additional functional information about tissues and organs can be obtained by functional MRI or nuclear medicine imaging techniques such as single-photon emission tomography or positron emission tomography. The introduction of cone-beam computed tomography (CBCT) imaging to the RT delivery process has also opened new possibilities for RT treatment. In the past, mainly bony anatomy was visualized with planar imaging, which was used for the localization of the treatment. With CBCT also the prevailing soft tissue anatomy in addition to bones can be verified on a daily basis. By taking advantage of the growing amount of information obtainable by imaging, RT treatment plans can be customized further to suit the individual anatomical and physiological properties of patients. The focus of this thesis is on advanced methods for taking the individual variation in patients physiology into account during the RT treatment planning. Two particular cases of variation are investigated: bladder filling and deformation during the RT of urinary bladder cancer, and radiation-induced changes of salivary gland function related to the RT of head and neck cancer. In both cases, pre-treatment imaging is used to create a patient-specific model to estimate the changes that would take place during the RT. The aim is to take these predicted changes into account in the treatment planning process, with the goal of protecting normal tissues. At Helsinki University Central Hospital (HUCH), a method of adaptive radiation therapy (ART) was designed and clinically implemented for the treatment of urinary bladder cancer. In the applied workflow, four consecutive CT scans for RT treatment planning were used to capture the changes in bladder shape and size while the bladder was filling. Assuming that a similar bladder filling pattern applies during the course of RT, four treatment plans corresponding to the different bladder volumes were prepared and stored in a plan library. Before each treatment fraction a CBCT scan was performed. The treatment plan, which was the closest match to the bladder shape and size of the day, was selected from the library and delivered accordingly. The use of ART enabled better conformity of the treatment. It considerably reduced the absorbed dose to the intestinal cavity, as compared to the non-adaptive approach. Furthermore, the dose coverage in the urinary bladder was not compromised, while the treatment margins were substantially reduced. Overall, the method was found to be feasible, and it was rapidly taken into clinical practice. A model for predicting post-RT salivary flow was constructed and evaluated for the treatment of head and neck cancer. The model was based on pre-RT quantitative 99mTc-pertechnetate scintigraphy, direct measurement of total salivary flow and population-based dose-response behaviour. A good correlation was found between the modelled and the measured values of saliva flow rate. Hence, the model can be used as a predictive tool for risk-adapted treatment planning. One possible explanation for the remaining discrepancies between the predicted and the measured saliva flow rate values may be patients individual responses to radiation.
  • Kekkonen, Hanne (Helsingin yliopisto, 2016)
    My dissertation focuses on the convergence rates and uncertainty quantification for continuous linear inverse problems. The problem is studied from both deterministic and stochastic points of view. In particular, I considered regularisation and Bayesian inversion with large noise in infinite-dimensional settings. The first paper in my thesis investigates the convergence results for continuous Tikhonov regularisation in appropriate Sobolev spaces. The convergence rates are achieved by using microlocal analysis for pseudodifferential operators. In the second paper variational regularisation is studied using convex analysis. In this paper we define a new kind of approximated source condition for large noise and for the unknown solution to guarantee the convergence of the approximated solution in Bregman distance. The third paper approaches Gaussian inverse problems from the statistical perspective. In this article we study the posterior contraction rates and credible sets for Bayesian inverse problems. Also the frequentist confidence regions are examined. The analysis of the small noise limit in statistical inverse problems, also known as the theory of posterior consistency, has attracted a lot of interest in the last decade. Developing a comprehensive theory is important since posterior consistency justifies the use of the Bayesian approach the same way as convergence results justify the use of regularisation techniques.
  • Kontro, Inkeri (Helsingin yliopisto, 2016)
    Elastic X-ray scattering is a probe which provides information on the structure of matter in nanometer lengthscales. Structure in this size scale determines the mechanical and functional properties of materials, and in this thesis, small- and wide-angle X-ray scattering (SAXS and WAXS) have been used to study the structure of biological and biomimetic materials. WAXS gives information on the structures in atomistic scales, while SAXS provides information in the important range above atomistic but below microscale. SAXS was used together with dynamic light scattering and zeta potential measurements to study protein and liposome structures. The S-layer protein of Lactobacillus brevis ATCC 8287 was shown to reassemble on cationic liposomes. The structure of the reassembled crystallite differed from that of the S-layer on native bacterial cell wall fragments, and the crystallite was more stable in the direction of the larger lattice constant than in the direction of the shorter. Liposomes were also used as a biomembrane model to study the interactions of phosphonium-based ionic liquids with cell membrane mimics. All studied ionic liquids penetrated multilamellar vesicles and caused a thinning of the lamellar distance that was dependent on ionic liquid concentration. The ability of the ionic liquids to disrupt membranes was, however, dependent on the length of hydrocarbon chains in the cation. In most cases, ionic liquids with long hydrocarbon chains in the cation induced disorder in the system, but in one case also selective extraction of lipids and reassembly into lamellae was observed. The effects depended both on ionic liquid type, concentration, and lipid composition of the vesicle. WAXS was used as a complementary technique to provide information on the structure-function relationship of a novel biomimicking material composed of a genetically engineered protein, chitin and calcium carbonate, and films composed of hydroxypropylated xylan. The presence of calcium carbonate and its polymorph (calcite) was determined from the biomimetic material. For the xylan films, crystallinity was assessed. In both cases, also the crystallite size was determined. These parameters influence the mechanical properties of the developed materials. In all cases, X-ray scattering provided information on the nanostructure of biological or biomimetic materials. Over a hundred years after the principle behind X-ray scattering was first explained, it still provides information about the properties of matter which is not available by other means.
  • Paolini, Gianluca (Helsingin yliopisto, 2016)
    The subject of this doctoral thesis is the mathematical theory of independence, and its various manifestations in logic and mathematics. The topics covered in this doctoral thesis range from model theory and combinatorial geometry, to database theory, quantum logic and probability logic. This study has two intertwined centres: - classification theory, independence calculi and combinatorial geometry (papers I-IV); - new perspectives in team semantics (papers V-VII). The first topic is a classical topic in model theory, which we approach from different directions (implication problems, abstract elementary classes, unstable first-order theories). The second topic is a relatively new logical framework where to study non-classical logical phenomena (dependence and independence, uncertainty, probabilistic reasoning, quantum foundations). Although these two centres seem to be far apart, we will see that they are linked to each others in various ways, under the guiding thread of independence.
  • Kallonen, Aki (Helsingin yliopisto, 2016)
    X-ray tomography is a widely used and powerful tool; its significance to diagnostics was recognized with the Nobel award, and tomographic imaging has also become a large contributor to several fields of science, from material physics to biological and palaeontological sciences. Current technology enables tomography on the micrometre scale, microtomography, in the laboratory. This provides a non-destructive three-dimensional microscope to probe the internal structure of radiotranslucent objects, which has obvious implications towards its applicability. Further, x-rays may be utilized for x-ray scattering experiments, which probes material properties on the ångström-scale. Crystallographic studies on various forms of matter, not least of which famously being the DNA molecule, have also been awarded the Nobel. In this thesis, the construction of a combined experimental set-up for both x-ray microtomography and x-ray scattering is documented. The device may be used to characterize materials on several levels of their hierarchical structure, and the microtomography data may be used as a reference for targeting the crystallographic experiment. X-ray diffraction tomography is demonstrated. An algorithm for x-ray tomography from sparse data is presented. In many scenarios, the amount of data collected for a tomogram is not sufficient for traditional algorithms, and would benefit from more robust computational schemes. Real x-ray data was used for computing a tomographic reconstruction from a data set two orders of magnitude smaller than what is conventionally used with set-ups such as the one presented in the thesis. Additionally, x-ray microtomography was utilized for morphological studies in developmental and evolutionary biology, evo-devo for short. The fossil record shows vast changes in morphology as more complex forms of life evolved, while the morphology of any given individual organism is the product of its developmental process. Understanding both evolution and development is essential for a comprehensive view on the history of life. In this thesis, two studies on teeth and their development are discussed. In both, dental morphology was investigated with high-resolution x-ray tomography.
  • Shubin, Mikhail (Helsingin yliopisto, 2016)
    The dissertation presents five problem-driven research articles, representing three research domains related to micro-organisms causing infectious disease. Articles I and II are devoted to the A(H1N1)pdm09 influenza (`swine flu') epidemic in Finland 2009-2011. Articles III and IV present software tools for analysing experimental data produced by Biolog phenotype microarrays. Article V studies a mismatch distribution as a summary statistic for the inference about evolutionary dynamics and demographic processes in bacterial populations. All addressed problems share the following two features: (1) they concern a dynamical process developing in time and space; (2) the observations of the process are partial and imprecise. The problems are generally approached using Bayesian Statistics as a formal methodology for learning by confronting hypothesis to evidence. Bayesian Statistics relies on modelling: constructing a generative algorithm mimicking the object, process or phenomenon of interest.
  • Lehtonen, Elina (Helsingin yliopisto, 2016)
    Archaean cratons contain the oldest parts of the Earth s crust that have survived crustal recycling processes. Archaean greenstone belts are vital parts of these cratons and preserve the oldest volcanoclastic and sedimentary rocks on Earth. They play an important part in the study of the evolution of the early Earth and formation of stable continental crust. One of the most important questions related to Archaean greenstone belts is how they were initially formed. Various tectonic processes have been suggested and the pertinent geological environments remain open for scientific debate. Absolute age determinations of plutonic and supracrustal rocks provide an essential tool for understanding crustal evolution of the early Earth. This thesis focuses on the largest Archae¬an greenstone association in eastern Finland: the Suomussalmi-Kuhmo-Tipasjärvi complex, which belongs to the Karelia Province of the Fennoscandian shield. The main objective of the thesis is to constrain a detailed geochronology for the felsic and intermediate volcanic rocks, as well as sedimentary rocks of the greenstone complex, and associated plutonic rocks. The ages of the selected samples were determined with second¬ary ion mass spectrometry (SIMS) and laser-ab¬lation multi-collector inductively-coupled-mass-spectrometry (LA-MC-ICPMS), and whole-rock samples were analyzed for their major and trace element compositions. The felsic and intermediate volcanic rocks of the Suomussalmi-Kuhmo-Tipasjärvi green¬stone complex can be divided into four distinct age groups based on pre-existing and new geochronological data: 2.94 Ga, 2.84 Ga, 2.82 Ga, and 2.80 2.79 Ga. The Suomussalmi greenstone belt contains the oldest volcanic phase (2.94 Ga), and the Tipasjärvi and Kuhmo greenstone belts the youngest volcanic phase (2.80 2.79 Ga). The new age determinations validate a chronostrati¬graphic interpretation for each belt. The updated chronostratigraphic model of the Suomussalmi greenstone belt comprises four volcanic units: ca. 2.94 Ga (Luoma unit), 2.87 Ga (age from a mafic rock; Tormua unit), 2.84 Ga (Ahvenlahti unit) and 2.82 Ga (Mesa-aho unit). The Kuhmo greenstone belt is interpreted to contain two units comprised of mainly volcanic rocks (Siivikkovaara and Nuolikangas) and a unit containing sedimentary material deposited after the end of the volcanism (Ronkaperä). Both the Nuolikangas unit (ca. 2.84 Ga) and the Siivikkovaara unit (ca. 2.80 2.79 Ga) also contain mafic and ultramafic rocks. The Tipasjärvi greenstone belt comprises three volcanic units with ages of ca. 2.84 Ga (Talassuo unit), 2.82 Ga (Hietaperä unit), 2.80 Ga (Koivumäki unit), and Kokkoniemi unit mainly composed of sedimentary rocks depos¬ited after the end of the volcanism. Based on available data, the ca. 2.94 Ga volcanic rocks of the complex register an older continent. The younger volcanic rocks with ages of ca. 2.84 2.79 Ga formed via interaction between the older continent and oceanic lithosphere, or during rifting of the older continent.
  • Rantala, Pekka (Helsingin yliopisto, 2016)
    Volatile organic compounds (VOC) are emitted into the atmosphere from both biogenic and anthropogenic sources. Some VOCs act as aerosol precursor compounds in the atmosphere, and thus, affect Earth's radiative budget and the global climate. In this thesis VOC exchange between the surface and the atmosphere was studied in an ecosystem scale using micrometeorological flux measurement techniques combined with proton transfer reaction mass spectrometry for VOC concentration measurements. The measurements were obtained above three different environments: a Scots pine dominated boreal forest, a Mediterranean oak-hornbeam dominated forest, and an urban area. The main results were the following: i) The direct flux measurement technique, disjunct eddy covariance method, was found to be problematic in low-flux conditions, thus, the indirect surface layer profile method can be recommended for flux measurements in these conditions. Conversely, the eddy covariance method with a time-of-flight mass analyser was found to be a powerful tool for VOC flux measurements. ii) The total VOC flux in the boreal ecosystem was dominated by monoterpenes through the whole year, and also oxygenated VOCs made rather a large contribution. Monoterpene emissions depended on both temperature and light. On the other hand, isoprene was the dominant flux compound above the oak-hornbeam forest, while other compounds played only minor roles compared with the isoprene flux. The OVOC exchange was found to be bi-directional, i.e. many OVOCs, especially methanol, had also significant deposition to the surface. A semi-empirical algorithm was developed to determine the total exchange of methanol. iii) The VOC emissions from anthropogenic sources were significant and they could be determined with the disjunct eddy covariance method. The emission dynamics of VOCs was more complicated compared with the forest environments. However, in urban areas traffic was the most important source for many VOCs whereas isoprene originated mostly from urban vegetation. The methods applied in this study can be used in multiple ecosystems and the thesis provides some recommendations for selecting the most feasible and reliable methods. The results of the thesis, such as determined emission potentials, are also adaptable for modelling purposes. Keywords: VOC, flux, emission, deposition, PTR-MS, boreal forest, broadleaf forest, urban landscape
  • Wilkman, Olli (Helsingin yliopisto, 2016)
    Understanding the light scattering properties of Solar System bodies is important, especially in the case of the small bodies. For these objects, most of our data is photometric, i.e. measurements of the brightness of light in broad spectral bands in visible and near-infrared. Though limited in many ways, these data can be used to derive physical properties that provide constraints on the structure and material composition of the objects. These atmosphereless bodies are almost always covered with a blanket of loose material called the regolith. The planetary regoliths consist of a range of grain sizes from micrometres to tens of metres, and have a complex geological history and chemical composition. We study two models for the reflectance of planetary surfaces. One is the Lommel-Seeliger model, which is mathematically simple, but also not truly applicable to particulate media such as regoliths. However, an analytical form exists for the integrated brightness of an ellipsoid with the Lommel-Seeliger scattering model. Ellipsoids are useful as crude shape models for asteroids. Some applications of Lommel-Seeliger ellipsoids are studied in the development of a faster software for the inversion of rotational state and rough shape from sparse asteroid lightcurves. The other scattering model is a semi-numerical one, developed to model the reflectance of dark particulate surfaces, such as the lunar regolith and the surfaces of many asteroids. The model term representing the shadowing effects in the medium is computed numerically, and is computationally expensive to produce, but after being computed once, it can be saved and reused. The model is applied to disk-resolved photometry of the lunar surface, as well as laboratory measurements of a dark volcanic sand. The lunar surface is the best known extraterrestrial material, while volcanic sands can be used as analogues for basaltic regoliths such as the lunar mare surfaces. These studies are still early steps in both of the model applications mentioned above. The results show promising avenues for further research. In the case of the Lommel-Seeliger ellipsoids, a statistical inversion scheme is used to gain information on the spin and shape of sparsely observed asteroids. In the studies with the PM scattering model, it was found to provide good fits to data, and though the interpretation of the model parameters is not clear, they are qualitatively reasonable. Some limitations of the current implementation of the model were found, with clear lines of future improvement. On the whole the model has potential for many applications in disk-resolved photometry of regolith surfaces.
  • Merisalo, Maria (Helsingin yliopisto, 2016)
    Digitalization, the social, economic and cultural process where individuals, organizations and societies access, adopt, use and utilize digital technologies, is expected to produce comprehensive societal benefits. Here, the spillover effects of the utilization of digital technologies such as e-government, teleworking and social media are examined in order to explore the added value that can be potentially gained from digitalization. Moreover, the study advances the conceptual perception of how, where and to whom the digitalization produces added value. The research applies Bourdieusian neo-capital theory, which emphasizes the significance of tangible and intangible forms of capital in understanding the social world. This dissertation addresses digitalization questions through four papers: The first paper is conceptual in nature. It redefines and introduces the concept of e-capital as another form of intangible capital, which emerges from the possibilities, capabilities and willingness of individuals, organizations and societies to invest in, utilize and reap benefits from digitalization and thus create added value. All forms of capital (physical, economic, human, social and cultural) are both required and produced in this process. The second paper exposes spatial and social disparities in the use of social media in the Helsinki Metropolitan Area (HMA), and the third paper shows the connection between teleworking, knowledge intensity and creativity of work and e-capital. Both of these papers draw on a survey of 971 inhabitants of the HMA conducted in 2010. The fourth paper examines the national e-government programme E-services and e-democracy (SADe) by exploiting 15 stakeholder interviews conducted in 2012. The paper indicates that the programme was mainly driven by a technological paradigm. The study demonstrates that the basic, primary motivation for advancing digitalization in societies is the fact that it matters: digitalization can provide e-capital and produce added value that cannot be gained or would be significantly more difficult to gain without digital technologies. The benefits do not materialize solely through the production of new innovative technological solutions, but rather they arise from comprehensive implementation by the individuals, organizations and societies. These actors possess varying amounts of different forms of capital and thus vary in terms of their possibilities, capabilities and willingness to implement new digital tools. Since different forms of capital are needed in order to create e-capital from digitalization, e-capital is most likely to emerge in the same locations as other forms of capital. However, the conceptualisation of e-capital demonstrated that jumping into the e-capital conversion process gives access to other forms of capital. This should motivate individuals, organizations and societies (including the public bodies supporting them) in their digitalization process. Keywords: e-capital, social media, teleworking, e-government, digitalization, Pierre Bourdieu
  • Peltola, Eveliina (Helsingin yliopisto, 2016)
    This thesis concerns questions motivated by two-dimensional critical lattice models of statistical mechanics, and conformal field theory (CFT). The core idea is to apply algebraic techniques to solve questions in random geometry, and reveal the algebraic structures therein. We consider the interplay between braided Hopf algebras (quantum groups) and CFT, with applications to critical lattice models and conformally invariant random curves, Schramm-Loewner evolutions (SLE). In the first article, a quantum group method is developed to construct explicit expressions for CFT correlation functions using a hidden Uq(sl2) symmetry. The quantum group method provides tools to directly read off properties of the functions from representation theoretical data. The correlation functions are analytic functions of several complex variables satisfying linear homogeneous partial differential equations known as the Benoit & Saint-Aubin PDEs. Such PDEs emerge in CFT from singular vectors in representations of the Virasoro algebra. The correlation functions of conformal field theory are believed to describe scaling limits of correlations in critical lattice models, expected to exhibit conformal invariance in the scaling limit. The second article contains applications to questions in the theory of SLEs: the pure partition functions of multiple SLEs, and the chordal SLE boundary visit probability amplitudes, also known as Green’s functions. The relevant solutions to the PDEs are found by imposing certain natural boundary conditions given by specified asymptotic behavior. Loosely speaking, the appropriate boundary conditions can be deduced from the qualitative properties of the associated stochastic processes, or alternatively, by CFT fusion arguments. More general solutions to the PDEs are constructed in the fourth article, in the spirit of fusion of CFT. The above type of solutions emerge also from critical lattice models, as (conjectured) scaling limits of renormalized probabilities of crossing and boundary visit events of interfaces. In the third article, such questions for the loop-erased random walk and the uniform spanning tree are studied. Explicit formulas for the probabilities are found, and their convergence in the scaling limit to solutions of second and third order PDEs of Benoit & Saint-Aubin type is proved. Furthermore, these functions are related to the conformal blocks of CFT, by certain combinatorial structures.
  • Hanhijärvi, Kalle (Helsingin yliopisto, 2016)
    Viruses are the most abundant form of life on Earth. They cause serious disease, significant suffering, and economic losses, but viruses are also important to the general balance of the ecosystem. Understanding the details of the viral lifecycle is therefore essential from the point of view of basic research. This thesis work expands the basis formed by traditional microbiology. Single molecule biophysics techniques open a unique perspective into the inner workings of viruses. The physics point of view provides a quantitative, predictive, and descriptive mathematical basis to help one understand the basic processes of life. Furthermore, single molecule methods reveal heterogeneity and process variability which are unresolvable in bulk studies. This thesis work employs single molecule biophysical experiments to study two aspects of the viral lifecycle: genome packaging and ejection. DNA ejection is a method of infection employed by many double-stranded DNA (dsDNA) bacteriophages. Their viral genome is packaged under high pressure within a small volume comparable in linear dimension to the persistence length of dsDNA. Viruses infecting archaea are a new and emerging field of study, which benefits from the single-molecule perspective. This thesis presents the first single molecule study of dsDNA ejection from an Archaeal virus His1, which has a dsDNA genome packaged in a lemon-shaped capsid. Osmotic suppression experiments are carried out and results are compared to those of established dsDNA phages. Results obtained with total internal reflection fluorescence microscopy indicate that DNA ejection from His1 is modulated by external salt concentration and osmotic pressure as is common to many bacteriophages. These findings qualitatively agree with the predictions given by the continuum theory of dsDNA packaging. In contrast to DNA ejection, genome packaging is essential to the assembly of virus particles. Here the focus is on Pseudomonas phage phi6 which has a three-part dsRNA genome, of which only positive sense ssRNA-segments are packaged into the preformed procapsid. This thesis presents the first optical tweezers experiment of single-stranded RNA (ssRNA) packaging by phi6. The results show that packaging alternates between fast and slow sections suggesting that the secondary structure of the ssRNA segment is opened as the RNA is packaged. Single molecule-level results obtained using the two model systems reveal previously unseen heterogeneity in the ejection and packaging processes. Such results cannot be obtained by bulk methods alone.
  • Kopperi, Matias (Helsingin yliopisto, 2016)
    The flux of emerging organic contaminants into environment is a global threat, which is widely studied and monitored. However, current regulation is not able to keep up with the increasing variety of new compounds released to the environment. More efficient and comprehensive analytical methodologies are required to enable sufficient monitoring of these compounds for legislative purposes. Non-targeted analytical approaches are able to identify previously unknown contaminants, which is not possible with conventional targeted methods. Therefore, the development of novel non-target methodologies is important. The goal of the thesis was to look for new ways to utilize non-targeted data for environmental applications with a special emphasis on wastewater analysis. The developed methodologies focused on chemometric quantification of non-target compounds, identification of steroidal transformation products, statistical cross-sample analysis of wastewater and atmospheric particles as well as non-targeted approaches to quantify selectivity of adsorbents employed in sample preparation. The samples were analyzed by comprehensive two-dimensional gas chromatography ‒ time-of-flight mass spectrometry utilizing mass spectral libraries and retention indices for compound identification. Different solid-phase extraction procedures were applied to aqueous samples, and ultra-sound assisted extraction to solid samples. The study included also the synthesis of novel polymeric adsorbents with increased selectivity towards steroidal compounds. Modern statistical software was utilized for data handling and chemometrics. The multidimensional system enabled the analysis of complex wastewater samples, and several steroids and their transformation products were identified from the samples. It was concluded that hydrophobic steroids were efficiently removed from wastewater by adsorption to sewage sludge. However, elimination from sludge was less efficient and steroids were also found in the processed sludge destined for agricultural purposes. The chemometric model for the prediction of concentrations of non-target compounds with steroidal structure demonstrated good accuracy. Non-targeted approaches allowed the arithmetic comparison of adsorbent selectivity, when previously only relative methods have been used. Fast comparison of different wastewater and aerosol samples was possible through cross-sample analysis with non-targeted data. Non-targeted approaches presented in this thesis can also be applied to other groups of contaminants and thus promote the available knowledge about environmental pollution. New ways to utilize non-targeted methodologies and cross-sample analyses demonstrated their value in this thesis and hopefully inspire future studies in the field.
  • Suur-Uski, Anna-Stiina (Helsingin yliopisto, 2016)
    The cosmic microwave background carries a wealth of cosmological information. It originates from the roughly 380,000-year-old Universe, and has traversed through the entire history of the Universe since. The observed radiation contains temperature anisotropies at the level of few times 10^−5 across the sky. By mapping out these temperature anisotropies, and their polarisation over the full sky, we can unravel some fundamental mysteries of our Universe. The cosmic microwave background was discovered by chance in 1965. Ever since the first discovery cosmologists have strived to measure its properties with increasing precision. The latest satellite mission to map the cosmic microwave background has been the European Space Agency s Planck satellite. Planck measured the cosmic sky for four years from 2009 to 2013 with unprecedented combination of sensitivity, angular resolution and frequency coverage. Planck s observations confirm the basic ΛCDM model, which is the most elementary model explaining the observed properties of the Universe. As the precision of measurements has improved, the data volumes have grown rapidly. Modern-sized datasets require sophisticated algorithms to tackle the challenges in the data analysis. In this thesis we will discuss two data analysis themes: map-making and residual noise estimation. Specifically, we will answer the following questions: how to produce sky maps, and low-resolution maps with corresponding noise covariance matrices from the Planck Low Frequency Instrument data. The low-resolution maps and the noise covariance matrices are required in the analysis of the largest structures of the microwave sky. The sky maps for the Stokes I, Q, and U components from the Planck Low Frequency Instrument data are built using the Madam map-maker. Madam is based on the generalised destriping principle. In the destriping the correlated part of the instrumental noise is modelled as a sequence of constant offsets, called baselines. Further, a generalised destriper is able to employ prior information on the instrumental noise properties to enhance the accuracy of noise removal. We achieved nearly optimal noise removal by destriping the data at the HEALPix resolution of Nside = 1024 using 0.25 s baselines for the 30 GHz channel, and 1 s baselines for the 44 and 70 GHz channels provided that a destriping mask, horn-uniform weighting and horn-uniform flagging was applied. For the low-l analysis we also provide maps at the resolution of Nside = 16. These low-resolution maps were downgraded from the high-resolution maps using a noise-weighted downgrading scheme combined with the Gaussian smoothing for the temperature component. The resulting maps were adequate for the Planck 2015 data release, but for the final round of Planck data analysis the downgrading scheme will need to be revised. The estimated maps will always contain some degree of residual noise. The analysis steps after the map-making, component separation and power spectrum estimation, require solid understanding of those residuals. We have three complementary methods at our disposal: half-ring noise maps, noise Monte Carlo simulations, and the noise covariance matrices. The half-ring noise maps characterise the residual noise directly at the map level, while the other methods rely on noise estimates. The noise covariance matrices describe pixel-pixel correlations of the residual noise, and they are needed especially in the low-l likelihood analysis. Hence, it is sufficient to calculate them at the highest feasible resolution of Nside = 64, and subsequently downgrade them to the target resolution of Nside = 16 using the same downgrading scheme as for the maps. The different residual noise estimates seem to show good agreement.