Browsing by Title

Sort by: Order: Results:

Now showing items 16638-16657 of 24771
  • Molari, Juha (Helsingin yliopisto, 2009)
    In the course of my research for my thesis The Q Gospel and Psychohistory, I moved on from the accounts of the Cynics ideals to psychohistorical explanations. Studying the texts dealing with the Cynics and the Q Gospel, I was amazed by the fact that these texts actually portrayed people living in greater poverty than they had to. I paid particular attention to the fact that the Q Gospel was born in traumatising, warlike circumstances. Psychiatric traumatology helped me understand the Q Gospel and other ancient documents using historical approaches in a way that would comply with modern behavioural science. Even though I found some answers to the questions I had posed in my research, the main result of my research work is the justification of the question: Is it important to ask whether there is a connection between the ethos expressed by means of the religious language of the Q Gospel and the predominantly war-related life experiences typical to Palestine at the time. As has been convincingly revealed by a number of studies, traumatic events contribute to the development of psychotic experiences. I approached the problematic nature, significance and complexity of the ideal of poverty and this warlike environment by clarifying the history of psychohistorical literary research and the interpretative contexts associated with Sigmund Freud, Jacques Lacan and Melanie Klein. It is justifiable to question abnormal mentality, but there is no reliable return from the abnormal mentality described in any particular text to the only affecting factor. The popular research tendency based on the Oedipus complex is just as controversial as the Oedipus complex itself. The sociological frameworks concerning moral panics and political paranoia of an outer and inner danger fit quite well with the construction of the Q Gospel. Jerrold M. Post, M.D., Professor of Psychiatry, Political Psychology and Interna-tional Affairs at George Washington University, and founder and director of the Center for the Analysis of Personality and Political Behavior for the Central Intelligence Agency, has focused on the role played by charisma in the attracting of followers and detailed the psychological styles of a "charismatic" leader. He wrote the books Political Paranoia and Leaders and Their Followers in a Dangerous World: the Psychology of Political Behavior among others. His psychoanalytic vocabulary was useful for my understanding of the minds and motivations involved in the Q Gospel s formation. The Q sect began to live in a predestined future, with the reality and safety of this world having collapsed in both their experience and their fantasies. The deep and clear-cut divisions into good and evil that are expressed in the Q Gospel reveal the powerful nature of destructive impulses, envy and overwhelming anxiety. Responsible people who influenced the Q Gospel's origination tried to mount an ascetic defense against anxiety, denying their own needs, focusing their efforts on another objective (God s Kingdom) and a regressive, submissive earlier phase of development (a child s carelessness). This spiritual process was primarily an ecclesiastic or group-dynamical tactic to give support to the power of group leaders.
  • Kolkkinen, Juha (2008)
    Tutkimuksen tavoite on arvioida sitä, miten tilaaja-tuottajamallin keskeiset toimijat Helsinki-konsernissa kokevat tilaaja-tuottajamallista muodostetun väiteotoksen. Arvioinnin kohteena ovat palvelun laatu, taloudelliset säästöt, kilpailun hyödyt ja osaaminen. Tutkimuksessa selvitetään myös, missä määrin tutkimukseen osallistuvien ryhmien sisällä esiintyy mielipide-eroja tilaaja-tuottajamallin väiteotoksesta. Tutkimuksessa teoreettisena lähtökohtana käytetään arviointiteoriaa ja naturalistista arviointimenetelmää. Arviointisuuntauksena tutkimus edustaa jälkipositivistista monitahoarviointia. Tutkimus suoritettiin Q-metodolgian avulla. Tulosten mukaan poliittisella kannalla tai sukupuolella ei ollut merkitystä väiteotoksen arvottamisessa. Tutkimuksen mukaan tilaaja-tuottajamallin epäsuotavana sivuvaikutuksena on havaittavissa eri toimijoiden mielipide-erojen polarisoituminen, joka muodostaa potentiaalisia ristiriitoja. Kaikkien osallistujaryhmien yhteisessä Q-jaottelussa erottui kolme erilaista väiteperhettä. Kaksi muodostuneista väiteperheistä koki tilaaja-tuottajamallin positiivisena ja yksi väiteperhe näki, että mallilla ei saavuteta mitään sille asetettuja tavoitteita. Tutkimustuloksien perusteella voidaan todeta, että jokainen sidosryhmä arvioi mallia subjektiivisesti vain omista lähtökohdistaan. Ryhmien jäsenet eivät yhdistä mallista esitettyjä väitteitä palvelustrategiaan, jota mallin avulla toteutetaan.Tämä paljastaa, että mallista yleisellä tasolla käytävä keskustelu on vahvasti politisoitunutta ja heijastelee sidosryhmien omia intressejä.
  • Wallenius, Tarja (2010)
    In this study, a quality assessment method based on sampling of primary laser inventory units (microsegments) was analysed. The accuracy of a laser inventory carried out in Kuhmo was analysed as a case study. Field sample plots were measured on the sampled microsegments in the Kuhmo inventory area. Two main questions were considered. Did the ALS based inventory meet the accuracy requirements set for the provider and how should a reliable, cost-efficient and independent quality assessment be undertaken. The agreement between control measurement and ALS based inventory was analysed in four ways: 1) The root mean squared errors (RMSEs) and bias were calculated. 2) Scatter plots with 95% confidence intervals were plotted and the placing of identity lines was checked. 3) Bland-Altman plots were drawn so that the mean difference of attributes between the control method and ALS-method was calculated and plotted against average value of attributes. 4) The tolerance limits were defined and combined with Bland-Altman plots. The RMSE values were compared to a reference study from which the accuracy requirements had been set to the service provider. The accuracy requirements in Kuhmo were achieved, however comparison of RMSE values proved to be difficult. Field control measurements are costly and time-consuming, but they are considered to be robust. However, control measurements might include errors, which are difficult to take into account. Using the Bland-Altman plots none of the compared methods are considered to be completely exact, so this offers a fair way to interpret results of assessment. The tolerance limits to be set on order combined with Bland-Altman plots were suggested to be taken in practise. In addition, bias should be calculated for total area. Some other approaches for quality control were briefly examined. No method was found to fulfil all the required demands of statistical reliability, cost-efficiency, time efficiency, simplicity and speed of implementation. Some benefits and shortcomings of the studied methods were discussed.
  • Wallenius, Tarja (Helsingin yliopisto, 2010)
    In this study, a quality assessment method based on sampling of primary laser inventory units (microsegments) was analysed. The accuracy of a laser inventory carried out in Kuhmo was analysed as a case study. Field sample plots were measured on the sampled microsegments in the Kuhmo inventory area. Two main questions were considered. Did the ALS based inventory meet the accuracy requirements set for the provider and how should a reliable, cost-efficient and independent quality assessment be undertaken. The agreement between control measurement and ALS based inventory was analysed in four ways: 1) The root mean squared errors (RMSEs) and bias were calculated. 2) Scatter plots with 95% confidence intervals were plotted and the placing of identity lines was checked. 3) Bland-Altman plots were drawn so that the mean difference of attributes between the control method and ALS-method was calculated and plotted against average value of attributes. 4) The tolerance limits were defined and combined with Bland-Altman plots. The RMSE values were compared to a reference study from which the accuracy requirements had been set to the service provider. The accuracy requirements in Kuhmo were achieved, however comparison of RMSE values proved to be difficult. Field control measurements are costly and time-consuming, but they are considered to be robust. However, control measurements might include errors, which are difficult to take into account. Using the Bland-Altman plots none of the compared methods are considered to be completely exact, so this offers a fair way to interpret results of assessment. The tolerance limits to be set on order combined with Bland-Altman plots were suggested to be taken in practise. In addition, bias should be calculated for total area. Some other approaches for quality control were briefly examined. No method was found to fulfil all the required demands of statistical reliability, cost-efficiency, time efficiency, simplicity and speed of implementation. Some benefits and shortcomings of the studied methods were discussed.
  • Hildén, Timo (Helsingin yliopisto, 2015)
    Gas Electron Multiplier (GEM) detectors are special of position sensitive gas filled detectors used in several particle physics experiments. They are capable of sub- millimeter spatial resolution and energy resolution (FWHM) of the order of 20%. GEM detectors can operate with rates up to 50 kHz/mm2, withstand radiation excellently and can be manufactured up to square meter sizes. This thesis describes the Quality Assurance (QA) methods used in the assembly of 50 GEM detectors for the TOTEM T2 telescope at the LHC at CERN. Further development of optical QA methods used in T2 detector assembly lead into development of a unique large-area scanning system capable of sub-µm resolution. The system, its capability and the software used in the analysis of the scans are described in detail. A correlation was found between one of the main characteristics of the detector, the gas gain, and the results of the optical QA method. It was shown, that a qualitative estimation of the gain can be made based on accurate optical measurement of the microscopic features of the detector components. Ability to predict the performance of individual components of the detectors is extremely useful in large scale production of GEM based detectors.
  • Kankaanhuhta, Ville (Finnish Society of Forest Science, Finnish Forest Research Institute, Faculty of Agriculture and Forestry of the University of Helsinki, School of Forest Sciences of the University of Eastern Finland, 2014)
    The purpose of this thesis was to find out what are the main factors that have to be taken into account in planning, controlling and improving the quality of forest regeneration activities. The forest regeneration services provided for the non-industrial privately-owned forests in Southern Finland by the local Forest Owners Associations (FOAs) were used as an example. Since the original assumptions of quality management were not completely valid in this context, Lillrank s classification of production processes was used. The classification fit well for this field of services, and a tentative framework for modelling and standardisation of forest regeneration service processes was proposed for further testing. The results of regeneration and costs varied considerably between the service providers at different levels. The jointly analysed inventory results and feedback provided a sound starting point for tackling the main causes of the statistical variation observed. The inventory results indicated that the selection of proper methods of regeneration and the way they were executed were the most common factors influencing the quality of service outcomes. The cost-quality analysis of the two most common chains of regeneration revealed an improvement potential for the cost-efficiency of these services. In the case of Norway spruce (Picea abies (L.) Karst.) planting the regeneration costs were only weakly related to quality. As for direct seeding of Scots pine (Pinus sylvestris L.) direct seeding, a significant positive correlation was found. However, the selection of this chain of regeneration for the MT (Myrtillus type) and more fertile site types produced poor regeneration results. In the case of Norway spruce planting, the most important factor explaining the outcomes was soil preparation. Selection of mounding produced better results than patching and disc trenching. In the FOAs, the effect of quality management interventions was observable especially regarding the improvement of resource allocation and practices related to soil preparation.
  • Aaltonen, Serja (Helsingin yliopisto, 2007)
    ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.
  • Bärlund, Hanna-Maria (2012)
    Since the beginning of the 1990s the emphasis of participatory democracy has become stronger in Finnish policy- and decision-making. This development involves various stakeholders participating in negotiations, or more specifically deliberations, around current issues in order to reach consensus and enable a continuance in the policy process. According to research, the more consensual a democracy is the more favourable are the policy outcomes towards environmental issues. The three case studies investigated, ie. the Forest Biodiversity Programme for Southern Finland, the Working Group on Renewable Energy, and the Natura 2000 Network of European Union nature protection areas, support this notion. The case studies are focused on how the key players involved have conceived the decision-making process in terms of achieved goals and degree of agreement as well as on the specific issue context as a backdrop to the development of policy. The cases displayed significant differences of outcomes depending on the achieved level of consensus and deliberation. The outcomes are analysed within the theoretical frameworks of Arend Lijphart's 'consensus vs majoritarian model of democracy' and Martin Jänicke's 'consensual capacity for ecological modernisation'. Further, applying Joshua Cohen's theory of deliberative democracy and his suggestions for achieving "ideal deliberation", the results suggest that the connection between consensus democracy and more effective environmental conservation policy is not that clear-cut. Nevertheless, consensus democracy provides a promising point of departure for overcoming the main disputes between the stakeholders, and common starting points and general goals to be agreed on, which is crucial in order for any progress in environmental conservation to take place.
  • Kymälainen, Hanna-Riitta (Helsingin yliopisto, 2004)
  • Kai, Zhan (2003)
    This research is focused on the quality uncertainty and market efficiency in E-commerce. The purpose of this study is to analyse the economics of lemons market in electronic commerce. In addition, I try to find methods to deal with this problem. Electronic commerce is presenting an exciting opportunity to reduce transaction costs, but its future may depend on how non-technological but fundamentally economic issues such as the lemons problems are solved, or it will essentially lead to the market failure. Repeat purchases play an important role in my analysis. In my opinion, one of the main reasons why electronic commerce players are losing money is because high-quality products cannot receive higher prices in high-quality markets. Due to lack of sufficient informed consumers, firms have to spend on dissipative advertising to signal product quality and consumers have to pay higher prices for high-quality products. By so doing, market efficiency cannot achieve. Thus, how to make consumers informed is the core of the problem of resolving lemons problems. I suggest that electronic intermediaries may provide information about product quality to consumers and reduce quality uncertainty. Actually, none of price, advertising and intermediaries is reliable to signal product quality. In order to reduce quality uncertainty and improve market efficiency, sellers are responsible to provide adequate information to buyers. Similarly, buyers should inform their preferences and tastes to sellers. My hope is that lemons could be turned into lemonade.
  • Forsman, Pia (Helsingin yliopisto, 2008)
    This thesis focuses on the issue of testing sleepiness quantitatively. The issue is relevant to policymakers concerned with traffic- and occupational safety; such testing provides a tool for safety legislation and -surveillance. The findings of this thesis provide guidelines for a posturographic sleepiness tester. Sleepiness ensuing from staying awake merely 17 h impairs our performance as much as the legally proscribed blood alcohol concentration 0.5 does. Hence, sleepiness is a major risk factor in transportation and occupational accidents. The lack of convenient, commercial sleepiness tests precludes testing impending sleepiness levels contrary to simply breath testing for alcohol intoxication. Posturography is a potential sleepiness test, since clinical diurnal balance testing suggests the hypothesis that time awake could be posturographically estimable. Relying on this hypothesis this thesis examines posturographic sleepiness testing for instrumentation purposes. Empirical results from 63 subjects for whom we tested balance with a force platform during wakefulness for maximum 36 h show that sustained wakefulness impairs balance. The results show that time awake is posturographically estimable with 88% accuracy and 97% precision which validates our hypothesis. Results also show that balance scores tested at 13:30 hours serve as a threshold to detect excessive sleepiness. Analytical results show that the test length has a marked effect on estimation accuracy: 18 s tests suffice to identify sleepiness related balance changes, but trades off some of the accuracy achieved with 30 s tests. The procedure to estimate time awake relies on equating the subject s test score to a reference table (comprising balance scores tested during sustained wakefulness, regressed against time awake). Empirical results showed that sustained wakefulness explains 60% of the diurnal balance variations, whereas the time of day explains 40% of the balance variations. The latter fact implies that time awake estimations also must rely on knowing the local times of both test and reference scores.
  • Salonen, J Sakari (Helsingin yliopisto, 2012)
    Palaeoclimatic reconstructions from fossil proxies have provided important insights into the natural variability of climate in the late Quaternary. However, major challenges remain in ensuring the robustness of these reconstructions. Multiple factors may introduce variability and biases into the palaeoclimatic estimates. For example, quantitative reconstructions use diverse modern calibration data-sets, and a wide variety of numerical calibration methods. While the choice of calibration data-set and calibration method may significantly influence the reconstructions, the comparison and analysis of these data-sets and methods have received relatively little attention. Further challenges are presented by the validation of the prepared reconstructions and the identification of climatic variables which can be robustly reconstructed from a given proxy. In this work, summer temperature reconstructions are prepared based on late-Quaternary pollen sequences from northern Finland and northern Russia, covering the Holocene and the early part of the last glacial period (Marine Isotope Stages 5d c). The major aim of this work is to validate these reconstructions and to identify sources of bias in them. Reconstructions are prepared using a number of different calibration methods and calibration sets, to analyse the between-reconstruction variability introduced by the choice of calibration method and calibration set. In addition, novel regression tree methods are used to test the ecological significance of different climatic factors, with the aim of identifying parameters which could feasibly be reconstructed. In the results, it is found that the choice of calibration method, calibration data-set, and fossil pollen sequence can all significantly affect the reconstruction. The problems in choosing calibration data are especially acute in pre-Holocene reconstructions, as it is difficult to find representative calibration data for reconstructions from non-analogue palaeoclimates which become increasingly common in the more distant past. First-order trends in the reconstructed palaeoclimates are found to be relatively robust. However, the degree of between-reconstruction variability stresses the importance of independent validation, and suggests that ensemble reconstructions using different methods and proxies should be increasingly relied on. The analysis of climatic response in northern European modern pollen samples by regression trees suggests secondary climatic determinants such as winter temperature and continentality to have major ecological influence, in addition to summer temperature which has been the most commonly reconstructed variable in palaeoclimatic studies. This suggests the potential to reconstruct the secondary parameters from fossil pollen. However, validating the robustness of secondary-parameter reconstructions remains a major challenge for future studies.
  • Schulman, Nina (Helsingin yliopisto, 2007)
    Knowing the chromosomal areas or actual genes affecting the traits under selection would add more information to be used in the selection decisions which would potentially lead to higher genetic response. The first objective of this study was to map quantitative trait loci (QTL) affecting economically important traits in the Finnish Ayrshire population. The second objective was to investigate the effects of using QTL information in marker-assisted selection (MAS) on the genetic response and the linkage disequilibrium between the different parts of the genome. Whole genome scans were carried out on a grand-daughter design with 12 half-sib families and a total of 493 sons. Twelve different traits were studied: milk yield, protein yield, protein content, fat yield, fat content, somatic cell score (SCS), mastitis treatments, other veterinary treatments, days open, fertility treatments, non-return rate, and calf mortality. The average spacing of the typed markers was 20 cM with 2 to 14 markers per chromosome. Associations between markers and traits were analyzed with multiple marker regression. Significance was determined by permutation and genome-wise P-values obtained by Bonferroni correction. The benefits from MAS were investigated by simulation: a conventional progeny testing scheme was compared to a scheme where QTL information was used within families to select among full-sibs in the male path. Two QTL on different chromosomes were modelled. The effects of different starting frequencies of the favourable alleles and different size of the QTL effects were evaluated. A large number of QTL, 48 in total, were detected at 5% or higher chromosome-wise significance. QTL for milk production were found on 8 chromosomes, for SCS on 6, for mastitis treatments on 1, for other veterinary treatments on 5, for days open on 7, for fertility treatments on 7, for calf mortality on 6, and for non-return rate on 2 chromosomes. In the simulation study the total genetic response was faster with MAS than with conventional selection and the advantage of MAS persisted over the studied generations. The rate of response and the difference between the selection schemes reflected clearly the changes in allele frequencies of the favourable QTL. The disequilibrium between the polygenes and QTL was always negative and it was larger with larger QTL size. The disequilibrium between the two QTL was larger with QTL of large effect and it was somewhat larger with MAS for scenarios with starting frequencies below 0.5 for QTL of moderate size and below 0.3 for large QTL. In conclusion, several QTL affecting economically important traits of dairy cattle were detected. Further studies are needed to verify these QTL, check their presence in the present breeding population, look for pleiotropy and fine map the most interesting QTL regions. The results of the simulation studies show that using MAS together with embryo transfer to pre-select young bulls within families is a useful approach to increase the genetic merit of the AI-bulls compared to conventional selection.
  • Wang, Cong (Helsingin yliopisto, 2010)
    This thesis studies the intermolecular interactions in (i) boron-nitrogen based systems for hydrogen splitting and storage, (ii) endohedral complexes, A@C60, and (iii) aurophilic dimers. We first present an introduction of intermolecular interactions. The theoretical background is then described. The research results are summarized in the following sections. In the boron-nitrogen systems, the electrostatic interaction is found to be the leading contribution, as 'Coulomb Pays for Heitler and London' (CHL). For the endohedral complex, the intermolecular interaction is formulated by a one-center expansion of the Coulomb operator 1/rab. For the aurophilic attraction between two C2v monomers, a London-type formula was derived by fully accounting for the anisotropy and point-group symmetry of the monomers.
  • Kurtén, Theo (Helsingin yliopisto, 2007)
    Nucleation is the first step of the process by which gas molecules in the atmosphere condense to form liquid or solid particles. Despite the importance of atmospheric new-particle formation for both climate and health-related issues, little information exists on its precise molecular-level mechanisms. In this thesis, potential nucleation mechanisms involving sulfuric acid together with either water and ammonia or reactive biogenic molecules are studied using quantum chemical methods. Quantum chemistry calculations are based on the numerical solution of Schrödinger's equation for a system of atoms and electrons subject to various sets of approximations, the precise details of which give rise to a large number of model chemistries. A comparison of several different model chemistries indicates that the computational method must be chosen with care if accurate results for sulfuric acid - water - ammonia clusters are desired. Specifically, binding energies are incorrectly predicted by some popular density functionals, and vibrational anharmonicity must be accounted for if quantitatively reliable formation free energies are desired. The calculations reported in this thesis show that a combination of different high-level energy corrections and advanced thermochemical analysis can quantitatively replicate experimental results concerning the hydration of sulfuric acid. The role of ammonia in sulfuric acid - water nucleation was revealed by a series of calculations on molecular clusters of increasing size with respect to all three co-ordinates; sulfuric acid, water and ammonia. As indicated by experimental measurements, ammonia significantly assists the growth of clusters in the sulfuric acid - co-ordinate. The calculations presented in this thesis predict that in atmospheric conditions, this effect becomes important as the number of acid molecules increases from two to three. On the other hand, small molecular clusters are unlikely to contain more than one ammonia molecule per sulfuric acid. This implies that the average NH3:H2SO4 mole ratio of small molecular clusters in atmospheric conditions is likely to be between 1:3 and 1:1. Calculations on charged clusters confirm the experimental result that the HSO4- ion is much more strongly hydrated than neutral sulfuric acid. Preliminary calculations on HSO4- NH3 clusters indicate that ammonia is likely to play at most a minor role in ion-induced nucleation in the sulfuric acid - water system. Calculations of thermodynamic and kinetic parameters for the reaction of stabilized Criegee Intermediates with sulfuric acid demonstrate that quantum chemistry is a powerful tool for investigating chemically complicated nucleation mechanisms. The calculations indicate that if the biogenic Criegee Intermediates have sufficiently long lifetimes in atmospheric conditions, the studied reaction may be an important source of nucleation precursors.
  • Calsamiglia, John (Helsingin yliopisto, 2001)