Browsing by Subject "calibration"

Sort by: Order: Results:

Now showing items 1-14 of 14
  • Zhang, Jie (2002)
    Most countries use a Labour Force Survey (LFS) to get information about different characteristics of the labour force. The LFS produces statistics about the number of persons employed, unemployed and inactive. Perhaps the most well known parameters are the number of unemployed persons and the unemployment rate, for the country as a whole and for its most important geographical and other domains. In principle, there is a lot of co-ordination in the LFS, such as the planning phase, the fieldwork, the estimation and analysis phase. The same holds for the LFS of different countries. But still, great differences exist in the way that these phases are carried out in different countries. The aim of this Master's thesis is to give an overview and a comparison of the LFS procedures in three countries, Finland, the United Kingdom (UK) and Canada. The main reason for choosing these countries is that they are considered among the leaders in regard to the quality of statistics production. Also, detailed written documentation about the LFS is easily obtainable for these countries. The thesis consists of two major parts, the examination of the LFS sampling design in the three countries and the examination of the LFS estimation procedures in the three countries.
  • Khoramshahi, Ehsan; Campos, Mariana Batista; Tommaselli, Antonio Maria Garcia; Vilijanen, Niko; Mielonen, Teemu; Kaartinen, Harri; Kukko, Antero; Honkavaara, Eija (2019)
    Mobile mapping systems (MMS) are increasingly used for many photogrammetric and computer vision applications, especially encouraged by the fast and accurate geospatial data generation. The accuracy of point position in an MMS is mainly dependent on the quality of calibration, accuracy of sensor synchronization, accuracy of georeferencing and stability of geometric configuration of space intersections. In this study, we focus on multi-camera calibration (interior and relative orientation parameter estimation) and MMS calibration (mounting parameter estimation). The objective of this study was to develop a practical scheme for rigorous and accurate system calibration of a photogrammetric mapping station equipped with a multi-projective camera (MPC) and a global navigation satellite system (GNSS) and inertial measurement unit (IMU) for direct georeferencing. The proposed technique is comprised of two steps. Firstly, interior orientation parameters of each individual camera in an MPC and the relative orientation parameters of each cameras of the MPC with respect to the first camera are estimated. In the second step the offset and misalignment between MPC and GNSS/IMU are estimated. The global accuracy of the proposed method was assessed using independent check points. A correspondence map for a panorama is introduced that provides metric information. Our results highlight that the proposed calibration scheme reaches centimeter-level global accuracy for 3D point positioning. This level of global accuracy demonstrates the feasibility of the proposed technique and has the potential to fit accurate mapping purposes
  • Heiskanen, Ilmari (Helsingin yliopisto, 2021)
    Interest towards indoor air quality has increased for several decades from human health perspective. In order to evaluate the quality of indoor air in terms of volatile organic compound (VOC) levels, robust analytical procedures and techniques must be used for indoor air VOC measurements. Since indoor building materials are the greatest source of indoor VOC emissions, same kind of procedures must be used for analysis of emission rates from building materials and their surfaces. Theory part of this thesis reviews background of VOCs and human health, legislation and guideline values, common building materials with emissions and used sampling techniques/approaches for indoor air sampling and surface material emission rate sampling & analysis. Discussed sampling techniques include, for example, material emission test chambers, field and laboratory test emission cells, solid phase microextraction (SPME) fibre applications and Radiello passive samplers. Also new innovative approaches are discussed. Used common analysis instruments are Gas Chromatography (GC) with Mass Spectrometer (MS) or Flame Ionization Detector (FID) for VOCs and High-Performance Liquid Chromatography-Ultraviolet/Visible light detector (HPLC-UV/VIS) for carbonyl VOCs (e.g. formaldehyde) after suitable derivatization. Analytical procedures remain highly ISO 16000 standard series orientated even in recent studies. In addition, potential usage of new modern miniaturized sample collection devices SPME Arrow and In-tube extraction (ITEX) used in experimental part of this thesis are discussed as an addition to indoor air and VOC emission studies. The aim of the experimental part of this thesis was to develop calibrations for selected organic nitrogen compounds with SPME Arrow and ITEX sampling techniques and test the calibration with indoor and outdoor samples. A calibration was successfully carried out with SPME Arrow (MCM-41 sorbent), ITEX (MCM-TP sorbent) and ITEX (Polyacrylonitrile (PAN) 10 % sorbent) with permeation system combined with GC-MS for the following selected organic nitrogen compounds: triethylamine, pyridine, isobutyl amine, allylamine, trimethylamine, ethylenediamine, dipropyl amine, hexylamine, 1,3-diaminopropane, 1-methyl-imidazole, N, N-dimethylformamide, 1,2-diaminocyclohexane, 1-nitropropane and formamide. The overall quality of the calibration curves was evaluated, and the calibrations were compared in terms of linear range, relative standard deviation (RSD) % for accepted calibration levels and obtained Limits of Detection (LOD) values. Also, ways to improve the calibrations were discussed. The calibration curves were tested with real indoor and outdoor samples and quantitative, as well as semi-quantitative, results were obtained.
  • Helin, Tuukka A.; Lemponen, Marja; Lassila, Riitta; Joutsi-Korhonen, Lotta (2021)
    Background: The thrombin generation (TG) assay is a feasible but labor-intensive method for detecting global coagulation. It enables comprehensive assessment of anticoagulation, while drug-specific assays assess only exposure. Traditionally, the Calibrated Automated Thrombogram (CAT) has been used, however the ST Genesia (Diagnostica Stago) allows automated evaluation. Objective: We aimed to observe coagulation using the ST Genesia and compare the data with those of CAT in anticoagulated patients. Patients and methods: In total, 43 frozen-thawed samples were studied using DrugScreen to assess direct oral anticoagulants (DOACs), warfarin, and low-molecularweight heparin. Twenty samples (nine rivaroxaban, five apixaban, three warfarin, and three heparin) were also compared using CAT (5 pM tissue factor). Results: TG reduction in DrugScreen depended on the specific drug and modestly correlated with DOAC levels (lag time R-2 = 0.36; peak R-2 = 0.50). The best correlation was observed with peak thrombin and rivaroxaban-specified anti-activated factor X (anti-Xa) activity (R-2 = 0.60). When comparing ST Genesia with CAT, only the results for apixaban concorded (R-2 = 0.97). Unlike CAT, ST Genesia yielded a normal endogenous thrombin potential (ETP) in 77% (24/31) activated factor X inhibitor cases, and it failed to give readouts at international normalized ratio (INR) >= 4.5 and at anti-Xa >= 1.0 IU/mL. Conclusion: The ST Genesia data did not correlate with CAT, but it was independently associated with INR, anti-Xa, and DOAC concentrations. The lag time and peak responses were similar; the major differences were that ST Genesia showed no ETP effect of DOACs and failed to give readout at high INR or anti-Xa activity.
  • Mäkelä, Jarmo (2020)
    Finnish Meteorological Institute Contributions 160
    How significant are different uncertainty sources when simulating the future state of the ecosystem in Finland? In this thesis, we examine this question and provide some answers to this broad topic by simulating 21st century ecosystem conditions with a land-ecosystem model called JSBACH. The results are also compared to similar simulations performed by another model called PREBAS. We consider four different sources of uncertainty that are related to 1) the model that is used to generate the future conditions; 2) future climate used to drive the model, represented by an ensemble of CMIP5 simulations; 3) RCP scenarios that depict the rising atmospheric CO2 concentration and; 4) forest management actions. Before running the simulations described above, we calibrated and validated the JSBACH model extensively on different temporal resolutions and with multiple model modifications. These hindcasting calibrations were performed with two Bayesian approaches: the adaptive Metropolis algorithm and the adaptive population importance sampler. The calibrations resulted in a sufficient model setup and satisfactory parameter distributions. These were used to represent the JSBACH model uncertainty in the 21st century simulations. Canonical correlation analysis was used to gleam the impact of the different uncertainty sources on multiple groups of ecosystem variables. The results are summarised via the use of redundancy indices that yield varied impacts. Overall, forest management actions and RCP scenarios tend to dominate the uncertainties towards the end of the century, but the effect of climate models and parameters should not be overlooked especially since a more detailed examination revealed that their impact was not fully captured. *** Kuinka merkittäviä ovat eri epävarmuuslähteet arvioitaessa metsäekosysteemien tulevaisuutta Suomessa? Tässä väitöskirjassa tarkastellaan edeltävää kysymystä mallittamalla metsäekosysteemien tilaa 2100-luvulle. Mallitukseen käytetään maa-ekosysteemimalli JSBACH:ia ja arvioita verrataan vastaaviin PREBAS-mallin tuloksiin. Tarkasteltavat epävarmuuslähteet voidaan jakaa 1) mallien sisäiseen epävarmuuteen; 2) mallien ajamiseen käytettäviin ilmastopakotteisiin, jotka pohjaavat CMIP5 simulaatioihin; 3) RCP-päästöskenaarioihin, jotka edustavat ilmakehän hiilidioksidipitoisuuden nousua sekä; 4) valittuun metsänhoitosuunnitelmaan. Edellä esitettyjen tulevaisuussimulaatioiden toteuttamiseksi JSBACH-malli kalibroitiin ja validoitiin käyttäen 10 paikallista mittausasemaa boreaalisella vyöhykkeellä. Tarkasteluun sisällytettiin eri aikaresoluutioita ja useita mallin rakenteellisia muutoksia, jotta mallin tuottama transpiraatio, evaporaatio ja hiilensidonta vastaisivat paremmin vastaavia havaintoja. Kalibrointiin käytettiin kahta eri Bayesilaista menetelmää: adaptive Metropolis sekä adaptive population importance sampler -algoritmeja. Kalibrointiprosessissa ilmenneitä parametrien välisiä riippuvuuksia, identifioituvuutta ja merkitsevyyttä analysoitiin perusteellisesti. Lisäksi lopullisia parametrijakaumia ja -arvoja verrattiin useisiin kirjallisuuslähteisiin. Validointi toteutettiin sekä kalibroinnista riippumattomilla että erillisten mittausasemien havainnoilla. Näiden tarkastelujen pohjalta muodostettiin tulevaisuussimulaatioihin soveltuva mallirakenne ja parametrijakaumat, jotka kuvaavat JSBACH-mallin sisäistä epävarmuutta. Tulevaisuussimulaatioiden epävarmuuslähteiden vaikuttavuutta arvioitiin kanonisen korrelaatioanalyysin pohjalta. Metsäekosysteemien tilaa kuvaavia indikaattoreita tutkittiin em. analyysissa sekä kaikkia kerralla että jaoteltuna eri ryhmiin vaikuttavuutensa perusteella. Voimakkaimmaksi epävarmuuslähteeksi vuosisadan lopussa nousivat sekä metsänhoitosuunnitelmat että päästöskenaariot, joita seurasivat järjestyksessä ilmastopakotteet ja mallien sisäinen epävarmuus.
  • Chatrchyan, S.; Anttila, E.; Czellar, S.; Härkönen, J.; Heikkinen, A.; Karimäki, V.; Kinnunen, R.; Klem, J.; Kortelainen, M.; Lampen, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Nysten, J.; Tuominen, E.; Tuominiemi, J.; Ungaro, D.; Wendland, L.; CMS Collaboration (INSTITUTE OF PHYSICS PUBLISHING, 2010)
  • Zaidan, Martha Arbayani; Hossein Motlagh, Naser; Fung, Pak Lun; Lu, David; Timonen, Hilkka; Kuula, Joel; Niemi, Jarkko V; Tarkoma, Sasu; Petäjä, Tuukka; Kulmala, Markku; Hussein, Tareq (2020)
    This paper presents the development of air quality low-cost sensors (LCS) with improved accuracy features. The LCS features integrate machine learning based calibration models and virtual sensors. LCS performances are analyzed and some LCS variables with low performance are improved through intelligent field-calibrations. Meteorological variables are calibrated using linear dynamic models. While, due to the non-linear relationship to reference instruments, fine particulate matter (PM2.5) are calibrated using non-linear machine learning models. However, due to sensor drifts or faults, carbon dioxide (CO2) does not present correlation to reference instrument. As a result, the LCS for CO2 is not feasible to be calibrated. Hence, to estimate the CO2 concentration, mathematical models are developed to be integrated in the calibrated LCS, known as a virtual sensor. In addition, another virtual sensor is developed to demonstrate the capability of estimating air pollutant concentrations, e.g. black carbon, when the physical sensor devices are not available. In our paper, calibration models and virtual sensors are established using corresponding reference instruments that are installed on two reference stations. This strategy generalizes the models of calibration and virtual sensing which then allows LCS to be deployed in field independently with a high accuracy. Our proposed methodology enables scaling-up accurate air pollution mapping appropriate for smart cities.
  • Kassamakov, Ivan; Maconi, Göran; Penttilä, Antti; Helander, Petteri; Gritsevich, Maria; Puranen, Tuomas; Salmi, Ari; Haeggström, Edward; Muinonen, Karri (SPIE - the international society for optics and photonics, 2018)
    Proceedings of SPIE
    We present the design of a novel scatterometer for precise measurement of the angular Mueller matrix profile of a mm- to mu m-sized sample held in place by sound. The scatterometer comprises a tunable multimode Argon-krypton laser (with possibility to set 1 of the 12 wavelengths in visible range), linear polarizers, a reference photomultiplier tube (PMT) for monitoring the beam intensity, and a micro-PMT module mounted radially towards the sample at an adjustable radius. The measurement angle is controlled by a motor-driven rotation stage with an accuracy of 15'. The system is fully automated using LabVIEW, including the FPGA-based data acquisition and the instrument's user interface. The calibration protocol ensures accurate measurements by using a control sphere sample (diameter 3 mm, refractive index of 1.5) fixed first on a static holder followed by accurate multi-wavelength measurements of the same sample levitated ultrasonically. To demonstrate performance of the scatterometer, we conducted detailed measurements of light scattered by a particle derived from the Chelyabinsk meteorite, as well as planetary analogue materials. The measurements are the first of this kind, since they are obtained using controlled spectral angular scattering including linear polarization effects, for arbitrary shaped objects. Thus, our novel approach permits a non-destructive, disturbance-free measurement with control of the orientation and location of the scattering object.
  • Concas, Francesco; Mineraud, Julien; Lagerspetz, Eemil; Varjonen, Samu; Liu, Xiaoli; Puolamäki, Kai; Nurmi, Petteri; Tarkoma, Sasu (2021)
    The significance of air pollution and the problems associated with it are fueling deployments of air quality monitoring stations worldwide. The most common approach for air quality monitoring is to rely on environmental monitoring stations, which unfortunately are very expensive both to acquire and to maintain. Hence environmental monitoring stations are typically sparsely deployed, resulting in limited spatial resolution for measurements. Recently, low-cost air quality sensors have emerged as an alternative that can improve the granularity of monitoring. The use of low-cost air quality sensors, however, presents several challenges: they suffer from cross-sensitivities between different ambient pollutants; they can be affected by external factors, such as traffic, weather changes, and human behavior; and their accuracy degrades over time. Periodic re-calibration can improve the accuracy of low-cost sensors, particularly with machine-learning-based calibration, which has shown great promise due to its capability to calibrate sensors in-field. In this article, we survey the rapidly growing research landscape of low-cost sensor technologies for air quality monitoring and their calibration using machine learning techniques. We also identify open research challenges and present directions for future research.
  • Toivonen, Mikko E.; Klami, Arto (2020)
    Knowledge of the spectral response of a camera is important in many applications such as illumination estimation, spectrum estimation in multi-spectral camera systems, and color consistency correction for computer vision. We present a practical method for estimating the camera sensor spectral response and uncertainty, consisting of an imaging method and an algorithm. We use only 15 images (four diffraction images and 11 images of color patches of known spectra to obtain high-resolution spectral response estimates) and obtain uncertainty estimates by training an ensemble of response estimation models. The algorithm does not assume any strict priors that would limit the possible spectral response estimates and is thus applicable to any camera sensor, at least in the visible range. The estimates have low errors for estimating color channel values from known spectra, and are consistent with previously reported spectral response estimates.
  • Lerviks, Alf-Erik (Svenska handelshögskolan, 2004)
    Research Reports
    A diffusion/replacement model for new consumer durables designed to be used as a long-term forecasting tool is developed. The model simulates new demand as well as replacement demand over time. The model is called DEMSIM and is built upon a counteractive adoption model specifying the basic forces affecting the adoption behaviour of individual consumers. These forces are the promoting forces and the resisting forces. The promoting forces are further divided into internal and external influences. These influences are operationalized within a multi-segmental diffusion model generating the adoption behaviour of the consumers in each segment as an expected value. This diffusion model is combined with a replacement model built upon the same segmental structure as the diffusion model. This model generates, in turn, the expected replacement behaviour in each segment. To be able to use DEMSIM as a forecasting tool in early stages of a diffusion process estimates of the model parameters are needed as soon as possible after product launch. However, traditional statistical techniques are not very helpful in estimating such parameters in early stages of a diffusion process. To enable early parameter calibration an optimization algorithm is developed by which the main parameters of the diffusion model can be estimated on the basis of very few sales observations. The optimization is carried out in iterative simulation runs. Empirical validations using the optimization algorithm reveal that the diffusion model performs well in early long-term sales forecasts, especially as it comes to the timing of future sales peaks.
  • Heikkinen, Terho; Kassamakov, I.; Viitala, T.; Järvinen, M.; Vainikka, T.; Nolvi, A.; Bermudez, C.; Artigas, R.; Martinez, P.; Korpelainen, Raija; Lassila, A.; Haeggström, E. (2020)
    Modern microscopes and profilometers such as the coherence scanning interferometer (CSI) approach sub-nm precision in height measurements. Transfer standards at all measured size scales are needed to guarantee traceability at any scale and utilize the full potential of these instruments, but transfer standards with similar characteristics upon reflection to those of the measured samples are preferred. This is currently not the case for samples featuring dimensions of less than 10 nm and for biosamples with different optical charasteristics to silicon, silica or metals. To address the need for 3D images of biosamples with traceable dimensions, we introduce a transfer standard with dimensions guaranteed by natural self-assembly and a material that is optically similar to that in typical biosamples. We test the functionality of these transfer standards by first calibrating them using an atomic force microscope (AFM) and then using them to calibrate a CSI. We investigate whether a good enough calibration accuracy can be reached to enable a useful calibration of the CSI system. The result is that the calibration uncertainty is only marginally increased due to the sample.
  • Lintuluoto, Adelina Eleonora (Helsingin yliopisto, 2021)
    At the Compact Muon Solenoid (CMS) experiment at CERN (European Organization for Nuclear Research), the building blocks of the Universe are investigated by analysing the observed final-state particles resulting from high-energy proton-proton collisions. However, direct detection of final-state quarks and gluons is not possible due to a phenomenon known as colour confinement. Instead, event properties with a close correspondence with their distributions are studied. These event properties are known as jets. Jets are central to particle physics analysis and our understanding of them, and hence of our Universe, is dependent upon our ability to accurately measure their energy. Unfortunately, current detector technology is imprecise, necessitating downstream correction of measurement discrepancies. To achieve this, the CMS experiment employs a sequential multi-step jet calibration process. The process is performed several times per year, and more often during periods of data collection. Automating the jet calibration would increase the efficiency of the CMS experiment. By automating the code execution, the workflow could be performed independently of the analyst. This in turn, would speed up the analysis and reduce analyst workload. In addition, automation facilitates higher levels of reproducibility. In this thesis, a novel method for automating the derivation of jet energy corrections from simulation is presented. To achieve automation, the methodology utilises declarative programming. The analyst is simply required to express what should be executed, and no longer needs to determine how to execute it. To successfully automate the computation of jet energy corrections, it is necessary to capture detailed information concerning both the computational steps and the computational environment. The former is achieved with a computational workflow, and the latter using container technology. This allows a portable and scalable workflow to be achieved, which is easy to maintain and compare to previous runs. The results of this thesis strongly suggest that capturing complex experimental particle physics analyses with declarative workflow languages is both achievable and advantageous. The productivity of the analyst was improved, and reproducibility facilitated. However, the method is not without its challenges. Declarative programming requires the analyst to think differently about the problem at hand. As a result there are some sociological challenges to methodological uptake. However, once the extensive benefits are understood, we anticipate widespread adoption of this approach.
  • Jaakkola, Sauli (Helsingfors universitet, 2013)
    Assessing and avoiding environmental impact of agriculture and forestry has become more and more important during recent years. In Finland, half of the phosphorus load and nearly 40 % of the nitrogen load in the water system is caused by agriculture and forestry. Traditionally water quality monitoring has been carried out with manual water sampling and laboratory analyses. The problem with manual sampling is low amount of samples. Continuously working water quality sensors have been used for a relatively short time, which is why continuous water quality monitoring needs more research. The objective of the study is to clarify the feasibility of optical sensors in monitoring water quality and nutrient loading in an agricultural and forest management area. The study was carried out in three monitoring stations of the Savijoki catchment in Southwest Finland. Two of the stations were identically equipped and were located in forested subcatchments. A third station was located at the Savijoki catchment discharge point, making it possible to study how sensors work in different water qualities. According to the study, monitoring with continuously working sensors will result in more accurate nutrient loading estimates. With sensors used in the study it is also possible to draw conclusions about dynamics between run-off and nutrient concentrations in water. A prerequisite for successful monitoring is utilizing the appropriate sensors in the correct location. For example, low nitrate levels in water in forested areas have to be taken into consideration when choosing sensors. During the monitoring it is important to actively keep track of the quality of data and to check that sensors are working properly. Water quality sensors always need good calibration and control water samples from the entire concentration range. Sensors also have to be equipped with an automatic cleaning mechanism.