Browsing by Subject "modelling"

Sort by: Order: Results:

Now showing items 1-20 of 25
  • Bhattacharjee, Joy; Rabbil, Mehedi; Fazel, Nasim; Darabi, Hamid; Choubin, Bahram; Khan, Md. Motiur Rahman; Marttila, Hannu; Haghighi, Ali Torabi (Elsevier, 2021)
    Science of the Total Environment 797 (2021), 149034
    Lake water level fluctuation is a function of hydro-meteorological components, namely input, and output to the system. The combination of these components from in-situ and remote sensing sources has been used in this study to define multiple scenarios, which are the major explanatory pathways to assess lake water levels. The goal is to analyze each scenario through the application of the water balance equation to simulate lake water levels. The largest lake in Iran, Lake Urmia, has been selected in this study as it needs a great deal of attention in terms of water management issues. We ran a monthly water balance simulation of nineteen scenarios for Lake Urmia from 2003 to 2007 by applying different combinations of data, including observed and remotely sensed water level, flow, evaporation, and rainfall. We used readily available water level data from Hydrosat, Hydroweb, and DAHITI platforms; evapotranspiration from MODIS and rainfall from TRMM. The analysis suggests that the consideration of field data in the algorithm as the initial water level can reproduce the fluctuation of Lake Urmia water level in the best way. The scenario that combines in-situ meteorological components is the closest match to the observed water level of Lake Urmia. Almost all scenarios showed good dynamics with the field water level, but we found that nine out of nineteen scenarios did not vary significantly in terms of dynamics. The results also reveal that, even without any field data, the proposed scenario, which consists entirely of remote sensing components, is capable of estimating water level fluctuation in a lake. The analysis also explains the necessity of using proper data sources to act on water regulations and managerial decisions to understand the temporal phenomenon not only for Lake Urmia but also for other lakes in semi-arid regions.
  • Jylhä, Kirsti; Ruosteenoja, Kimmo; Räisänen, Jouni; Venäläinen, Ari; Tuomenvirta, Heikki; Ruokolainen, Leena; Saku, Seppo; Seitola, Teija (2010)
    Rapoetteja - Rapporter - Reports
  • Forsius, Martin; Posch, Maximilian; Holmberg, Maria; Vuorenmaa, Jussi; Kleemola, Sirpa; Augustaitis, Algirdas; Beudert, Burkhard; Bochenek, Witold; Clarke, Nicholas; de Wit, Heleen A.; Dirnböck, Thomas; Frey, Jane; Grandin, Ulf; Hakola, Hannele; Kobler, Johannes; Krám, Pavel; Lindroos, Antti-Jussi; Löfgren, Stefan; Pecka, Tomasz; Rönnback, Pernilla; Skotak, Krzysztof; Szpikowski, Józef; Ukonmaanaho, Liisa; Valinia, Salar; Váňa, Milan (Elsevier, 2021)
    Science of The Total Environment 753 (2021), 141791
    Anthropogenic emissions of nitrogen (N) and sulphur (S) compounds and their long-range transport have caused widespread negative impacts on different ecosystems. Critical loads (CLs) are deposition thresholds used to describe the sensitivity of ecosystems to atmospheric deposition. The CL methodology has been a key science-based tool for assessing the environmental consequences of air pollution. We computed CLs for eutrophication and acidification using a European long-term dataset of intensively studied forested ecosystem sites (n = 17) in northern and central Europe. The sites belong to the ICP IM and eLTER networks. The link between the site-specific calculations and time-series of CL exceedances and measured site data was evaluated using long-term measurements (1990–2017) for bulk deposition, throughfall and runoff water chemistry. Novel techniques for presenting exceedances of CLs and their temporal development were also developed. Concentrations and fluxes of sulphate, total inorganic nitrogen (TIN) and acidity in deposition substantially decreased at the sites. Decreases in S deposition resulted in statistically significant decreased concentrations and fluxes of sulphate in runoff and decreasing trends of TIN in runoff were more common than increasing trends. The temporal developments of the exceedance of the CLs indicated the more effective reductions of S deposition compared to N at the sites. There was a relation between calculated exceedance of the CLs and measured runoff water concentrations and fluxes, and most sites with higher CL exceedances showed larger decreases in both TIN and H+ concentrations and fluxes. Sites with higher cumulative exceedance of eutrophication CLs (averaged over 3 and 30 years) generally showed higher TIN concentrations in runoff. The results provided evidence on the link between CL exceedances and empirical impacts, increasing confidence in the methodology used for the European-scale CL calculations. The results also confirm that emission abatement actions are having their intended effects on CL exceedances and ecosystem impacts.
  • Rajakallio, Maria; Jyväsjärvi, Jussi; Muotka, Timo; Aroviita, Jukka (Blackwell, 2021)
    Journal of Applied Ecology 58: 7, 1523-1532
    1. Growing bioeconomy is increasing the pressure to clear-cut drained peatland forests. Yet, the cumulative effects of peatland drainage and clear-cutting on the biodiversity of recipient freshwater ecosystems are largely unknown. 2. We studied the isolated and combined effects of peatland drainage and clear-cutting on stream macroinvertebrate communities. We further explored whether the impact of these forestry-driven catchment alterations to benthic invertebrates is related to stream size. We quantified the impact on invertebrate biodiversity by comparing communities in forestry-impacted streams to expected communities modelled with a multi-taxon niche model. 3. The impact of clear-cutting of drained peatland forests exceeded the sum of the independent effects of drainage and clear-cutting, indicating a synergistic interaction between the two disturbances in small streams. Peatland drainage reduced benthic biodiversity in both small and large streams, whereas clear-cutting did the same only in small streams. Small headwater streams were more sensitive to forestry impacts than the larger downstream sites. 4. We found 11 taxa (out of 25 modelled) to respond to forestry disturbances. These taxa were mainly different from those previously reported as sensitive to forestry-driven alterations, indicating the context dependence of taxonomic responses to forestry. In contrast, most of the functional traits previously identified as responsive to agricultural sedimentation also responded to forestry pressures. In particular, taxa that live temporarily in hyporheic habitats, move by crawling, disperse actively in water, live longer than 1 year, use eggs as resistance form and obtain their food by scraping became less abundant than expected, particularly in streams impacted by both drainage and clear-cutting. 5. Synthesis and applications. Drained peatland forests in boreal areas are reaching maturity and will soon be harvested. Clear-cutting of these forests incurs multiple environmental hazards but previous studies have focused on terrestrial ecosystems. Our results show that the combined impacts of peatland drainage and clear-cutting may extend across ecosystem boundaries and cause significant biodiversity loss in recipient freshwater ecosystems. This information supports a paradigm shift in boreal forest management, whereby continuous-cover forestry based on partial harvest may provide the most sustainable approach to peatland forestry.
  • Lehtomaa, Jere (Helsingfors universitet, 2017)
    The incomplete global coverage of current emissions trading schemes has raised concerns about free-riding and carbon leakage. EU ETS, the first and currently the biggest carbon market, is at the fore of such fears. Carbon-based import tariffs have thereby been proposed to compensate domestic industries for the cost disadvantage against their rivals in non-regulating countries. This thesis uses an applied general equilibrium (AGE) model to assess the impacts of a hypothetical EU carbon tariff on the Finnish economy. The carbon content of imported goods is first estimated with an environmentally extended input-output analysis, and the tariff is levied according to the anticipated price of EU emission allowances. To examine the sensitivity of the results, five additional scenarios are then constructed by altering the key simulation parameters. The tariff is imposed on the most energy-intensive and trade-exposed industries in 2016 and simulated until 2030. The results suggest that carbon tariffs are detrimental to the Finnish economy. The negative outcome is determined by high material intensity and a growing dependence on imported materials throughout the industry sector. As a result, the tariff-induced increase in import prices adds up to a notable growth in total production costs. Moreover, the negative impact is most pronounced within the export-oriented heavy manufacturing sector that the tariff was designed to shelter in the first place. The few sectors that gain from the tariff were not directly subject to it, but utilize the secondary impacts as the economy adapts to the shock. The findings imply that due to the deeper integration of global value chains, the appeal of protective tariffs, even if environmentally motivated, can be harmfully over-simplistic.
  • Mäkinen, Ville; Oksanen, Juha; Sarjakoski, Tapani (Stichting AGILE, 2019)
    The digital elevation model (DEM) is an invaluable product in numerous geospatial applications from orthorectification of aerial photographs to hydrological modelling and advanced 3D visualisation. With the current aerial laser scanning methods, superior quality digital elevation models can be produced over land areas, but surfaces over water bodies are visually problematic, especially for streams in 3D. We present a method to generate smooth, monotonically decreasing elevation surfaces over water bodies in DEMs. The method requires the point cloud data and the polygons delineating the water bodies as input data. We show how DEM visualisations improve by applying the presented method.
  • Grazhdankin, Evgeni (Helsingfors universitet, 2018)
    We have developed a software for homology modelling by satisfaction of distance restraints using MODELLER back-end. The protocols used extend exploration of distance restraints and conformational space. We drive the models in optimization cycle towards better structures as assessed by the used metrics on DOPE score, retrospective distance restraint realization and others. Hydrogen bond networks are optimized for their size and connectivity density. The performance of the method is evaluated for its ability to reconstruct GPCR structures and an extracellular loop 2. The software is written in object-oriented Python (v.2.7) and supports easy extension with additional modules. We built a relational PostgreSQL database for the restraints to allow for data-driven machine and deep learning applications. An important part of the work was the visualization of the distance restraints with custom PyMOL scripts for three-dimensional viewing. Additionally, we automatically generate a plethora of diagnostic plots for assessing the performance of the modelling protocols. The software utilizes parallelism and is computationally practical with compute requirements on an order of magnitude lower than those typically seen in molecular dynamics simulations. The main challenges left to be solved is the evaluation of restraint goodness, assigning secondary structures, restraint interconditioning, and water and ligand placement.
  • Ylä-Mella, Lotta (Helsingin yliopisto, 2020)
    Terrestrial cosmogenic nuclides can be used to date glacial events. The nuclides are formed when cosmic rays interact with atoms in rocks. When the surface is exposed to the rays, the number of produced nuclides increases. Shielding, like glaciation, can prevent production. Nuclide concentration decreases with depth because the bedrock attenuates the rays. The northern hemisphere has experienced several glaciations, but typically only the latest one can be directly observed. The aim of the study was to determine if these nuclides, produced by cosmic rays, can be used to detect glaciations before the previous one by using a forward and an inverse model. The forward model predicted the nuclide concentration with depth based on a glacial history. The longer the exposure duration was, the higher was the number of nuclides in the rock. In the model, it was possible to use three isotopes. Be-10, C-14 and Al-26. The forward model was used to produce synthetic samples, which were then used in the inverse model. The purpose of the inverse model was to test which kind of glacial histories produce similar nuclide concentrations than what the sample had. The inverse model produced a concentration curve which was compared with the concentration of the samples. The misfit of the inverse solution was defined with an “acceptance box”. The box was formed from the thickness of the sample and the corresponding concentrations. If the curve intersected with the box, the solution was accepted. Small misfit values were gained if the curve was close to the sample. The idea was to find concentration curves which have as similar values as the samples. The inverse model was used in several situations, where the number of limitations was varied. If the timing of the last deglaciation and amount of erosion were known, the second last deglaciation was found relatively well. With looser constraints, it was nearly impossible to detect the past glaciations unless a depth profile was used in the sampling. The depth profile provided a tool to estimate the amount of erosion and the total exposure duration using only one isotope.
  • Hosseinzadeh, Mahboubeh Sadat; Farhadi Qomi, Masood; Naimi, Babak; Roedder, Dennis; Kazemi, Seyed Mandi (2018)
    Species distribution models estimate the relationship between species occurrences and environmental and spatial characteristics. Herein, we used maximum entropy distribution modelling (MaxEnt) for predicting the potential distribution of the Plateau Snake Skink Ophiomorus nuchalis on the Iranian Plateau, using a small number of occurrence records (i.e. 10) and environmental variables derived from remote sensing. The MaxEnt model had a high success rate according to test AUC scores (0.912). A remotely sensed enhanced vegetation index (39.1%), and precipitation of the driest month (15.4%) were the most important environmental variables that explained the geographical distribution of O. nuchalis. Our results are congruent with previous studies suggesting that suitable habitat of O. nuchalis is limited to the central Iranian Plateau, although mountain ranges in western and eastern Iran might be environmentally suitable but not accessible.
  • JET Contributors; Eriksson, F.; Fransson, E.; Oberparleiter, M.; Nordman, H.; Strand, P.; Salmi, A.; Tala, T.; Ahlgren, T. (2019)
    Transport modelling of Joint European Torus (JET) dimensionless collisionality scaling experiments in various operational scenarios is presented. Interpretative simulations at a fixed radial position are combined with predictive JETTO simulations of temperatures and densities, using the TGLF transport model. The model includes electromagnetic effects and collisions as well as (E)over-right-arrow x (b)over-right-arrow shear in Miller geometry. Focus is on particle transport and the role of the neutral beam injection (NBI) particle source for the density peaking. The experimental 3-point collisionality scans include L-mode, and H-mode (D and H and higher beta D plasma) plasmas in a total of 12 discharges. Experimental results presented in (Tala et al 2017 44th EPS Conf.) indicate that for the H-mode scans, the NBI particle source plays an important role for the density peaking, whereas for the L-mode scan, the influence of the particle source is small. In general, both the interpretative and predictive transport simulations support the experimental conclusions on the role of the NBI particle source for the 12 JET discharges.
  • Tanhuanpää, Topi (Helsingfors universitet, 2011)
    There is an ever growing interest in coarse woody debris (CWD). This is because of its role in maintaining biodiversity and storing atmospheric carbon. The aim of this study was to create an ALS-data utilizing model for mapping CWD and estimating its volume. The effect of grid cell size change to the model's performance was also considered. The study area is located in Sonkajärvi in eastern Finland and it consisted mostly of young commercially managed forests. The study utilized low-frequency ALS-data and precise strip-wise field inventory of CWD. The data was divided into two parts: one fourth of the data was used for modeling and the remaining three fourths for validating the models that were constructed. Both parametric and non-parametric modelling practices were used for modelling the area's CWD. Logistic regression was used to predict the probability of encountering CWD in grid cells of different sizes (0.04, 0.20, 0.32, 0.52 and 1.00 ha). The explanatory variables were chosen among 80 ALS-based variables and their conversions in three stages. Firstly, the variables were plotted against CWD volumes. Secondly, the best variables plotted in the first stage were examined in single-variable variable models. Thirdly, variables to the final multivariable model were chosen using 95 % level of significance. The 0.20 ha model was parametrized to other grid cell sizes. In addition to parametric model constructed with logistic regression, 0.04 ha and 1.0 ha grid cells were also classified with CART-modelling (Classification and Regression Trees). With CARTmodelling, non-linear dependecies were sought between ALS-variables and CWD. CART-models were constructed for both CWD existence and volume. When the existence of CWD in the study grid cells was considered, CART-modelling resulted in better classification than logistic regression. With logistic model the goodness of classification was improved as the grid cell size grew from 0.04 ha (kappa 0.19) to 0.32 ha (kappa 0.38). On 0.52 ha cell size, kappa value of the classification started to diminish (kappa 0.32) and was futhermore diminished to 1.0 ha cell size (kappa 0.26). The CART classification improved as the cell size grew larger. The results of CART-modelling were better than those of the logistic model in both 0.04 ha (kappa 0.24) and 1.0 ha (kappa 0.52) cell sizes. The relative RMSE of the cellwise CWD volume predicted with CART-models diminished as the cell size was enlarged. On 0.04 ha grid cell size the RMSE of the total CWD volume of the study area was 197.1 % and it diminished to 120.3 % as the grid cell size was enlarged to 1.0 ha. On the grounds of the results of this study it can be stated that the link between CWD and ALS-variables is weak but becomes slightly stronger when cell size increases. However, when cell size increases, small-scale variation of CWD becomes more difficult to spot. In this study, the existence of CWD could be estimated somewhat accurately, but the mapping of small-scale patterns was not successful with the methods that were used. Accurate locating of small-scale CWD variation requires further research, particularly on the use of high density ALS-data in CWD inventories.
  • Kallio, Varpu (Helsingfors universitet, 2014)
    The purpose of this study is to evaluate patients' quality of life and healthcare use before and after bariatric surgery and produce new, clinical data-based information on the cost-effectiveness of bariatric surgery. Healthcare resources are limited and expenditures have grown from year to year. Therefore it is important to make cost-effectiveness evaluations so that financial resources could be allocated properly. The research population consists of patients who have undergone gastric bypass or sleeve gastrectomy in the Hospital District of Helsinki and Uusimaa, during the years 2007-2009. The study population consists of 147 gastric bypass patients and 79 sleeve gastrectomy patients. In this study the decision analytic model, used in the Finohta study "Sairaalloisen lihavuuden leikkaushoito" was updated using actual, up-to-date information. The analysis was done using a decision tree and a Markov model with a time horizon of 10 years. The cost data in this study was based on actual data for the first two years after surgery. A forecast model was used to predict the costs for the years 3-10 after surgery. Patients' quality of life scores were based on real data for the years 1 (the year of operation) to 4. Quality of life scores for the other years were predicted. In the literature review section, international studies on the cost-effectiveness of bariatric surgery and its impacts on drug therapy were evaluated. The studies showed that the use of medicines, which were used to treat obesity-related diseases were lower in the surgery group. However, drugs used to treat vitamin deficiencies, depression and gastrointestinal diseases were higher in the surgery group. Most studies found that surgery is the most cost-effective way to treat morbid obesity. This study confirms the role of the bariatric surgery in the treatment of morbid obesity in Finland. Even though the healthcare costs were increased in the first two years after the operation, the conclusions of the Finohta study didn't change. The bariatric surgery is cheaper and more effective than ordinary treatment and the most cost-effective way to treat morbid obesity. The mean costs were 30 309 € for the gastric bypass, 31 838 € for the sleeve gastectomy and 36 482 € for ordinary treatment. The mean numbers of quality-adjusted life-years were 6.919 for the gastric bypass, 6.920 for the sleeve gastrectomy and 6.661 for ordinary treatment. However, there is demand for more information for the long-term effects, benefits and risks of the surgery. How much the surgery will actually save money, will be hopefully clarified in the long-term follow-up study, which should also include an actual control group.
  • Laakkonen, Antti (Helsingin yliopisto, 2020)
    Understanding soil respiration behaviour in different environments is one of the most crucial research questions currently in environmental sciences, since it is a major component of the carbon cycle. It can be divided into many source components, them being litter decomposition, soil organic matter, root respiration and respiration in the rhizosphere. Many biotic and abiotic factors control soil respiration through complicated relationship networks. Strong controlling factors being soil temperature, soil moisture, substrate supply and quality, soil nitrogen content, soil acidity and soil texture. As these relationships are biome-specific, they must be understood in order to produce more accurate assessments worldwide. In this study annual soil respiration rates and its controlling factors were investigated globally in unmanaged and natural mature forest biomes. Observed values were extracted from Soil respiration database (SRDB) v.5, and it was complemented with spatially and temporally linked data from remotely sensed and modelled databases to produce variables for forest productivity, meteorological conditions and soil properties. Furthermore, empirical soil respiration models and machine learning algorithms, as well as previous estimates, were compared to each other. Locally, monthly manual soil respiration measurements from boreal forest site in Hyytiälä, Finland from the years 2010-2011, with environmental, soil temperature and soil water conditions were investigated to identify seasonal differences in controlling factors of soil respiration rate. Soil respiration controls were found to differ between biomes. Furthermore, the Artificial Neural Network algorithm used was observed to outperform empirical models and previous estimates, when biome specific modelling was implemented with the continental division. Artificial neural networks and other algorithms could produce more accurate estimates globally. Locally soil respiration rates were observed to differ seasonally, with soil temperature controls being stronger during the growing season and when snow depth exceeded 30 cm, soil water conditions, controlled soil respiration strongly.
  • Kukkonen, Tommi (Helsingin yliopisto, 2020)
    The Arctic is warming with an increased pace, and it can affect ecosystems, infrastructure and communities. By studying periglacial landforms and processes, and using improved methods, more knowledge on these changing environmental conditions and their impacts can be obtained. The aim of this thesis is to map studied landforms and predict their probability of occurrence in the circumpolar region utilizing different modelling methods. Periglacial environments occur in high latitudes and other cold regions. These environments host permafrost, which is frozen ground and responds effectively to climate warming, and underlays areas that host many landform types. Therefore, landform monitoring and modelling in permafrost regions under changing climate can provide information about the ongoing changes in the Arctic and landform distributions. Here four landform/process types were mapped and studied: patterned ground, pingos, thermokarst activity and solifluction. The study consisted of 10 study areas across the circumpolar Arctic that were mapped for their landforms. The study utilized GLM, GAM and GBM analyses in determining landform occurrences in the Arctic based on environmental variables. Model calibration utilized logit link function, and evaluation explained the deviance value. Data was sampled to evaluation and calibration sets to assess prediction abilities. The predictive accuracy of the models was assessed using ROC/AUC values. Thermokarst activity proved to be most abundant in studied areas, whereas solifluction activity was most scarce. Pingos were discovered evenly throughout studied areas, and patterned ground activity was absent in some areas but rich in others. Climate variables and mean annual ground temperature had the biggest influence in explaining landform occurrence throughout the circumpolar region. GBM proved to be the most accurate and had the best predictive performance. The results show that mapping and modelling in mesoscale is possible, and in the future, similar studies could be utilized in monitoring efforts regarding global change and in studying environmental and periglacial landform/process interactions.
  • Airola, Sofia (Helsingfors universitet, 2014)
    The subject of this thesis was to evaluate the capability of the FEMMA model in simulating the daily nitrogen load from a forested catchment. For that FEMMA was tested in a forest plot in Hyytiälä, Juupajoki. The modeling results of the concentrations of ammonium, nitrate and dissolved organic nitrogen in the runoff water were compared to the measured values of those. This work presents the current state of knowledge concerning the most significant nitrogen processes in forest soil, as reported in the literature. It also lists some alternative models for simulating nitrogen and evaluates the uncertainties in the modelling critically. As a result FEMMA was found not to be suitable for simulating daily nitrogen load from this catchment. The simulated results didn’t correspond to the measured values. The most significant factors to develop in FEMMA found in this study were the parametrization of the gaseous nitrogen losses from the system, re-examining the nitrogen uptake by plants and developing the computing of the fractions of nitrogen released in decomposition. For future research it would be important to decide if it is meaningful to simulate the daily nitrogen leaching with process-based models at all. At least in the Hyytiälä site the amount of leached nitrogen is so small compared to the nitrogen in other processes that it’s quite challenging to simulate it accurately enough.
  • Lehtiniemi, Heidi (Helsingin yliopisto, 2020)
    Computing complex phenomena into models providing information of the causalities and future scenarios is a very topical way to present scientific information. Many claim models to be the best available tool to provide decision making with information about near-future scenarios and the action needed (Meah, 2019; Schirpke et al., 2020). This thesis studies global climate models based on objective data compared to local ecosystem services models combining ecological and societal data offer an extensive overview of modern environmental modelling. In addition to modelling, the science-policy boundary is important when analyzing the societal usefulness of models. Useful and societally-relevant modelling is analyzed with an integrative literature review (Whittemore & Knafl, 2005) on the topics of climate change, ecosystem services, modelling and science-policy boundary, n=58. Literature from various disciplines and viewpoints is included in the material. Since the aim is to create a comprehensive understanding of the multidisciplinary phenomenon of modelling, the focus is not on the technical aspects of it. Based on the literature, types of uncertainty in models and strategies to manage them are identified (e.g. van der Sluijs, 2005). Characteristics of useful models and other forms of scientific information are recognized (e.g. Saltelli et al., 2020). Usefulness can be achieved when models are fit for purpose, accessible and solution-oriented, and sufficient interaction and trust is established between the model users and developers. Climate change and ecosystem services are analyzed as case studies throughout the thesis. The relationship of science and policy is an important discussion especially important when solving the sustainability crisis. Because modelling is a boundary object (Duncan et al., 2020), the role of boundary work in managing and communicating the uncertainties and ensuring the usefulness of models is at the center of the analysis.
  • Miettinen, Topi (2001)
    In the thesis, I will present a formal game theoretic model on determination of a fair social contract introduced by Binmore (1994, 1998). Bimore considers a social contract as an implicit contract that determines the rights and duties of contracting individuals. Binmore's construction is naturalistic and ethically relativistic. The driving forces are biological and social evolution. Morality is seen as an equilibrium selection mechanism to coordinate among multiple equilibria available. Binmore wants to construct a synthesis of the theories of Rawls (1971) and Harsanyi (1977). He picks up an idea familiar to economists from the theory of the firm and presents three time intervals. In the short run, all the decisions of importance are made. In the medium run, social evolution alters the fair social contract. In the long run, genetic codes adapt to the prevailing shorter run circumstances. In the short run, players are playing two games simultaneously: the game of life where players strategy choices are only restricted by physical, natural and biological constraints and the game of morals where side-stepping from the fair social contract strategies launches punishments on one hand and a negotiation process on the other. The negotiation process takes place behind the veil of ignorance familiar from theories of Rawls and Harsanyi. Binmore uses bayesian decision theory in maximizing empathetic preferences that are identical to extended preferences of Harsanyi. This approach leads Harsanyi to utilitarianism. Binmore, however, sticks to non-commitment approach in the negotiation process. By these means he ends up with a maximin conclusion familiar from Rawls, that Rawls thought to require abandoning bayesian decision theory. In the medium-run the weights of empathetic preferences adapt and finally settle to an evolutionary stable equilibrium. The solutions of Rawlsian and utilitarian approaches coincide. Finally long-run approach presents a theory why the market system has evolved. As far as the scope is restricted to division of market goods, the fair social contract coincides in the long run with the walrasian equilibrium. We proceed by first discussing, how moral and ethical theories fit to traditional economics. We try to shed light on some issues of dispute in economics that are essential for the theory. We will then present the essential tools of game theory necessary for the understanding of the ideas. We will shortly present theories of Rawls and Harsanyi. After presenting the predecessors, we will tackle Binmores theory. First the short and medium run processes are presented. Secondly, we dive into the deep waters of genetic adaptation of long run treatment. Finally, we will present critiques and further ideas.
  • Mireles-Flores, Luis (2018)
    This essay is a review of the recent literature on the methodology of economics, with a focus on three broad trends that have defined the core lines of research within the discipline during the last two decades. These trends are: (a) the philosophical analysis of economic modelling and economic explanation; (b) the epistemology of causal inference, evidence diversity, and evidence-based policy, and (c) the investigation of the methodological underpinnings and public policy implications of behavioural economics. The final output is inevitably not exhaustive, yet it aims at offering a fair taste of some of the most representative questions in the field on which many philosophers, methodologists, and social scientists have recently been placing a great deal of intellectual effort. The topics and references compiled in this review should serve at least as safe introductions to some of the central research questions in the philosophy and methodology of economics.
  • Nystedt, Ari (Helsingin yliopisto, 2019)
    The modern, intensive silviculture has affected negatively to the grouses. Main reasons are changes in the ground vegetation and decreasing proportion of blueberry. Main features for grouse habitats are variety in the forest cover and protection from the understorey. In managed forests fluctuation can be increased via thickets. Thicket size varies from couple of trees to approximately two ares. Tickets are uncleared patches containing trees in various sizes. To highlight grouses via game-friendly forest management, information about the habitat is required in the forest site and broader area. Observations about the grouses in the forest site and the information about capercaillie’s lekking sites, willow grouse’s habitats and the wintering areas have been beneficial. Information about grouse densities and population’s fluctuations has been gathered via game triangles. Guide books about game husbandry contain information about grouse habitats and thicket characteristics. The aim of this study was to investigate, whether it is possible to model suitable thickets and grouse habitats with open GIS (Geographical Information Systems) material via GIS- analyses. Examined grouse species in modelling were capercaillie, black grouse and hazel grouse. Weighted Overlay was done with ArcMap- software. Suitable thickets and habitats were examined in the whole research area and in suitable figures. Based on the results of the analysis, theme maps were made to represent the research area’s suitability for thickets and grouse habitats. The needed material for the thickets was collected and GIS- analyses were made in the research area in Tavastia Proper, Hausjärvi. For the research, 12 one-hectare squares were created. Together 45 suitable areas for thickets were charted via field inventory. After the field inventory and GIS- analyses, the results were compared. Key figures from the tickets were number of the thickets, areas, distance to the nearest thicket, averages and standard deviations. Statistical methods were applied to examine possible statistically significant differences between areas and between distances to the nearest thicket. Performed tests were One-Way ANOVA and Kruskall-Wallis. Grouse habitat’s tree characteristics were examined with up-to-date forest management plan. Tree characteristics were examined from 17 suitable figures, covering total area of 42,6 hectares. In field inventory, the average amount of found thickets in research grid was 3,8 and with modelling 1,4. The average area of thicket was 76,9 m2 in field inventory and 252 m2 in modelling. The average distance between thickets was 12,6 meters in field inventory and 24,8 meters with modelling. In field inventory thickets covered approximately 2,9 percent and modelled 3,6 percent of the research grid’s total area. According to statistical analyses, there was statistically significant difference between the inventory method to the total thicket area and distance to the nearest thicket. According to the modelling and forest management plan, capercaillie’s habitats were located in mature pine stands. Black grouse habitats were located in spruce dominated, young forest stands. Hazel grouse habitats included high proportion of broad-leaved trees, which were visible in ecotones between forest and field. Common for capercaillie, black grouse and hazel grouse habitats were minor surface area and mosaic-like structure. As a result, thickets and grouse habitats can be modeled with open GIS-material. However, modelling requires knowing the characteristics of thickets and examined species. With weighted overlay thickets were not found in areas where canopy density and spruce volume were naturally low. Research is needed to verify thicket’s occupation with trail cameras. The ecological impacts on the research area by saving thickets require evaluation.