Browsing by Subject "modelling"

Sort by: Order: Results:

Now showing items 1-20 of 48
  • Lehtinen, Julius (Helsingin yliopisto, 2022)
    Sortition – selecting representatives by drawing lots – has played a significant part in the history and development of democracy. When it comes to modern more representative variants of democracy, the custom of sortition has, however, fallen from grace and largely vanished from the main stage of democracy. But how well would it work in modern parliaments, compared to current practice of electing representatives? The thesis undertakes a task to simulate the main functions of a plural parliament in the case where there are no randomised representatives present, and in the case where randomised representatives are present – to different extents – and ultimately contrast them to solve this research question. The thesis takes on the research question with the method of constructing a rather simplistic model of a plural parliament and its functions, generating a metric of the hypothetical quality and volume of the legislation produced by the parliament. The model is run and re-run thousands of times as a so-called Mont Carlo simulation with few randomised and many fixed variables to produce overall results dataset of the simulations with different fractions of randomised legislators present in the parliament. The results dataset is ultimately subjected to an analysis of variance (ANOVA) to determine the likelihood of the different fractions of independent legislators producing legislation of different quantity and quality on average. The result of the conducted ANOVA is that the produced quantity and quality of legislation produced by different fractions of independent legislators very probably is not equal on average. Therefore, the quality and quantity of legislation seems to depend on the fraction of randomised legislators in a plural parliament. The quality and quantity of legislation is, further, higher on average in plural parliaments with a moderate number of randomised legislators than it is in a plural parliament where randomised legislators are not present. The thesis continues to conclude that – as the quality and quantity is on average higher with a moderate number of randomised legislators and as the true quality and quantity of the legislation is very probably not equal across the simulations – the quality and quantity of the legislation is higher with a moderate amount of sortition, i.e., randomised legislators present in a plural parliament. The thesis goes on to briefly discuss the ways the conditions of the model could be enabled in real-life and the best ways to achieve the results that the model points towards.
  • Bhattacharjee, Joy; Rabbil, Mehedi; Fazel, Nasim; Darabi, Hamid; Choubin, Bahram; Khan, Md. Motiur Rahman; Marttila, Hannu; Haghighi, Ali Torabi (Elsevier, 2021)
    Science of the Total Environment 797 (2021), 149034
    Lake water level fluctuation is a function of hydro-meteorological components, namely input, and output to the system. The combination of these components from in-situ and remote sensing sources has been used in this study to define multiple scenarios, which are the major explanatory pathways to assess lake water levels. The goal is to analyze each scenario through the application of the water balance equation to simulate lake water levels. The largest lake in Iran, Lake Urmia, has been selected in this study as it needs a great deal of attention in terms of water management issues. We ran a monthly water balance simulation of nineteen scenarios for Lake Urmia from 2003 to 2007 by applying different combinations of data, including observed and remotely sensed water level, flow, evaporation, and rainfall. We used readily available water level data from Hydrosat, Hydroweb, and DAHITI platforms; evapotranspiration from MODIS and rainfall from TRMM. The analysis suggests that the consideration of field data in the algorithm as the initial water level can reproduce the fluctuation of Lake Urmia water level in the best way. The scenario that combines in-situ meteorological components is the closest match to the observed water level of Lake Urmia. Almost all scenarios showed good dynamics with the field water level, but we found that nine out of nineteen scenarios did not vary significantly in terms of dynamics. The results also reveal that, even without any field data, the proposed scenario, which consists entirely of remote sensing components, is capable of estimating water level fluctuation in a lake. The analysis also explains the necessity of using proper data sources to act on water regulations and managerial decisions to understand the temporal phenomenon not only for Lake Urmia but also for other lakes in semi-arid regions.
  • Jylhä, Kirsti; Ruosteenoja, Kimmo; Räisänen, Jouni; Venäläinen, Ari; Tuomenvirta, Heikki; Ruokolainen, Leena; Saku, Seppo; Seitola, Teija (2010)
    Rapoetteja - Rapporter - Reports
  • Forsius, Martin; Posch, Maximilian; Holmberg, Maria; Vuorenmaa, Jussi; Kleemola, Sirpa; Augustaitis, Algirdas; Beudert, Burkhard; Bochenek, Witold; Clarke, Nicholas; de Wit, Heleen A.; Dirnböck, Thomas; Frey, Jane; Grandin, Ulf; Hakola, Hannele; Kobler, Johannes; Krám, Pavel; Lindroos, Antti-Jussi; Löfgren, Stefan; Pecka, Tomasz; Rönnback, Pernilla; Skotak, Krzysztof; Szpikowski, Józef; Ukonmaanaho, Liisa; Valinia, Salar; Váňa, Milan (Elsevier, 2021)
    Science of The Total Environment 753 (2021), 141791
    Anthropogenic emissions of nitrogen (N) and sulphur (S) compounds and their long-range transport have caused widespread negative impacts on different ecosystems. Critical loads (CLs) are deposition thresholds used to describe the sensitivity of ecosystems to atmospheric deposition. The CL methodology has been a key science-based tool for assessing the environmental consequences of air pollution. We computed CLs for eutrophication and acidification using a European long-term dataset of intensively studied forested ecosystem sites (n = 17) in northern and central Europe. The sites belong to the ICP IM and eLTER networks. The link between the site-specific calculations and time-series of CL exceedances and measured site data was evaluated using long-term measurements (1990–2017) for bulk deposition, throughfall and runoff water chemistry. Novel techniques for presenting exceedances of CLs and their temporal development were also developed. Concentrations and fluxes of sulphate, total inorganic nitrogen (TIN) and acidity in deposition substantially decreased at the sites. Decreases in S deposition resulted in statistically significant decreased concentrations and fluxes of sulphate in runoff and decreasing trends of TIN in runoff were more common than increasing trends. The temporal developments of the exceedance of the CLs indicated the more effective reductions of S deposition compared to N at the sites. There was a relation between calculated exceedance of the CLs and measured runoff water concentrations and fluxes, and most sites with higher CL exceedances showed larger decreases in both TIN and H+ concentrations and fluxes. Sites with higher cumulative exceedance of eutrophication CLs (averaged over 3 and 30 years) generally showed higher TIN concentrations in runoff. The results provided evidence on the link between CL exceedances and empirical impacts, increasing confidence in the methodology used for the European-scale CL calculations. The results also confirm that emission abatement actions are having their intended effects on CL exceedances and ecosystem impacts.
  • Rajakallio, Maria; Jyväsjärvi, Jussi; Muotka, Timo; Aroviita, Jukka (Blackwell, 2021)
    Journal of Applied Ecology 58: 7, 1523-1532
    1. Growing bioeconomy is increasing the pressure to clear-cut drained peatland forests. Yet, the cumulative effects of peatland drainage and clear-cutting on the biodiversity of recipient freshwater ecosystems are largely unknown. 2. We studied the isolated and combined effects of peatland drainage and clear-cutting on stream macroinvertebrate communities. We further explored whether the impact of these forestry-driven catchment alterations to benthic invertebrates is related to stream size. We quantified the impact on invertebrate biodiversity by comparing communities in forestry-impacted streams to expected communities modelled with a multi-taxon niche model. 3. The impact of clear-cutting of drained peatland forests exceeded the sum of the independent effects of drainage and clear-cutting, indicating a synergistic interaction between the two disturbances in small streams. Peatland drainage reduced benthic biodiversity in both small and large streams, whereas clear-cutting did the same only in small streams. Small headwater streams were more sensitive to forestry impacts than the larger downstream sites. 4. We found 11 taxa (out of 25 modelled) to respond to forestry disturbances. These taxa were mainly different from those previously reported as sensitive to forestry-driven alterations, indicating the context dependence of taxonomic responses to forestry. In contrast, most of the functional traits previously identified as responsive to agricultural sedimentation also responded to forestry pressures. In particular, taxa that live temporarily in hyporheic habitats, move by crawling, disperse actively in water, live longer than 1 year, use eggs as resistance form and obtain their food by scraping became less abundant than expected, particularly in streams impacted by both drainage and clear-cutting. 5. Synthesis and applications. Drained peatland forests in boreal areas are reaching maturity and will soon be harvested. Clear-cutting of these forests incurs multiple environmental hazards but previous studies have focused on terrestrial ecosystems. Our results show that the combined impacts of peatland drainage and clear-cutting may extend across ecosystem boundaries and cause significant biodiversity loss in recipient freshwater ecosystems. This information supports a paradigm shift in boreal forest management, whereby continuous-cover forestry based on partial harvest may provide the most sustainable approach to peatland forestry.
  • Westerlund, Antti; Tuomi, Laura; Alenius, Pekka; Myrberg, Kai; Miettunen, Elina; Vankevich, Roman E.; Hord, Robinson (Taylor & Francis, 2019)
    Tellus A: Dynamic Meteorology and Oceanography, 71:1
    We studied circulation patterns in the Gulf of Finland (GoF), an estuary-like sub-basin of the Baltic Sea. Circulation patterns in the GoF are complex and vary from season to season and year to year. Estuarine circulation in the gulf is heavily modified by many factors, such as wind forcing, topography and geostrophic effects. Based on a 7-year run of the NEMO 3D hydrodynamic model with a 500 m horizontal resolution, we analysed seasonal changes of mean circulation patterns. We found that there were clear seasonal differences in the circulation patterns in the GoF. Features that moved or changed direction from season to season were damped or hidden in the averages. To further study these differences, we also carried out a self-organising map (SOM) analysis of currents for several latitudinal sections. The results of the SOM analysis emphasised the estuary-like nature of the GoF. Circulation changed rapidly from normal estuarine circulation to reverse estuarine circulation. The dominant southwesterly winds supported the reversal of the estuarine circulation. Both normal and reversed estuarine circulation were roughly as common in our data. The SOM analysis also demonstrated how the long-term cyclonic mean circulation field and the average salinity field emerged from the interaction of normal and reversed estuarine circulation.
  • Lehtomaa, Jere (Helsingfors universitet, 2017)
    The incomplete global coverage of current emissions trading schemes has raised concerns about free-riding and carbon leakage. EU ETS, the first and currently the biggest carbon market, is at the fore of such fears. Carbon-based import tariffs have thereby been proposed to compensate domestic industries for the cost disadvantage against their rivals in non-regulating countries. This thesis uses an applied general equilibrium (AGE) model to assess the impacts of a hypothetical EU carbon tariff on the Finnish economy. The carbon content of imported goods is first estimated with an environmentally extended input-output analysis, and the tariff is levied according to the anticipated price of EU emission allowances. To examine the sensitivity of the results, five additional scenarios are then constructed by altering the key simulation parameters. The tariff is imposed on the most energy-intensive and trade-exposed industries in 2016 and simulated until 2030. The results suggest that carbon tariffs are detrimental to the Finnish economy. The negative outcome is determined by high material intensity and a growing dependence on imported materials throughout the industry sector. As a result, the tariff-induced increase in import prices adds up to a notable growth in total production costs. Moreover, the negative impact is most pronounced within the export-oriented heavy manufacturing sector that the tariff was designed to shelter in the first place. The few sectors that gain from the tariff were not directly subject to it, but utilize the secondary impacts as the economy adapts to the shock. The findings imply that due to the deeper integration of global value chains, the appeal of protective tariffs, even if environmentally motivated, can be harmfully over-simplistic.
  • Soimakallio, Sampo; Böttcher, Hannes; Niemi, Jari; Mosley, Fredric; Turunen, Sara; Hennenberg, Klaus Josef; Reise, Judith; Fehrenbach, Horst (Wiley, 2022)
    GCB Bioenergy
    Fossil-based emissions can be avoided by using wood in place of non-renewable raw materials as energy and materials. However, wood harvest influences forest carbon stocks. Increased harvest may reduce the overall climate benefit of wood use significantly, but is widely overlooked. We reviewed selected simulation studies and compared differences in forest carbon and amount of wood harvested between harvest scenarios of different intensities for three different time perspectives: short- (1–30 years), mid- (31–70 years), and long-term (71–100 years). Out of more than 450 reviewed studies 45 provided adequate data. Our results show that increased harvest reduces carbon stocks over 100 years in temperate and boreal forests by about 1.6 (stdev 0.9) tC per tC harvested (referred to as carbon balance indicator (CBI)). CBI proved to be robust when outliers explicitly influenced by factors other than changes in the harvest rate, such as fertilization or increase in forest area, were removed. The carbon impacts tend to be greatest in the mid-term, but no significant difference in was found for average values between short and long time-horizons. CBI can be interpreted as carbon opportunity costs of wood harvest in forests. Our results indicate that even after 100 years, CBI is significant compared to the typical GHG credits expected in the technosphere by avoiding fossil emissions in substitution and increasing carbon stocks in harvested wood products. Our estimates provide typical values that can directly be included in GHG balances of products or assessments of mitigation policies and measures related to wood use. However, more systematic scenarios with transparent information on influencing factors for forest carbon stocks are required to provide better constrained estimates for specific forest types.
  • Marttila, H.; Tammela, S.; Mustonen, K.-R.; Louhi, P.; Muotka, Timo; Mykrä, Heikki; Klove, B. (IWA Publishing, 2019)
    Hydrology Research 1 June 2019; 50 (3): 878–885
    We conducted a series of tracer test experiments in 12 outdoor semi-natural flumes to assess the effects of variable flow conditions and sand addition on hyporheic zone conditions in gravel beds, mimicking conditions in headwater streams under sediment pressure. Two tracer methods were applied in each experiment: 2–5 tracer-pulse tests were conducted in all flumes and pulses were monitored at three distances downstream of the flume inlet (0 m, 5 m and 10 m, at bed surface), and in pipes installed into the gravel bed at 5 m and 10 m distances. The tracer breakthrough curves (total of 120 tracer injections) were then analysed with a one-dimensional solute transport model (OTIS) and compared with data from the gravel pipes in point-dilution pulse tests. Sand addition had a strong negative effect on horizontal fluxes (qh), whereas the fraction of the median travel time due to transient storage (F200) was determined more by flow conditions. These results suggest that even small additions of sand can modify the hyporheic zone exchange in gravel beds, thus making headwater streams with low sediment transport capacity particularly vulnerable to sediments transported into the stream from catchment land use activities.
  • Mäkinen, Ville; Oksanen, Juha; Sarjakoski, Tapani (Stichting AGILE, 2019)
    The digital elevation model (DEM) is an invaluable product in numerous geospatial applications from orthorectification of aerial photographs to hydrological modelling and advanced 3D visualisation. With the current aerial laser scanning methods, superior quality digital elevation models can be produced over land areas, but surfaces over water bodies are visually problematic, especially for streams in 3D. We present a method to generate smooth, monotonically decreasing elevation surfaces over water bodies in DEMs. The method requires the point cloud data and the polygons delineating the water bodies as input data. We show how DEM visualisations improve by applying the presented method.
  • Niittynen, Pekka; Heikkinen, Risto K.; Luoto, Miska (2020)
    Proceedings of the National Academy of Sciences of the United States of America 117: 35, 21480-21487
    The Arctic is one of the least human-impacted parts of the world, but, in turn, tundra biome is facing the most rapid climate change on Earth. These perturbations may cause major reshuffling of Arctic species compositions and functional trait profiles and diversity, thereby affecting ecosystem processes of the whole tundra region. Earlier research has detected important drivers of the change in plant functional traits under warming climate, but studies on one key factor, snow cover, are almost totally lacking. Here we integrate plot-scale vegetation data with detailed climate and snow information using machine learning methods to model the responsiveness of tundra communities to different scenarios of warming and snow cover duration. Our results show that decreasing snow cover, together with warming temperatures, can substantially modify biotic communities and their trait compositions, with future plant communities projected to be occupied by taller plants with larger leaves and faster resource acquisition strategies. As another finding, we show that, while the local functional diversity may increase, simultaneous biotic homogenization across tundra communities is likely to occur. The manifestation of climate warming on tundra vegetation is highly dependent on the evolution of snow conditions. Given this, realistic assessments of future ecosystem functioning require acknowledging the role of snow in tundra vegetation models.
  • Boutle, Ian; Angevine, Wayne; Bao, Jian-Wen; Bergot, Thierry; Bhattacharya, Ritthik; Bott, Andreas; Ducongé, Leo; Forbes, Richard; Goecke, Tobias; Grell, Evelyn; Hill, Adrian; Igel, Adele L.; Kudzotsa, Innocent; Lac, Christine; Maronga, Bjorn; Romakkaniemi, Sami; Schmidli, Juerg; Schwenkel, Johannes; Steeneveld, Gert-Jan; Vié, Benoît (Copernicus Publ., 2022)
    Atmospheric chemistry and physics
    An intercomparison between 10 single-column (SCM) and 5 large-eddy simulation (LES) models is presented for a radiation fog case study inspired by the Local and Non-local Fog Experiment (LANFEX) field campaign. Seven of the SCMs represent single-column equivalents of operational numerical weather prediction (NWP) models, whilst three are research-grade SCMs designed for fog simulation, and the LESs are designed to reproduce in the best manner currently possible the underlying physical processes governing fog formation. The LES model results are of variable quality and do not provide a consistent baseline against which to compare the NWP models, particularly under high aerosol or cloud droplet number concentration (CDNC) conditions. The main SCM bias appears to be toward the overdevelopment of fog, i.e. fog which is too thick, although the inter-model variability is large. In reality there is a subtle balance between water lost to the surface and water condensed into fog, and the ability of a model to accurately simulate this process strongly determines the quality of its forecast. Some NWP SCMs do not represent fundamental components of this process (e.g. cloud droplet sedimentation) and therefore are naturally hampered in their ability to deliver accurate simulations. Finally, we show that modelled fog development is as sensitive to the shape of the cloud droplet size distribution, a rarely studied or modified part of the microphysical parameterisation, as it is to the underlying aerosol or CDNC.
  • Grazhdankin, Evgeni (Helsingfors universitet, 2018)
    We have developed a software for homology modelling by satisfaction of distance restraints using MODELLER back-end. The protocols used extend exploration of distance restraints and conformational space. We drive the models in optimization cycle towards better structures as assessed by the used metrics on DOPE score, retrospective distance restraint realization and others. Hydrogen bond networks are optimized for their size and connectivity density. The performance of the method is evaluated for its ability to reconstruct GPCR structures and an extracellular loop 2. The software is written in object-oriented Python (v.2.7) and supports easy extension with additional modules. We built a relational PostgreSQL database for the restraints to allow for data-driven machine and deep learning applications. An important part of the work was the visualization of the distance restraints with custom PyMOL scripts for three-dimensional viewing. Additionally, we automatically generate a plethora of diagnostic plots for assessing the performance of the modelling protocols. The software utilizes parallelism and is computationally practical with compute requirements on an order of magnitude lower than those typically seen in molecular dynamics simulations. The main challenges left to be solved is the evaluation of restraint goodness, assigning secondary structures, restraint interconditioning, and water and ligand placement.
  • Bettencourt da Silva, Ricardo J.N; Saame, Jaan; Anes, Bárbara; Heering, Agnes; Leito, Ivo; Näykki, Teemu; Stoica, Daniela; Deleebeeck, Lisa; Bastkowski, Frank; Snedden, Alan; Camões, M. Filomena (Elsevier, 2021)
    Analytica Chimica Acta 1182 (2021), 338923
    The use of the unified pH concept, pHabsH2O, applicable to aqueous and non-aqueous solutions, which allows interpreting and comparison of the acidity of different types of solutions, requires reliable and objective determination. The pHabsH2O can be determined by a single differential potentiometry measurement referenced to an aqueous reference buffer or by a ladder of differential potentiometric measurements that allows minimisation of inconsistencies of various determinations. This work describes and assesses bottom-up evaluations of the uncertainty of these measurements, where uncertainty components are combined by the Monte Carlo Method (MCM) or Taylor Series Approximation (TSM). The MCM allows a detailed simulation of the measurements, including an iterative process involving in minimising ladder deviations. On the other hand, the TSM requires the approximate determination of minimisation uncertainty. The uncertainty evaluation was successfully applied to measuring aqueous buffers with pH of 2.00, 4.00, 7.00, and 10.00, with a standard uncertainty of 0.01. The reference and estimated values from both approaches are metrologically compatible for a 95% confidence level even when a negligible contribution of liquid junction potential uncertainty is assumed. The MCM estimated pH values with an expanded uncertainty, for the 95% confidence level, between 0.26 and 0.51, depending on the pH value and ladder inconsistencies. The minimisation uncertainty is negligible or responsible for up to 87% of the measurement uncertainty. The TSM quantified measurement uncertainties on average only 0.05 units larger than the MCM estimated ones. Additional experimental tests should be performed to test these uncertainty models for analysis performed in other laboratories and on non-aqueous solutions.
  • Frohn, Lise M.; Geels, Camilla; Andersen, Christopher; Andersson, Camilla; Bennet, Cecilia; Christensen, Jesper H.; Im, Ulas; Karvosenoja, Niko; Kindler, Paula Anna; Kukkonen, Jaakko; Lopez-Aparicio, Susana; Nielsen, Ole-Kenneth; Palamarchuk, Yuliia; Paunu, Ville-Veikko; Plejdrup, Marlene Smith; Segersson, David; Sofiev, Mikhail; Brandt, Jørgen (Elsevier BV, 2022)
    Atmospheric Environment
    This study presents a comprehensive evaluation of the combination of the regional scale chemistry-transport model DEHM (Danish Eulerian Hemispheric Model) and the Gaussian plume-in-grid model UBMv10 (Urban Background Model). The focus of the study was centred around the following research question: the combination of an Eulerian regional scale approach and a Gaussian high-resolution/local scale approach improves the performance from evaluation with measurements compared to the Eulerian regional scale approach alone. We also investigated the research question that the integrated Eulerian/Gaussian approach has a similar performance as Eulerian models set up with the same high spatial resolution. The DEHM/UBM model has been run for a domain covering Denmark, Finland, Norway and Sweden with a 1 km × 1 km spatial resolution, producing hourly concentration estimates for four decades, 1979–2018. The results were evaluated against rural and urban background measurements in the four countries and the performance of the DEHM/UBM model was compared to the performance of the DEHM model based on a similar evaluation. The comparison showed that the DEHM/UBM model, in general, performs similar to the DEHM model, however, DEHM/UBM captures the interannual variability better in most cases. The DEHM/UBM model results for 2015 were also compared with corresponding high-resolution results from the chemistry-transport models SILAM (System for Integrated modeLling of Atmospheric coMposition) and MATCH (Multi-scale Atmospheric Transport and CHemistry model) for the four Nordic capitals; Copenhagen, Helsinki, Oslo and Stockholm. This model comparison was carried out to evaluate and compare the performance of the relatively simple, yet computationally fast approach of the DEHM/UBM model setup to the more time-consuming approaches of applied regional scale models on very high-resolution. The DEHM/UBM model performed similarly to SILAM and MATCH for most components, however, the 3D models performed better with respect to capturing the differences between rural and urban settings for the four capitals. Highlights • Combining a regional scale Eulerian model with a Gaussian plume-in-grid model enables long-term high-resolution local scale modelling of air pollution for a large geographic area. • Ambient concentrations of NO2, O3 and PM2.5 in the continental Nordic countries were modelled with 1 km x 1 km spatial and hourly temporal resolution for 1979-2018. • The Eulerian-Gaussian approach performs better than the Eulerian approach for PM2.5 in general and with respect to year-to-year variation for NO2 and O3. • The Eulerian-Gaussian approach performs similar to high-resolution local scale Eulerian models.
  • Ylä-Mella, Lotta (Helsingin yliopisto, 2020)
    Terrestrial cosmogenic nuclides can be used to date glacial events. The nuclides are formed when cosmic rays interact with atoms in rocks. When the surface is exposed to the rays, the number of produced nuclides increases. Shielding, like glaciation, can prevent production. Nuclide concentration decreases with depth because the bedrock attenuates the rays. The northern hemisphere has experienced several glaciations, but typically only the latest one can be directly observed. The aim of the study was to determine if these nuclides, produced by cosmic rays, can be used to detect glaciations before the previous one by using a forward and an inverse model. The forward model predicted the nuclide concentration with depth based on a glacial history. The longer the exposure duration was, the higher was the number of nuclides in the rock. In the model, it was possible to use three isotopes. Be-10, C-14 and Al-26. The forward model was used to produce synthetic samples, which were then used in the inverse model. The purpose of the inverse model was to test which kind of glacial histories produce similar nuclide concentrations than what the sample had. The inverse model produced a concentration curve which was compared with the concentration of the samples. The misfit of the inverse solution was defined with an “acceptance box”. The box was formed from the thickness of the sample and the corresponding concentrations. If the curve intersected with the box, the solution was accepted. Small misfit values were gained if the curve was close to the sample. The idea was to find concentration curves which have as similar values as the samples. The inverse model was used in several situations, where the number of limitations was varied. If the timing of the last deglaciation and amount of erosion were known, the second last deglaciation was found relatively well. With looser constraints, it was nearly impossible to detect the past glaciations unless a depth profile was used in the sampling. The depth profile provided a tool to estimate the amount of erosion and the total exposure duration using only one isotope.
  • Jyväsjärvi, Jussi; Lehosmaa, Kaisa; Aroviita, Jukka; Turunen, Jarno; Rajakallio, Maria; Marttila, Hannu; Tolkkinen, Mikko; Mykrä, Heikki; Muotka, Timo (Elsevier, 2021)
    Ecological Indicators 121 (2021), 106986
    Degradation of freshwater ecosystems requires efficient tools for assessing the ecological status of freshwater biota and identifying potential cause(s) for their biological degradation. While diatoms and macroinvertebrates are widely used in stream bioassessment, the potential utility of microbial communities has not been fully harnessed. Using data from 113 Finnish streams, we assessed the performance of aquatic leaf-associated fungal decomposers, relative to benthic macroinvertebrates and diatoms, in modelling-based bioassessment. We built multi-taxon niche -type predictive models for fungal assemblages by using genus-based and sequence-based identification levels. We then compared the models’ precision and accuracy in the prediction of reference conditions (number of native taxa) to corresponding models for macroinvertebrates and diatoms. Genus-based fungal model nearly equalled the accuracy and precision of our best model (macroinvertebrates), whereas the sequence-based model was less accurate and tended to overestimate the number of taxa. However, when the models were applied to streams disturbed by anthropogenic stressors (nutrient enrichment, sedimentation and acidification), alone or in combination, the sequence-based fungal assemblages were more sensitive than other taxonomic groups, especially when multiple stressors were present. Microbial leaf decomposition rates were elevated in sediment-stressed streams whereas decomposition attributable to leaf-shredding macroinvertebrates was accelerated by nutrients and decelerated by sedimentation. Comparison of leaf decomposition results to model output suggested that leaf decomposition rates do not detect effectively the presence of multiple simultaneous disturbances. The rapid development of global microbial database may soon enable species-level identification of leaf-associated fungi, facilitating a more precise and accurate modelling of reference conditions in streams using fungal communities. This development, combined with the sensitivity of aquatic fungi in detecting the presence of multiple human disturbances, makes leaf-associated fungal assemblages an indispensable addition in a stream ecologist’s toolbox.
  • Laakom, Firas; Raitoharju, Jenni; Passalis, Nikolaos; Iosifidis, Alexandros; Gabbouj, Moncef (Institute of Electrical and Electronics Engineers (IEEE), 2022)
    IEEE Access
    Spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines. The main aim is to learn a meaningful low dimensional embedding of the data. However, most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty. Thus, learning directly from raw data can be misleading and can negatively impact the accuracy. In this paper, we propose to model artifacts in training data using probability distributions; each data point is represented by a Gaussian distribution centered at the original data point and having a variance modeling its uncertainty. We reformulate the Graph Embedding framework to make it suitable for learning from distributions and we study as special cases the Linear Discriminant Analysis and the Marginal Fisher Analysis techniques. Furthermore, we propose two schemes for modeling data uncertainty based on pair-wise distances in an unsupervised and a supervised contexts.
  • Hosseinzadeh, Mahboubeh Sadat; Farhadi Qomi, Masood; Naimi, Babak; Roedder, Dennis; Kazemi, Seyed Mandi (2018)
    Species distribution models estimate the relationship between species occurrences and environmental and spatial characteristics. Herein, we used maximum entropy distribution modelling (MaxEnt) for predicting the potential distribution of the Plateau Snake Skink Ophiomorus nuchalis on the Iranian Plateau, using a small number of occurrence records (i.e. 10) and environmental variables derived from remote sensing. The MaxEnt model had a high success rate according to test AUC scores (0.912). A remotely sensed enhanced vegetation index (39.1%), and precipitation of the driest month (15.4%) were the most important environmental variables that explained the geographical distribution of O. nuchalis. Our results are congruent with previous studies suggesting that suitable habitat of O. nuchalis is limited to the central Iranian Plateau, although mountain ranges in western and eastern Iran might be environmentally suitable but not accessible.
  • Janssen, Annette B. G.; Janse, Jan H.; Beusen, Arthur H. W.; Chang, Manqi; Harrison, John A.; Huttunen, Inese; Kong, Xiangzhen; Rost, Jasmijn; Teurlincx, Sven; Troost, Tineke A.; van Wijk, Dianneke; Mooij, Wolf M. (Elsevier, 2019)
    Current Opinion in Environmental Sustainability 36 (2019), 1-10
    Algal blooms increasingly threaten lake and reservoir water quality at the global scale, caused by ongoing climate change and nutrient loading. To anticipate these algal blooms, models to project future algal blooms worldwide are required. Here we present the state-of-the-art in algal projection modelling and explore the requirements of an ideal algal projection model. Based on this, we identify current challenges and opportunities for such model development. Since most building blocks are present, we foresee that algal projection models for any lake on earth can be developed in the near future. Finally, we think that algal bloom projection models at a global scale will provide a valuable contribution to global policymaking, in particular with respect to SDG 6 (clean water and sanitation).