Browsing by Subject "Uncertainty"

Sort by: Order: Results:

Now showing items 1-10 of 10
  • Quist, Liina-Maija; Nygren, Anja (2019)
    Marine extraction accounts for one third of the world's hydrocarbon production. Several analyses suggest that seismic surveys employed in oil exploration harm marine life; however, their long-term impacts have not been extensively studied. We examine debates between fishers, the oil industry, and governmental authorities over the effects of oil explorations in Tabasco, Mexico. The study employs ideas from historical ontology in tracing the contested production of truth-claims about exploration in the context of scientific uncertainty. It shows how actors, through their different engagements with the sea, and with different degrees of power, frame claims about the relations between exploration and fish. We argue that fishers, through their efforts to "think like fish" produce situated knowledges about the effects of oil exploration. They explain a disappearance of fish by their understanding that seismic surveys disturb fish migration, impair the hearing of fish and cause fish death. Oil company and governmental representatives frame the impacts of oil exploration as insignificant by separating environmental and social dimensions, by isolating individual exploration events, and by arguing that possible effects are transitional. Due to scientific indeterminacy, oil exploration is malleable in the hands of powerful political representations that understate its possible impacts on marine socio-environments.
  • Vauhkonen, Jari (2020)
    Key Message Tree-level forest inventory data are becoming increasingly available, which motivates the use of these data for decision-making. However, airborne inventories carried out tree-by-tree typically include systematic errors, which can propagate to objective function variables used to determine optimal forest management. Effects of under-detection focused on the smallest trees on predicted immediate harvest profits and future expectation values were assessed assuming different sites and interest rates. Management decisions based on the erroneous information caused losses of 0-17% of the total immediate and future expected income of Scots pine stands. Context Optimal decisions on how to manage forest stands can depend on the absence or presence of intermediate and understory trees. Yet, these tree strata are likely prone to inventory errors. Aims The aim of this study is to examine implications of making stand management decisions based on data that include systematic errors resembling those typically observed in airborne inventories carried out tree-by-tree. Methods Stand management instructions were developed based on theoretical diameter distribution functions simulated to have different shape, scale, and frequency parameters corresponding to various degrees of under-detection focused on the smallest trees. Immediate harvest income and future expectation value were derived based on various management alternatives simulated. Results Errors in diameter distributions affected the predicted harvest profits and future expectation values differently between the simulated alternatives and depending on site type and interest rate assumptions. As a result, different alternatives were considered as optimal management compared to the use of the error-free reference distributions. In particular, the use of no management or most intensive management alternatives became preferred over alternatives with intermediate harvesting intensities. Certain harvesting types such as thinning from below became preferred more often than what was optimal. The errors did not affect the selection of the management alternative in 71% of the simulations, whereas in the remaining proportion, relying on the erroneous information would have caused losing 2%, on average, and 17% at maximum, of the total immediate and future expected income. Conclusion The effects above might not have been discovered, if the results were validated for inventory totals instead of separately considering the immediate and future income and losses produced by the erroneous decisions. It is recommended not to separate but to integrate the inventory and planning systems for well-informed decisions.
  • Parviainen, Tuuli; Goerlandt, Floris; Helle, Inari; Haapasaari, Päivi; Kuikka, Sakari (2021)
    The risk of a large-scale oil spill remains significant in marine environments as international maritime transport continues to grow. The environmental as well as the socio-economic impacts of a large-scale oil spill could be substantial. Oil spill models and modeling tools for Pollution Preparedness and Response (PPR) can support effective risk management. However, there is a lack of integrated approaches that consider oil spill risks comprehensively, learn from all information sources, and treat the system uncertainties in an explicit manner. Recently, the use of the international ISO 31000:2018 risk management framework has been suggested as a suitable basis for supporting oil spill PPR risk management. Bayesian networks (BNs) are graphical models that express uncertainty in a probabilistic form and can thus support decision-making processes when risks are complex and data are scarce. While BNs have increasingly been used for oil spill risk assessment (OSRA) for PPR, no link between the BNs literature and the ISO 31000:2018 framework has previously been made. This study explores how Bayesian risk models can be aligned with the ISO 31000:2018 framework by offering a flexible approach to integrate various sources of probabilistic knowledge. In order to gain insight in the current utilization of BNs for oil spill risk assessment and management (OSRA-BNs) for maritime oil spill preparedness and response, a literature review was performed. The review focused on articles presenting BN models that analyze the occurrence of oil spills, consequence mitigation in terms of offshore and shoreline oil spill response, and impacts of spills on the variables of interest. Based on the results, the study discusses the benefits of applying BNs to the ISO 31000:2018 framework as well as the challenges and further research needs.
  • Cai, Runlong; Jiang, Jingkun; Mirme, Sander; Kangasluoma, Juha (2019)
    Measuring aerosol size distributions accurately down to similar to 1 nm is a key to nucleation studies, and it requires developments and improvements in instruments such as electrical mobility spectrometers in use today. The key factors characterizing the performance of an electrical mobility spectrometer for sub-3 nm particles are discussed in this study. A parameter named as Pi is proposed as a figure of merit for the performance of an electrical mobility spectrometer in the sub-3 nm size range instead of the overall detection efficiency. Pi includes the overall detection efficiency, the measurement time in each size bin, the aerosol flow rate passing through the detector, and the aerosol-to-sheath flow ratio of the differential mobility analyzer. The particle raw count number recorded by the detector can be estimated using Pi at a given aerosol size distribution function, dN/dlogd(p)( ). The limit of detection for the spectrometer and the statistical uncertainty of the measured aerosol size distribution can also be readily estimated using Pi. In addition to Pi, the size resolution of an electrical mobility analyzer is another factor characterizing the systematic errors originated from particle sizing. Four existing electrical mobility spectrometers designed for measuring sub-3 nm aerosol size distributions, including three scanning/differential mobility particle spectrometers and one differential mobility analyzer train, are examined. Their optimal performance is evaluated using Pi and the size resolution. For example, the Pi value and the size resolution of a diethylene-glycol differential mobility particle spectrometer for 1.5 nm particles are 8.0 x 10(-4) cm(3) and 5.7, respectively. The corresponding relative uncertainty of the measured size distribution is approximately 9.6% during an atmospheric new particle formation event with a dN/dlogd(p) of 5 x 10(5) cm(-3) . Assuming an adjustable sheath flow rate of the differential mobility analyzer, the optimal size resolution is approximately 5-9 when measuring atmospheric new particle formation events.
  • Urraca, Ruben; Huld, Thomas; Javier Martinez-de-Pison, Francisco; Sanz-Garcia, Andres (2018)
    The major sources of uncertainty in short-term assessment of global horizontal radiation (G) are the pyranometer type and their operation conditions for measurements, whereas the modeling approach and the geographic location are critical for estimations. The influence of all these factors in the uncertainty of the data has rarely been compared. Conversely, solar radiation data users are increasingly demanding more accurate uncertainty estimations. Here we compare the annual bias and uncertainty of all the mentioned factors using 732 weather stations located in Spain, two satellite-based products and three reanalyses. The largest uncertainties were associated to operational errors such as shading (bias = - 8.0%) or soiling (bias = - 9.4%), which occurred frequently in low-quality monitoring networks but are rarely detected because they pass conventional QC tests. Uncertainty in estimations greatly changed from reanalysis to satellite-based products, ranging from the gross accuracy of ERA-Interim (+ 6.1(-6.7)(+)(1)(8.)(8)%) to the high quality and spatial homogeneity of SARAH-1 (+ 1.4(-5.3)(+)(5.6)%). Finally, photodiodes from the Spanish agricultural network SIAR showed an uncertainty of (+6.)(9)(-5.4)%, which is far greater than that of secondary standards (+/- 1.5%) and similar to SARAH-1. This is probably caused by the presence of undetectable operational errors and the use of uncorrected photodiodes. Photodiode measurements from low-quality monitoring networks such as SIAR should be used with caution, because the chances of adding extra uncertainties due to poor maintenance or inadequate calibration considerably increase.
  • Zhou, Yanli; Acerbi, Luigi; Ma, Wei Ji (2020)
    Perceptual organization is the process of grouping scene elements into whole entities. A classic example is contour integration, in which separate line segments are perceived as continuous contours. Uncertainty in such grouping arises from scene ambiguity and sensory noise. Some classic Gestalt principles of contour integration, and more broadly, of perceptual organization, have been re-framed in terms of Bayesian inference, whereby the observer computes the probability that the whole entity is present. Previous studies that proposed a Bayesian interpretation of perceptual organization, however, have ignored sensory uncertainty, despite the fact that accounting for the current level of perceptual uncertainty is one the main signatures of Bayesian decision making. Crucially, trial-by-trial manipulation of sensory uncertainty is a key test to whether humans perform near-optimal Bayesian inference in contour integration, as opposed to using some manifestly non-Bayesian heuristic. We distinguish between these hypotheses in a simplified form of contour integration, namely judging whether two line segments separated by an occluder are collinear. We manipulate sensory uncertainty by varying retinal eccentricity. A Bayes-optimal observer would take the level of sensory uncertainty into account-in a very specific way-in deciding whether a measured offset between the line segments is due to non-collinearity or to sensory noise. We find that people deviate slightly but systematically from Bayesian optimality, while still performing "probabilistic computation" in the sense that they take into account sensory uncertainty via a heuristic rule. Our work contributes to an understanding of the role of sensory uncertainty in higher-order perception. Author summary Our percept of the world is governed not only by the sensory information we have access to, but also by the way we interpret this information. When presented with a visual scene, our visual system undergoes a process of grouping visual elements together to form coherent entities so that we can interpret the scene more readily and meaningfully. For example, when looking at a pile of autumn leaves, one can still perceive and identify a whole leaf even when it is partially covered by another leaf. While Gestalt psychologists have long described perceptual organization with a set of qualitative laws, recent studies offered a statistically-optimal-Bayesian, in statistical jargon-interpretation of this process, whereby the observer chooses the scene configuration with the highest probability given the available sensory inputs. However, these studies drew their conclusions without considering a key actor in this kind of statistically-optimal computations, that is the role of sensory uncertainty. One can easily imagine that our decision on whether two contours belong to the same leaf or different leaves is likely going to change when we move from viewing the pile of leaves at a great distance (high sensory uncertainty), to viewing very closely (low sensory uncertainty). Our study examines whether and how people incorporate uncertainty into contour integration, an elementary form of perceptual organization, by varying sensory uncertainty from trial to trial in a simple contour integration task. We found that people indeed take into account sensory uncertainty, however in a way that subtly deviates from optimal behavior.
  • Rahikainen, Mika; Helle, Inari; Haapasaari, Paivi; Oinonen, Soile; Kuikka, Sakari; Vanhatalo, Jarno; Mantyniemi, Samu; Hoviniemi, Kirsi-Maaria (2014)
  • Tao, Fulu; Palosuo, Taru; Rötter, Reimund P.; Díaz-Ambrona, Carlos Gregorio Hernández; Inés Mínguez, M.; Semenov, Mikhail A.; Kersebaum, Kurt Christian; Cammarano, Davide; Specka, Xenia; Nendel, Claas; Srivastava, Amit Kumar; Ewert, Frank; Padovan, Gloria; Ferrise, Roberto; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G.; Salo, Tapio; Dibari, Camilla; Schulman, Alan H. (2020)
    Robust projections of climate impact on crop growth and productivity by crop models are key to designing effective adaptations to cope with future climate risk. However, current crop models diverge strongly in their climate impact projections. Previous studies tried to compare or improve crop models regarding the impact of one single climate variable. However, this approach is insufficient, considering that crop growth and yield are affected by the interactive impacts of multiple climate change factors and multiple interrelated biophysical processes. Here, a new comprehensive analysis was conducted to look holistically at the reasons why crop models diverge substantially in climate impact projections and to investigate which biophysical processes and knowledge gaps are key factors affecting this uncertainty and should be given the highest priorities for improvement. First, eight barley models and eight climate projections for the 2050s were applied to investigate the uncertainty from crop model structure in climate impact projections for barley growth and yield at two sites: Jokioinen, Finland (Boreal) and Lleida, Spain (Mediterranean). Sensitivity analyses were then conducted on the responses of major crop processes to major climatic variables including temperature, precipitation, irradiation, and CO2, as well as their interactions, for each of the eight crop models. The results showed that the temperature and CO2 relationships in the models were the major sources of the large discrepancies among the models in climate impact projections. In particular, the impacts of increases in temperature and CO2 on leaf area development were identified as the major causes for the large uncertainty in simulating changes in evapotranspiration, above-ground biomass, and grain yield. Our findings highlight that advancements in understanding the basic processes and thresholds by which climate warming and CO2 increases will affect leaf area development, crop evapotranspiration, photosynthesis, and grain formation in contrasting environments are needed for modeling their impacts.