Browsing by Subject "decision making"

Sort by: Order: Results:

Now showing items 1-19 of 19
  • Sandström, Saana (Helsingfors universitet, 2015)
    Manufacturing execution systems (MES) are computer systems which are used for controlling and automating manufacturing processes. They are increasingly adapted in pharmaceutical industry. Implementation solutions differ, however, and there is no single solution which would be the optimal one for all facilities. Each manufacturing facility has their unique properties and needs which have to be reflected in the implementation. A successful MES project will bring plenty of benefits such as more efficient use of resources and automated data transfer, but the roll out phase might turn to be problematic if the processes of the organization have not been analysed thoroughly enough at decision making. This creates the need for systematic analysis of possible to-be implementation scenarios which is based on the value-drivers of the organization and considers the decision from multiple viewpoints. This study presents a holistic value driver-based framework with a mathematical weighing method to allow for a systematic and scientifically justified decision for identification of the optimal implementation depth of equipment management (EQM) in MES. A Delphi study method was utilized in this study to create the framework. The framework was developed based on literature and brainstorming sessions with experts and validated by means of a Delphi questionnaire round with expert panel consisting of professionals representing the major stakeholders of MES system in a pharmaceutical manufacturing facility. Classical additive weighing method was applied to create a mathematical basis for valuation and comparison of the scenarios. The robustness of mathematical method was tested by means of a sensitivity analysis. A benchmarking survey was done to obtain information on current implementation solutions and decisions leading to current situation. The presented method not only addresses the costs but also takes into account intangible factors. Intangible factors include aspects such as good manufacturing practice (GMP) quality and user acceptance which are not directly transferable into quantitative units but are crucial both for pharmaceutical industry and the success of the implementation project. The framework describes the decision in the form of a value tree with three main branches, namely GMP, cost and process&organization which cover the main viewpoints important for the decision. The presented method also allows the weighing of different factors according to current needs of the facility and decision in question. Hence, the presented framework leads the decision maker through a systematic and comprehensive analysis of different to-be scenarios for EQM implementation. The benchmarking survey identified three major factors of a successful MES implementation, namely effort in design phase, well-defined processes and close discussion with production. The value drivers valued highest by the expert panelists were related to GMP quality. As a use case, the presented framework was applied in a parenterals clinical manufacturing facility to evaluate six different to-be scenarios and based on the results one of them was selected by the management to be implemented. The results from the use case indicate that the framework is a valuable tool in a decision making process, and encourage the further utilization of the framework in future implementation decisions.
  • Kangas, Jyrki; Loikkanen, Teppo; Pukkala, Timo; Pykäläinen, Jouni (The Society of Forestry in Finland - The Finnish Forest Research Institute, 1996)
    The paper examines the needs, premises and criteria for effective public participation in tactical forest planning. A method for participatory forest planning utilizing the techniques of preference analysis, professional expertise and heuristic optimization is introduced. The techniques do not cover the whole process of participatory planning, but are applied as a tool constituting the numerical core for decision support. The complexity of multi-resource management is addressed by hierarchical decision analysis which assesses the public values, preferences and decision criteria toward the planning situation. An optimal management plan is sought using heuristic optimization. The plan can further be improved through mutual negotiations, if necessary. The use of the approach is demonstrated with an illustrative example, it's merits and challenges for participatory forest planning and decision making are discussed and a model for applying it in general forest planning context is depicted. By using the approach, valuable information can be obtained about public preferences and the effects of taking them into consideration on the choice of the combination of standwise treatment proposals for a forest area. Participatory forest planning calculations, carried out by the approach presented in the paper, can be utilized in conflict management and in developing compromises between competing interests.
  • Laine, Valtteri; Goerlandt, Floris; Banda, Osiris Valdez; Baldauf, Michael; Koldenhof, Yvonne; Rytkönen, Jorma (Elsevier, 2021)
    Marine Pollution Bulletin 171 (2021), 112724
    Several risk management frameworks have been introduced in the literature for maritime Pollution Preparedness and Response (PPR). However, in light of the actual needs of the competent authorities, there is still a lack of framework that is established on a sound risk conceptual basis, addresses the different risk management decision-making contexts of organizations, and provides tools for various risk management questions of this field. To alleviate the limits of existing approaches, this paper introduces a new risk management framework for this purpose, which was developed in cooperation with the competent authorities and other maritime experts. The framework adopts the risk-informed decision-making strategy and includes three aligned components. The first component provides a unified theoretical risk concept to the framework through an interpretation of the Society for Risk Analysis risk approach. The second consists of four ISO 31000:2018 standard based processes focused on different risk management decision-making contexts of the PPR organizations. The third comprises a set of practical risk assessment tools to generate the needed information. A case study provides an example of the functionality of this framework with integrated data from the northern Baltic Sea. To conclude, a risk concept is provided for the PPR authorities and their stakeholders as well as processes for managing the risk and tools for its assessment.
  • Maeda, Eduardo; Haapasaari, Päivi; Helle, Inari; Lehikoinen, Annukka; Voinov, Alexey; Kuikka, Sakari (2021)
    Modeling is essential for modern science, and science-based policies are directly affected by the reliability of model outputs. Artificial intelligence has improved the accuracy and capability of model simulations, but often at the expense of a rational understanding of the systems involved. The lack of transparency in black box models, artificial intelligence based ones among them, can potentially affect the trust in science driven policy making. Here, we suggest that a broader discussion is needed to address the implications of black box approaches on the reliability of scientific advice used for policy making. We argue that participatory methods can bridge the gap between increasingly complex scientific methods and the people affected by their interpretations
  • Räihä, Jouni; Ruokamo, Enni (Elsevier, 2021)
    Energy and Buildings 251 (2021), 111366
    Detached house owners can improve energy efficiency in heating by adding a supplementary heating system alongside the primary mode. Whereas research on primary heating mode adoption is wide, studies focusing solely on the determinants of supplementary heating system adoption is limited. This study examines the determinants of supplementary heating system adoption and consideration in Finland with a survey data collected from a sample of newly built detached house owners. We employ discrete choice modeling to investigate the homeowners’ supplementary heating system choices and interpret the results vis-à-vis the diffusion of innovations literature. The supplementary heating systems under study are solar panel, solar thermal heater, air-source heat pump and water-circulating fireplace. Overall, the findings indicate that homeowners are generally receptive to supplementary heating in Finland. The analyses show that several factors such as age, education, primary heating mode, heating system attributes, location, environmental attitudes and information channels impact the supplementary heating system adoption decision.
  • Tötterman, Henrik (Svenska handelshögskolan, 2008)
    Economics and Society
    This study focuses on self-employed industrial designers and how they emerge new venture ideas. More specifically, this study strives to determine what design entrepreneurs do when they create new venture ideas, how venture ideas are nurtured into being, and how the processes are organized to bring such ideas to the market in the given industrial context. In contemporary times when the concern for the creative class is peaking, the research and business communities need more insight of the kind that this study provides, namely how professionals may contribute to their entrepreneurial processes and other agents’ business processes. On the one hand, the interviews underlying this study suggest that design entrepreneurs may act as reactive service providers who are appointed by producers or marketing parties to generate product-related ideas on their behalf. On the other hand, the interviews suggest that proactive behaviour that aims on generating own venture ideas, may force design entrepreneurs to take considerable responsibility in organizing their entrepreneurial processes. Another option is that they strive to bring venture ideas to the market in collaboration, or by passing these to other agents’ product development processes. Design entrepreneurs’ venture ideas typically emerge from design related starting points and observations. Product developers are mainly engaged with creating their own ideas, whereas service providers refer mainly to the development of other agents’ venture ideas. In contrast with design entrepreneurs, external actors commonly emphasize customer demand as their primary source for new venture ideas, as well as development of these in close interaction with available means of production and marketing. Consequently, design entrepreneurs need to address market demand since without sales their venture ideas may as well be classified as art. In case, they want to experiment with creative ideas, then there should be another source of income to support this typically uncertain and extensive process. Currently, it appears like a lot of good venture ideas and resources are being wasted, when venture ideas do not suite available production or business procedures. Sufficient communication between design entrepreneurs and other agents would assist all parties in developing production efficient and distributable venture ideas. Overall, the findings suggest that design entrepreneurs are often involved simultaneously in several processes that aim at emerging new product related ventures. Consequently, design entrepreneurship is conceptualized in this study as a dual process. This implies that design entrepreneurs can simultaneously be in charge of their entrepreneurial processes, as they operate as resources in other agents’ business processes. The interconnection between activities and agents suggests that these kinds of processes tend to be both complex and multifaceted to their nature.
  • Rosengren, L.M.; Raymond, C.M.; Sell, M.; Vihinen, H. (2020)
    Leverage points from systems research are increasingly important to understand how to support transformations towards sustainability, but few studies have considered leverage points in strengthening adaptive capacity to climate change. The existing literature mainly considers strengthening adaptive capacity as a steady and linear process. This article explores possibilities to fast track positive adaptive capacity trajectories of small-scale farmers in the Northern Region of Ghana. Leverage points were identified by triangulating data from semi-structured interviews with farmers (n=72), key informant interviews (n=7) and focus group discussions (FG1 n=17; FG2 n=20). The results present two ways to approach adaptation planning: 1) using four generic leverage points (gender equality, social learning, information and knowledge, and access to finance) or 2) combining the adaptive capacity and leverage point frameworks, thereby creating 15 associations. The generic points provide a set of topics as a starting point for policy and intervention planning activities, while the 15 associations support the identification of place-specific leverage points. Four benefits of using leverage points for adaptive capacity in adaptation planning were identified: guidance on where to intervene in a system, ability to deal with complex systems, inclusion of both causal and teleological decision-making, and a possibility to target deep, transformative change. © 2021 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
  • Orenius, Olli (Helsingfors universitet, 2015)
    People tend to first look evenly at both objects when they are making a decision between them. Gaze starts to get directed more to the object which is eventually chosen before selecting it consciously. One explanation for the phenomenon is the gaze cascade model, which states, that directing of gaze is related to making decisions based on preference. It also states that the gaze bias is influenced by cognitive models people have about the perceived stimulus. Gaze bias should be greater the less one has previous experience, i.e. cognitive models about the objects. This study evaluates these two assumptions by the gaze cascade model. 64 subjects participated in the experiment from which 54 subjects' data was used in the final analysis (average age 27.7, range 18–47 years). Stimuli consisted from images of cheese packages which are sold in Finland, images of cheese packages sold abroad and abstract images. The assumption was that the subjects would have most cognitive models about the packages sold in Finland, the second most about the packages sold abroad and least about the abstract images. Subjects made choices about the stimuli by preference, size and ecology. Direction of gaze during decision making was recorded with gaze tracking goggles. Likelihood that the subjects were looking at the stimulus which they chose was estimated for 53 sampling points, 1.77 seconds before the conscious decision. Likelihood scores were fitted to sigmoid functions by least square method. The amount of gaze bias during different decision making instructions and stimuli was compared with a two sample Kolmogorov-Smirnov test. Gaze bias was greater the less subjects had previous experience about the stimuli. Large gaze bias was perceived also in other tasks than preference selection. For this reason gaze cascade model does not seem to provide a good explanation for gaze bias during decision making. There was also found an interaction between the used decision criteria and stimulus type. Earlier studies have usually focused on examining either the effect of the decision criteria or the used stimulus type to the gaze bias. This study shows that the interaction between decision criteria and stimulus type should be taken into consideration when examining the gaze bias during decision making. One possible explanation for the interaction might be the difficultness of the choice. Gaze bias during decision making might be especially related to situations where the difference between two stimuli evaluated by given criteria is very small.
  • Lindén, Carl-Gustav (Helsinki University Press, 2021)
    Kingdom of Nokia tells a fascinating story of corporatism in Finland. How did the mobile phone giant Nokia make the Finnish elite willing to serve the interests of the company? Nokia became a global player in mobile communications in the 1990s, and helped establish Anglo-Saxon capitalism in Finland. Through its success and strong lobbying, the company managed to capture the attention of Finnish politicians, civil servants, and journalists nationwide. With concrete detailed examples, Kingdom of Nokia illustrates how Nokia organised lavishing trips to journalists and paid direct campaign funding to politicians to establish its role at the core of Finnish decision-making. As a result, the company influenced important political decisions such as joining the European Union and adopting the euro, and further, Nokia even drafted its own law to serve its special interests. All this in a country considered one of the least corrupt in the world. Carl-Gustav Lindén is an Associate Professor of Data Journalism at the University of Bergen and Associate Professor (Docent) at the University of Helsinki. Lindén’s background is in journalism, and he was a business journalist working for newspapers, magazines, and television until 2012, when he turned to academia.
  • Vanhatalo, Kalle O. (Helsingfors universitet, 2012)
    Reliable forest inventory data creates the foundation for quality forest planning. The quality of forest inventory data is emphasized when the planning is tried to make as optimal as possible compared to the aims. It is important for the choice of measures and timing that the forest variables used in the decision making are as accurate as possible. Unreliable information at the starting point and the wrong conclusions as a result may lead to inoptimality losses. Additional forest surveys also bring unnecessary expenses. Forest inventory is a financial investment for forest owners. One should not be content with information that is too incorrect and inexpensive, for the inoptimality losses can rise up to be higher than the investment expenses. The meaning of quality in forest inventory information was studied in this thesis. The aim was to find out how the various precision levels of markings in forest inventory information have an effect on both the choice of cuttings and timing in forests with various structures. This thesis aimed to find limits of quality requirements for forest inventory information which enable forest planning that is compatible with the aims. The quality aims of the planning were set according to the decision maker and the employer of the research, UPMKymmene plc. The research material consisted of a set of 337 stands provided by UPM Forest. The development of the stands was simulated by SIMO software. The simulation was made by assuming that the forest inventory data was flawless and by adding error into it. The forest variables that had error added into them were the basal area, average diameter, average height and site index. The simulations were made with a single error and combination error. In the case of single error one variable was added error systematically per cent by per cent up to -30–30 %. In the case of combination the error in one variable was added systematically and in the others randomly. The assumption was that the errors in different variables will not correlate. Measures and measure timings planned for each stand with reference data were compared to plans received by inaccurate starting point information. The planning period was ten years and the planned actions were thinning, clearcutting and no action. The error in the initial data clearly lessened the quality of planning. Over or underestimations of over 10% simulated to the basal area or average diameter alone led to the average accuracy of cutting measures going below the target level of 90 %. While the error grew the result of the planning weakened even more. The average height’s relevance of error in the quality of planning was minor but, instead, the relevance of error of the site index was significant. The site index error was clearly more damaging in pine forests than in spruce forests. The reason for this is likely to be that in pine forests there are three various thinning models and renewal limitations (ct, vt, mt), while in spruce forests there are only two (mt, omt). On grounds of the results of the research relatively small errors in basal area, average diameter and site index can cause a several years’ deviation in cutting planning. The relevance of error varied a great deal in forests of different structure. In the case of measure planning the forest inventory data gathered from well managed young forests does not need to be especially accurate, for the next measure is usually further in the future than the next investment. The accuracy demand of information is also not great in overly thick forests or in forests which have clearly surpassed the renewal limit. In these cases there is no obscurity about the next measure. In the accuracy of inventory one should pay the most attention on stands with young and grown forest cover which have had their last forestry measure done at least ten years ago. The research problem was approached from the point of view of measure accuracy. However, in the future it would be useful to research the effect of error in forest inventory information from the point of view of gain. As a result, one could have more factors in the research, such as timber logistic and planning of forests to be cut, in which the more accurate forest inventory data would be useful.
  • Pesonen, Mauno; Kettunen, Arto; Räsänen, Petri (The Society of Forestry in Finland - The Finnish Forest Research Institute, 1995)
    The factors affecting the non-industrial, private forest landowners' (hereafter referred to using the acronym NIPF) strategic decisions in management planning are studied. A genetic algorithm is used to induce a set of rules predicting potential cut of the landowners' choices of preferred timber management strategies. The rules are based on variables describing the characteristics of the landowners and their forest holdings. The predictive ability of a genetic algorithm is compared to linear regression analysis using identical data sets. The data are cross-validated seven times applying both genetic algorithm and regression analyses in order to examine the data-sensitivity and robustness of the generated models. The optimal rule set derived from genetic algorithm analyses included the following variables: mean initial volume, landowner's positive price expectations for the next eight years, landowner being classified as farmer, and preference for the recreational use of forest property. When tested with previously unseen test data, the optimal rule set resulted in a relative root mean square error of 0.40. In the regression analyses, the optimal regression equation consisted of the following variables: mean initial volume, proportion of forestry income, intention to cut extensively in future, and positive price expectations for the next two years. The R2 of the optimal regression equation was 0.34 and the relative root mean square error obtained from the test data was 0.38. In both models, mean initial volume and positive stumpage price expectations were entered as significant predictors of potential cut of preferred timber management strategy. When tested with the complete data set of 201 observations, both the optimal rule set and the optimal regression model achieved the same level of accuracy.
  • Pietilä, Ilona (Helsingfors universitet, 2009)
    There is need for information about stands and their future development in forest planning decision making. This information is collected by inventories. In general inventory is repeated with some before-hand set intervals, irrespective of the method. Between inventories information is updated with growth models. Both inventory and using of growth models causes errors in forest planning results, for example in management options. Erroneous predictions can lead to wrong conclusions and inoptimal decisions. If the optimal result is known, economical losses caused by wrong conclusions can be described with so called inoptimality losses. The aim of this study was to answer the question how long forest inventory information, updated with growth models, can be used in forest planning purposes. Study approach was economical, so evaluation of information`s usefulness was based on inoptimality losses which arise when development of the stand is predicted incorrectly with growth models. The study material included 99 stands. Their development was simulated with the SIMO software for 60 years from present. In the 60 years period influencies of growth prediction errors were studied with inventory periods which lengths were 5, 10, 15, 20, 30 and 60 years. It was assumed that new error-free forest inventory information was received in the beginning of each of the inventory periods. In order to study effects of different inventory periods, it was assumed that the growth models were able to predict the true development of stands. Erroneous developments were yielded with error model which was developed for this study and added to the growth models. Inoptimality losses were calculated with the information derived from the optimization of stands` true and erroneous developments. Inoptimality losses increased when the inventory period became longer. Absolute inoptimality loss was approximately 230 eur/ha when the inventory period was 5 years and approximately 860 eur/ha when the inventory period was 60 years. Relative inoptimality loss was 3,3 % when the inventory period was 5 years and 11,6 % when the inventory period was 60 years. The average inoptimality losses were different between different development classes, site classes and main tree species. Study results show that the length of the updating period has an effect on the developing economical losses. It seems also that the inventory period should be different for example in different development classes. However, it is difficult to specify the optimal updating period because total losses are a sum of losses of inventory errors, losses of growth prediction errors and losses caused by other uncertainty sources. The effects of both inventory errors and growth prediction errors are different in different kinds of stands. So estimation of total losses and estimation of inoptimality losses caused by different error sources requires more research.
  • Lehtomäki, Joona Aleksi; Moilanen, Atte Jaakko; Toivonen, Tuuli Kaarina; Leathwick, John (University of Helsinki, 2016)
    Zonation is decision support software for land use planning, including uses such as traditional design of conservation area networks or spatial impact avoidance. It is capable of data rich, large scale, high-resolution spatial prioritization. Whether using Zonation for scientific research or in real-life planning, running a successful project involves several project stages that often proceed in a somewhat iterative fashion. This document provides an overview of what those stages are, and what types of issues should be considered when planning to use Zonation. This information is intended for any individual or organization that is considering making use of Zonation. The topics addressed here are those encountered before and after the Zonation prioritization analysis itself: - Budgeting for time and resources - Setting objectives and planning how to meet them - Building a model for spatial prioritization - Data requirements and pre-processing - Setting up and organizing Zonation input files - Visualizing and interpreting the results - Creating planning products Zonation projects are also demonstrated through a set of case-studies that range from national to global scales.
  • Sääksvuori, Lauri (2007)
    Markets are the necessary prerequisite for human development. The freehold of a property and the freedom of exchange are the bedrocks of individual and societal well-being. However, economic research has proved that the markets do not efficiently allocate goods under asymmetric information. The affluence through free markets is dependent on others whose behavior we do not know or even fully understand. Conventionally, attempts to solve the problems of imperfect information have relied on jurisdiction and establishment of hierarchical organizations. The rise of the Internet has lately revolutionized the customs of social and economic exchange. Electronic marketplaces span the boundaries of cultural and juristically inconsistent territories, as a result, the prevailing contract monitoring turns out to be inadequate. Should the virtual exchange obey existing laws, the transaction costs may top the benefits of trade, and thus prevent otherwise mutually valuable transactions. In this study, we examine conditions for the endogenously emerging markets based on trust and reputation. The analysis is focused on the effects of different forms of feedback information in markets that suffer from moral hazard due to sequential trading. The study presents data-oriented evidence on why and when people trust each other in economic transactions. Electronic markets, particularly electronic auctions, are presented as the primary application context for the feedback system based on trust and reputations. The experimental data for the research were collected in a laboratory experiment taking advantage of newly designed and implemented computer application. The participants in experimental sessions were all students at the University of Helsinki. The contribution of the thesis is threefold. Firstly, we develop further the idea of tailored trustworthiness aggregates. Secondly, we introduce a novel extensive form game to model trust decisions with endogenous payoff formation. This game design unites the ordinary Trust Game with auctions. Thirdly, based on the unique data from the experiment, we tackle the motivation behind the individual’s trust decision. The experimental results in this study demonstrate that, in an economic exchange, the economic agent behaves simultaneously both fairly and selfishly. Furthermore, the expression of mixed motives appears to be sensitive to the variations in the flow of information. The data collected for this study clearly indicate that the augmentation of information improves the economic efficiency of endogenously organized marketplaces. Market efficiency does not require a large number of participants, complete information or full economic understanding, but incentives to trust each other.
  • Hahtola, Kauko (Suomen metsätieteellinen seura, 1973)
  • Koski, Vilja; Kotamäki, Niina; Hämäläinen, Heikki; Meissner, Kristian; Karvanen, Juha; Kärkkäinen, Salme (Elsevier, 2020)
    Science of the Total Environment 726 (2020), 138396
    Uncertainty in the information obtained through monitoring complicates decision making about aquatic ecosystems management actions. We suggest the value of information (VOI) to assess the profitability of paying for additional monitoring information, when taking into account the costs and benefits of monitoring and management actions, as well as associated uncertainty. Estimating the monetary value of the ecosystem needed for deriving VOI is challenging. Therefore, instead of considering a single value, we evaluate the sensitivity of VOI to varying monetary value. We also extend the VOI analysis to the more realistic context where additional information does not result in perfect, but rather in imperfect information on the true state of the environment. Therefore, we analytically derive the value of perfect information in the case of two alternative decisions and two states of uncertainty. Second, we describe a Monte Carlo type of approach to evaluate the value of imperfect information about a continuous classification variable. Third, we determine confidence intervals for the VOI with a percentile bootstrap method. Results for our case study on 144 Finnish lakes suggest that generally, the value of monitoring exceeds the cost. It is particularly profitable to monitor lakes that meet the quality standards a priori, to ascertain that expensive and unnecessary management can be avoided. The VOI analysis provides a novel tool for lake and other environmental managers to estimate the value of additional monitoring data for a particular, single case, e.g. a lake, when an additional benefit is attainable through remedial management actions.
  • Suurnäkki, Jessi (Helsingin yliopisto, 2019)
    Universal Internet has a major role in facilitating information search for consumers. What visitors do online, how their behaviour can be predicted and influenced are one of the most important questions that web site developers and marketing specialists try to figure out. A comprehensive understanding of online behaviour has become a necessity for the success of websites. As there are more and more websites offering similar products to consumers, it is significant to increase the usability of the website and make it easy to find. While some consumers go directly to websites, some use search engines to reach desired websites and others browse websites via referral links. The main research question for this thesis was to analyse what kind of insight Web Analytics reveals about the characteristics of website visitors. The aim for this case study was to highlight with different metrics, the amount of traffic the current marketing tactics have generated and to discuss possible improvements. The theoretical framework of the study was based on the key aspects of the consumer decision process, which focuses on the search of information and the comparison of alternatives before the actual purchase transaction. The study of online purchase behaviour and information search by Comegys et. al. (2006) was in central role of the theoretical framework. As understanding and analysing key metrics played a pivotal role in this research, Plaza (2009) views on the proposed importance of adopting key metrics were used as the basis for the main research question. A pharmaceutical company was the cooperative company of this thesis. The research of the thesis was based on a case study inspecting the web analytics data of the case company’s Finnish and Danish websites. The characteristics of web site visitors were investigated through Google Analytics and web site visitors’ actions on the sites depending on their traffic source were also examined. All the relevant data concerning of consumer behaviour and decision process was collected from the Google Analytics data. The case study of this thesis examined eleven different web metrics collected from two websites during a period of twelve months. With Google Analytics data, the number of visits on a web site and the source of traffic, including organic results from search engines, links from referral web pages or direct access by entering the URL into the web browser was analysed. All these three traffic sources were viewed as different kinds of consumer behaviour. The findings in this study showed that for the case company’s Finnish website, visits generated by search engines bring the most traffic among the selected traffic sources followed by direct traffic and referral traffic. As for Danish web site, the organic traffic remains the source of most traffic while referral visits are the second largest group leaving direct traffic to be the smallest group. The findings from the data show that as the theoretical literature proposed, return visitors did spend a longer time on the site and viewed more pages than new visitors. Both visitor groups - new and returning visitors - spent more time at the web site when they used direct traffic. Findings show that case company websites were highly comparable. Key findings relating to traffic sources and demographics showed only minor differences and can therefore be compared with full confidence.
  • Ravaja, Niklas; Bente, Gary; Kätsyri, Jari; Salminen, Mikko; Takala, Tapio (2018)
    We examined the effects of the emotional facial expressions of a virtual character (VC) on human frontal electroencephalographic (EEG) asymmetry (putatively indexing approach/withdrawal motivation), facial electromyographic (EMG) activity (emotional expressions), and social decision making (cooperation/defection). In a within-subjects design, the participants played the Iterated Prisoner's Dilemma game with VCs with different dynamic facial expressions (predefined or dependent on the participant's electrodermal and facial EMG activity). In general, VC facial expressions elicited congruent facial muscle activity. However, both frontal EEG asymmetry and facial EMG activity elicited by an angry VC facial expression varied as a function of preceding interactional events (human collaboration/defection). Pre-decision inner emotional-motivational processes and emotional facial expressions were dissociated, suggesting that human goals influence pre-decision frontal asymmetry, whereas display rules may affect (pre-decision) emotional expressions in human-VC interaction. An angry VC facial expression, high pre-decision corrugator EMG activity, and relatively greater left frontal activation predicted the participant's decision to defect. Both post-decision frontal asymmetry and facial EMG activity were related to reciprocal cooperation. The results suggest that the justifiability of VC emotional expressions and the perceived fairness of VC actions influence human emotional responses.