Browsing by Subject "cloud computing"

Sort by: Order: Results:

Now showing items 1-6 of 6
  • Münch, Jürgen; Department of Computer Science; Software Systems Engineering research group / Jürgen Münch; Empirical Software Engineering research group (2013)
    The seminar on cloud-based software engineering in 2013 covered many interesting topics related to cloud computing and software engineering. These proceedings focus on decision support for moving to the cloud, on opportunities that cloud computing provides to software engineering, and on security aspects that are associated to cloud computing. Moving to the Cloud – Options, Criteria, and Decision Making: Cloud computing can enable or facilitate software engineering activities through the use of computational, storage and other resources over the network. Organizations and individuals interested in cloud computing must balance the potential benefits and risks which are associated with cloud computing. It might not always be worthwhile to transfer existing services and content to external or internal, public or private clouds for a number of reasons. Standardized information and metrics from the cloud service providers may help to make the decision which provider to choose. Care should be taken when making the decision as switching from one service provider to another can be burdensome due to the incompatibilities between the providers. Hardware in data centers is not infallible: the equipment that powers cloud computing services is as prone to failure as any computing equipment put to high stress which can have an effect on the availability of services. Software Engineering – New Opportunities with the Cloud: Public and private clouds can be platforms for the services produced by parties but the cloud computing resources and services can be helpful during software development as well. Tasks like testing or compiling - which might take a long time to complete on a single, local, workstation - can be shifted to run on network resources for improved efficiency. Collaborative tools that take advantage of some of the features of cloud computing can also potentially boost communication in software development projects spread across the globe. Security in the Cloud – Overview and Recommendations: In an environment where the resources can be shared with other parties and controlled by a third party, security is one matter that needs to be addressed. Without encryption, the data stored in third-party-owned network storage is vulnerable and thus secure mechanisms are needed to keep the data safe. The student seminar was held during the 2013 spring semester, from January 16th to May 24th, at the Department of Computer Science of the University of Helsinki. There were a total of 16 papers in the seminar of which 11 were selected for the proceedings based on the suitability to the three themes. In some cases, papers were excluded in order to be published elsewhere. A full list of all the seminar papers can be found from the appendix. We wish you to have an interesting and enjoyable reading experience with the proceedings.
  • Sarapalo, Joonas (Helsingin yliopisto, 2020)
    The page hit counter system processes, counts and stores page hit counts gathered from page hit events from a news media company’s websites and mobile applications. The system serves a public application interface which can be queried over the internet for page hit count information. In this thesis I will describe the process of replacing a legacy page hit counter system with a modern implementation in the Amazon Web Services ecosystem utilizing serverless technologies. The process includes the background information, the project requirements, the design and comparison of different options, the implementation details and the results. Finally, I will show how the new system implemented with Amazon Kinesis, AWS Lambda and Amazon DynamoDB has running costs that are less than half of that of the old one’s.
  • Kovala, Jarkko (Helsingin yliopisto, 2020)
    Internet of Things (IoT) has the potential to transform many domains of human activity, enabled by the collection of data from the physical world at a massive scale. As the projected growth of IoT data exceeds that of available network capacity, transferring it to centralized cloud data centers is infeasible. Edge computing aims to solve this problem by processing data at the edge of the network, enabling applications with specialized requirements that cloud computing cannot meet. The current market of platforms that support building IoT applications is very fragmented, with offerings available from hundreds of companies with no common architecture. This threatens the realization of IoT's potential: with more interoperability, a new class of applications that combine the collected data and use it in new ways could emerge. In this thesis, promising IoT platforms for edge computing are surveyed. First, an understanding of current challenges in the field is gained through studying the available literature on the topic. Second, IoT edge platforms having the most potential to meet these challenges are chosen and reviewed for their capabilities. Finally, the platforms are compared against each other, with a focus on their potential to meet the challenges learned in the first part. The work shows that AWS IoT for the edge and Microsoft Azure IoT Edge have mature feature sets. However, these platforms are tied to their respective cloud platforms, limiting interoperability and the possibility of switching providers. On the other hand, open source EdgeX Foundry and KubeEdge have the potential for more standardization and interoperability in IoT but are limited in functionality for building practical IoT applications.
  • Lindell, Rony (Helsingfors universitet, 2016)
    Next-generation sequencing has evolved during the past 10 years to become the go-to method for genome-wide analysis projects. Based on parallelizable PCR methods adopted from the traditional Sanger sequencing, NGS platforms can produce massive amounts of genetic information in a single run and read an entire DNA molecule within a day. The immense amount of nucleotide sequence data produced by a single sample has brought us to an era of algorithmic optimization for analysis and guring out parallelization schema. For cohort projects generally cloud based systems are used due to vast computing power requirements. Anduril is an integration and parallelization framework well suited for NGS analysis, as is shown in this study. After a brief review of the golden standard methods of NGS analysis, we describe the incorporation of the main tools into the new sequencing bundle for Anduril. Tools for alignment (BWA, Bowtie), recalibration (GATK, Picard-tools) and variant calling (GATK, Samtools, VarScan) are in main focus. The Best Practice of Broad Institute, creators of The Genome Analysis Toolkit (GATK), has been a big inspiration in the creation of our sequencing pipeline. The evolution of sequencing bundle tools into a pipeline is discussed through three separate project examples. First, a small group of 8 chronic myeloid leukemia patient samples were analysed after implementation of the main tools of the pipeline. The results were consistent with previous results, but no novel relevant mutations were found. Second, exome sequencing data from 180 breast cancers with controls available in TCGA (The Cancer Genome Atlas) were processed for use in various projects in our lab. The example showed the power of Anduril in gross cohort analysis projects, enabling automatic parallelization and intelligent work ow management system. Third, we analysed exome data from 330 TCGA ovarian cancers with controls and created a prototypical set of database components for creation of a database of annotated variants for use in analytical queries. Compared to other integration frameworks (e.g. GATK, Crossbow and Hadoop), Anduril is a robust contender for the programming oriented scientist. As cloud computing is becoming at an increasing rate a requirement in large genome-wide analysis projects, Anduril provides an e ective generalizable framework for adding tools, creating pipelines and executing entire work ows on multi-nodal computing servers. As technology advances and available computational resources grow, fast multi-processor analysis can be incorporated into health care more and more for detection of disease causing genes, medication kinetics altering polymorphisms and cancer driving mutations in an everyday setting.
  • Harri, Ari-Matti; Schmidt, Walter; Romero, Pilar; Vázquez, Luis; Barderas, Gonzalo; Kemppinen, Osku; Aguirre, Carlos; Vázquez-Poleti, Jose Luis; Llorente, Ignacio M.; Haukka, Harri; Paton, Mark (2012)
    Raportteja - Rapporter - Reports 2012:2
    Abstract We present a general approach to study solar eclipses by Phobos on Mars: its parameterization and prediction.The validation of the model and the involved parameters is made with the already observed eclipses by previous Mars missions. Eclipse prediction is applied for the past Mars lander missions: Viking, Pathfinder and Phoenix, as well as for the future Mars MetNet Precursor Mission. A successful detection of eclipses could be used for the localization of landers and to study atmospheric properties. We also consider the data analysis, with special emphasis in the tomographic method to identify events which are very localized in space and time. Large computation requirements are needed for the implemented methods. To this propose an efficient Cloud Computing Network Infrastructure has been used.
  • Mohan, Nitinder; Corneo, Lorenzo; Zavodovski, Aleksandr; Bayhan, Suzan; Wong, Walter; Kangasharju, Jussi (ACM, 2020)
    Edge computing has gained attention from both academia and industry by pursuing two significant challenges: 1) moving latency critical services closer to the users, 2) saving network bandwidth by aggregating large flows before sending them to the cloud. While the rationale appeared sound at its inception almost a decade ago, several current trends are impacting it. Clouds have spread geographically reducing end-user latency, mobile phones? computing capabilities are improving, and network bandwidth at the core keeps increasing. In this paper, we scrutinize edge computing, examining its outlook and future in the context of these trends. We perform extensive client-to-cloud measurements using RIPE Atlas, and show that latency reduction as motivation for edge is not as persuasive as once believed; for most applications the cloud is already 'close enough' for majority of the world's population. This implies that edge computing may only be applicable for certain application niches, as opposed to a general-purpose solution.