Browsing by Issue Date

Sort by: Order: Results:

Now showing items 1-20 of 40
  • Lindman, Juho (Universidad de Talca, 2014)
  • Eränti, Veikko; Lindman, Juho (Valtiotieteellinen yhdistys, 2014)
  • Samuelson, Olov; Björk, Bo-Christer (Elsevier BV, 2014)
    The rapid development of IT technology has in the past three decades created opportunities for faster and more efficient processes as well as innovative new working methods in the building industry. This paper presents the results of a longitudinal survey-based study (the ”IT barometer”) of IT use in the Swedish building industry, conducted at several intervals over the period 1998-2011. The results show a rapid increase in general IT use as well as in the use of sector-specific tools. Improving communication and information sharing is a strong driving force for taking IT into use, for instance technologies such as EDM and EDI, although the adoption of the more complex applications (ie BIM) is slower. Interestingly “demands from employees” has over the years become a very important reason for companies to increase their IT use. Leading areas for planned IT investments include document handling and mobile equipment, with BIM technology rather low on the list.
  • Nyman, Linus Morten (2013)
    While significant factors that affect the open source community’s interest to participate in a development project have been studied, there has been little focus on the motivating factors that can cause a contributor to become a competitor by utilizing the right to fork a program i.e., to copy an existing program’s code base and use it to begin a separate development. The right to copy an existing program’s code base and use it to begin a separate development is guaranteed by all open source licenses. However, this right to fork a program is rarely exercised. Indeed, there is strong social pressure against code forking stemming from the negative side effects of code forking, such as conflict and duplicated efforts among developers. This paper details the events that led Widenius, the founder of the MySQL project, to decide to fork MariaDB from MySQL. Our findings confirm the previously held notion that there is a high threshold for starting a competing fork. While the few studies that exist of competitive forks find the reasons to be due to disagreement among developers, in the case of MariaDB the fork was caused by Widenius’ concerns regarding the uncertainty of the future freedom and openness of the MySQL codebase. This article makes three contributions. Firstly, it further validates the existing notion that there is a strong threshold to starting a competing fork. Secondly, it offers an in-depth analysis of the events and motivations behind the birth of a fork. Thirdly, it contributes to theory by introducing the freedom factor hypothesis: limiting either developers’ freedoms to contribute to a project or the freedom inherent in a project's license increases the likelihood of a fork.
  • Lindman, Juho; Rossi, Matti; Tuunainen, Virpi (2013)
  • Laakso, Mikael; Björk, Bo-Christer (John Wiley & Sons, Inc., 2013)
    Delayed open access (OA) refers to scholarly articles in subscription journals made available openly on the web directly through the publisher at the expiry of a set embargo period. Though a substantial number of journals have practiced delayed OA since they started publishing e-versions, empirical studies concerning open access have often overlooked this body of literature. This study provides comprehensive quantitative measurements by identifying delayed OA journals, collecting data concerning their publication volumes, embargo lengths, and citation rates. Altogether 492 journals were identified, publishing a combined total of 111 312 articles in 2011. 77,8 % of these articles were made open access within 12 months from publication, with 85,4 % becoming available within 24 months. A journal impact factor analysis revealed that delayed OA journals have on average twice as high average citation rates compared to closed subscription journals, and three times as high as immediate OA journals. Overall the results demonstrate that delayed OA journals constitute an important segment of the openly available scholarly journal literature, both by their sheer article volume as well as by including a substantial proportion of high impact journals.
  • Lindman, Juho; Riepula, Mikko; Rossi, Matti; Marttiin, Pentti (Springer, 2013)
  • Björk, Bo-Christer; Solomon, David (Association of Learned and Professional Society Publishers, 2012)
    The article processing charge (APC) is currently the primary method of funding Open Access peer reviewed journals. The pricing principles of 77 OA publishers publishing over 1000 journals using APCs were studied and classified. The most common method is the fixed single fee, which can either be the same for all of a publisher’s journals or individually determined for each journal. Fees are usually only levied for publication of accepted papers, but there are some journals that also charge for submission. Instead of fixed prices many publishers charge by the page or have multi-tiered fees depending on the length of articles. The country of origin of the author can also influence the pricing, in order to facilitate publishing for authors from developing countries.
  • Laakso, Mikael; Kiviniemi, Arto (International Council for Research and Innovation in Building and Construction, 2012)
    IFC (Industry Foundation Classes) is an open and standardized data model intended to enable interoperability between building information modeling software applications in the AEC/FM industry. IFC has been in development by an industry consortium since 1994, and since the start of the effort, the evolving industry context, standardization organization, resource availability, and technology development have exposed the standardization process to a dynamic environment. While the overarching mission of IFC standardization has always been to enable interoperability between AEC/FM software applications, the approach for how best to operationalize that mission has changed over the years. Through a literature review supported by the general theory on IT standardization, this study follows the development process of the IFC standard from its origins in the early 1990s to its latest activities in 2012. The end result is both a descriptive review of the history of IFC standardization and the establishment of an initial connection to IT standardization research for the IFC standard by profiling the effort in accordance with existing IT standardization theories and typologies. The review highlights the evolution of IFC standardization through several distinct phases, and its gradual movement from emphasizing technical architecture development towards growing involvement in specifying the processes facilitating its use. The organization behind the standard has also seen changes in its modus operandi, from initially being a closed and loosely coupled alliance to evolving into a consortium incorporating open hybrid standardization, where a formal standards body publishes the standards prepared by the consortium. The consortium has faced many challenges compiling an ambitious interoperability standard with few resources, and were it not for the growing demand for the standard provided by public actors, momentum and enthusiasm for the effort might have petered out due to slow market uptake and low use of the data standard in actual construction projects thus far. While this paper does not investigate the adoption phenomenon in-depth, the moderate uptake of the standard can perhaps be explained to be a symptom of the slow adoption of collaborative model-based construction processes and industry reluctance to switch over to new IT tools, which in turn are prerequisites for the existence of demand for an open interoperability standard.
  • Lindman, Juho; Rossi, Matti; Puustell, Anna (I E E E, 2011)
    Choosing Open Source Software License and Corresponding Business Model
  • Samuelson, Olle; Björk, Bo-Christer (2011)
    Three strategically important uses of IT in the construction industry are the storage and management of project documents on webservers (EDM), the electronic handling of orders and invoices between companies (EDI) and the use of 3-D models including non-geometrical attributes for integrated design and construction (BIM). In a broad longitudinal survey study of IT use in the Swedish Construction Industry the extent of use of these techniques was measured in 1998, 2000 and 2007. The results showed that EDM and EDI are currently already well-established techniques whereas BIM, although it promises the biggest potential benefits to the industry, only seems to be at the beginning of adoption. In a follow-up to the quantitative studies, the factors affecting the decisions to implement EDM, EDI and BIM as well as the actual adoption processes, were studied using semi-structured interviews with practitioners. The theoretical basis for the interview studies was informed by theoretical frameworks from IT-adoption theory, where in particular the UTAUT model has provided the main basis for the analyses presented here. The results showed that the decisions to take the above technologies into use are made on three differ- ent levels: the individual level, the organizational level in the form of a company, and the organiza- tional level in the form of a project. The different patterns in adoption can to some part be explained by where the decisions are mainly taken. EDM is driven from the organisation/project level, EDI mainly from the organisation/company level, and BIM is driven by individuals pioneering the technique.
  • Aaltonen, Aleksi (2011)
    How does a new medium create its audience? This study takes the business model of commercial media as its starting point and identifies industrial audience measurement as a constitutive operation in creating the sellable asset of advertising- funded companies. The study employs a qualitative case study design to analyse how a mobile virtual network operator (MVNO) company harnesses digital behavioural records generated by computational network infrastructure to turn network subscribers into an advertising audience product. The empirical evidence is based on a three-months intensive fieldwork at the company office. The analysis reveals comprehensiveness, openness and granularity as the historically new attributes of computational data vis-à-vis traditional audience measurement arrangements. These attributes are then juxtaposed with four kinds of business analytical operations (automatic data aggregation procedures, the use of software reporting tools, organizational reporting practices and custom analyses) observed at the research site to assess how does computational media environment rule key audiencemaking practices. Finally, the implications of this analytical infrastructure are reflected upon three sets of organizational practices. The theoretical framework for the analysis is composed by critically assessing constructivist approaches (SCOT, ANT and sociomateriality) for studying technology and by discussing an approach inspired by critical realism to overcome their limitations with respect to the objectives of the study. The findings contribute toward innovating new digital services, information systems (IS) theory and the study of media audiences. The case opens up considerable complexity involved in establishing a new kind of advertising audience and, more generally, a platform business. Sending out advertisements is easy compared to demonstrating that somebody is actually receiving them. The three computational attributes both extend and provide summative validity for mid-range theorizing on how computational objects mediate organizational practices and processes. Finally, the analysis reveals an interactive nature of digital audience stemming from the direct and immediate behavioural feedback in an audiencemaking cycle.
  • Aaltonen, Aleksi (2011)
    In this paper we propose a theoretical framework to understand the governance of internet-mediated social production. Focusing on one of the most popular websites and reference tools, Wikipedia, we undertake an exploratory theoretical analysis to clarify the structure and mechanisms driving the endogenous change of a large-scale social production system. We argue that the popular transactions costs approach underpinning many of the analyses is an insufficient framework for unpacking the evolutionary character of governance. The evolution of Wikipedia and its shifting modes of governance can be better framed as a process of building a collective capability, namely the capability of editing and managing a new kind of encyclopedia. We understand Wikipedia evolution as a learning phenomenon that gives over time rise to governance mechanisms and structures as endogenous responses to the problems and conditions that the ongoing development of Wikipedia itself has produced over the years. Finally, we put forward five empirical hypotheses to test the theoretical framework.
  • Nyman, Linus Morten; Mikkonen, Tommi (2011)
    A project fork occurs when software developers take a copy of source code from one software package and use it to begin an independent development work that is maintained separately from its origin. Although forking in open source software does not require the permission of the original authors, the new version, nevertheless, competes for the attention of the same developers that have worked on the original version. The motivations developers have for performing forks are many, but in general they have received little attention. In this paper, we present the results of a study of forks performed in SourceForge ( and list the developers’ motivations for their actions. The main motivation, seen in close to half of the cases of forking, was content modification; either adding content to the original program or focusing the content to the needs of a specific segment of users. In a quarter of the cases the motivation was technical modification; either porting the program to new hardware or software, or improving the original.
  • Björk, Bo-Christer; Laakso, Mikael (Elsevier, 2010)
    There has been a demand for uniform CAD standards in the construction industry ever since the large-scale introduction of computer aided design systems in the late 1980s. While some standards have been widely adopted without much formal effort, other standards have failed to gain support even though considerable resources have been allocated for the purpose. Establishing a standard concerning building information modeling has been one particularly active area of industry development and scientific interest within recent years. In this paper, four different standards are discussed as cases: the IGES and DXF/DWG standards for representing the graphics in 2D drawings, the ISO 13567 standard for the structuring of building information on layers, and the IFC standard for building product models. Based on a literature study combined with two qualitative interview studies with domain experts, a process model is proposed to describe and interpret the contrasting histories of past CAD standardisation processes.
  • Björk, Bo-Christer; Welling, Patrik; Laakso, Mikael; Majlender, Peter; Hedlund, Turid; Gudnason, Gudni (Public Library of Science, 2010)
    Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.
  • Björk, Bo-Christer (CIB, 2009)
    In smaller countries where the key players in construction IT development tend to know each other personally and where public R&D funding is concentrated to a few channels, IT roadmaps and strategies would seem to have a better chance of influencing development than in the bigger industrial countries. In this paper Finland and the RATAS-project is presented as a historical case illustrating such impact. RATAS was initiated as a construction IT roadmap project in 1985, involving many of the key organisations and companies active in construction sector development. Several of the individuals who took an active part in the project have played an important role in later developments both in Finland and on the international scene. The central result of RATAS was the identification of what is nowadays called Building Information Modelling (BIM) technology as the central issue in getting IT into efficient use in the construction sector. BIM, which earlier was referred to as building product modelling, has been a key ingredient in many roadmaps since and the subject of international standardisation efforts such as STEP and IAI/IFCs. The RATAS project can in hindsight be seen as a forerunner with an impact which also transcended national borders.
  • Laakso, Mikael (CRC Press, 2009)
    The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.
  • Björk, Bo-Christer; Öörni, Anssi (Elsevier, 2009)
    When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.
  • Björk, Bo-Christer; Roos, Annikki; Lauri, Mari (University of Lund Library, 2009)
    Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.