Information Systems Science

 

Nyligen publicerat

  • Aaltonen, Aleksi (Yhdyskuntasuunnittelun seura (YSS) ry, 2006)
  • Samuelson, Olov; Björk, Bo-Christer (Elsevier BV, 2014)
    The rapid development of IT technology has in the past three decades created opportunities for faster and more efficient processes as well as innovative new working methods in the building industry. This paper presents the results of a longitudinal survey-based study (the ”IT barometer”) of IT use in the Swedish building industry, conducted at several intervals over the period 1998-2011. The results show a rapid increase in general IT use as well as in the use of sector-specific tools. Improving communication and information sharing is a strong driving force for taking IT into use, for instance technologies such as EDM and EDI, although the adoption of the more complex applications (ie BIM) is slower. Interestingly “demands from employees” has over the years become a very important reason for companies to increase their IT use. Leading areas for planned IT investments include document handling and mobile equipment, with BIM technology rather low on the list.
  • Nyman, Linus Morten (2013)
    While significant factors that affect the open source community’s interest to participate in a development project have been studied, there has been little focus on the motivating factors that can cause a contributor to become a competitor by utilizing the right to fork a program i.e., to copy an existing program’s code base and use it to begin a separate development. The right to copy an existing program’s code base and use it to begin a separate development is guaranteed by all open source licenses. However, this right to fork a program is rarely exercised. Indeed, there is strong social pressure against code forking stemming from the negative side effects of code forking, such as conflict and duplicated efforts among developers. This paper details the events that led Widenius, the founder of the MySQL project, to decide to fork MariaDB from MySQL. Our findings confirm the previously held notion that there is a high threshold for starting a competing fork. While the few studies that exist of competitive forks find the reasons to be due to disagreement among developers, in the case of MariaDB the fork was caused by Widenius’ concerns regarding the uncertainty of the future freedom and openness of the MySQL codebase. This article makes three contributions. Firstly, it further validates the existing notion that there is a strong threshold to starting a competing fork. Secondly, it offers an in-depth analysis of the events and motivations behind the birth of a fork. Thirdly, it contributes to theory by introducing the freedom factor hypothesis: limiting either developers’ freedoms to contribute to a project or the freedom inherent in a project's license increases the likelihood of a fork.
  • Laakso, Mikael; Björk, Bo-Christer (John Wiley & Sons, Inc., 2013)
    Delayed open access (OA) refers to scholarly articles in subscription journals made available openly on the web directly through the publisher at the expiry of a set embargo period. Though a substantial number of journals have practiced delayed OA since they started publishing e-versions, empirical studies concerning open access have often overlooked this body of literature. This study provides comprehensive quantitative measurements by identifying delayed OA journals, collecting data concerning their publication volumes, embargo lengths, and citation rates. Altogether 492 journals were identified, publishing a combined total of 111 312 articles in 2011. 77,8 % of these articles were made open access within 12 months from publication, with 85,4 % becoming available within 24 months. A journal impact factor analysis revealed that delayed OA journals have on average twice as high average citation rates compared to closed subscription journals, and three times as high as immediate OA journals. Overall the results demonstrate that delayed OA journals constitute an important segment of the openly available scholarly journal literature, both by their sheer article volume as well as by including a substantial proportion of high impact journals.
  • Aaltonen, Aleksi (2011)
    In this paper we propose a theoretical framework to understand the governance of internet-mediated social production. Focusing on one of the most popular websites and reference tools, Wikipedia, we undertake an exploratory theoretical analysis to clarify the structure and mechanisms driving the endogenous change of a large-scale social production system. We argue that the popular transactions costs approach underpinning many of the analyses is an insufficient framework for unpacking the evolutionary character of governance. The evolution of Wikipedia and its shifting modes of governance can be better framed as a process of building a collective capability, namely the capability of editing and managing a new kind of encyclopedia. We understand Wikipedia evolution as a learning phenomenon that gives over time rise to governance mechanisms and structures as endogenous responses to the problems and conditions that the ongoing development of Wikipedia itself has produced over the years. Finally, we put forward five empirical hypotheses to test the theoretical framework.
  • Aaltonen, Aleksi (2011)
    How does a new medium create its audience? This study takes the business model of commercial media as its starting point and identifies industrial audience measurement as a constitutive operation in creating the sellable asset of advertising- funded companies. The study employs a qualitative case study design to analyse how a mobile virtual network operator (MVNO) company harnesses digital behavioural records generated by computational network infrastructure to turn network subscribers into an advertising audience product. The empirical evidence is based on a three-months intensive fieldwork at the company office. The analysis reveals comprehensiveness, openness and granularity as the historically new attributes of computational data vis-à-vis traditional audience measurement arrangements. These attributes are then juxtaposed with four kinds of business analytical operations (automatic data aggregation procedures, the use of software reporting tools, organizational reporting practices and custom analyses) observed at the research site to assess how does computational media environment rule key audiencemaking practices. Finally, the implications of this analytical infrastructure are reflected upon three sets of organizational practices. The theoretical framework for the analysis is composed by critically assessing constructivist approaches (SCOT, ANT and sociomateriality) for studying technology and by discussing an approach inspired by critical realism to overcome their limitations with respect to the objectives of the study. The findings contribute toward innovating new digital services, information systems (IS) theory and the study of media audiences. The case opens up considerable complexity involved in establishing a new kind of advertising audience and, more generally, a platform business. Sending out advertisements is easy compared to demonstrating that somebody is actually receiving them. The three computational attributes both extend and provide summative validity for mid-range theorizing on how computational objects mediate organizational practices and processes. Finally, the analysis reveals an interactive nature of digital audience stemming from the direct and immediate behavioural feedback in an audiencemaking cycle.
  • Nyman, Linus Morten; Mikkonen, Tommi (2011)
    A project fork occurs when software developers take a copy of source code from one software package and use it to begin an independent development work that is maintained separately from its origin. Although forking in open source software does not require the permission of the original authors, the new version, nevertheless, competes for the attention of the same developers that have worked on the original version. The motivations developers have for performing forks are many, but in general they have received little attention. In this paper, we present the results of a study of forks performed in SourceForge (http://sourceforge.net/) and list the developers’ motivations for their actions. The main motivation, seen in close to half of the cases of forking, was content modification; either adding content to the original program or focusing the content to the needs of a specific segment of users. In a quarter of the cases the motivation was technical modification; either porting the program to new hardware or software, or improving the original.
  • Laakso, Mikael; Kiviniemi, Arto (International Council for Research and Innovation in Building and Construction, 2012)
    IFC (Industry Foundation Classes) is an open and standardized data model intended to enable interoperability between building information modeling software applications in the AEC/FM industry. IFC has been in development by an industry consortium since 1994, and since the start of the effort, the evolving industry context, standardization organization, resource availability, and technology development have exposed the standardization process to a dynamic environment. While the overarching mission of IFC standardization has always been to enable interoperability between AEC/FM software applications, the approach for how best to operationalize that mission has changed over the years. Through a literature review supported by the general theory on IT standardization, this study follows the development process of the IFC standard from its origins in the early 1990s to its latest activities in 2012. The end result is both a descriptive review of the history of IFC standardization and the establishment of an initial connection to IT standardization research for the IFC standard by profiling the effort in accordance with existing IT standardization theories and typologies. The review highlights the evolution of IFC standardization through several distinct phases, and its gradual movement from emphasizing technical architecture development towards growing involvement in specifying the processes facilitating its use. The organization behind the standard has also seen changes in its modus operandi, from initially being a closed and loosely coupled alliance to evolving into a consortium incorporating open hybrid standardization, where a formal standards body publishes the standards prepared by the consortium. The consortium has faced many challenges compiling an ambitious interoperability standard with few resources, and were it not for the growing demand for the standard provided by public actors, momentum and enthusiasm for the effort might have petered out due to slow market uptake and low use of the data standard in actual construction projects thus far. While this paper does not investigate the adoption phenomenon in-depth, the moderate uptake of the standard can perhaps be explained to be a symptom of the slow adoption of collaborative model-based construction processes and industry reluctance to switch over to new IT tools, which in turn are prerequisites for the existence of demand for an open interoperability standard.
  • Björk, Bo-Christer; Solomon, David (Association of Learned and Professional Society Publishers, 2012)
    The article processing charge (APC) is currently the primary method of funding Open Access peer reviewed journals. The pricing principles of 77 OA publishers publishing over 1000 journals using APCs were studied and classified. The most common method is the fixed single fee, which can either be the same for all of a publisher’s journals or individually determined for each journal. Fees are usually only levied for publication of accepted papers, but there are some journals that also charge for submission. Instead of fixed prices many publishers charge by the page or have multi-tiered fees depending on the length of articles. The country of origin of the author can also influence the pricing, in order to facilitate publishing for authors from developing countries.
  • Samuelson, Olle; Björk, Bo-Christer (2011)
    Three strategically important uses of IT in the construction industry are the storage and management of project documents on webservers (EDM), the electronic handling of orders and invoices between companies (EDI) and the use of 3-D models including non-geometrical attributes for integrated design and construction (BIM). In a broad longitudinal survey study of IT use in the Swedish Construction Industry the extent of use of these techniques was measured in 1998, 2000 and 2007. The results showed that EDM and EDI are currently already well-established techniques whereas BIM, although it promises the biggest potential benefits to the industry, only seems to be at the beginning of adoption. In a follow-up to the quantitative studies, the factors affecting the decisions to implement EDM, EDI and BIM as well as the actual adoption processes, were studied using semi-structured interviews with practitioners. The theoretical basis for the interview studies was informed by theoretical frameworks from IT-adoption theory, where in particular the UTAUT model has provided the main basis for the analyses presented here. The results showed that the decisions to take the above technologies into use are made on three differ- ent levels: the individual level, the organizational level in the form of a company, and the organiza- tional level in the form of a project. The different patterns in adoption can to some part be explained by where the decisions are mainly taken. EDM is driven from the organisation/project level, EDI mainly from the organisation/company level, and BIM is driven by individuals pioneering the technique.
  • Björk, Bo-Christer; Turk, Ziga (Michigan University Press, 2000)
    The current mainstream scientific-publication process has so far been only marginally affected by the possibilities offered by the Internet, despite some pioneering attempts with free electronic-only journals and electronic preprint archives. Additional electronic versions of traditional paper journals for which one needs a subscription are not a solution. A clear trend, for young researchers in particular, is to go around subscription barriers (both for paper and electronic material) and rely almost exclusively on what they can find free on the Internet, which often includes working versions posted on the home pages of the authors. A survey of how scientists retrieve publications was conducted in February 2000, aimed at measuring to what extent the opportunities offered by the Internet are already changing the scientific information exchange and how researchers feel about this. This paper presents the results based on 236 replies to an extensive Web-based questionnaire, which was announced to around 3,000 researchers in the domains of construction information technology and construction management. The questions dealt with how researchers find, access, and read different sources; how many and what publications they read; how often and to which conferences they travel; how much they publish, and criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing, with one final section dedicated to opinions about electronic publishing. According to the survey, researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author's or publisher's Web site. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available freely in their entirety on the Web, where the costs could be covered by, for instance, professional societies or the publishing university.
  • Björk, Bo-Christer; Turk, Ziga (University of Lund Library, 2006)
    Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.
  • Björk, Bo-Christer (Arnolds, 2002)
    A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.
  • Björk, Bo-Christer; Hedlund, Turid (University of Michigan, 2009)
    The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
  • Björk, Bo-Christer; Welling, Patrik; Laakso, Mikael; Majlender, Peter; Hedlund, Turid; Gudnason, Gudni (Public Library of Science, 2010)
    Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.