Browsing by Subject "CERN"

Sort by: Order: Results:

Now showing items 1-4 of 4
  • Veteli, Peitsa (Helsingin yliopisto, 2020)
    Opetus- ja tutkimusmaailmojen välillä koetaan olevan rako, jota voidaan pitää osasyynä yleisesti havaittuun opiskelijoiden matalaan motivaatioon luonnontieteellisiä aloja kohtaan. Samassa yhteydessä esiin nousevat autenttisuuden ja relevanssin käsitteet, joilla voidaan kuvata eri tavoilla tapahtuvan toiminnan ”aitoutta” tai mielekkyyttä. Tässä työssä esitellään Fysiikan tutkimuslaitos HIP:in (Helsinki Institute of Physics) Avoin data opetuksessa -projektin myötä kehitettyjä merkityksellisen ohjelmoinnin työkaluja, joissa hyödynnetään muun muassa CERNissä toimivan CMS-kokeen (Compact Muon Solenoid) avoimia hiukkastutkimuksen aineistoja. Näiden materiaalien siirtymistä opettajakunnan avuksi tuetaan koulutuksilla, joista kerättyä palautetta analysoidaan tässä tutkielmassa laajemman tiedeopetuksen autenttisuuteen ja avoimen datan hyödyntämiseen liittyvän keskustelun yhteydessä. Avoimen datan hyödyntäminen ja opetuksellinen tutkiminen ovat hyvin nuoria aloja, joiden eturintamaan tämäkin työ asettuu. Aineistoa on kerätty sekä suomalaisilta (n = 64) että kansainvälisiltä (n = 12) toisen asteen opettajilta, minkä lisäksi vertailukohtana käytetään opiskelijatyöpajoista nousseita kommentteja (n = 62). Menetelmänä toimii temaattinen analyysi, jonka tulokset ovat vertailukelpoisia muuhun luonnontieteen opetuksen tutkimuskirjallisuuteen. Tutkimuskysymyksenä on: Miten autenttisuus esiintyy opettajien palautteessa hiukkasfysiikan avoimen datan opetuskäytön kursseilta ja kuinka se vertautuu tiedeopetuksen tutkimuskirjallisuuteen? Tuloksista havaitaan opettajien näkemysten asettuvan hyvin saman suuntaisesti kuin verrokkikirjallisuuden pohjalta olisi voinut olettaakin, yleisimpien autenttisuuden yhteyksien painottuessa tutkijoiden toimintaan verrattaviin työskentelytapoihin ja ”oikean maailman” haasteisiin. Palautteen lähes yksimielinen positiivisuus antaa vahvaa indikaatiota projektin tarjoamien mahdollisuuksien hyödyllisyydestä ja tukee alalla kaivattavien jatkotutkimusten kannattavuutta.
  • Pirttikoski, Antti (Helsingin yliopisto, 2021)
    LHC is the highest energy particle collider ever built and it is employed to study elementary particles by colliding protons together. One intriguing study subject at LHC is the stability of the electroweak vacuum in our universe. The current prediction suggests that the vacuum is in the metastable state. The stability of the vacuum is dependent on the mass of the top quark, and it is possible that more precise measurement of the mass could shift the prediction to the border of the metastable and stable states. In order to measure the mass of the top quark more precisely, we need to measure the bottom (b) quarks decaying from it at high precision, as top quark decays predominantly into a W boson and a b quark. Due to the phenomenon called hadronisation, we can not measure the quarks directly, but rather as sprays of collimated particles called jets. The jets originating from b quarks (b jet) can be identified by b-tagging. The precise measurement and calibration of the b jet energy is crucial for top quark mass measurement. This thesis studies the b jets and their energy calibration at the CMS, which is one of the general purpose detectors along the LHC. Especially the b jet energy scale (bJES) is under the investigation and the various phenomena affecting to it. For example, large fraction of b jets contain neutrinos, which cannot be measured directly. This increases uncertainties related to the energy measurement. Also there are problems how precisely the formation and evolution of the b jets can be modelled by Monte Carlo event generators, such as Pythia8, which was utilized in this thesis. The aim of this thesis is to evaluate how big effect on the bJES is caused by the various different phenomena, which presumably weaken the precision of the b jet measurements. The studied phenomena are the semileptonic branching ratios of b hadrons, branching ratios of b hadron to c hadron decays, b hadron production fraction and parameterization of the b quark fragmentation function. The combined effect of all four different rescaling features mentioned above, suggests that bJES is known at 0.2% level. A small shift of -0.1% in the Missing transverse energy Projection Fraction (MPF) response scale is detected at low pt values, which vanishes as the pt increases. This improves remarkably 0.4-0.5% JES accuracy achieved during at CMS during Run 1 of the LHC. However, there are still many ways we can improve the performance presented here. Definitely there is a need for further studies of the rescaling methods before results could be utilized in the corrections of bJES to do precision measurement of the top quark mass.
  • Ballof, J.; Ramos, J. P.; Molander, A.; Johnston, K.; Rothe, S.; Stora, T.; Duellmann, Ch. E. (2020)
    At the CERN-ISOLDE facility a variety of radioactive ion beams are available to users of the facility. The number of extractable isotopes estimated from yield database data exceeds 1000 and is still increasing. Due to high demand and scarcity of available beam time, precise experiment planning is required. The yield database stores information about radioactive beam yields and the combination of target material and ion source needed to extract a certain beam along with their respective operating conditions. It allows to investigate the feasibility of an experiment and the estimation of required beamtime. With the increasing demand for ever more exotic beams, needs arise to extend the functionality of the database and website not only to provide information about yields determined experimentally, but also to predict yields of isotopes, which can only be measured with sophisticated setups. For the prediction of yields, in-target production and information about release properties of target materials must be known. While the former were estimated in a simulation campaign using FLUKA and ABRABLA codes, the latter is available from measurement data as already stored in the database. We have compiled the information necessary to predict yields, and made available a yield prediction tool as web application. This currently undergoes extensive testing and will be available as powerful tool to the ISOLDE user community.
  • Niemi, Tapio; Nurminen, Jukka K.; Liukkonen, Juha-Matti; Hameri, Ari-Pekka (2018)
    High-energy physics studies collisions of particles traveling near the speed of light. For statistically significant results, physicists need to analyze a huge number of such events. One analysis job can take days and process tens of millions of collisions. Today the experiments of the large hadron collider (LHC) create 10 GB of data per second and a future upgrade will cause a ten-fold increase in data. The data analysis requires not only massive hardware but also a lot of electricity. In this article, we discuss energy efficiency in scientific computing and review a set of intermixed approaches we have developed in our Green Big Data project to improve energy efficiency of CERN computing. These approaches include making energy consumption visible to developers and users, architectural improvements, smarter management of computing jobs, and benefits of cloud technologies. The open and innovative environment at CERN is an excellent playground for different energy efficiency ideas which can later find use in mainstream computing. (C) 2017 Elsevier B.V. All rights reserved.