Browsing by Subject "Monte Carlo"

Sort by: Order: Results:

Now showing items 1-6 of 6
  • Tikkanen, J.; Zink, K.; Pimpinella, M.; Teles, P.; Borbinha, J.; Ojala, J.; Siiskonen, T.; Goma, C.; Pinto, M. (2020)
    The beam quality correction factor, , which corrects for the difference in the ionization chamber response between the reference and clinical beam quality, is an integral part of radiation therapy dosimetry. The uncertainty of is one of the most significant sources of uncertainty in the dose determination. To improve the accuracy of available data, four partners calculated factors for 10 ionization chamber models in linear accelerator beams with accelerator voltages ranging from 6 MV to 25 MV, including flattening-filter-free (FFF) beams. The software used in the calculations were EGSnrc and PENELOPE, and the ICRU report 90 cross section data for water and graphite were included in the simulations. Volume averaging correction factors were calculated to correct for the dose averaging in the chamber cavities. A comparison calculation between partners showed a good agreement, as did comparison with literature. The values from TRS-398 were higher than our values for each chamber where data was available. The values for the FFF beams did not follow the same , relation as beams with flattening filter (values for 10 MV FFF beams were below fits made to other data on average by 0.3%), although our FFF sources were only for Varian linacs.
  • Suomi, J.; Tuominen, P.; Niinistö, S.; Virtanen, S.M.; Savela, K. (2019)
    AIMS Agriculture and Food 2019: Vol.4, No. 3, p. 778 - 793
    The exposure of Finnish 1-year-olds to cadmium, lead and inorganic arsenic via food and drinking water was determined. The food consumption data consisted of 3-day records from 1010 children aged 12 months, collected during 2002 to 2005 in Southwest Finland. One fifth of these children were still breastfed when the consumption data were collected and their exposure was assessed separately from the non-breastfed children. The heavy metal concentration data in foodstuffs were mainly analysis results from national authorities and they were mostly from the years 2005 to 2012. Dietary exposure assessment was performed probabilistically using the online program MCRA. With middle bound estimates, 89% of the non-breastfed and 56% of the breastfed children exceeded the tolerable weekly intake of cadmium. The benchmark dose (BMDL01) for neurological damage caused by lead was exceeded by 60% of the non-breastfed and by 50% of the breastfed children, while the lowest BMDL01 for cancer risk increase caused by inorganic arsenic was exceeded by 77% of the non-breastfed and by 61% of the breastfed children. The assessment did not include the unknown heavy metal exposure from breast milk. Heavy metal exposure differences between the boys and the girls were also assessed. Breastfed girls had significantly higher heavy metal exposure relative to their bodyweight than the breastfed boys, while in the non-breastfed group there were no differences by sex.
  • Magdoom, Kulam Najmudeen; Komlosh, Michal E.; Saleem, Kadharbatcha; Gasbarra, Dario; Basser, Peter J. (2022)
    Neural tissue microstructure plays a key role in developmental, physiological and pathophysiological processes. In the continuing quest to characterize it at ever finer length scales, we use a novel diffusion tensor distribution (DTD) paradigm to probe microstructural features much smaller than the nominal MRI voxel size. We first assume the DTD is a normal tensor variate distribution constrained to lie on the manifold of positive definite matrices, characterized by a mean and covariance tensor. We then estimate the DTD using Monte Carlo signal inversion combined with parsimonious model selection framework that exploits a hierarchy of symmetries of mean and covariance tensors. High resolution multiple pulsed field gradient (mPFG) MRI measurements were performed on a homogeneous isotropic diffusion phantom (PDMS) for control, and excised visual cortex and spinal cord of macaque monkey to investigate the capabilities of DTD MRI in revealing neural tissue microstructural features using strong gradients not typically available in clinical MRI scanners. DTD-derived stains and glyphs, which disentangle size, shape, and orientation heterogeneities of microscopic diffusion tensors, are presented for all samples along with the distribution of the mean diffusivity (MD) within each voxel. We also present a new glyph to visualize the symmetric (kurtosis) and asymmetric parts of the fourth-order covariance tensor. An isotropic mean diffusion tensor and zero covariance tensor was found for the isotropic PDMS phantom, as expected, while the covariance tensor (both symmetric and asymmetric parts) for neural tissue was non-zero indicating that the kurtosis tensor may not be sufficient to fully describe the microstructure. Cortical layers were clearly delineated in the higher moments of the MD spectrum consistent with histology, and microscopic anisotropy was detected in both gray and white matter of neural tissue. DTD MRI captures crossing and splaying white matter fibers penetrating into the cortex, and skewed fiber diameter distributions in the white matter tracts within the cortex and spinal cord. DTD MRI was also shown to subsume diffusion tensor imaging (DTI) while providing additional microstructural information about tissue heterogeneity and microscopic anisotropy within each voxel.
  • Bexelius, Tobias; Sohlberg, Antti (2018)
    Statistical SPECT reconstruction can be very time-consuming especially when compensations for collimator and detector response, attenuation, and scatter are included in the reconstruction. This work proposes an accelerated SPECT reconstruction algorithm based on graphics processing unit (GPU) processing. Ordered subset expectation maximization (OSEM) algorithm with CT-based attenuation modelling, depth-dependent Gaussian convolution-based collimator-detector response modelling, and Monte Carlo-based scatter compensation was implemented using OpenCL. The OpenCL implementation was compared against the existing multi-threaded OSEM implementation running on a central processing unit (CPU) in terms of scatter-to-primary ratios, standardized uptake values (SUVs), and processing speed using mathematical phantoms and clinical multi-bed bone SPECT/CT studies. The difference in scatter-to-primary ratios, visual appearance, and SUVs between GPU and CPU implementations was minor. On the other hand, at its best, the GPU implementation was noticed to be 24 times faster than the multi-threaded CPU version on a normal 128 x 128 matrix size 3 bed bone SPECT/CT data set when compensations for collimator and detector response, attenuation, and scatter were included. GPU SPECT reconstructions show great promise as an every day clinical reconstruction tool.
  • Nordlund, K. (Elsevier Ltd., 2020)
    Frontiers of Nanoscience
  • Penttinen, Jussi (Helsingin yliopisto, 2021)
    HMC is a computational method build to efficiently sample from a high dimensional distribution. Sampling from a distribution is typically a statistical problem and hence a lot of works concerning Hamiltonian Monte Carlo are written in the mathematical language of probability theory, which perhaps is not ideally suited for HMC, since HMC is at its core differential geometry. The purpose of this text is to present the differential geometric tool's needed in HMC and then methodically build the algorithm itself. Since there is a great introductory book to smooth manifolds by Lee and not wanting to completely copy Lee's work from his book, some basic knowledge of differential geometry is left for the reader. Similarly, the author being more comfortable with notions of differential geometry, and to cut down the length of this text, most theorems connected to measure and probability theory are omitted from this work. The first chapter is an introductory chapter that goes through the bare minimum of measure theory needed to motivate Hamiltonian Monte Carlo. Bulk of this text is in the second and third chapter. The second chapter presents the concepts of differential geometry needed to understand the abstract build of Hamiltonian Monte Carlo. Those familiar with differential geometry can possibly skip the second chapter, even though it might be worth while to at least flip through it to fill in on the notations used in this text. The third chapter is the core of this text. There the algorithm is methodically built using the groundwork laid in previous chapters. The most important part and the theoretical heart of the algorithm is presented here in the sections discussing the lift of the target measure. The fourth chapter provides brief practical insight to implementing HMC and also discusses quickly how HMC is currently being improved.