Scaling from gauge and scalar radiation in Abelian-Higgs string networks

Show full item record



Hindmarsh , M , Lizarraga , J , Urrestilla , J , Daverio , D & Kunz , M 2017 , ' Scaling from gauge and scalar radiation in Abelian-Higgs string networks ' , Physical Review D , vol. 96 , no. 2 , 023525 .

Title: Scaling from gauge and scalar radiation in Abelian-Higgs string networks
Author: Hindmarsh, Mark; Lizarraga, Joanes; Urrestilla, Jon; Daverio, David; Kunz, Martin
Contributor organization: Department of Physics
Helsinki Institute of Physics
Date: 2017-07-21
Language: eng
Number of pages: 20
Belongs to series: Physical Review D
ISSN: 2470-0010
Abstract: We investigate cosmic string networks in the Abelian Higgs model using data from a campaign of large-scale numerical simulations on lattices of up to 4096(3) grid points. We observe scaling or self-similarity of the networks over a wide range of scales, and estimate the asymptotic values of the mean string separation in horizon length units xi and of the mean square string velocity v(-2) in the continuum and large time limits. The scaling occurs because the strings lose energy into classical radiation of the scalar and gauge fields of the Abelian Higgs model. We quantify the energy loss with a dimensionless radiative efficiency parameter, and show that it does not vary significantly with lattice spacing or string separation. This implies that the radiative energy loss underlying the scaling behaviour is not a lattice artefact, and justifies the extrapolation of measured network properties to large times for computations of cosmological perturbations. We also show that the core growth method, which increases the defect core width with time to extend the dynamic range of simulations, does not introduce significant systematic error. We compare xi and v(-2) to values measured in simulations using the Nambu-Goto approximation, finding that the latter underestimate the mean string separation by about 25%, and overestimate v(-2) by about 10%. The scaling of the string separation implies that string loops decay by the emission of massive radiation within a Hubble time in field theory simulations, in contrast to the Nambu-Goto scenario which neglects this energy loss mechanism. String loops surviving for only one Hubble time emit much less gravitational radiation than in the Nambu-Goto scenario and are consequently subject to much weaker gravitational wave constraints on their tension.
114 Physical sciences
Peer reviewed: Yes
Rights: cc_by
Usage restriction: openAccess
Self-archived version: publishedVersion

Files in this item

Total number of downloads: Loading...

Files Size Format View
PhysRevD.96.023525.pdf 1.252Mb PDF View/Open

This item appears in the following Collection(s)

Show full item record