## Publications

229 results found

Munshi D, Jung G, Kitching TD,
et al., 2023, Position-dependent correlation function of weak-lensing convergence, *PHYSICAL REVIEW D*, Vol: 107, ISSN: 2470-0010

Loureiro A, Whiteaway L, Sellentin E,
et al., 2023, Almanac: Weak Lensing power spectra and map inference on the masked sphere, *The Open Journal of Astrophysics*, Vol: 6, Pages: 1-18, ISSN: 2565-6120

We present a field-based signal extraction of weak lensing from noisy observations on the curved and masked sky. We test the analysis on a simulated Euclid-like survey, using a Euclid-like mask and noise level. To make optimal use of the information available in such a galaxy survey, we present a Bayesian method for inferring the angular power spectra of the weak lensing fields, together with an inference of the noise-cleaned tomographic weak lensing shear and convergence (projected mass) maps. The latter can be used for field-level inference with the aim of extracting cosmological parameter information including non-gaussianity of cosmic fields. We jointly infer all-sky E-mode and B-mode tomographic auto- and cross-power spectra from the masked sky, and potentially parity-violating EB-mode power spectra, up to a maximum multipole of ℓmax=2048. We use Hamiltonian Monte Carlo sampling, inferring simultaneously the power spectra and denoised maps with a total of ∼16.8 million free parameters. The main output and natural outcome is the set of samples of the posterior, which does not suffer from leakage of power from E to B unless reduced to point estimates. However, such point estimates of the power spectra, the mean and most likely maps, and their variances and covariances, can be computed if desired.

Prideaux-Ghee J, Leclercq F, Lavaux G,
et al., 2023, Field-based physical inference from peculiar velocity tracers, *Monthly Notices of the Royal Astronomical Society*, Vol: 518, Pages: 4191-4213, ISSN: 0035-8711

We present a Bayesian hierarchical modelling approach to reconstruct theinitial cosmic matter density field constrained by peculiar velocityobservations. As our approach features a model for the gravitational evolutionof dark matter to connect the initial conditions to late-time observations, itreconstructs the final density and velocity fields as natural byproducts. Weimplement this field-based physical inference approach by adapting the BayesianOrigin Reconstruction from Galaxies (BORG) algorithm, which explores thehigh-dimensional posterior through the use of Hamiltonian Monte Carlo sampling.We test the self-consistency of the method using random sets of mock tracers,and assess its accuracy in a more complex scenario where peculiar velocitytracers are non-linearly evolved mock haloes. We find that our frameworkself-consistently infers the initial conditions, density and velocity fields,and shows some robustness to model mis-specification. As compared to thestate-of-the-art approach of constrained Gaussian random fields/Wienerfiltering, our method produces more accurate final density and velocity fieldreconstructions. It also allows us to constrain the initial conditions bypeculiar velocity observations, complementing in this aspect previousfield-based approaches based on other cosmological observables.

Heavens A, Makinen TL, Lemos P,
et al., 2022, The cosmic graph: optimal information extraction from large-scale structure using catalogues, *Open Journal of Astrophysics*, Vol: 5, Pages: 1-16

We present an implicit likelihood approach to quantifying cosmological information over discrete catalogue data, assembled as graphs. To do so, we explore cosmological parameter constraints using mock dark matter halo catalogues. We employ Information Maximising Neural Networks (IMNNs) to quantify Fisher information extraction as a function of graph representation. We a) demonstrate the high sensitivity of modular graph structure to the underlying cosmology in the noise-free limit, b) show that graph neural network summaries automatically combine mass and clustering information through comparisons to traditional statistics, c) demonstrate that networks can still extract information when catalogues are subject to noisy survey cuts, and d) illustrate how nonlinear IMNN summaries can be used as asymptotically optimal compressed statistics for Bayesian simulation-based inference. We reduce the area of joint Ωm,σ8 parameter constraints with small (∼100 object) halo catalogues by a factor of 42 over the two-point correlation function, and demonstrate that the networks automatically combine mass and clustering information. This work utilises a new IMNN implementation over graph data in Jax, which can take advantage of either numerical or auto-differentiability. We also show that graph IMNNs successfully compress simulations away from the fiducial model at which the network is fitted, indicating a promising alternative to n-point statistics in catalogue simulation-based analyses.

Abdalla E, Abellán GF, Aboubrahim A,
et al., 2022, Cosmology intertwined: A review of the particle physics, astrophysics, and cosmology associated with the cosmological tensions and anomalies, *Journal of High Energy Astrophysics*, Vol: 34, Pages: 49-211, ISSN: 2214-4048

The standard Cold Dark Matter (CDM) cosmological model provides a good description of a widerange of astrophysical and cosmological data. However, there are a few big open questions that make thestandard model look like an approximation to a more realistic scenario yet to be found. In this paper,we list a few important goals that need to be addressed in the next decade, taking into account thecurrent discordances between the different cosmological probes, such as the disagreement in the valueof the Hubble constant H0, the σ8–S8 tension, and other less statistically significant anomalies. Whilethese discordances can still be in part the result of systematic errors, their persistence after several yearsof accurate analysis strongly hints at cracks in the standard cosmological scenario and the necessity fornew physics or generalisations beyond the standard model. In this paper, we focus on the 5.0 σ tensionbetween the Planck CMB estimate of the Hubble constant H0 and the SH0ES collaboration measurements.After showing the H0 evaluations made from different teams using different methods and geometriccalibrations, we list a few interesting new physics models that could alleviate this tension and discusshow the next decade’s experiments will be crucial. Moreover, we focus on the tension of the PlanckCMB data with weak lensing measurements and redshift surveys, about the value of the matter energydensity m, and the amplitude or rate of the growth of structure (σ8, f σ8). We list a few interestingmodels proposed for alleviating this tension, and we discuss the importance of trying to fit a full arrayof data with a single model and not just one parameter at a time. Additionally, we present a wide rangeof other less discussed anomalies at a statistical significance level lower than the H0–S8 tensions whichmay also constitute hints towards new physics, and we discuss possible generic theoretical approachesthat can collectively explain

Percival WJ, Friedrich O, Sellentin E,
et al., 2022, Matching Bayesian and frequentist coverage probabilities when using an approximate data covariance matrix, *Monthly Notices of the Royal Astronomical Society*, Vol: 510, Pages: 3207-3221, ISSN: 0035-8711

Observational astrophysics consists of making inferences about the Universe by comparing data and models. The credible intervals placed on model parameters are often as important as the maximum a posteriori probability values, as the intervals indicate concordance or discordance between models and with measurements from other data. Intermediate statistics (e.g. the power spectrum) are usually measured and inferences are made by fitting models to these rather than the raw data, assuming that the likelihood for these statistics has multivariate Gaussian form. The covariance matrix used to calculate the likelihood is often estimated from simulations, such that it is itself a random variable. This is a standard problem in Bayesian statistics, which requires a prior to be placed on the true model parameters and covariance matrix, influencing the joint posterior distribution. As an alternative to the commonly used independence Jeffreys prior, we introduce a prior that leads to a posterior that has approximately frequentist matching coverage. This is achieved by matching the covariance of the posterior to that of the distribution of true values of the parameters around the maximum likelihood values in repeated trials, under certain assumptions. Using this prior, credible intervals derived from a Bayesian analysis can be interpreted approximately as confidence intervals, containing the truth a certain proportion of the time for repeated trials. Linking frequentist and Bayesian approaches that have previously appeared in the astronomical literature, this offers a consistent and conservative approach for credible intervals quoted on model parameters for problems where the covariance matrix is itself an estimate.

Hu L, Heavens A, Bacon D, 2022, Light bending by the cosmological constant, Publisher: ArXiv

We revisit the question of whether the cosmological constant $\Lambda$affects the cosmological gravitational bending of light, by numericalintegration of the geodesic equations for a Swiss cheese model consisting of apoint mass and a compensated vacuole, in a Friedmann-Robertson-Walkerbackground. We find that there is virtually no dependence of the light bendingon the cosmological constant that is not already accounted for in the angulardiameter distances of the standard lensing equations, plus small modificationsthat arise because the bending is restricted to a finite region covered by thehole. The residual $\Lambda$ dependence for a $10^{13}\,M_{\odot}$ lens is atthe level of 1 part in $10^7$, and even this might be accounted for by smallchanges in the hole size evolution as the photon crosses. We therefore concludethat there is no need for modification of the standard cosmological lensingequations in the presence of a cosmological constant.

Mootoovaloo A, Jaffe AH, Heavens AF,
et al., 2022, Kernel-based emulator for the 3D matter power spectrum from CLASS, *Astronomy and Computing*, Vol: 38, Pages: 100508-100508, ISSN: 2213-1337

The 3D matter power spectrum, is a fundamental quantity in the analysis of cosmological data such as large-scale structure, 21 cm observations, and weak lensing. Existing computer models (Boltzmann codes) such as CLASS can provide it at the expense of immoderate computational cost. In this paper, we propose a fast Bayesian method to generate the 3D matter power spectrum, for a given set of wavenumbers, and redshifts, . Our code allows one to calculate the following quantities: the linear matter power spectrum at a given redshift (the default is set to 0); the non-linear 3D matter power spectrum with/without baryon feedback; the weak lensing power spectrum. The gradient of the 3D matter power spectrum with respect to the input cosmological parameters is also returned and this is useful for Hamiltonian Monte Carlo samplers. The derivatives are also useful for Fisher matrix calculations. In our application, the emulator is accurate when evaluated at a set of cosmological parameters, drawn from the prior, with the fractional uncertainty, centred on 0. It is also times faster compared to CLASS, hence making the emulator amenable to sampling cosmological and nuisance parameters in a Monte Carlo routine. In addition, once the 3D matter power spectrum is calculated, it can be used with a specific redshift distribution, to calculate the weak lensing and intrinsic alignment power spectra, which can then be used to derive constraints on cosmological parameters in a weak lensing data analysis problem. The software (emuPK) can be trained with any set of points and is distributed on Github, and comes with a pre-trained set of Gaussian Process (GP) models, based on 1000 Latin Hypercube (LH) samples, which follow roughly the current priors for current weak lensing analyses.

Porqueres N, Heavens A, Mortlock D,
et al., 2021, Lifting weak lensing degeneracies with a field-based likelihood, *Monthly Notices of the Royal Astronomical Society*, Vol: 509, Pages: 3194-3202, ISSN: 0035-8711

We present a field-based approach to the analysis of cosmic shear data to infer jointly cosmological parameters and the dark matter distribution. This forward modelling approach samples the cosmological parameters and the initial matter fluctuations, using a physical gravity model to link the primordial fluctuations to the non-linear matter distribution. Cosmological parameters are sampled and updated consistently through the forward model, varying (1) the initial matter power spectrum, (2) the geometry through the distance-redshift relationship, and (3) the growth of structure and light-cone effects. Our approach extracts more information from the data than methods based on two-point statistics. We find that this field-based approach lifts the strong degeneracy between the cosmological matter density, Ωm, and the fluctuation amplitude, σ8, providing tight constraints on these parameters from weak lensing data alone. In the simulated four-bin tomographic experiment we consider, the field-based likelihood yields marginal uncertainties on σ8 and Ωm that are, respectively, a factor of 3 and 5 smaller than those from a two-point power spectrum analysis applied to the same underlying data.

Bahr-Kalus B, Bertacca D, Verde L,
et al., 2021, The Kaiser-Rocket effect: three decades and counting, *Journal of Cosmology and Astroparticle Physics*, Vol: 2021, Pages: 1-41, ISSN: 1475-7516

The peculiar motion of the observer, if not accurately accounted for, is bound to induce a well-defined clustering signal in the distribution of galaxies. This signal is related to the Kaiser rocket effect. Here we examine the amplitude and form of this effect, both analytically and numerically, and discuss possible implications for the analysis and interpretation of forthcoming cosmological surveys. For an idealistic cosmic variance dominated full-sky survey with a Gaussian selection function peaked at z ∼ 1.5 it is a > 5σ effect and it can in principle bias very significantly the inference of cosmological parameters, especially for primordial non-Gaussianity. For forthcoming surveys, with realistic masks and selection functions, the Kaiser rocket is not a significant concern for cosmological parameter inference except perhaps for primordial non-Gaussianity studies. However, it is a systematic effect, whose origin, nature and imprint on galaxy maps are well known and thus should be subtracted or mitigated. We present several approaches to do so.

Berera A, Brahma S, Brandenberger R,
et al., 2021, Quantum coherence of photons to cosmological distances, *Physical Review D: Particles, Fields, Gravitation and Cosmology*, Vol: 104, ISSN: 1550-2368

We identify potential sources of decoherence for U(1) gauge bosons from a cosmological standpoint. Besides interactions with different species in the cosmological medium, we also consider effects due to the expansion of the Universe, which can produce particles (especially scalars) that can potentially interact with the photon in a quantum state. We look in particular at the case of axionlike particles and their predicted decay channels in our analysis. These interactions are shown to have a negligible effect as far as decoherence goes. Interaction rates with cosmic microwave background radiation or through Thomson scattering are small, so that the interstellar medium remains the biggest decoherence factor. Thus, quantum teleportation experiments with photon energies in the range 1–10 keV should be feasible at cosmological distances up to the galaxy formation epoch or beyond (z∼100).

Leclercq F, Heavens A, 2021, On the accuracy and precision of correlation functions and field-levelinference in cosmology, *Monthly Notices of the Royal Astronomical Society*, Vol: 506, Pages: L85-L90, ISSN: 0035-8711

We present a comparative study of the accuracy and precision of correlation function methods and full-field inference in cosmological data analysis. To do so, we examine a Bayesian hierarchical model that predicts lognormal (LN) fields and their two-point correlation function. Although a simplified analytic model, the LN model produces fields that share many of the essential characteristics of the present-day non-Gaussian cosmological density fields. We use three different statistical techniques: (i) a standard likelihood-based analysis of the two-point correlation function; (ii) a likelihood-free (simulation-based) analysis of the two-point correlation function; (iii) a field-level analysis, made possible by the more sophisticated data assimilation technique. We find that (a) standard assumptions made to write down a likelihood for correlation functions can cause significant biases, a problem that is alleviated with simulation-based inference; and (b) analysing the entire field offers considerable advantages over correlation functions, through higher accuracy, higher precision, or both. The gains depend on the degree of non-Gaussianity, but in all cases, including for weak non-Gaussianity, the advantage of analysing the full field is substantial.

Jung G, Namikawa T, Liguori M,
et al., 2021, The integrated angular bispectrum of weak lensing, *Journal of Cosmology and Astroparticle Physics*, Vol: 2021, Pages: 1-22, ISSN: 1475-7516

We investigate three-point statistics in weak lensing convergence, through the integrated bispectrum. This statistic involves measuring power spectra in patches, and is thus easy to measure, and avoids the complexity of estimating the very large number of possible bispectrum configurations. The integrated bispectrum principally probes the squeezed limit of the bispectrum. To be useful as a set of summary statistics, accurate theoretical predictions of the signal are required, and, assuming Gaussian sampling distributions, the covariance matrix. In this paper, we investigate through simulations how accurate are theoretical formulae for both the integrated bispectrum and its covariance, finding that there a small inaccuracies in the theoretical signal, and more serious deviations in the covariance matrix, which may need to be estimated using simulations.

Di Valentino E, Anchordoqui LA, Akarsu O,
et al., 2021, Snowmass2021-Letter of interest cosmology intertwined I: Perspectives for the next decade, *ASTROPARTICLE PHYSICS*, Vol: 131, ISSN: 0927-6505

- Author Web Link
- Cite
- Citations: 20

Di Valentino E, Anchordoqui LA, Akarsu O,
et al., 2021, Snowmass2021-Letter of interest cosmology intertwined II: The hubble constant tension, *ASTROPARTICLE PHYSICS*, Vol: 131, ISSN: 0927-6505

- Author Web Link
- Cite
- Citations: 90

Di Valentino E, Anchordoqui LA, Akarsu O,
et al., 2021, Cosmology intertwined III: f sigma(8) and S-8, *ASTROPARTICLE PHYSICS*, Vol: 131, ISSN: 0927-6505

- Author Web Link
- Cite
- Citations: 73

Di Valentino E, Anchordoqui LA, Akarsu O,
et al., 2021, Snowmass2021-Letter of interest cosmology intertwined IV: The age of the universe and its curvature, *ASTROPARTICLE PHYSICS*, Vol: 131, ISSN: 0927-6505

- Author Web Link
- Cite
- Citations: 21

Porqueres N, Heavens A, Mortlock D,
et al., 2021, Bayesian forward modelling of cosmic shear data, *Monthly Notices of the Royal Astronomical Society*, Vol: 502, Pages: 3035-3044, ISSN: 0035-8711

We present a Bayesian hierarchical modelling approach to infer the cosmic matter density field, and the lensing and the matter power spectra, from cosmic shear data. This method uses a physical model of cosmic structure formation to infer physically plausible cosmic structures, which accounts for the non-Gaussian features of the gravitationally evolved matter distribution and light-cone effects. We test and validate our framework with realistic simulated shear data, demonstrating that the method recovers the unbiased matter distribution and the correct lensing and matter power spectrum. While the cosmology is fixed in this test, and the method employs a prior power spectrum, we demonstrate that the lensing results are sensitive to the true power spectrum when this differs from the prior. In this case, the density field samples are generated with a power spectrum that deviates from the prior, and the method recovers the true lensing power spectrum. The method also recovers the matter power spectrum across the sky, but as currently implemented, it cannot determine the radial power since isotropy is not imposed. In summary, our method provides physically plausible inference of the dark matter distribution from cosmic shear data, allowing us to extract information beyond the two-point statistics and exploiting the full information content of the cosmological fields.

Heavens A, Sellentin E, Jaffe A, 2020, Extreme data compression while searching for new physics, *Monthly Notices of the Royal Astronomical Society*, Vol: 498, Pages: 3440-3451, ISSN: 0035-8711

Bringing a high-dimensional dataset into science-ready shape is a formidablechallenge that often necessitates data compression. Compression has accordinglybecome a key consideration for contemporary cosmology, affecting public datareleases, and reanalyses searching for new physics. However, data compressionoptimized for a particular model can suppress signs of new physics, or evenremove them altogether. We therefore provide a solution for exploring newphysics \emph{during} data compression. In particular, we store additionalagnostic compressed data points, selected to enable precise constraints ofnon-standard physics at a later date. Our procedure is based on the maximalcompression of the MOPED algorithm, which optimally filters the data withrespect to a baseline model. We select additional filters, based on ageneralised principal component analysis, which are carefully constructed toscout for new physics at high precision and speed. We refer to the augmentedset of filters as MOPED-PC. They enable an analytic computation of Bayesianevidences that may indicate the presence of new physics, and fast analyticestimates of best-fitting parameters when adopting a specific non-standardtheory, without further expensive MCMC analysis. As there may be large numbersof non-standard theories, the speed of the method becomes essential. Should nonew physics be found, then our approach preserves the precision of the standardparameters. As a result, we achieve very rapid and maximally preciseconstraints of standard and non-standard physics, with a technique that scaleswell to large dimensional datasets.

Jimenez R, Heavens AF, 2020, The distribution of dark galaxies and spin bias, *Monthly Notices of the Royal Astronomical Society*, Vol: 498, Pages: L93-L97, ISSN: 0035-8711

In the light of the discovery of numerous (almost) dark galaxies from the ALFALAFA and LITTLE THINGS surveys, we revisit the predictions of Jimenez et al. 1997, based on the Toomre stability of rapidly-spinning gas disks. We have updated the predictions for ΛCDM with parameters given by Planck18, computing the expected number densities of dark objects, and their spin parameter and mass distributions. Comparing with the data is more challenging, but where the spins are more reliably determined, the spins are close to the threshold for disks to be stable according to the Toomre criterion, where the expected number density is highest, and reinforces the concept that there is a bias in the formation of luminous galaxies based on the spin of their parent halo.

Mootoovaloo A, Heavens AF, Jaffe AH,
et al., 2020, Parameter Inference for Weak Lensing using Gaussian Processes and MOPED, *Monthly Notices of the Royal Astronomical Society*, Vol: 497, Pages: 2213-2226, ISSN: 0035-8711

In this paper, we propose a Gaussian Process (GP) emulator for the calculation both of tomographic weak lensing band powers, and of coefficients of summary data massively compressed with the MOPED algorithm. In the former case cosmological parameter inference is accelerated by a factor of ∼10–30 compared with Boltzmann solver class applied to KiDS-450 weak lensing data. Much larger gains of order 103 will come with future data, and MOPED with GPs will be fast enough to permit the Limber approximation to be dropped, with acceleration in this case of ∼105. A potential advantage of GPs is that an error on the emulated function can be computed and this uncertainty incorporated into the likelihood. However, it is known that the GP error can be unreliable when applied to deterministic functions, and we find, using the Kullback–Leibler divergence between the emulator and class likelihoods, and from the uncertainties on the parameters, that agreement is better when the GP uncertainty is not used. In future, weak lensing surveys such as Euclid, and the Legacy Survey of Space and Time, will have up to ∼104 summary statistics, and inference will be correspondingly more challenging. However, since the speed of MOPED is determined not the number of summary data, but by the number of parameters, MOPED analysis scales almost perfectly, provided that a fast way to compute the theoretical MOPED coefficients is available. The GP provides such a fast mechanism.

Leclercq F, Faure B, Lavaux G,
et al., 2020, Perfectly parallel cosmological simulations using spatial comoving Lagrangian acceleration, *Astronomy and Astrophysics: a European journal*, Vol: 639, ISSN: 0004-6361

Context. Existing cosmological simulation methods lack a high degree of parallelism due to the long-range nature of the gravitational force, which limits the size of simulations that can be run at high resolution.Aims. To solve this problem, we propose a new, perfectly parallel approach to simulate cosmic structure formation, which is based on the spatial COmoving Lagrangian Acceleration (sCOLA) framework.Methods. Building upon a hybrid analytical and numerical description of particles’ trajectories, our algorithm allows for an efficient tiling of a cosmological volume, where the dynamics within each tile is computed independently. As a consequence, the degree of parallelism is equal to the number of tiles. We optimised the accuracy of sCOLA through the use of a buffer region around tiles and of appropriate Dirichlet boundary conditions around sCOLA boxes.Results. As a result, we show that cosmological simulations at the degree of accuracy required for the analysis of the next generation of surveys can be run in drastically reduced wall-clock times and with very low memory requirements.Conclusions. The perfect scalability of our algorithm unlocks profoundly new possibilities for computing larger cosmological simulations at high resolution, taking advantage of a variety of hardware architectures.

Leclercq F, Enzi W, Jasche J,
et al., 2019, Primordial power spectrum and cosmology from black-box galaxy surveys, *Monthly Notices of the Royal Astronomical Society*, Vol: 490, Pages: 4237-4253, ISSN: 0035-8711

We propose a new, likelihood-free approach to inferring the primordial matterpower spectrum and cosmological parameters from arbitrarily complex forwardmodels of galaxy surveys where all relevant statistics can be determined fromnumerical simulations, i.e. black-boxes. Our approach builds upon approximateBayesian computation using a novel effective likelihood, and upon thelinearisation of black-box models around an expansion point. Consequently, weobtain simple "filter equations" for an effective posterior of the primordialpower spectrum, and a straightforward scheme for cosmological parameterinference. We demonstrate that the workload is computationally tractable, fixeda priori, and perfectly parallel. As a proof of concept, we apply our frameworkto a realistic synthetic galaxy survey, with a data model accounting forphysical structure formation and incomplete and noisy galaxy observations. Indoing so, we show that the use of non-linear numerical models allows the galaxypower spectrum to be safely fitted up to at least $k_\mathrm{max} = 0.5$$h$/Mpc, outperforming state-of-the-art backward-modelling techniques by afactor of $\sim 5$ in the number of modes used. The result is an unbiasedinference of the primordial matter power spectrum across the entire range ofscales considered, including a high-fidelity reconstruction of baryon acousticoscillations. It translates into an unbiased and robust inference ofcosmological parameters. Our results pave the path towards easy applications oflikelihood-free simulation-based inference in cosmology.

Jones DM, Heavens AF, 2019, Gaussian mixture models for blended photometric redshifts, *Monthly Notices of the Royal Astronomical Society*, Vol: 490, Pages: 3966-3986, ISSN: 0035-8711

Future cosmological galaxy surveys such as the Large Synoptic Survey Telescope (LSST) will photometrically observe very large numbers of galaxies. Without spectroscopy, the redshifts required for the analysis of these data will need to be inferred using photometric redshift techniques that are scalable to large sample sizes. The high number density of sources will also mean that around half are blended. We present a Bayesian photometric redshift method for blended sources that uses Gaussian mixture models to learn the joint flux–redshift distribution from a set of unblended training galaxies, and Bayesian model comparison to infer the number of galaxies comprising a blended source. The use of Gaussian mixture models renders both of these applications computationally efficient and therefore suitable for upcoming galaxy surveys.

Jimenez R, Maartens R, Khalifeh AR,
et al., 2019, Measuring the homogeneity of the universe using polarization drift, *Journal of Cosmology and Astroparticle Physics*, Vol: 2019, ISSN: 1475-7516

We propose a method to probe the homogeneity of a general universe, without assuming symmetry. We show that isotropy can be tested at remote locations on the past lightcone by comparing the line-of-sight and transverse expansion rates, using the time dependence of the polarization of Cosmic Microwave Background photons that have been inverse-Compton scattered by the hot gas in massive clusters of galaxies. This probes a combination of remote transverse and parallel components of the expansion rate of the metric, and we may use radial baryon acoustic oscillations or cosmic clocks to measure the parallel expansion rate. Thus we can test remote isotropy, which is a key requirement of a homogeneous universe. We provide explicit formulas that connect observables and properties of the metric.

Schmit CJ, Heavens AF, Pritchard JR, 2019, The gravitational and lensing-ISW bispectrum of 21 cm radiation, *Monthly Notices of the Royal Astronomical Society*, Vol: 483, Pages: 4259-4275, ISSN: 0035-8711

Cosmic microwave background experiments from COBE to Planck have launched cosmology into an era of precision science, where many cosmological parameters are now determined to the per cent level. Next-generation telescopes, focusing on the cosmological 21 cm signal from neutral hydrogen, will probe enormous volumes in the low-redshift Universe, and have the potential to determine dark energy properties and test modifications of Einstein’s gravity. We study the 21 cm bispectrum due to gravitational collapse as well as the contribution by line-of-sight perturbations in the form of the lensing-ISW bispectrum at low redshifts (z ∼ 0.35−3), targeted by upcoming neutral hydrogen intensity mapping experiments. We compute the expected bispectrum amplitudes and use a Fisher forecast model to compare power spectrum and bispectrum observations of intensity mapping surveys by Canadian Hydrogen Intensity Mapping Experiment (CHIME), MeerKAT, and SKA-mid. We find that combined power spectrum and bispectrum observations have the potential to decrease errors on the cosmological parameters by an order of magnitude compared to Planck. Finally, we compute the contribution of the lensing-ISW bispectrum, and find that, unlike for the cosmic microwave background analyses, it can safely be ignored for 21 cm bispectrum observations.

Jones DM, Heavens AF, 2019, Bayesian photometric redshifts of blended sources, *Monthly Notices of the Royal Astronomical Society*, Vol: 483, Pages: 2487-2505, ISSN: 0035-8711

Photometric redshifts are necessary for enabling large-scale multicolour galaxy surveys to interpret their data and constrain cosmological parameters. While the increased depth of future surveys such as the Large Synoptic Survey Telescope (LSST) will produce higher precision constraints, it will also increase the fraction of sources that are blended. In this paper, we present a Bayesian photometric redshift (BPZ) method for blended sources with an arbitrary number of intrinsic components. This method generalizes existing template-based BPZ methods, and produces joint posterior distributions for the component redshifts that allow uncertainties to be propagated in a principled way. Using Bayesian model comparison, we infer the probability that a source is blended and the number of components that it contains. We extend our formalism to the case where sources are blended in some bands and resolved in others. Applying this to the combination of LSST- and Euclid-like surveys, we find that the addition of resolved photometry results in a significant improvement in the reduction of outliers over the fully blended case. We make available blendz, a Python implementation of our method.

Amendola L, Appleby S, Avgoustidis A,
et al., 2018, Cosmology and fundamental physics with the Euclid satellite, *Living Reviews in Relativity*, Vol: 21, Pages: 1-345, ISSN: 1433-8351

Euclid is a European Space Agency medium-class mission selected for launch in 2020 within the cosmic vision 2015–2025 program. The main goal of Euclid is to understand the origin of the accelerated expansion of the universe. Euclid will explore the expansion history of the universe and the evolution of cosmic structures by measuring shapes and red-shifts of galaxies as well as the distribution of clusters of galaxies over a large fraction of the sky. Although the main driver for Euclid is the nature of dark energy, Euclid science covers a vast range of topics, from cosmology to galaxy evolution to planetary research. In this review we focus on cosmology and fundamental physics, with a strong emphasis on science beyond the current standard models. We discuss five broad topics: dark energy and modified gravity, dark matter, initial conditions, basic assumptions and questions of methodology in the data analysis. This review has been planned and carried out within Euclid’s Theory Working Group and is meant to provide a guide to the scientific themes that will underlie the activity of the group during the preparation of the Euclid mission.

Jeffrey N, Heavens AF, Fortio PD, 2018, Fast sampling from Wiener posteriors for image data with dataflow engines, *Astronomy and Computing*, Vol: 25, Pages: 230-237, ISSN: 2213-1337

We use Dataflow Engines (DFE) to construct an efficient Wiener filter of noisy and incomplete image data, and to quickly draw probabilistic samples of the compatible true underlying images from the Wiener posterior. Dataflow computing is a powerful approach using reconfigurable hardware, which can be deeply pipelined and is intrinsically parallel. The unique Wiener-filtered image is the minimum-variance linear estimate of the true image (if the signal and noise covariances are known) and the most probable true image (if the signal and noise are Gaussian distributed). However, many images are compatible with the data with different probabilities, given by the analytic posterior probability distribution referred to as the Wiener posterior. The DFE code also draws large numbers of samples of true images from this posterior, which allows for further statistical analysis. Naive computation of the Wiener-filtered image is impractical for large datasets, as it scales as [Formula presented], where [Formula presented] is the number of pixels. We use a messenger field algorithm, which is well suited to a DFE implementation, to draw samples from the Wiener posterior, that is, with the correct probability we draw samples of noiseless images that are compatible with the observed noisy image. The Wiener-filtered image can be obtained by a trivial modification of the algorithm. We demonstrate a lower bound on the speed-up, from drawing [Formula presented] samples of a [Formula presented] image, of 11.3 ± 0.8 with 8 DFEs in a 1U MPC-X box when compared with a 1U server presenting 32 CPU threads. We also discuss a potential application in astronomy, to provide better dark matter maps and improved determination of the parameters of the Universe.

Heavens AF, Di Valentino E, Melchiorri A,
et al., 2018, Bayesian Evidence against Harrison-Zel'dovich spectrum in tension cosmology, *Physical Review D - Particles, Fields, Gravitation and Cosmology*, Vol: 98, ISSN: 1550-2368

Current cosmological constraints on the scalar spectral index of primordial fluctuations ns in the ΛVcold dark matter (ΛCDM) model have excluded the minimal scale-invariant Harrison-Zel’dovich model (ns=1; hereafter HZ) at high significance, providing support for inflation. In recent years, however, some tensions have emerged between different cosmological data sets that, if not due to systematics, could indicate the presence of new physics beyond the ΛCDM model. In light of these developments, we evaluate the Bayesian evidence against HZ in different data combinations and model extensions. Considering only the Planck temperature data, we find inconclusive evidence against HZ when including variations in the neutrino number Neff and/or the helium abundance YHe. Adding the Planck polarization data, on the other hand, yields strong evidence against HZ in the extensions we considered. Perhaps most interestingly, Planck temperature data combined with local measurements of the Hubble parameter [A. G. Riess et al., Astrophys. J. 826, 56 (2016); A. G. Riess et al. Astrophys. J. 861, 126 (2018)] give as the most probable model a HZ spectrum, with additional neutrinos. However, with the inclusion of polarization, standard ΛCDM is once again preferred, but the HZ model with extra neutrinos is not strongly disfavored. The possibility of fully ruling out the HZ spectrum is therefore ultimately connected with the solution to current tensions between cosmological data sets. If these tensions are confirmed by future data, then new physical mechanisms could be at work and a HZ spectrum could still offer a valid alternative.

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.