Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Constraining dark matter annihilation and decay in large-scale structures

6 minute read

Published:

The identification of dark matter is a crucial task of modern physics. We present a full-sky, field-level search for dark matter annihilation and decay in the large-scale structure of the nearby universe, exploiting more information than conventional analyses targetting specific objects. We find no evidence for such effects, placing new constraints on the rates of dark matter interactions.

Is the speed of light energy dependent?

7 minute read

Published:

High energy astrophysical transients at cosmological distances allow us to test the fundamental assumptions of the standard models of cosmology and particle physics, such as the Weak Equivalence Principle, Lorentz Invariance or the massless nature of the photon. A violation of any of these would result in energy-dependent arrival times for photons from distant sources. We forward model these time delays for gamma ray bursts using the BORG SDSS-III/BOSS reconstruction and compare to data to constrain the quantum gravity energy scale, the mass of the photon, and violations of the Weak Equivalence Principle.

portfolio

publications

Exhaustive Symbolic Regression

Published in IEEE Transactions on Evolutionary Computation 28, 950, 2023

Download paper here

Recommended citation: D.J. Bartlett, H. Desmond and P.G. Ferreira (2023). "Exhaustive Symbolic Regression." In IEEE Transactions on Evolutionary Computation 28, 950.

Priors for symbolic regression

Published in The Genetic and Evolutionary Computation Conference (GECCO) 2023 Workshop on Symbolic Regression, 2023

Download paper here

Recommended citation: D.J. Bartlett, H. Desmond and P.G. Ferreira (2023). "Priors for symbolic regression." In Proceedings of the Companion Conference on Genetic and Evolutionary Computation, Association for Computing Machinery, New York, NY, USA, 2402–2411.

talks

Constraining the photon’s dispersion relation

Published:

Non-standard physics, such as Lorentz Invariance Violation in Quantum Gravity (models or a non-zero photon mass, can lead to an energy-dependent propagation speed for photons, such that photons of different energies from a distant source would arrive at different times, even if they were emitted simultaneously. The short durations and large distances to high energy astrophysical transients therefore allow us to test the fundamental assumptions of the standard models of cosmology and particles physics by considering the energy-dependent time delay between photon arrival time (spectral lag) of Gamma Ray Bursts (GRBs). Many previous attempts to do place constraints on such theories are obtained using a handful of GRBs, do not propagate uncertainties in the redshifts of sources, or suffer from uncertain systematics in the model for other contributions to the spectral lag (noise). In this talk I will discuss recent work in which we looked to overcome these challenges and hence were able to constrain the quantum gravity energy scale and photon mass by constructing probabilistic source-by-source forward models of the time delays for a large sample of GRBs and we demonstrate that these constraints are robust to the choice of noise model.

Galactic-Scale Tests of Fundamental Physics

Published:

Conventional probes of fundamental physics tend to consider one of three regimes: small scales, cosmological scales or the strong-field regime. Since LCDM is known to have several galactic-scale issues and novel physics (modified gravity, non-cold dark matter etc.) can alter galactic dynamics and morphology, tests of fundamental physics on astrophysical scales can provide tight constraints which are complementary to traditional techniques. By forward-modelling observational signals on a source-by-source basis and marginalising over models describing other astrophysical and observational processes, it is possible to harness the constraining power of galaxies whilst accounting for their complexity. In this talk I will demonstrate how these Bayesian Monte Carlo-based forward models can be used to constrain a variety of gravitational theories and outline ways to assess their robustness to baryonic effects.

Testing Fundamental Physics with Gamma Ray Bursts

Published:

High energy astrophysical transients at cosmological distances allow us to test the fundamental assumptions of the standard models of cosmology and particles physics, such as Lorentz invariance, the massless nature of the photon or the weak equivalence principle. If any of these assumptions are incorrect, photons of different energies propagate differently through spacetime, which could be observable in the spectral lags of Gamma Ray Bursts. Constraining such violations can challenging in the presence of uncertainties in the redshifts of sources, uncertain systematics in the model for other contributions to the spectral lag, and the long range of the gravitational potential. In this talk I will discuss how one can overcome these hurdles by constructing probabilistic source-by-source forward models and by combining constrained realisations of the local density field with unconstrained large-scale modes to obtain some of the tightest constraints on these models to date.

Learning Equations from Data: Exhaustive Symbolic Regression

Published:

Symbolic Regression (SR) algorithms learn analytic expressions which fit data accurately and in a highly interpretable manner. As such, these methods can be used to help uncover “physical laws” from data or provide simple and interpretable effective descriptions of complex, non-linear phenomena. Conventional SR suffers from two fundamental issues which I address here. First, typical SR methods search the space stochastically and hence do not necessarily find the best function. Second, the criteria used to select the equation optimally balancing accuracy with simplicity have been variable and poorly motivated. I will introduce a new method for SR – Exhaustive Symbolic Regression (ESR) - which addresses both of these issues. To illustrate the power of ESR, I will apply it to a catalogue of cosmic chronometers and the Pantheon+ sample of supernovae to learn the Hubble rate as a function of redshift, finding ~40 functions (out of 5.2 million considered) that fit the data more economically than the Friedmann equation. I will then employ ESR to learn the form of the radial acceleration relation (RAR) of galaxy dynamics and therefore assess the claim that its asymptotic limits provide evidence for a new law of nature, namely Modified Newtonian Dynamics.

Towards Velocity Field Reconstructions with BORG

Published:

Accurate peculiar velocity field maps are critical for various cosmological analyses, including Hubble constant determinations and density field reconstructions. In this talk, I will discuss the challenges faced when reconstructing the peculiar velocity field from distance tracers, as well as the collective efforts of the Aquila Consortium in developing a physical, Bayesian hierarchical model for this task. We employ Bayesian hierarchical models, connecting the initial matter density with peculiar velocity data to reconstruct the final density and velocity fields. Utilising the BORG algorithm, this approach outperforms traditional methods, even in the face of model mis-specification. I will discuss the importance of the inhomogeneous Malmquist bias for obtaining an unbiased velocity field reconstructions and will present results from simulations and peculiar velocity datasets demonstrating our model’s accuracy. In the latter part of the presentation, I will introduce a novel unified pipeline, which facilitates cosmological parameter sampling. This integration allows for the inclusion of peculiar velocity data in initial condition reconstructions which produce accurate and self-consistent density and velocity fields. This advancement holds significant implications for cosmology and astrophysics and could provide valuable insights into the S8 tension.

The terms Eisenstein and Hu missed

Published:

The matter power spectrum of cosmology, $P(k)$, is of fundamental importance in cosmological analyses, yet solving the Boltzmann equations can be computationally prohibitive if required several thousand times, e.g. in a MCMC. Emulators for $P(k)$ as a function of cosmology have therefore become popular, whether they be neural network or Gaussian process based. Yet one of the oldest emulators we have is an analytic, physics-informed fit proposed by Eisenstein and Hu (E&H). Given this is already accurate to within a few percent, does one really need a large, black-box, numerical method for calculating $P(k)$, or can one simply add a few terms to E&H? In this talk I demonstrate that Symbolic Regression can obtain such a correction, yielding sub-percent level predictions for $P(k)$.

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.