Skip to content

Data analysis and modelling of the pulsar triple system PSR J0337+1715

Notifications You must be signed in to change notification settings

aarchiba/triplesystem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

triplesystem - testing Einstein's gravity with PSR J0337+17

This repository is for holding code, notes, and perhaps data for studying the millisecond pulsar PSR J0337+17. This is a millisecond pulsar in a stellar triple system, and its motion is sensitive to possible violations of Einstein's Strong Equivalence Principle: if gravitational binding energy falls differently from other kinds of energy, we might see the effect in the pulse arrival times from the pulsar.

Code

To study this system, we need to be able to describe the motion of the bodies in the system. Fundamentally, this is just an n-body gravitational code for n=3. There are some complexities:

  • We need to be able to evaluate the solution at the precise positions where our observations occur, not at the ends of integrator steps: we need "event detection".
  • We need to integrate for millions of steps and obtain positions on the order of a hundred light-seconds good at the light-nanosecond level. The accumulation of roundoff error means we need to use long doubles, possibly with quad precision as a check.
  • We need equations of motion that take into account post-Newtonian effects that may not be captured by general relativity. Although the interactions between bodies happen at first post-Newtonian order, the fields within the bodies - at least the pulsar - may well not be first-order. We need a 1PPN Lagrangian and symbolic computation to derive equations of motion from it.
  • We need to keep track of relativistic time dilation, both special- and general-relativistic: we need components in the integrator that are not symplectic.
  • We need a fitting algorithm that will work in spite of the numerical roughness introduced to the solution space by round-off error and branching within the integrator and root-finders. The well-established MINUIT is capable of coping with this to some extent, and MCMC algorithms are not troubled by roughness.
  • We need to evaluate many candidate orbits, and each integration takes several seconds. We need to parallelize our MCMC algorithm.
  • We need to fit many parameters, some of which (spin period) are very tightly constrained and strongly correlated with other parameters. We need to analytically marginalize over all the parameters that can be fit with a linear least-squares process.
  • We need to understand the dependence of our residuals on each of our many parameters: we need a robust numerical differentiation algorithm that works in the presence of noise and with very large dimensions.

The first pass at the code works, managing all these complexities, but it is messy, hard to understand, and not at all portable. It lives in code-v1. I plan to gradually migrate portable and usable bits, ultimately providing general-purpose tools, into a new, tidier implementation in code-v2.

The general structure of the code is a series of IPython notebooks that provide a user interface of sorts. These call into a python module, threebody, that implements fitting-related tools. That in turn calls into a Cython module that handles root-finding and driving the C++ ODE solver (from boost). The right-hand side of the ODE is a C++ file that is generated by one of the notebooks running sympy.

Data

The raw data has now been collected on dop263; it lives in /psr_archive/hessels/archibald/0337+17/raw. A slightly-processed version in which each observation has been collected into one directory also lives at /psr_archive/hessels/archibald/0337+17/obs. This filesystem is meant for permanent archival storage, so processed data lives on local disk at /data/archibald/0337+1715/; this includes a symlinked copy of the raw data. Among the processing tools are tools to manage this (possibly ill-advised) symlink scheme.

Processing

The raw data must be processed in order to yield pulse arrival time measurements. This process involves several steps:

  • Calibration - applies telescope-based calibration files to correct polarization gains and zap known bad frequencies
  • RFI zapping - automatic detection and elimination of interference
  • Alignment - all the data was taken with approximate ephemerides for the pulsar motion; this must be corrected before any downsampling can be done
  • Downsampling - averaging in time and frequency to suitable intervals
  • TOA generation - the phase shifts between the downsampled files and a standard profile must be computed to obtain pulse arrival times

The existing tool nanopipe is designed to carry out most of this process automatically for large numbers of observations from multiple pulsars and multiple telescopes. Unfortunately it is written in python to generate Makefiles, and this architeccture is not really flexible enough to accommodate many of the things I wanted the pipeline to do.

The pipeline, which includes tools to import new observations, process and reprocess existing observations, and manage multiple processing runs, lives in processing.

Write-up

For the moment, I am writing mostly markdown documentation, in files like this and in IPython notebooks. Ultimately, of course, we will be writing one or more papers for publication. This seems like as good a place as any to keep those while in progress. This will include plots and perhaps tables generated by the notebooks.

About

Data analysis and modelling of the pulsar triple system PSR J0337+1715

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages