摘要

In this paper we discuss a simple method of testing for the presence of energy-dependent dispersion in high energy datasets. It uses the minimisation of the Kolmogorov distance between the cumulative distribution of two probability functions as the statistical metric to estimate the magnitude of any spectral dispersion within transient features in a lightcurve and we also show that it performs well in the presence of modest energy resolutions (similar to 20%) typical of gamma-ray observations. After presenting the method in detail we apply it to a parameterised simulated lightcurve based on the extreme VHE gamma-ray flare of PKS 2155-304 observed with H.E.S.S. in 2006, in order to illustrate its potential through the concrete example of setting constraints on quantum-gravity induced Lorentz invariance violation (LIV) effects. We obtain comparable limits to those of the most advanced techniques used in LIV searches applied to similar datasets, but the present method has the advantage of being particularly straightforward to use. Whilst the development of the method was motivated by LIV searches, it is also applicable to other astrophysical situations where energy-dependent dispersion is expected, such as spectral lags from the acceleration and cooling of particles in relativistic outflows.