摘要

Gamma-ray bursts (GRBs) are promising tools for tracing the formation of high-redshift stars, including the first generation. At very high redshifts the reverse shock emission lasts longer in the observer frame, and its importance for detection and analysis purposes relative to the forward shock increases. We consider two different models for the GRB environment, based on current ideas about the redshift dependence of gas properties in galaxies and primordial star formation. We calculate the observed flux as a function of the redshift and observer time for typical GRB afterglows, taking into account intergalactic photoionization and Lyalpha absorption opacity, as well as extinction by the Milky Way. The fluxes in the X-ray and near-IR bands are compared with the sensitivity of different detectors such as Chandra, XMM, Swift XRT, and the James Webb Space Telescope (JWST). Using standard assumptions, we find that Chandra, XMM, and Swift XRT can potentially detect GRBs in the X-ray band out to very high redshifts zgreater than or similar to30. In the K and M bands, the JWST and ground-based telescopes are potentially able to detect GRBs even 1 day after the trigger out to z similar to 16 and 33, if present. While the X-ray band is insensitive to the external density and to reverse shocks, the near-IR bands provide a sensitive tool for diagnosing both the environment and the reverse shock component.

  • 出版日期2004-4-1