Theories for Cosmological Bursts

Theories for bursts at cosmological distances were developed after the BATSE observations invalidated the local Galactic neutron star hypothesis. Here I provide only a schematic outline.

The burst originates with the release of a large amount of energy in a small volume. The resulting processes erase most of the memory of the origin of this energy. The necessary energy release of more than 1051 ergs (if bursts radiate isotropically) suggests a source related to the binding energy of a stellar mass such as the merger of neutron star-neutron star binaries or unusual, extremely energetic supernovae ("hypernovae"). The necessary rate is approximately once per 105 years per L* galaxy (again assuming bursts radiate isotropically).

Neutron star-neutron star binaries are observed to exist and decay by gravitational radiation, and the merger rate per galaxy should be sufficient. Since a comprehensive calculation has not yet been feasible, it is unknown whether a neutron star-neutron star merger will release sufficient energy for an observable burst. Davies et al. (1994) used a Newtonian smoothed particle hydrodynamics code to model the merger. Because the merged object is close to the maximum mass for a spinning neutron star, a disk of material is left in the equatorial plane, and of order 1053 ergs is released in various forms which can be used to power the burst. Using a piecewise-parabolic hydrodynamics code, Ruffert and colleagues perform Newtonian calculations of a merger which include gravitational radiation and its back-reaction; they find that insufficient energy is released. Mathews and Wilson (1997,1998) calculate the fully relativistic inspiral of the binary; their numerical methodology does not allow them to follow the binary to the final merger. However, they find that as a consequence of general relativistic effects, the two neutron stars collapse to black holes before the merger. But before the collapse, the neutron stars heat up and radiate a large neutrino flux (~1053 ergs) before they collapse. These various calculations include different physical processes, and thus reach divergent conclusions.

Hypernovae are currently only a theoretical construct, and can be postulated to occur sufficiently frequently. Paczynski (1997) proposed a model where a massive rotating star collapses to a black hole, leaving behind a disk of material which then accretes onto the black hole, releasing energy. Fuller and Shi (1998) suggest that the supernova of a supermassive star (M>5x104Mo) may emit a large enough neutrino flux to power the burst.

The simplest models assume that binary mergers and hypernovae occur in galaxies and are endpoints of stellar evolution, and therefore a reasonable conclusion is that bursts occur in galaxies. Of course, alternatives are possible. A neutron star-neutron star binary may be ejected from the galaxy and may not burst until it has traveled a fair distance from the host galaxy. The supermassive star (M>5x 104Mo) which might power the burst may form outside of a galaxy.

If the gamma-ray energy density is sufficiently large, the resulting volume will be optically thick to pair creation. A pair plasma should result which will expand relativistically; the Lorentz factor of the fireball depends on the ratio of the energy to the number of baryons which are swept up by the plasma. The original fireball models attributed the gamma-ray emission to the moment when the fireball becomes optically thin. However, this should produce a single short spike with a quasi-black body spectrum, and is therefore insufficient to reproduce the observed spectra and temporal structures. In the next generation of models the "external" shocks which form when the fireball collides with the surrounding medium radiate the observed gamma-rays. However, the external shocks are not thought to be capable of producing the rich temporal structure unless the shocks radiate with very low efficiency. Consequently, in the current fireball model inhomogeneities within the relativistic outflow result in "internal" shocks which radiate the observed gamma-ray emission. However, the "external" shocks should radiate at lower energies on timescales much longer than the gamma-ray burst; this is the origin of the recently observed afterglows.


Back to The Solution?
Forward to The Minimal Model
Back to the outline
Page maintained by David Band.
Last revised January 21, 1999.