It has been suggested that the prompt emission in gamma-ray bursts (GRBs) could be described by radiation from the photosphere in a hot fireball. Such models must be tested by directly fitting them to data. In this work we use data from the Fermi Gamma-ray Space Telescope and consider a specific photospheric model, in which the kinetic energy of a low-magnetization outflow is dissipated locally by internal shocks below the photosphere. We construct a table model with a physically motivated parameter space and fit it to time-resolved spectra of the 36 brightest Fermi GRBs with a known redshift. We find that about two-thirds of the examined spectra cannot be described by the model, as it typically underpredicts the observed flux. However, since the sample is strongly biased towards bright GRBs, we argue that this fraction will be significantly lowered when considering the full population. From the successful fits we find that the model can reproduce the full range of spectral slopes present in the sample. For these cases we also find that the dissipation consistently occurs at a radius of ~10^12^cm and that only a few per cent efficiency is required. Furthermore, we find a positive correlation between the fireball luminosity and the Lorentz factor. Such a correlation has been previously reported by independent methods. We conclude that if GRB spectra are due to photospheric emission, the dissipation cannot only be the specific scenario we consider here.
Cone search capability for table J/MNRAS/485/474/table2 (Best-fitting parameter values for accepted fits with DREAM1.2)