Matt Hilton, A. Kathy Romer, Scott T. Kay, Nicola Mehrtens, E. J. Lloyd-Davies, Peter A. Thomas, Chris J. Short, Julian A. Mayers, Philip J. Rooney, John P. Stott, Chris A. Collins, Craig D. Harrison, Ben Hoyle, Andrew R. Liddle, Robert G. Mann, Christopher J. Miller, Martin Sahlen, Pedro T. P. Viana, Michael Davidson, Mark Hosmer, Robert C. Nichol, Kivanc Sabirli, S. A. Stanford, Michael J. West
We measure the evolution of the X-ray luminosity-temperature (L_X-T) relation since z~1.5 using a sample of 211 serendipitously detected galaxy clusters with spectroscopic redshifts drawn from the XMM Cluster Survey first data release (XCS-DR1). This is the first study spanning this redshift range using a single, large, homogeneous cluster sample. Using an orthogonal regression technique, we find no evidence for evolution in the slope or intrinsic scatter of the relation since z~1.5, finding both to be consistent with previous measurements at z~0.1. However, the normalisation is seen to evolve negatively with respect to the self-similar expectation: we find E(z)^{-1} L_X = 10^{44.67 +/- 0.09} (T/5)^{3.04 +/- 0.16} (1+z)^{-1.5 +/- 0.5}, which is within 2 sigma of the zero evolution case. We see milder, but still negative, evolution with respect to self-similar when using a bisector regression technique. We compare our results to numerical simulations, where we fit simulated cluster samples using the same methods used on the XCS data. Our data favour models in which the majority of the excess entropy required to explain the slope of the L_X-T relation is injected at high redshift. Simulations in which AGN feedback is implemented using prescriptions from current semi-analytic galaxy formation models predict positive evolution of the normalisation, and differ from our data at more than 5 sigma. This suggests that more efficient feedback at high redshift may be needed in these models.
View original:
http://arxiv.org/abs/1205.5570
No comments:
Post a Comment