Monday, July 9, 2012

1207.1588 (R. C. Keenan et al.)

Testing for a large local void by investigating the Near-Infrared Galaxy Luminosity Function    [PDF]

R. C. Keenan, A. J. Barger, L. L. Cowie, W. -H. Wang, I. Wold, L. Trouille
Recent cosmological modeling efforts have shown that a local underdensity on scales of a few hundred Mpc (out to $z \sim 0.1$), could produce the apparent acceleration of the expansion of the universe observed via type Ia supernovae. Several studies of galaxy counts in the near-infrared (NIR) have found that the local universe appears under-dense by $\sim 25-50%$ compared with regions a few hundred Mpc distant. Galaxy counts at low redshifts sample primarily $L \sim L^*$ galaxies. Thus, if the local universe is under-dense, then the normalization of the NIR galaxy luminosity function (LF) at $z>0.1$ should be higher than that measured for $z<0.1$. Here we present a highly complete ($> 90$%) spectroscopic sample of 1436 galaxies selected in the $H-$band ($1.6\mu$m) to study the normalization of the NIR LF at $0.1View original: http://arxiv.org/abs/1207.1588

No comments:

Post a Comment