Gopolang M. Mohlabeng, John P. Ralston
We find a highly significant correlation of Type 1a supernova magnitudes in the Union 2.1 compilation of 580 sources. The correlation of magnitude residuals relative to the \Lambda CDM model and color \times redshift has a significance equivalent to 13 standard deviations, as evaluated by randomly shuffling the data. We generalize the standard B-V color correction to include a Taylor series in redshift z. The goodness of fit \chi^{2} decreases by more than 50 units using one additional parameter linear in color \times redshift. The new parameter shifts the supernova best-fit cosmological dark energy density parameter from \Omega_{\Lambda} =0.71 \pm 0.02 to \Omega_{\Lambda} = 0.74 \pm 0.02 assuming a flat universe. Varying \Omega_{m} and \Omega_{\Lambda} separately produces \Omega_{m}+\Omega_{\Lambda}=1 within errors. The color-redshift correlation is quite robust, cannot be attributed to outliers, and passes several tests indicating it does not originate in data selection or systematic error assignments. One physical interpretation is that supernovae or their environments evolve significantly with increasing redshift. The previously known rule that bluer supernovae have larger absolute luminosity tends to flatten out observationally with increasing redshift.
View original:
http://arxiv.org/abs/1303.0580
No comments:
Post a Comment