Laura Chomiuk, Matthew S. Povich
The star formation rate (SFR) of the Milky Way remains poorly known, with
often-quoted values ranging from 1 to 10 solar masses per year. This situation
persists despite the potential for the Milky Way to serve as the ultimate SFR
calibrator for external galaxies. We show that various estimates for the
Galactic SFR are consistent with one another once they have been normalized to
the same initial mass function (IMF) and massive star models, converging to 1.9
+/- 0.4 M_sun/yr. However, standard SFR diagnostics are vulnerable to
systematics founded in the use of indirect observational tracers sensitive only
to high-mass stars. We find that absolute SFRs measured using resolved
low/intermediate-mass stellar populations in Galactic H II regions are
systematically higher by factors of ~2-3 as compared with calibrations for SFRs
measured from mid-IR and radio emission. We discuss some potential explanations
for this discrepancy and conclude that it could be allayed if (1) the power-law
slope of the IMF for intermediate-mass (1.5 M_sun < m < 5 M_sun) stars were
steeper than the Salpeter slope, or (2) a correction factor was applied to the
extragalactic 24 micron SFR calibrations to account for the duration of star
formation in individual mid-IR-bright H II regions relative to the lifetimes of
O stars. Finally, we present some approaches for testing if a Galactic SFR of
~2 M_sun/yr is consistent with what we would measure if we could view the Milky
Way as external observers. Using luminous radio supernova remnants and X-ray
point sources, we find that the Milky Way deviates from expectations at the 1-3
sigma level, hinting that perhaps the Galactic SFR is overestimated or
extragalactic SFRs need to be revised upwards.
View original:
http://arxiv.org/abs/1110.4105
No comments:
Post a Comment