Astrophysicists have solved the fifty-year-old mystery of star formation in the Milky Way

(ORDO NEWS) — American and Korean theorists have developed a new model of the Milky Way, showing that the correct consideration of various factors that limit the appearance of new stars can explain the “paradoxically low” rates of star formation in the Galaxy observed in reality.

The expected rate of birth of new stars is strongly influenced by the effects of magnetic fields penetrating space, gas turbulence and the presence of massive stars that radiate huge amounts of energy and scatter the gas clouds around them.

The problem of suspiciously low rates of star formation in the Milky Way and its galactic neighbors – 150-180 times less than theoretical estimates – confronted science 50 years ago. In order to explain such rates, unrealistically strong magnetic fields and constant powerful turbulence were required.

A team of researchers led by Neil Evans at the University of Texas at Austin approached this long-standing problem in a new way, looking at two main parameters that determine the rate of star formation: the abundance of clouds of molecular gas and how efficiently they can form stars.

The total mass of a molecular cloud determines whether the cloud is held together by gravitational forces and how long it takes for it to contract, eventually collapsing into a star.

Molecular clouds are composed mostly of hydrogen, but it is difficult to keep an accurate record of its amount, so astrophysicists make estimates by the number of molecules of other substances present in clouds in order to estimate their total mass.

The authors of the article used carbon monoxide emission maps of CO in the Milky Way to estimate the masses of star-forming clouds using a conversion factor that depends on the abundance of metals – in the slang of astrophysicists, elements heavier than helium are called metals – and varies depending on the distance to the center of the Galaxy.

Using a series of input values ​​obtained from real observations, the authors calculated that the rate of star formation in the Milky Way should be from 0.50 to 5.93 solar masses per year, which is well within the actually observed range of 1.65-1.90 solar masses. annually.

The new work thus shows that the mystery of the slow rate of star formation in the Milky Way can be unraveled if one correctly takes into account the influence of the degree of metallicity on the overall mass estimate of molecular clouds and imposes the necessary restrictions on the efficiency of star formation.


Contact us: [email protected]

Our Standards, Terms of Use: Standard Terms And Conditions.