Abstract:
High surface density, rapidly star-forming galaxies are observed to have
≈
50
–
100 km s
−
1
line of sight velocity
dispersions, which are much higher than expected from supernova driving alone, but may arise from large-scale
gravitational instabilities. Using three-dimensional simulations of local regions of the interstellar medium, we
explore the impact of high velocity dispersions that arise from these disk instabilities. Parametrizing disks by their
surface densities and epicyclic frequencies, we conduct a series of simulations that probe a broad range of
conditions. Turbulence is driven purely horizontally and on large scales, neglecting any energy input from
supernovae. We
fi
nd that such motions lead to strong global out
fl
ows in the highly compact disks that were
common at high redshifts, but weak or negligible mass loss in the more diffuse disks that are prevalent today.
Substantial out
fl
ows are generated if the one-dimensional horizontal velocity dispersion exceeds
≈
35 km s
−
1
,as
occurs in the dense disks that have star-formation rate
(
SFR
)
densities above
≈
0.1
M
e
yr
−
1
kpc
−
2
. These out
fl
ows
are triggered by a thermal runaway, arising from the inef
fi
cient cooling of hot material coupled with successive
heating from turbulent driving. Thus, even in the absence of stellar feedback, a critical value of the SFR density for
out
fl
ow generation can arise due to a turbulent heating instability. This suggests that in strongly self-gravitating
disks, out
fl
ows may be enhanced by, but need not caused by, energy input from supernovae.