Deep Learning of the Eddington Tensor in Core-collapse Supernova Simulation

Akira Harada*, Shota Nishikawa, Shoichi Yamada

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

We trained deep neural networks (DNNs) as a function of the neutrino energy density, flux, and the fluid velocity to reproduce the Eddington tensor for neutrinos obtained in our first-principles core-collapse supernova simulation. Although the moment method, which is one of the most popular approximations for neutrino transport, requires a closure relation, none of the analytical closure relations commonly employed in the literature capture all aspects of the neutrino angular distribution in momentum space. In this paper, we develop a closure relation by using DNNs that take the neutrino energy density, flux, and the fluid velocity as the inputs and the Eddington tensor as the output. We consider two kinds of DNNs: a conventional DNN, named a component-wise neural network (CWNN), and a tensor-basis neural network (TBNN). We find that the diagonal component of the Eddington tensor is better reproduced by the DNNs than the M1 closure relation, especially for low to intermediate energies. For the off-diagonal component, the DNNs agree better with the Boltzmann solver than the M1 closure relation at large radii. In the comparison between the two DNNs, the TBNN displays slightly better performance than the CWNN. With these new closure relations at hand, based on DNNs that well reproduce the Eddington tensor at much lower costs, we have opened up a new possibility for the moment method.

Original languageEnglish
Article number117
JournalAstrophysical Journal
Volume925
Issue number2
DOIs
Publication statusPublished - 2022 Feb 1

ASJC Scopus subject areas

  • Astronomy and Astrophysics
  • Space and Planetary Science

Fingerprint

Dive into the research topics of 'Deep Learning of the Eddington Tensor in Core-collapse Supernova Simulation'. Together they form a unique fingerprint.

Cite this