TY - JOUR
T1 - A new method of Bayesian causal inference in non-stationary environments
AU - Shinohara, Shuji
AU - Manome, Nobuhito
AU - Suzuki, Kouta
AU - Chung, Ung Il
AU - Takahashi, Tatsuji
AU - Okamoto, Hiroshi
AU - Gunji, Yukio Pegio
AU - Nakajima, Yoshihiro
AU - Mitsuyoshi, Shunji
N1 - Funding Information:
This research is supported by the Center of Innovation Program from the Japan Science and Technology Agency, JST and by JSPS KAKENHI Grant Numbers JP16K01408 and JP17H04696.
Publisher Copyright:
© 2020 Shinohara et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
PY - 2020/5
Y1 - 2020/5
N2 - Bayesian inference is the process of narrowing down the hypotheses (causes) to the one that best explains the observational data (effects). To accurately estimate a cause, a considerable amount of data is required to be observed for as long as possible. However, the object of inference is not always constant. In this case, a method such as exponential moving average (EMA) with a discounting rate is used to improve the ability to respond to a sudden change; it is also necessary to increase the discounting rate. That is, a trade-off is established in which the followability is improved by increasing the discounting rate, but the accuracy is reduced. Here, we propose an extended Bayesian inference (EBI), wherein human-like causal inference is incorporated. We show that both the learning and forgetting effects are introduced into Bayesian inference by incorporating the causal inference. We evaluate the estimation performance of the EBI through the learning task of a dynamically changing Gaussian mixture model. In the evaluation, the EBI performance is compared with those of the EMA and a sequential discounting expectation-maximization algorithm. The EBI was shown to modify the trade-off observed in the EMA.
AB - Bayesian inference is the process of narrowing down the hypotheses (causes) to the one that best explains the observational data (effects). To accurately estimate a cause, a considerable amount of data is required to be observed for as long as possible. However, the object of inference is not always constant. In this case, a method such as exponential moving average (EMA) with a discounting rate is used to improve the ability to respond to a sudden change; it is also necessary to increase the discounting rate. That is, a trade-off is established in which the followability is improved by increasing the discounting rate, but the accuracy is reduced. Here, we propose an extended Bayesian inference (EBI), wherein human-like causal inference is incorporated. We show that both the learning and forgetting effects are introduced into Bayesian inference by incorporating the causal inference. We evaluate the estimation performance of the EBI through the learning task of a dynamically changing Gaussian mixture model. In the evaluation, the EBI performance is compared with those of the EMA and a sequential discounting expectation-maximization algorithm. The EBI was shown to modify the trade-off observed in the EMA.
UR - http://www.scopus.com/inward/record.url?scp=85085276070&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85085276070&partnerID=8YFLogxK
U2 - 10.1371/journal.pone.0233559
DO - 10.1371/journal.pone.0233559
M3 - Article
C2 - 32442220
AN - SCOPUS:85085276070
SN - 1932-6203
VL - 15
JO - PloS one
JF - PloS one
IS - 5
M1 - e0233559
ER -