ERN: Bayesian Analysis (Topic) | 2021

Combining a DSGE Model with Variational Bayesian Neural Networks

 

Abstract


In the spirit of the DSGE-VAR approach, I employed Temporal Difference Variational Auto-Encoder (TDVAE) proposed by Gregor et al. (2019) as a reduced timeseries model on which theoretical restrictions from a DSGE model are imposed. TDVAE is virtually a general-form stochastic state-space model implemented via variational Bayesian inference (VI) with neural networks. With the flexibility of neural networks, this DSGE-TDVAE approach tries to address the Consolo et al. (2009)’s criticism on DSGE-VAR that an unrestricted VAR may not represent the data correctly due to its shortage in expressiveness. The empirical results showed that forecast performance is certainly gained by combining the standard New Keynesian DSGE with TDVAE for the actual macroeconomic data of Japan. It confirmed that some support for the DSGE model is still found in the data even when evaluated against the more general statistical benchmark of TDVAE rather than VAR. At the same time, this indicated that the DSGE holds considerable degree of misspecification as the optimal intensity of DSGE restriction was small. Furthermore, DSGE-TDVAE achieved the superior forecasting performance to DSGE-VAR.

Volume None
Pages None
DOI 10.2139/ssrn.3857010
Language English
Journal ERN: Bayesian Analysis (Topic)

Full Text