Publication: Methods for Converging Solutions of Differential Equations: Applying Imaginary Time Propagation to Density Functional Theory and Unsupervised Neural Networks to Dynamical Systems
No Thumbnail Available
Open/View Files
Date
2020-05-14
Authors
Published Version
Published Version
Journal Title
Journal ISSN
Volume Title
Publisher
The Harvard community has made this article openly available. Please share how this access benefits you.
Citation
Flamant, Cedric Wen. 2020. Methods for Converging Solutions of Differential Equations: Applying Imaginary Time Propagation to Density Functional Theory and Unsupervised Neural Networks to Dynamical Systems. Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences.
Research Data
Abstract
Reliable and robust convergence to the electronic ground state within density functional theory (DFT) Kohn-Sham (KS) calculations remains a thorny issue in many systems of interest. Here, we use an approach based on transforming the time-dependent DFT equations to imaginary time, followed by imaginary-time evolution, as a reliable alternative to the self-consistent field (SCF) procedure for determining the KS ground state. We discuss the theoretical and technical aspects of this approach and show that the KS ground state should be expected to be the long-imaginary-time output of the evolution, independent of the exchange-correlation functional or the level of theory used to simulate the system. By maintaining self-consistency between the single-particle wavefunctions (orbitals) and the electronic density throughout the determination of the stationary state, our method avoids the typical difficulties encountered in SCF. To demonstrate dependability of our approach, we apply it to selected systems which struggle to converge with SCF schemes. In addition, through the van Leeuwen theorem, we affirm the physical meaningfulness of imaginary time TDDFT, justifying its use in certain topics of statistical mechanics such as in computing imaginary time path integrals.
The time evolution of dynamical systems is frequently described by ordinary differential equations (ODEs), which must be solved for given initial conditions. Most standard approaches numerically integrate the ODEs, producing a solution whose values are computed at discrete times. For every set of initial conditions and system parameters, the calculation has to be repeated from scratch, adding significant computational overhead to methods which require varied solutions to the ODE. We extend the Lagaris method of creating an approximating neural network solution to a set of differential equations, proposing instead that a neural network be used as a solution bundle, a collection of solutions to an ODE for various initial states and system parameters. The neural network solution bundle is trained with an unsupervised loss that does not require any prior knowledge of the sought solutions, and the resulting object is differentiable in initial conditions and system parameters. The solution bundle exhibits fast, parallelizable evaluation of the system state, facilitating the use of Bayesian inference for parameter or trajectory estimation in real dynamical systems.
Description
Other Available Sources
Keywords
DFT, neural network, unsupervised, it-TDDFT, TDDFT, imaginary time propagation, density functional theory, Kohn-Sham, Lagaris, solution bundle, differential equation, Bayesian, convergence
Terms of Use
This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service