Publication:
Formulation and properties of a divergence used to compare probability measures without absolute continuity and its application to uncertainty quantification

No Thumbnail Available

Date

2020-09-09

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Mao, Yixiang. 2020. Formulation and properties of a divergence used to compare probability measures without absolute continuity and its application to uncertainty quantification. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.

Research Data

Abstract

This thesis develops a new divergence that generalizes relative entropy and can be used to compare probability measures without a requirement of absolute continuity. We establish properties of the divergence, and in particular derive and exploit a representation as an infimum convolution of optimal transport cost and relative entropy. We include examples of computation and approximation of the divergence, and its applications in uncertainty quantification in discrete models and Gauss-Markov models.

Description

Other Available Sources

Keywords

convex duality, KL divergence, model uncertainty, optimal transport theory, relative entropy, Mathematics, Information science

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories