Person:
Mumford, David

Loading...
Profile Picture

Email Address

AA Acceptance Date

Birth Date

Research Projects

Organizational Units

Job Title

Last Name

Mumford

First Name

David

Name

Mumford, David

Search Results

Now showing 1 - 10 of 47
  • Thumbnail Image
    Publication
    Remembering Raoul Bott (1923–2005)
    (American Mathematical Society (AMS), 2013) Tu, Loring W.; Gurdian, Rodolfo; Smale, Stephen; Mumford, David; Jaffe, Arthur; Yau, Shing-Tung
  • Thumbnail Image
    Publication
    The Canonical Ring of an Algebraic Surface
    (1962) Mumford, David
  • Thumbnail Image
    Publication
    An Elementary Theorem in Geometric Invariant Theory
    (American Mathematical Society, 1961) Mumford, David
  • Thumbnail Image
    Publication
    Further Pathologies in Algebraic Geometry
    (Johns Hopkins University Press, 1962) Mumford, David
  • Thumbnail Image
    Publication
    Two Fundamental Theorems on Deformations of Polarized Varieties
    (Johns Hopkins University Press, 1964) Matsusaka, T; Mumford, David
  • Thumbnail Image
    Publication
    Learning Generic Prior Models for Visual Computation
    (Institute of Electrical and Electronics Engineers, 1997) Zhu, Song Chun; Mumford, David
    This paper presents a novel theory for learning generic prior models from a set of observed natural images based on a minimax entropy theory that the authors studied in modeling textures. We start by studying the statistics of natural images including the scale invariant properties, then generic prior models were learnt to duplicate the observed statistics. The learned Gibbs distributions confirm and improve the forms of existing prior models. More interestingly inverted potentials are found to be necessary, and such potentials form patterns and enhance preferred image features. The learned model is compared with existing prior models in experiments of image restoration.
  • Thumbnail Image
    Publication
    Prior Learning and Gibbs Reaction-Diffusion
    (Institute of Electrical and Electronics Engineers, 1997) Zhu, Song Chun; Mumford, David
    This article addresses two important themes in early visual computation: it presents a novel theory for learning the universal statistics of natural images, and, it proposes a general framework of designing reaction-diffusion equations for image processing. We studied the statistics of natural images including the scale invariant properties, then generic prior models were learned to duplicate the observed statistics, based on minimax entropy theory. The resulting Gibbs distributions have potentials of the form U(I; Λ, S)=Σα=1kΣx,yλ (α)((F(α)*I)(x,y)) with S={F(1) , F(2),...,F(K)} being a set of filters and Λ={λ(1)(),λ(2)(),...,λ (K)()} the potential functions. The learned Gibbs distributions confirm and improve the form of existing prior models such as line-process, but, in contrast to all previous models, inverted potentials were found to be necessary. We find that the partial differential equations given by gradient descent on U(I; Λ, S) are essentially reaction-diffusion equations, where the usual energy terms produce anisotropic diffusion, while the inverted energy terms produce reaction associated with pattern formation, enhancing preferred image features. We illustrate how these models can be used for texture pattern rendering, denoising, image enhancement, and clutter removal by careful choice of both prior and data models of this type, incorporating the appropriate features
  • Thumbnail Image
    Publication
    GRADE: Gibbs Reaction and Diffusion Equations
    (Narosa, 1998) Zhu, Song Chun; Mumford, David
    Recently there have been increasing interests in using nonlinear PDEs for applications in computer vision and image processing. In this paper, we propose a general statistical framework for designing a new class of PDEs. For a given application, a Markov random field model \(p(I)\) is learned according to the minimax entropy principle so that \(p(I)\) should characterize the ensemble of images in our application. \(P(I)\) is a Gibbs distribution whose energy terms can be divided into two categories. Subsequently the partial differential equations given by gradient descent on the Gibbs potential are essentially reaction-diffusion equations, where the energy terms in one category produce anisotropic diffusion while the inverted energy terms in the second category produce reaction associated with pattern formation. We call this new class of PDEs the Gibbs Reaction And Diffusion Equations-GRADE and we demonstrate experiments where GRADE are used for texture pattern formation, denoising, image enhancement, and clutter removal.
  • Thumbnail Image
    Publication
    On the Equations Defining Abelian Varieties. II
    (Springer Verlag, 1967) Mumford, David