Person: Gelbart, Michael Adam
Loading...
Email Address
AA Acceptance Date
Birth Date
Research Projects
Organizational Units
Job Title
Last Name
Gelbart
First Name
Michael Adam
Name
Gelbart, Michael Adam
2 results
Search Results
Now showing 1 - 2 of 2
Publication Constrained Bayesian Optimization and Applications(2015-05-17) Gelbart, Michael Adam; Doshi-Velez, Finale; Parkes, David C.; Singer, Yaron; Hogle, James M.Bayesian optimization is an approach for globally optimizing black-box functions that are expensive to evaluate, non-convex, and possibly noisy. Recently, Bayesian optimization has been used with great effectiveness for applications like tuning the hyperparameters of machine learning algorithms and automatic A/B testing for websites. This thesis considers Bayesian optimization in the presence of black-box constraints. Prior work on constrained Bayesian optimization consists of a variety of methods that can be used with some efficacy in specific contexts. Here, by forming a connection with multi-task Bayesian optimization, we formulate a more general class of constrained Bayesian optimization problems that we call Bayesian optimization with decoupled constraints. In this general framework, the objective and constraint functions are divided into tasks that can be evaluated independently of each other, and resources with which these tasks can be performed. We then present two methods for solving problems in this general class. The first method, an extension to a constrained variant of expected improvement, is fast and straightforward to implement but performs poorly in some circumstances and is not sufficiently flexible to address all varieties of decoupled problems. The second method, Predictive Entropy Search with Constraints (PESC), is highly effective and sufficiently flexible to address all problems in the general class of decoupled problems without any ad hoc modifications. The two weaknesses of PESC are its implementation difficulty and slow execution time. We address these issues by, respectively, providing a publicly available implementation within the popular Bayesian optimization software Spearmint, and developing an extension to PESC that achieves greater speed without significant performance losses. We demonstrate the effectiveness of these methods on real-world machine learning meta-optimization problems.Publication Segmentation fusion for connectomics(IEEE, 2011) Vazquez-Reina, Amelio; Gelbart, Michael Adam; Huang, Daniel Eachern; Lichtman, Jeff; Miller, Eric; Pfister, HanspeterWe address the problem of automatic 3D segmentation of a stack of electron microscopy sections of brain tissue. Unlike previous efforts, where the reconstruction is usually done on a section-to-section basis, or by the agglomerative clustering of 2D segments, we leverage information from the entire volume to obtain a globally optimal 3D segmentation. To do this, we formulate the segmentation as the solution to a fusion problem. We first enumerate multiple possible 2D segmentations for each section in the stack, and a set of 3D links that may connect segments across consecutive sections. We then identify the fusion of segments and links that provide the most globally consistent segmentation of the stack. We show that this two-step approach of pre-enumeration and posterior fusion yields significant advantages and provides state-of-the-art reconstruction results. Finally, as part of this method, we also introduce a robust rotationally-invariant set of features that we use to learn and enumerate the above 2D segmentations. Our features outperform previous connectomic-specific descriptors without relying on a large set of heuristics or manually designed filter banks.