Show simple item record

dc.contributor.authorGelbart, Michael Adamen_US
dc.date.accessioned2015-07-17T16:53:13Z
dc.date.created2015-05en_US
dc.date.issued2015-05-17en_US
dc.date.submitted2015en_US
dc.identifier.citationGelbart, Michael Adam. 2015. Constrained Bayesian Optimization and Applications. Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences.en_US
dc.identifier.urihttp://nrs.harvard.edu/urn-3:HUL.InstRepos:17467236
dc.description.abstractBayesian optimization is an approach for globally optimizing black-box functions that are expensive to evaluate, non-convex, and possibly noisy. Recently, Bayesian optimization has been used with great effectiveness for applications like tuning the hyperparameters of machine learning algorithms and automatic A/B testing for websites. This thesis considers Bayesian optimization in the presence of black-box constraints. Prior work on constrained Bayesian optimization consists of a variety of methods that can be used with some efficacy in specific contexts. Here, by forming a connection with multi-task Bayesian optimization, we formulate a more general class of constrained Bayesian optimization problems that we call Bayesian optimization with decoupled constraints. In this general framework, the objective and constraint functions are divided into tasks that can be evaluated independently of each other, and resources with which these tasks can be performed. We then present two methods for solving problems in this general class. The first method, an extension to a constrained variant of expected improvement, is fast and straightforward to implement but performs poorly in some circumstances and is not sufficiently flexible to address all varieties of decoupled problems. The second method, Predictive Entropy Search with Constraints (PESC), is highly effective and sufficiently flexible to address all problems in the general class of decoupled problems without any ad hoc modifications. The two weaknesses of PESC are its implementation difficulty and slow execution time. We address these issues by, respectively, providing a publicly available implementation within the popular Bayesian optimization software Spearmint, and developing an extension to PESC that achieves greater speed without significant performance losses. We demonstrate the effectiveness of these methods on real-world machine learning meta-optimization problems.en_US
dc.description.sponsorshipBiophysicsen_US
dc.format.mimetypeapplication/pdfen_US
dc.language.isoenen_US
dash.licenseLAAen_US
dc.subjectComputer Scienceen_US
dc.titleConstrained Bayesian Optimization and Applicationsen_US
dc.typeThesis or Dissertationen_US
dash.depositing.authorGelbart, Michael Adamen_US
dc.date.available2015-07-17T16:53:13Z
thesis.degree.date2015en_US
thesis.degree.grantorGraduate School of Arts & Sciencesen_US
thesis.degree.levelDoctoralen_US
thesis.degree.nameDoctor of Philosophyen_US
dc.contributor.committeeMemberDoshi-Velez, Finaleen_US
dc.contributor.committeeMemberParkes, David C.en_US
dc.contributor.committeeMemberSinger, Yaronen_US
dc.contributor.committeeMemberHogle, James M.en_US
dc.type.materialtexten_US
thesis.degree.departmentBiophysicsen_US
dash.identifier.vireohttp://etds.lib.harvard.edu/gsas/admin/view/479en_US
dc.description.keywordsBayesian optimizationen_US
dash.author.emailmichael.gelbart@gmail.comen_US
dash.identifier.drsurn-3:HUL.DRS.OBJECT:25164423en_US
dash.contributor.affiliatedGelbart, Michael Adam


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record