Channels Resources Recent Items Reading list HomeRegisterLoginSupportContact

Authors: Kevin G. Jamieson Robert D. Nowak Benjamin Recht
Details: | Google Scholar CiteSeer X DBLP Database
View PDF
This paper provides lower bounds on the convergence rate of Derivative Free Optimization (DFO) with noisy function evaluations, exposing a fundamental and unavoidable gap between the performance of algorithms with access to gradients and those with access to only function evaluations. However, there are situations in which DFO is unavoidable, and for such situations we propose a new DFO algorithm that is proved to be near optimal for the class of strongly convex objective functions. A distinctive feature of the algorithm is that it uses only Boolean-valued function comparisons, rather than function evaluations. This makes the algorithm useful in an even wider range of applications, such as optimization based on paired comparisons from human subjects, for example. We also show that regardless of whether DFO is based on noisy function evaluations or Boolean-valued function comparisons, the convergence rate is the same.
Item Details
Status: updated [Success]
Update: last updated 12/09/2012, 09:38 PM

2137 users, 707 channels, 351 resources, 59993 items