Please use this identifier to cite or link to this item:
http://hdl.handle.net/123456789/2084
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Sahasrabudhe, Neeraja | - |
dc.date.accessioned | 2020-11-24T04:30:23Z | - |
dc.date.available | 2020-11-24T04:30:23Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | Journal of Machine Learning Research, 18, pp. 1-27 | en_US |
dc.identifier.uri | https://jmlr.csail.mit.edu/papers/v18/15-592.html | - |
dc.identifier.uri | http://hdl.handle.net/123456789/2084 | - |
dc.description | Only IISERM authors are available in the record. | - |
dc.description.abstract | We propose a scheme for finding a “good” estimator for the gradient of a function on a high-dimensional space with few function evaluations, for applications where function evaluations are expensive and the function under consideration is not sensitive in all coordinates locally, making its gradient almost sparse. Exploiting the latter aspect, our method combines ideas from Spall’s Simultaneous Perturbation Stochastic Approximation with compressive sensing. We theoretically justify its computational advantages and illustrate them empirically by numerical experiments. In particular, applications to estimating gradient outer product matrix as well as standard optimization problems are illustrated via simulations. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Microtome Publishing | en_US |
dc.subject | Gradient estimation | en_US |
dc.subject | Compressive sensing | en_US |
dc.subject | Gradient descent | en_US |
dc.subject | Gradient outer product matrix. | en_US |
dc.title | Gradient estimation with simultaneous perturbation and compressive sensing | en_US |
dc.type | Article | en_US |
Appears in Collections: | Research Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Need to add pdf.odt | 8.63 kB | OpenDocument Text | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.