Abstract
Zeroth-order optimization algorithms recently emerge as a popular research theme in optimization and machine learning, playing important roles in many deep-learning related tasks such as black-box adversarial attack, deep reinforcement learning, as well as hyper-parameter tuning. Mainstream zeroth-order optimization algorithms, however, concentrate on exploiting zeroth-order-estimated first-order gradient information of the objective landscape. In this paper, we propose a novel meta-algorithm called Hessian-Aware Zeroth-Order (ZOHA) optimization algorithm, which utilizes several canonical variants of zeroth-order-estimated second-order Hessian information of the objective: power-method-based, and Gaussian-smoothing-based. We conclude theoretically that ZOHA enjoys an improved convergence rate compared with existing work without incorporating in zeroth-order optimization second-order Hessian information. Empirical studies on logistic regression as well as the black-box adversarial attack are provided to validate the effectiveness and improved success rates with reduced query complexity of the zeroth-order oracle.
| Original language | English |
|---|---|
| Pages (from-to) | 4869-4877 |
| Number of pages | 9 |
| Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
| Volume | 47 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - 2025 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 1979-2012 IEEE.
Keywords
- Hessian-aware algorithm
- randomized algorithm
- zeroth-order optimization
Fingerprint
Dive into the research topics of 'Hessian-Aware Zeroth-Order Optimization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver