Hessian-Aware Zeroth-Order Optimization

Haishan Ye*, Zhichao Huang, Cong Fang, Chris Junchi Li, Tong Zhang

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

Abstract

Zeroth-order optimization algorithms recently emerge as a popular research theme in optimization and machine learning, playing important roles in many deep-learning related tasks such as black-box adversarial attack, deep reinforcement learning, as well as hyper-parameter tuning. Mainstream zeroth-order optimization algorithms, however, concentrate on exploiting zeroth-order-estimated first-order gradient information of the objective landscape. In this paper, we propose a novel meta-algorithm called Hessian-Aware Zeroth-Order (ZOHA) optimization algorithm, which utilizes several canonical variants of zeroth-order-estimated second-order Hessian information of the objective: power-method-based, and Gaussian-smoothing-based. We conclude theoretically that ZOHA enjoys an improved convergence rate compared with existing work without incorporating in zeroth-order optimization second-order Hessian information. Empirical studies on logistic regression as well as the black-box adversarial attack are provided to validate the effectiveness and improved success rates with reduced query complexity of the zeroth-order oracle.

Original languageEnglish
Pages (from-to)4869-4877
Number of pages9
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume47
Issue number6
DOIs
Publication statusPublished - 2025
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 1979-2012 IEEE.

Keywords

  • Hessian-aware algorithm
  • randomized algorithm
  • zeroth-order optimization

Fingerprint

Dive into the research topics of 'Hessian-Aware Zeroth-Order Optimization'. Together they form a unique fingerprint.

Cite this