Abstract
This paper considers the problem of high-dimensional sparse precision matrix estimation under Laplacian constraints. We prove that the Laplacian constraints bring favorable properties for estimation: the Gaussian maximum likelihood estimator exists and is unique almost surely on the basis of one observation, irrespective of the dimension. We establish the optimal rate of convergence under Frobenius norm by the derivation of the minimax lower and upper bounds. The minimax lower bound is obtained by applying Le Cam-Assouad's method with a novel construction of a subparameter space of multivariate normal distributions. The minimax upper bound is established by designing an adaptive ℓ1-norm regularized maximum likelihood estimation method and quantifying the rate of convergence. We prove that the proposed estimator attains the optimal rate of convergence with an overwhelming probability. Numerical experiments demonstrate the effectiveness of the proposed estimator.
| Original language | English |
|---|---|
| Pages (from-to) | 3736-3744 |
| Number of pages | 9 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 130 |
| Publication status | Published - 2021 |
| Event | 24th International Conference on Artificial Intelligence and Statistics, AISTATS 2021 - Virtual, Online, United States Duration: 13 Apr 2021 → 15 Apr 2021 |
Bibliographical note
Publisher Copyright:Copyright © 2021 by the author(s)