Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector gaussian channels

Miguel Payaró*, Daniel P. Palomar

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

Abstract

Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa.

Original languageEnglish
Pages (from-to)3613-3628
Number of pages16
JournalIEEE Transactions on Information Theory
Volume55
Issue number8
DOIs
Publication statusPublished - 2009

Keywords

  • Concavity properties
  • Differential entropy
  • Entropy power
  • Fisher information matrix
  • Gaussian noise
  • Hessian matrices
  • Linear vector Gaussian channels
  • Minimum mean-square error (MMSE)
  • Mutual information
  • Nonlinear estimation

Fingerprint

Dive into the research topics of 'Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector gaussian channels'. Together they form a unique fingerprint.

Cite this