Entropy estimation via uniformization

Ziqiao Ao, Jinglai Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

120 Downloads (Pure)

Abstract

Entropy estimation is of practical importance in information theory and statistical science. Many existing entropy estimators suffer from fast growing estimation bias with respect to dimensionality, rendering them unsuitable for high-dimensional problems. In this work we propose a transform-based method for high-dimensional entropy estimation, which consists of the following two main ingredients. Firstly, we provide a modified k-nearest neighbors (k-NN) entropy estimator that can reduce estimation bias for samples closely resembling a uniform distribution. Second we design a normalizing flow based mapping that pushes samples toward the uniform distribution, and the relation between the entropy of the original samples and the transformed ones is also derived. As a result the entropy of a given set of samples is estimated by first transforming them toward the uniform distribution and then applying the proposed estimator to the transformed samples. The performance of the proposed method is compared against several existing entropy estimators, with both mathematical examples and real-world applications.
Original languageEnglish
Article number103954
Number of pages34
JournalArtificial Intelligence
Volume322
Early online date12 Jun 2023
DOIs
Publication statusPublished - Sept 2023

Keywords

  • Entropy estimation
  • k Nearest neighbor estimator
  • Normalizing flow
  • Uniformization

Fingerprint

Dive into the research topics of 'Entropy estimation via uniformization'. Together they form a unique fingerprint.

Cite this