We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.IT

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Information Theory

Title: Unbiased Estimating Equation on Inverse Divergence and Its Conditions

Abstract: This paper focuses on the Bregman divergence defined by the reciprocal function, called the inverse divergence. For the loss function defined by the monotonically increasing function $f$ and inverse divergence, the conditions for the statistical model and function $f$ under which the estimating equation is unbiased are clarified. Specifically, we characterize two types of statistical models, an inverse Gaussian type and a mixture of generalized inverse Gaussian type distributions, to show that the conditions for the function $f$ are different for each model. We also define Bregman divergence as a linear sum over the dimensions of the inverse divergence and extend the results to the multi-dimensional case.
Comments: Accepted to the 2024 IEEE International Symposium on Information Theory (ISIT 2024)
Subjects: Information Theory (cs.IT); Machine Learning (cs.LG); Statistics Theory (math.ST)
Cite as: arXiv:2404.16519 [cs.IT]
  (or arXiv:2404.16519v1 [cs.IT] for this version)

Submission history

From: Masahiro Kobayashi [view email]
[v1] Thu, 25 Apr 2024 11:22:48 GMT (14kb)

Link back to: arXiv, form interface, contact.