We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cond-mat.stat-mech

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Condensed Matter > Statistical Mechanics

Title: Jaynes & Shannon's Constrained Ignorance and Surprise

Abstract: In this simple article, with possible applications in theoretical and applied physics, we suggest an original way to derive the expression of Shannon's entropy from a purely variational approach,using constraints. Based on the work of Edwin T. Jaynes, our results are not fundamentally new but the context in which they are derived might, however, lead to a remarkably consistent formalism,where the maximum entropy principle appears naturally. After having given a general definition of "ignorance" in this framework, we derive the somehow general expected expression for the entropy using two approaches. In the first, one is biased and has a vague idea of the shape of the entropy function. In the second, we consider the general case, where nothing is a priori known. The merits of both ways of thinking are compared.
Subjects: Statistical Mechanics (cond-mat.stat-mech); Information Theory (cs.IT); Probability (math.PR); Statistics Theory (math.ST)
Cite as: arXiv:2107.05008 [cond-mat.stat-mech]
  (or arXiv:2107.05008v1 [cond-mat.stat-mech] for this version)

Submission history

From: Thomas Cailleteau Mr [view email]
[v1] Sun, 11 Jul 2021 10:20:38 GMT (18kb)

Link back to: arXiv, form interface, contact.