We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.IT

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Information Theory

Title: A Mathematical Theory of Semantic Communication

Abstract: The year 1948 witnessed the historic moment of the birth of classic information theory (CIT). Guided by CIT, modern communication techniques have approached the theoretic limitations, such as, entropy function $H(U)$, channel capacity $C=\max_{p(x)}I(X;Y)$ and rate-distortion function $R(D)=\min_{p(\hat{x}|x):\mathbb{E}d(x,\hat{x})\leq D} I(X;\hat{X})$. Semantic communication paves a new direction for future communication techniques whereas the guided theory is missed. In this paper, we try to establish a systematic framework of semantic information theory (SIT). We investigate the behavior of semantic communication and find that synonym is the basic feature so we define the synonymous mapping between semantic information and syntactic information. Stemming from this core concept, synonymous mapping $f$, we introduce the measures of semantic information, such as semantic entropy $H_s(\tilde{U})$, up/down semantic mutual information $I^s(\tilde{X};\tilde{Y})$ $(I_s(\tilde{X};\tilde{Y}))$, semantic capacity $C_s=\max_{f_{xy}}\max_{p(x)}I^s(\tilde{X};\tilde{Y})$, and semantic rate-distortion function $R_s(D)=\min_{\{f_x,f_{\hat{x}}\}}\min_{p(\hat{x}|x):\mathbb{E}d_s(\tilde{x},\hat{\tilde{x}})\leq D}I_s(\tilde{X};\hat{\tilde{X}})$. Furthermore, we prove three coding theorems of SIT by using random coding and (jointly) typical decoding/encoding, that is, the semantic source coding theorem, semantic channel coding theorem, and semantic rate-distortion coding theorem. We find that the limits of SIT are extended by using synonymous mapping, that is, $H_s(\tilde{U})\leq H(U)$, $C_s\geq C$ and $R_s(D)\leq R(D)$. All these works composite the basis of semantic information theory. In addition, we discuss the semantic information measures in the continuous case. For the band-limited Gaussian channel, we obtain a new channel capacity formula, $C_s=B\log\left[S^4\left(1+\frac{P}{N_0B}\right)\right]$.
Comments: (version 2.0 updated) 96 pages, 18 figures. This paper is submitted to IEEE Transactions on Information Theory (TIT)
Subjects: Information Theory (cs.IT)
Cite as: arXiv:2401.13387 [cs.IT]
  (or arXiv:2401.13387v2 [cs.IT] for this version)

Submission history

From: Kai Niu [view email]
[v1] Wed, 24 Jan 2024 11:35:42 GMT (1493kb,D)
[v2] Wed, 27 Mar 2024 03:37:17 GMT (1496kb,D)

Link back to: arXiv, form interface, contact.