We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.IT

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Information Theory

Title: Polynomial Approximations of Conditional Expectations in Scalar Gaussian Channels

Abstract: We consider a channel $Y=X+N$ where $X$ is a random variable satisfying $\mathbb{E}[|X|]<\infty$ and $N$ is an independent standard normal random variable. We show that the minimum mean-square error estimator of $X$ from $Y,$ which is given by the conditional expectation $\mathbb{E}[X \mid Y],$ is a polynomial in $Y$ if and only if it is linear or constant; these two cases correspond to $X$ being Gaussian or a constant, respectively. We also prove that the higher-order derivatives of $y \mapsto \mathbb{E}[X \mid Y=y]$ are expressible as multivariate polynomials in the functions $y \mapsto \mathbb{E}\left[ \left( X - \mathbb{E}[X \mid Y] \right)^k \mid Y = y \right]$ for $k\in \mathbb{N}.$ These expressions yield bounds on the $2$-norm of the derivatives of the conditional expectation. These bounds imply that, if $X$ has a compactly-supported density that is even and decreasing on the positive half-line, then the error in approximating the conditional expectation $\mathbb{E}[X \mid Y]$ by polynomials in $Y$ of degree at most $n$ decays faster than any polynomial in $n.$
Comments: A short version of this paper has been submitted to the 2021 IEEE International Symposium on Information Theory (ISIT)
Subjects: Information Theory (cs.IT); Probability (math.PR)
Cite as: arXiv:2102.05970 [cs.IT]
  (or arXiv:2102.05970v1 [cs.IT] for this version)

Submission history

From: Wael Alghamdi [view email]
[v1] Thu, 11 Feb 2021 12:26:43 GMT (23kb)

Link back to: arXiv, form interface, contact.