**2**Author(s)

By Topic

- Aerospace
- Bioengineering
- Communication, Networking & Broadcasting
- Components, Circuits, Devices & Systems
- Computing & Processing (Hardware/Software)
- Engineered Materials, Dielectrics & Plasmas

In this, the first part of a two-part paper, we establish a theorem concerning the entropy of a certain sequence of binary random variables. In the sequel we will apply this result to the solution of three problems in multi-user communication, two of which have been open for some time. Specifically we show the following. Let and be binary random -vectors, which are the input and output, respectively, of a binary symmetric channel with "crossover" probability . Let and be the entropies of and , respectively. Then begin{equation} begin{split} frac{1}{n} H{X} geq h(alpha_0), qquad 0 leq alpha_0 &leq 1, Rightarrow \ qquad qquad &qquad frac{1}{n}H{Y} geq h(alpha_0(1 - p_0) + (1 - alpha_0)p_0) end{split} end{equation} where .

- Page(s):
- 769 - 772
- ISSN :
- 0018-9448
- DOI:
- 10.1109/TIT.1973.1055107

- Date of Publication :
- Nov 1973
- Date of Current Version :
- 06 January 2003
- Issue Date :
- Nov 1973
- Sponsored by :
- IEEE Information Theory Society
- Publisher:
- IEEE