In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.
The uncertainty, or entropy rate, of an information source is defined as
where
is the sequence of random variables defining the information source, and
is the conditional information entropy of the sequence of random variables. Equivalently, one has
See also
References
- Robert B. Ash, Information Theory, (1965) Dover Publications. ISBN 0-486-66521-6
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.