chnentropy prints the entropy of channelfile (CHN format) to stdout. Option -o gives the order of the entropy calculation. Valid values are 1 and 2. If order 1 is specified, the first-order entropy (sometimes called the zero-order entropy) is calculated as H(X(t)), where X(t) is the random process represented by the channel. If order 2 is specified, the second-order conditional entropy, H(X(t) | X(t-1)), is calculated; this is the average entropy of the current symbol conditioned on the preceding symbol and gives an estimate of the entropy rate of a first-order Markov source. Of course, X(t) is assumed to be stationary and ergodic.
If option -d is given, the entropy is given as bits per vector component (i.e., it is the entropy of the channel symbols (first- or second-order) divided by vector_dimension). Otherwise, the entropy printed is simply the entropy of channel symbols. For example, the -d option gives a convenient way to calculate the bit rate, in bits per original source symbol, when the channel corresponds to indices output from a vector quantizer (see vqencode(1) ).
The -vo option indicates that only the value of the entropy is to be printed (terse output).
chnentropy uses QccChannelEntropy(3) to perform the entropy calculation.
T. M. Cover and J. T. Thomas, Elements of Information Theory. New York: John Wiley & Sons, Inc., 1991.
A. Gersho and R. Gray, Vector Quantization and Signal Compression. Norwell, MA: Kluwer Academic Publishers, 1992.