double QccChannelEntropy(const QccChannel *channel, int order);
If order 1 is specified, the first-order entropy (sometimes called the zero-order entropy) is calculated as H(X(t)), where X(t) is the random process represented by the channel symbols, channel->channel_symbols. If order 2 is specified, the second-order conditional entropy, H(X(t) | X(t-1)), is calculated; this is the average entropy of the current symbol conditioned on the preceding symbol and is an estimate of the entropy rate of a first-order Markov source. Of course, X(t) is assumed to be stationary and ergodic.
For the first-order entropy calculation, QccChannelEntropy() calulates the probability of occurrence of each symbol in the current block of symbols. For the second-order entropy calculation, a probability matrix is calculated to give the probability of each symbol conditioned on the previous symbol. QccENTConditionalEntropy (3) is used to perform the entropy calculation in each case.
T. M. Cover and J. T. Thomas, Elements of Information Theory. New York: John Wiley & Sons, Inc., 1991.
A. Gersho and R. Gray, Vector Quantization and Signal Compression. Norwell, MA: Kluwer Academic Publishers, 1992.