int QccENTAdaptiveGolombEncodeChannel(const
QccChannel *channel, QccBitBuffer *output_buffer);
int QccENTAdaptiveGolombDecodeChannel(QccBitBuffer
*input_buffer, const QccChannel *channel);
QccENTAdaptiveGolombDecodeChannel() performs decoding of the bits in the bitstream input_buffer, producing an output stream of binary channel symbols that are stored in channel. channel must be allocated prior to calling QccENTAdaptiveGolombDecodeChannel(); QccChannelGetBlockSize(3) is called to find out how many channel symbols are to be decoded from input_buffer. QccENTAdaptiveGolombDecodeChannel() calls QccENTAdaptiveGolombDecode(3) to actually do the adaptive Golomb decoding.
QccENTAdaptiveGolombEncode() will fail if it encounters an invalid symbol (i.e., a symbol that is neither 0 or 1).
Golomb coding originated in the 1966 paper by Golomb; the adaptive variant described here is due to Langdon. Apparently, this adaptive Golomb coding is also known as runlength/Rice coding.
G. G. Langdon, Jr., "An adaptive run-length coding algorithm," IBM Technical Disclosure Bulletin, vol. 26, no. 7B, pp. 3783-3785, December 1983.
S. W. Golomb, "Run-Length Encodings," IEEE Transactions on Information Theory, vol. 12, pp. 399-401, July 1966.
Copyright (C) 1997-2021 James E. Fowler