int QccENTAdaptiveGolombEncode(QccBitBuffer *output_buffer, const int
*symbols, int num_symbols);
int QccENTAdaptiveGolombDecode(QccBitBuffer
*input_buffer, int *symbols, int num_symbols);
QccENTAdaptiveGolombDecode() performs adaptive Golomb (Langdon) decoding of the bits in the bitstream input_buffer, producing an output stream of symbols that are stored in symbols. The symbols array must be allocated with space sufficient for holding num_symbols integers; this allocation must be done prior to calling QccENTAdaptiveGolombDecode(). Additionally, input_buffer must be opened for reading prior to calling QccENTAdaptiveGolombDecode().
QccENTAdaptiveGolombEncode() will fail if it encounters an invalid symbol (i.e., a symbol that is neither 0 or 1).
Golomb coding originated in the 1966 paper by Golomb; the adaptive variant described here is due to Langdon. Apparently, this adaptive Golomb coding is also known as runlength/Rice coding.
G. G. Langdon, Jr., "An adaptive run-length coding algorithm," IBM Technical Disclosure Bulletin, vol. 26, no. 7B, pp. 3783-3785, December 1983.
S. W. Golomb, "Run-Length Encodings," IEEE Transactions on Information Theory, vol. 12, pp. 399-401, July 1966.
Copyright (C) 1997-2021 James E. Fowler