sqlloyd designs a scalar quantizer to minimize the mean-square error (MSE) for the given probability density and number of quantizer levels. The designed quantizer is output to quantizer_file. The range of the quantizer is min_value through max_value. The number of quantization levels is num_levels.
If the -l option is specified, a Laplacian density is used. Otherwise, a Gaussian density is used. In either case, variance is the variance of the density to be used. The mean of the density is assumed to be (max_value+min_value)/2.
The Lloyd algorithm involves finding the first moment of each partition and the probability of each partition. Numerical integration using Simpon's Rule is used in each of these calculations. The value of integration_intervals gives the number of intervals to use in Simpson's Rule. The more intervals used, the more accurate will be the the integration, but using more levels increases the computation time. 100 to 1000 intervals is probably sufficient for most applications.
The design process stops when the average distortion of the quantizer changes by less than stop_threshold between iterations. Small values yield better quantizers yet longer computation time. Values in the range 0.001 to 0.000001 are probably sufficient for most applications.
A. Gersho and R. Gray, Vector Quantization and Signal Compression. Norwell, MA: Kluwer Academic Publishers, 1992, pp. 187-194.
A. K. Jain, Fundamentals of Digital Image Processing. Englewood Cliffs, NJ: Prentice Hall, 1989, pp. 101-112.