[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#16471: New entropy coding: faster than Huffman, compression rate lik

From: Jarosław Duda
Subject: bug#16471: New entropy coding: faster than Huffman, compression rate like arithmetic
Date: Thu, 16 Jan 2014 18:47:36 -0500
User-agent: Mozilla/5.0 (Windows NT 6.2; WOW64; rv:17.0) Gecko/20130801 Thunderbird/17.0.8


Since I haven't found gzip development mailing list, I am sending it here.
There is a new approach to entropy coding: asymmetric numeral systems. It has some connection with arithmetic coding, but is simpler: uses a single natural number as the state, instead of two to represent the range. Thanks of it, we can put the entire behavior for given large alphabet probability distribution into a relatively small table: a few kilobytes for 256 size alphabet. This way it turns out about 50% faster than Huffman decoding, still providing accuracy (compression rate) like arithmetic coding.

Here is some implementation: https://github.com/Cyan4973/FiniteStateEntropy
Just replacing Huffman with it, we get improvement of both speed and compression rate, like recently in Zhuff of Yann Collet: http://fastcompression.blogspot.fr/2013/12/zhuff-v09-or-first-fse-application.html
I thought it could also improve gzip DEFLATE in the same way?


-- dr Jarosław Duda
Center for Science of Information, Purdue University, USA

reply via email to

[Prev in Thread] Current Thread [Next in Thread]