Around 2001, I found an interesting algorithm that could be useful for several applications. I called it "Recursive Range Reduction" ("3R" for short) and it is described in some depth here (EN). It is an encoder, not a transform, which means that it requires correctly pre-processed input data to be efficient (so it is expected that 3R expands "random" data, like most encoders).
You can find a dedicated page on hackaday and a few pages in the 4th edition of David Salomon's book "Data Compression: The Complete Reference", described at http://www.davidsalomon.name/DC4advertis/DComp4Ad.html (Google book)
Sometimes, 3Rpi is shorter than 3Rpo so this page implements a combined version called "3Rpio". The results of 3Rpi and 3Rpo are compared, the shortest result is kept and a bit is added (to signal the decoder which encoder is chosen). You can keep on hitting the "random" button tens or hundreds of times : the difference is very small but 3Rpo wins anyway ! Now, "real data" are not random so check yourself.
In July 2010, I finally implemented yet another variation of the compressor : I noticed that in some specific cases, the algorithm generates long strings of consecutive 1s or 0s. The new "-x" variations of the algorithms simply "xor"s the consecutive bits of the output stream (this is computationnaly insignificant and easily recovered by the decoder) and we count the number of 1s. In some cases (many "spikes" and 0s), there is enough bias for an arithmetic or range coder to exploit. But again, YMWV with your data and I have yet to find an arithmetic encoder that does not expand the output when given equiprobable (random) bits.
Todo : design an efficient offset-removal algorithm (low-order linear approximation ?)
Output data (normal) :
Output with phasing-in :
Output with phasing-out :
Legend : Size prefix, Total sum without MSB, Partial sum, Some of the terminal values
Accumulate bias (1 vs 0) over runs :