I just compressed the ASSD 1 gb testfile down to 16.3 mb, which is a factor of x0.016 aka 1.6%! While most compressions methods fail on this testfiles LZMA obviously does a pretty good job. ASSD needs a dictionary size of exactly 16 mb (=192 mb RAM needed for compression), anything bigger won't compress down further than the 16 mb outcome, anything less won't compress at all. So ASSD's "random" data fits exactly into these 16 mb. Furthermore I compressed a CDM 2 gb testfile down to 1.3 mb, which is a factor of x0.0006 aka 0.06%! CDM's testfile is easier to compress for other compression algorithms. While Winrar fails to get anything out of the ASSD file it has no problems whatsoever with the CDM one (less than 2.9 mb for the 2 gb testfile). LZMA is public domain btw, but it's also somewhat "demanding" aka slower than others, so it may or may not be used by Sandforce. But even if not, how do we know that Sandforce (now or in future revisions) really does not manage to compress ASSD and CDM testfiles during benchmarks?