I wanted to see how long it takes for large data to be compressed in memory (thought it would be efficient). So I tried to use ICSharpCode SharpZipLib and also compared the efficiency with GZipStream/DeflateStream. I used my TimeAndExecute() method I posted earlier and used a 80 MB random data.
My observations are:
1. There is no CompressionLevel setting to be used with GZipStream/DeflateStream.
2. CompressionLevel when set in SharpZipLib has good impact. BEST_SPEED was way 2 seconds faster than the other two standard BCL classes.
3. DeflateStream is faster than GZipStream.
Using SharpZipLib CompressionMode.BEST_SPEED
So clearly, it is not a good idea to use compressed data that is frequently accessed in memory. It takes more time to compress as well as decompress. I did not bother with de-compression though! Extensions later.