The term data compression identifies lowering the number of bits of info that needs to be saved or transmitted. This can be achieved with or without losing data, which means that what will be erased during the compression shall be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the data and its quality will be the same, whereas in the second case the quality will be worse. There're various compression algorithms which are more efficient for different type of data. Compressing and uncompressing data frequently takes plenty of processing time, which means that the server performing the action needs to have plenty of resources to be able to process the data fast enough. One simple example how information can be compressed is to store how many consecutive positions should have 1 and how many should have 0 inside the binary code instead of storing the actual 1s and 0s.

Data Compression in Cloud Website Hosting

The ZFS file system that runs on our cloud Internet hosting platform uses a compression algorithm called LZ4. The aforementioned is a lot faster and better than every other algorithm you can find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of Internet sites hosted on ZFS-based platforms. Since the algorithm compresses data really well and it does that quickly, we can generate several backups of all the content kept in the cloud website hosting accounts on our servers daily. Both your content and its backups will need less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the web servers where your content will be kept.