The term data compression refers to lowering the number of bits of data which should be saved or transmitted. You can do this with or without the loss of data, so what will be removed in the course of the compression shall be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the info and the quality will be the same, while in the second case the quality will be worse. There are different compression algorithms which are better for different sort of information. Compressing and uncompressing data usually takes plenty of processing time, which means that the server performing the action should have plenty of resources to be able to process your info quick enough. One simple example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 inside the binary code instead of storing the actual 1s and 0s.

Data Compression in Cloud Web Hosting

The ZFS file system that runs on our cloud Internet hosting platform employs a compression algorithm called LZ4. The aforementioned is significantly faster and better than every other algorithm available on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of websites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that quickly, we're able to generate several backups of all the content kept in the cloud web hosting accounts on our servers daily. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the servers where your content will be kept.