Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. Because of this, the compressed info takes substantially less disk space than the initial one, so extra content could be stored using identical amount of space. There are various compression algorithms which function in different ways and with a lot of them just the redundant bits are erased, so once the info is uncompressed, there is no loss of quality. Others delete unneeded bits, but uncompressing the data afterwards will result in reduced quality compared to the original. Compressing and uncompressing content requires a large amount of system resources, in particular CPU processing time, so each and every web hosting platform that employs compression in real time must have sufficient power to support this feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of storing the whole code.

Data Compression in Cloud Hosting

The ZFS file system that is run on our cloud Internet hosting platform employs a compression algorithm called LZ4. The latter is substantially faster and better than any other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. As the algorithm compresses data really well and it does that very fast, we're able to generate several backups of all the content kept in the cloud hosting accounts on our servers daily. Both your content and its backups will need less space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the servers where your content will be stored.