Data compression is the compacting of info by decreasing the number of bits which are stored or transmitted. This way, the compressed information will take considerably less disk space than the original one, so more content could be stored on identical amount of space. You'll find various compression algorithms that work in different ways and with a number of them only the redundant bits are deleted, so once the info is uncompressed, there is no decrease in quality. Others delete unnecessary bits, but uncompressing the data at a later time will result in lower quality in comparison with the original. Compressing and uncompressing content consumes a large amount of system resources, and in particular CPU processing time, so every web hosting platform that uses compression in real time needs to have sufficient power to support that feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of saving the entire code.
Data Compression in Shared Hosting
The compression algorithm used by the ZFS file system which runs on our cloud web hosting platform is known as LZ4. It can enhance the performance of any Internet site hosted in a shared hosting account on our end because not only does it compress data more efficiently than algorithms employed by alternative file systems, but also uncompresses data at speeds that are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. One more advantage of LZ4 is that it enables us to create backup copies much more rapidly and on less disk space, so we can have a couple of daily backups of your databases and files and their generation won't affect the performance of the servers. In this way, we can always recover all the content that you could have deleted by accident.