Data compression is the compacting of data by decreasing the number of bits which are stored or transmitted. Consequently, the compressed data will take considerably less disk space than the initial one, so more content can be stored on identical amount of space. You'll find various compression algorithms which function in different ways and with several of them only the redundant bits are deleted, therefore once the information is uncompressed, there's no loss of quality. Others erase unnecessary bits, but uncompressing the data following that will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a significant amount of system resources, and in particular CPU processing time, so each and every web hosting platform which employs compression in real time should have adequate power to support that feature. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the whole code.
Data Compression in Shared Web Hosting
The ZFS file system that runs on our cloud web hosting platform uses a compression algorithm called LZ4. The latter is significantly faster and better than any other algorithm available on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that very quickly, we are able to generate several backups of all the content stored in the shared web hosting accounts on our servers on a daily basis. Both your content and its backups will need less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the servers where your content will be stored.