The term data compression refers to decreasing the number of bits of info which should be saved or transmitted. You can do this with or without losing data, so what will be removed throughout the compression will be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the info and its quality shall be the same, while in the second case the quality will be worse. There're various compression algorithms which are better for various kind of data. Compressing and uncompressing data frequently takes lots of processing time, therefore the server performing the action should have adequate resources to be able to process your info fast enough. A simple example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 inside the binary code as an alternative to storing the actual 1s and 0s.
Data Compression in Web Hosting
The compression algorithm that we employ on the cloud web hosting platform where your new web hosting account shall be created is known as LZ4 and it is used by the state-of-the-art ZFS file system that powers the platform. The algorithm is greater than the ones other file systems employ as its compression ratio is higher and it processes data a lot quicker. The speed is most noticeable when content is being uncompressed since this happens even faster than data can be read from a hard disk. Consequently, LZ4 improves the performance of any website hosted on a server which uses this algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio make it possible for us to produce several daily backups of the whole content of all accounts and keep them for one month. Not only do these backup copies take less space, but their generation won't slow the servers down like it can often happen with other file systems.