From: Yann Collet Date: Mon, 7 Nov 2016 20:25:28 +0000 (-0800) Subject: Fix #444 images addresses X-Git-Url: http://git.ipfire.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=dba6cd84a1b29f63841ee22a03b0ad82816b8975;p=thirdparty%2Fzstd.git Fix #444 images addresses --- diff --git a/index.html b/index.html index 0bab43929..847271167 100644 --- a/index.html +++ b/index.html @@ -118,10 +118,10 @@ compiled with gcc 5.2.1, on the [Silesia compression corpus]. | Compression Speed vs Ratio | Decompression Speed | | ---------------------------|-------------------- | -| Compression Speed vs Ratio | Decompression Speed +| Compression Speed vs Ratio | Decompression Speed Several algorithms can produce higher compression ratio but at slower speed, falling outside of the graph. -For a larger picture including very slow modes, [click on this link](https://raw.githubusercontent.com/facebook/zstd/master/images/DCspeed5.png) . +For a larger picture including very slow modes, [click on this link](https://raw.githubusercontent.com/facebook/zstd/master/doc/images/DCspeed5.png) . ### The case for Small Data compression @@ -132,7 +132,7 @@ This problem is common to any compression algorithm. The reason is, compression To solve this situation, Zstd offers a __training mode__, which can be used to tune the algorithm for a selected type of data, by providing it with a few samples. The result of the training is stored in a file called "dictionary", which can be loaded before compression and decompression. Using this dictionary, the compression ratio achievable on small data improves dramatically : -Compressing Small Data +Compressing Small Data These compression gains are achieved while simultaneously providing faster compression and decompression speeds.