From: 94b1 <94b1@users.noreply.github.com>
Date: Wed, 31 Aug 2016 17:50:21 +0000 (-0600)
Subject: Changed URLs
X-Git-Url: http://git.ipfire.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=refs%2Fpull%2F313%2Fhead;p=thirdparty%2Fzstd.git
Changed URLs
Changed URLs to reflect change in ownership from and reduce page load times of zstd.net due to multiple redirects
---
diff --git a/index.html b/index.html
index 2da5a3dee..84833e6bd 100644
--- a/index.html
+++ b/index.html
@@ -43,7 +43,7 @@
@@ -118,10 +118,10 @@ compiled with gcc 5.2.1, on the [Silesia compression corpus].
| Compression Speed vs Ratio | Decompression Speed |
| ---------------------------|-------------------- |
-|
|
+|
|
Several algorithms can produce higher compression ratio but at slower speed, falling outside of the graph.
-For a larger picture including very slow modes, [click on this link](https://raw.githubusercontent.com/Cyan4973/zstd/master/images/DCspeed5.png) .
+For a larger picture including very slow modes, [click on this link](https://raw.githubusercontent.com/facebook/zstd/master/images/DCspeed5.png) .
### The case for Small Data compression
@@ -132,7 +132,7 @@ This problem is common to any compression algorithm. The reason is, compression
To solve this situation, Zstd offers a __training mode__, which can be used to tune the algorithm for a selected type of data, by providing it with a few samples. The result of the training is stored in a file called "dictionary", which can be loaded before compression and decompression. Using this dictionary, the compression ratio achievable on small data improves dramatically :
-
+
These compression gains are achieved while simultaneously providing faster compression and decompression speeds.