If you are serving static files, but not compressing them, you are being
irresponsible with your bandwidth. If you are serving static files, but
compressing them on the fly, you are being irresponsible with your CPU. So, how
do you deliever statically compressed files using nginx
?
(If you don't use nginx
, you probably should... you might know better, of
course.)
It's really simple and consists of just 2 steps:
- Compress files
- Configure
nginx
1. Compress files
You could use gzip
, if your version has a flag not to delete the original
file. Mine didn't...
I whipped up a quick Python script to do it for me.
#!/usr/bin/python3
# Usage: gzip.py <file>
import gzip
import sys
def gzip_file(path):
with open(path, 'rb') as f_in:
with gzip.open('{}.gz'.format(path), 'wb') as f_out:
f_out.writelines(f_in)
if __name__ == "__main__":
gzip_path(sys.argv[1])
2. Configure nginx
Configuring nginx
is really easy, but ngx_http_gzip_static_module
must be
enabled on compilation; this can be done with the
--with-http_gzip_static_module
option. Just add the following lines to your
http
/server
/location
block in nginx.conf
(or an included file).
gzip_static on;
gzip_http_version 1.1;
gzip_proxied expired no-cache no-store private auth;
gzip_disable "MSIE [1-6]\.";
gzip_vary on;
You can find more details on the Nginx Wiki. Once the configuration has
been reloaded, nginx
will automatically serve the compressed files, as long
as
- You have created a compressed file in the same folder and with the same name
with a
.gz
suffix. - The user-agent supports
gzip
compression.
Notes
I deploy this blog using git
and githooks
on the server. When my
post-receive
hook has built the site, it then compresses every file.