Web Apps

Optimize images of your website manually

by , , revisited on

We have by far the largest RPM repository with NGINX module packages and VMODs for Varnish. If you want to install NGINX, Varnish, and lots of useful performance/security software with smooth yum upgrades for production use, this is the repository for you.
Active subscription is required.

While optimizing your website, you might be using PageSpeed Insights testing tool. We mentioned earlier that you should not trust PageSpeed Insights or GTmetrix for real world performance testing. However, these tools are not totally useless. One great use is knowing which images require optimization. In PageSpeed Insights, you would see “Optimize images” with the following texts:

Compressing and resizing https://domain.com/some/image.jpg could save 7.7KiB (85% reduction).
Losslessly compressing https://www.domain.com/img/slide-1.jpg could save 6.5KiB (2% reduction).

Optimize images by converting them to a better format

To optimize images for size you will use different tools for each image type. However, there is one fine catch in this that is not clear initially. Sometimes you may be optimizing same images over and over again, yet PageSpeed Insights will report it requires further resizing and compressing. It doesn’t explicitly says it, but all it really wants from you is to have the the image converted to a completely different format, in order to achieve smaller size.

For instance, background .png image that doesn’t really need to be transparent might have much smaller size when it is saved in JPEG format.


WebP has a special mode for lossless (-m 6 -q 100) which can reduce a file to its smallest size by exploring all parameter combinations. It’s an order of magnitude slower but is worth it for static assets.

To convert an image to different formats on the command line, you can use ImageMagick convert tool:

convert -background white -flatten product-shadow.png product-shadow.jpg

I found this requirement of converting to a different format quite often. For one of reference websites it produced a 1KB .jpg file that saved a hundred of kilobytes (the original .png file was 120KB even when optimized with optipng).

Optimize images by using the right tool for each format

PNG images


The following command losslessly optimizes all .png files in current directory:

find . -type f -iname '*.png' -exec optipng -o7 -clobber -backup -preserve {} \;


Use 8 cores of CPU to achieve the lossy compression of .png files in current directory

find . -name '*.png' -print0 | xargs -0 -P8 -L1 pngquant --ext .png --force 256

nohup find . -name '*.png' -print0 | xargs -0 -P8 -L1 pngquant --ext .png --force 256 --verbose &

find . -name '*.png' -exec pngquant --ext .png --force 256 {} \;

However, there is a better tool for PNG files, zopflipng. So:

find . -type f -iname '*.png' -exec zopflipng -m -y {} {} \;

The -m switch will run the file through more iterations, while -y will auto-confirm writing to the same file.

JPEG files

There are numerous tools and ways to optimize JPEG images. Whether you’re fine with lossy or lossless compression, you need to pick an optimization program that suits you best.

However, it’s kind of useless comparing tools like jpegoptim and jpegtran if they are using the same encoder – the standard libjpeg-turbo. They will provide byte to byte the same results in case you use equivalent command-line switches.

It makes sense to compare tools that use different encoders, like jpegtran with standard library vs jpegtran with MozJPEG encoder. To save you time: tools compiled with MozJPEG library ALWAYS win by 2-5% in compression to the regular library.

To save you furthermore time, here is how to install jpegoptim powered by MozJpeg on CentOS/RHEL.

Lossless compression

find . -type f -iname '*.jpg' -exec jpegoptim --strip-all --force --all-progressive {} \;

Lossy compression

The following command will perform lossy optimization of all .jpg files in current directory.

find . -type f -iname '*.jpg' -exec jpegoptim -m85 --strip-all --force --all-progressive {} \;

The --force --all-progressive will ensure that the optimized files will be progressive JPEGs, which will allow for quicker visual display of images, turning from blurred to clear state, instead of empty spot while loading.

This is good for one time optimization. Running it again, will reduce file sizes (and quality!!!) further.

MagenX, the fake speed optimizer, does this to automate things:

echo "0 1 * * 1 find ${MAGE_WEB_ROOT_PATH}/pub/ -name '*\.jpg' -type f -mtime -7 -exec jpegoptim -q -s -p --all-progressive -m 65 {} \; >/dev/null 2>&1" >> rootcron

Why this is wrong and how to automate image optimization properly – read in our automatic optimization solution for images post.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: