Site icon GetPageSpeed

Optimize images of your website manually

While optimizing your website, you might be using PageSpeed Insights testing tool. We mentioned earlier that you should not trust PageSpeed Insights or GTmetrix for real world performance testing. However, these tools are not totally useless. One great use is knowing which images require optimization. In PageSpeed Insights, you would see “Optimize images” with the following texts:

Compressing and resizing https://domain.com/some/image.jpg could save 7.7KiB (85% reduction).
Losslessly compressing https://www.domain.com/img/slide-1.jpg could save 6.5KiB (2% reduction).

Optimize images by converting them to a better format

To optimize images for size you will use different tools for each image type. However, there is one fine catch in this that is not clear initially. Sometimes you may be optimizing same images over and over again, yet PageSpeed Insights will report it requires further resizing and compressing. It doesn’t explicitly says it, but all it really wants from you is to have the the image converted to a completely different format, in order to achieve smaller size.

For instance, background .png image that doesn’t really need to be transparent might have much smaller size when it is saved in JPEG format.

WebP

WebP has a special mode for lossless (-m 6 -q 100) which can reduce a file to its smallest size by exploring all parameter combinations. It’s an order of magnitude slower but is worth it for static assets.

To convert an image to different formats on the command line, you can use ImageMagick convert tool:

convert -background white -flatten product-shadow.png product-shadow.jpg

I found this requirement of converting to a different format quite often. For one of reference websites it produced a 1KB .jpg file that saved a hundred of kilobytes (the original .png file was 120KB even when optimized with optipng).

Optimize images by using the right tool for each format

PNG images

OptiPNG

The following command losslessly optimizes all .png files in current directory:

find . -type f -iname '*.png' -exec optipng -o7 -clobber -backup -preserve "{}" \;

The command utilizes the find command along with optipng for optimizing PNG images. Here’s a breakdown of each part of the command:

  1. find .: This initiates the find command in the current directory (.). The find command is used to search for files and directories within a file system.
  2. -type f: This option tells find to look for files only, not directories.

  3. -iname '*.png': The -iname option allows find to search for files case-insensitively matching the given pattern. Here, it’s looking for files with the .png extension. The asterisk (*) is a wildcard that matches any character sequence. This means it will match all PNG files, regardless of their names.

  4. -exec optipng -o7 -clobber -backup -preserve "{}" \;: This part of the command executes (-exec) the optipng command on each file found. The {} is a placeholder for the current file name being processed. The \; indicates the end of the command to execute for each file found.

In summary, this command searches for all PNG files in the current directory and its subdirectories, optimizes them with optipng using a high level of optimization, overwrites the originals while creating backups of them, and preserves their original file attributes.

PngQuant

Use 8 cores of CPU to achieve the lossy compression of .png files in current directory

find . -name '*.png' -print0 | xargs -0 -P8 -L1 pngquant --ext .png --force 256

nohup find . -name '*.png' -print0 | xargs -0 -P8 -L1 pngquant --ext .png --force 256 --verbose &

find . -name '*.png' -exec pngquant --ext .png --force 256 "{}" \;

However, there is a better tool for PNG files, zopflipng. So:

find . -type f -iname '*.png' -exec zopflipng -m -y "{}" "{}" \;

The -m switch will run the file through more iterations, while -y will auto-confirm writing to the same file.

JPEG files

There are numerous tools and ways to optimize JPEG images. Whether you’re fine with lossy or lossless compression, you need to pick an optimization program that suits you best.

However, it’s kind of useless comparing tools like jpegoptim and jpegtran if they are using the same encoder – the standard libjpeg-turbo. They will provide byte to byte the same results in case you use equivalent command-line switches.

It makes sense to compare tools that use different encoders, like jpegtran with standard library vs jpegtran with MozJPEG encoder. To save you time: tools compiled with MozJPEG library ALWAYS win by 2-5% in compression to the regular library.

To save you furthermore time, here is how to install jpegoptim powered by MozJpeg on CentOS/RHEL.

Lossless compression

find . -type f -iname '*.jpg' -exec jpegoptim --strip-all --force --all-progressive "{}" \;

Lossy compression

The following command will perform lossy optimization of all .jpg files in current directory.

find . -type f -iname '*.jpg' -exec jpegoptim -m85 --strip-all --force --all-progressive "{}" \;

The --force --all-progressive will ensure that the optimized files will be progressive JPEGs, which will allow for quicker visual display of images, turning from blurred to clear state, instead of empty spot while loading.

This is good for one time optimization. Running it again, will reduce file sizes (and quality!!!) further.

MagenX, the fake speed optimizer, does this to automate things:

echo "0 1 * * 1 find ${MAGE_WEB_ROOT_PATH}/pub/ -name '*\.jpg' -type f -mtime -7 -exec jpegoptim -q -s -p --all-progressive -m 65 "{}" \; >/dev/null 2>&1" >> rootcron

Why this is wrong and how to automate image optimization properly – read in our automatic optimization solution for images post.

Exit mobile version