Following Google’s announcement of Guetzli (“cookie” in Swiss German), a horde of online publications rushed to regurgitate the tech giant’s news of high quality JPEG images with reduced file sizes.  Guetzli is Google’s new algorithm to encode JPEG high quality images reduces them by a whopping 35% over current methods.  Let’s face it, we all want to deliver content quickly but not at the price of using low quality images. Here at 1point21, we know that faster website load times are good for user experience, conversion rates and search engine optimization so naturally we were very interested in testing Guetzli out ourselves.

How does Guetzli work?

The algorithm itself is said to be ‘psychovisual’ like an MP3-encoded audio file is a ‘psychoacoustic’ compression, working directly with the way science understands that our brain and senses work to most efficiently present the data to the consumer, and most importantly making intelligent choices about what information to drop.  “Guetzli uses Butteraugli [bless you], our perceptual distance metric, as the source of feedback in its optimization process.” The icing on the cake was confirmed that users preferred Guetzli encoded JPEG images over the same-sized libJPEGs (what most of us use/see today).

Guetzli image test with eye

20×24 pixel zoomed areas from a picture of a cat’s eye. Uncompressed original on the left. Guetzli (on the right) shows less ringing artifacts than libjpeg (middle) without requiring a larger file size.

I’m sold.  Encode all the images on our sites with this magic sauce immediately!

Yeah… No. Skip the next paragraph for more image compression suspense because this is your spoiler alert.

Unfortunately these publications (including Google’s own press release) failed to mention as with any study or theory, Guetzli is still “a proof-of-concept” and still “extremely slow” to use.  Its actually well documented here in an arVix pre-press paper if you want to geek out a bit.

Game-on with Guetzli high-quality reduced-sized images

Who would we be if we let a “proof-of-concept” label stop us from trying new technology? We wanted to see the actual numbers obtained by using my own input data so we set out immediately to find out whether this was good news, too good to be true news, or something in between.  The results told us all we needed to know.

In fairness we used a real-world testing environment attainable by most computer users. Once installed, Guetzli is a single-file input executable with very limited options.  It can take input of JPEG and PNG, and produces only JPEG output.

Guetzli image compression test environment

We worked together with the legal marketing division of our company, iLawyerMarketing.com for the test as they were also highly interested in the results. Each image tested was processed in isolation on a fresh Digital Ocean  server running Ubuntu 16.04 with two provisioned virtual processors and 2 gigabytes of RAM ($20/month droplet at the time of writing).  We cloned the GitHub repository and only needed to install gnu make/compile tools and libpng-dev to build the program.

We used the UNIX time command to output the amount of time and system effort required per run.

Command format example:

root@guetzli:~# time /usr/local/src/guetzli/bin/Release/guetzli input.jpg output.jpg

Our Guetzli encoding test results

Guetzli results image 1

 Test 1 – After getting the program installed, we were a little surprised that the GOOGLE PROVIDED test image (bees.jpg) took 13 clock seconds to compress given it is only 177k to begin with.  The output looks excellent, and it is only 21.37% the size of the original.  This seemed too good to be true!

Guetzli test results image #2

Test 2 –  We then processed a high-resolution portrait provided directly from a professional photographer.  After churning for nearly 19 clock minutes, 2.5MB was knocked down to 1.6MB, ~65% of the original size.  The Guetzli process was running 100% CPU and using >70% RAM for much of that time.  Ouch.

Guetzli test results image #3

Test 3 – Finally, we decided to throw an oddball in the mix and took a screenshot of the terminal window showing the CPU/RAM usage from the prior image in top.  This time the Guetzli output was SIZED UP in resolution and 172% the original file size! After running > 2.5 min on a 348kb file, it output a 600kb file.  No thanks!

Guetzli compression test image #4

We were provided the second set of three images from actual production websites, testing for a potential gain even if we had to pump images in one at a time and run for hours to finish up.  The results of these compressions are much less impressive, but bear in mind that these images were previously compressed with Photoshop to begin with.

Adopting Guetzli to increase pagespeed and decrease page load

Relatively speaking, Guetzli is a slow resource hog.  Don’t try feed it images that have already been compressed because the results are not great. This software is not yet in a state to be used for production of any kind.  There is no guarantee that the output file will be smaller than the input, it eats heavy resources, the results are unpredictable and unless you’re prepared to waiting over an hour to compress some images, it’s simply not ready mass adoption.

The “cookie” recipe is quite undercooked and recipe not yet perfected but with the common-goal of improving user-experience, here’s Guetzli’s for you to use, experiment, and ultimately provide further development: https://github.com/google/guetzli. At least for now, we will continue to use Photoshop and other more efficient image compression tools.