Pagespeed Optimizations

4 minute read Published:

Getting a 100 Google Pagespeed score without a CDN or a server

This blog is currently hosted on Gitlab Pages using Hugo.

Problem

So after I set up my site, which I documented here, I went over to Google Pagespeed Insights to check the score. For mobile, they gave me a 69100, which I was not happy with. Although this might have been a premature optimizations, I’ve been sold on the learning opportunity of practicing micro-optimizations. Thus, began my quest to achieve that 100100.

“Possible Optimizations”

  • “Eliminate render-blocking JavaScript and CSS in above-the-fold content”
  • “Leverage browser caching”
  • “Enable compression”
  • “Minify CSS”
  • “Minify HTML”
  • “Minify JavaScript”

Gzip

It was simple to add gzip compression. I just added one line, gzip -k -9 -r public, to my .gitlab-ci.yml.

image: monachus/hugo

before_script:
  - git submodule init
  - git submodule update --force

pages:
  script:
  - hugo
  - gzip -k -9 -r public
  artifacts:
    paths:
    - public
  only:
  - master

I found various websites recommending a compression level of 5-6 to balance cpu time to compress and uncompress the files with the size of the files, so I used Pingdom Tools to benchmark.

Gzip Level Load Time(ms)  Page Size(kB)
=======================================
0          388            22.1
5          290             9.2
9          153             5.6

It looks like the highest compression level of 9 was the clear winner. I don’t know if Pingdom took the uncompression time into account but since I was using the Gitlab runner and my builds were still finishing in under 30 seconds, I decided to use level 9 for now.

Note: The above page sizes and load times were after I had moved my site to use the After Dark theme which I talk about below.

Webpack + Gulp

I used the javascript packages uglify-css, minify-html, and uglify-js to knock the latter three optimizations out but after rerunning Pagespeed Insights, I noticed that my score didn’t improve.

I then tried to set up cache busting which allows you to cache static assets for a long time. After a bunch of twiddling, I eventually set that up with gulp and webpack. The source code can be found here.

However, using webpack and gulp increased the build times significantly to consistently over 2 minutes using the Gitlab Runner. In addition, I still wasn’t “leveraging browser caching” correctly. It seems like I would need to have access to the webserver serving my website to set cache headers or I would need to serve my site over a CDN.

After Dark

Fortunately I found a better solution - just use a better Hugo theme. After Dark was designed to be fast which made getting a 100100 much easier. The only optimization I needed was to disable BPG image support since I didn’t have any images yet.

I did this by adding two lines to my config.toml.

[params.seo]
  disable_bpg = true

This temporarily got my Pagespeed score to 100100.

Google Analytics

Unfortunately, when I added google analytics to my config.toml I still got the complaint that I wasn’t leveraging browser caching. I found the library ga-lite which would solve the problem. Unfortunately, currently there is a bug in Hugo which meant that even though I tried to add a layouts/_default/baseof.html file, Hugo was not using this base file instead of the one in my theme directory, themes/after-dark/layouts/_default/baseof.html.

To workaround this issue, I created a temporary hack. I added a ga.txt file which I would append to all my html files:

<script>
(function(e,t,n,i,s,a,c){e[n]=e[n]||function(){(e[n].q=e[n].q||[]).push(arguments)}
;a=t.createElement(i);c=t.getElementsByTagName(i)[0];a.async=true;a.src=s
;c.parentNode.insertBefore(a,c)
})(window,document,"galite","script","https://cdn.jsdelivr.net/npm/ga-lite@2/dist/ga-lite.min.js");

galite('create', 'UA-102040397-1', 'auto');
galite('send', 'pageview');
</script>

Then I created a script to append that file to all my html files:

FILES="$(find public -name '*.html')"
for f in $FILES
do
  echo "cat ga.txt >> $f"
  cat ga.txt >> $f
done

Finally I added this script to my Gitlab runner configuration file:

image: monachus/hugo

before_script:
  - git submodule init
  - git submodule update --force

pages:
  script:
  - hugo
  - gzip -k -9 -r public
  - sh ga-hack.sh
  artifacts:
    paths:
    - public
  only:
  - master

Conclusion

Some may argue that this was premature optimization but one of my mentors stresses the importance of microoptimizations to become a better programmer. While this wasn’t necessarily code microoptimizations, it still taught me a lot about website performance like how cache busting works.