Optimised

Tips, insights, and articles to help boost your website's performance.

Issue #21

July 23, 2021

Building a fast, sustainable personal website - Part 2

This week I'll continue reflecting on the redesign and redevelopment of my personal website. Part 2 of this case study explores some of the key code and build step implementations that help to deliver perfect Lighthouse scores and improved website sustainability for my personal website.


In the last issue I wrote about some of the important decisions I made when planning out my new website. Some of those decisions (like statically generating my site with Eleventy) set me up to implement things in my code that would ensure a fast, sustainable website.

Legend

Through this article you will see these two icons next to some of the headings/bullet points.

  • 💚 signifies something that helps with website sustainability
  • 🚀 indicates something that helps with website performance

💚🚀 Minify all the things!

I include a couple of code minification steps as part of the build process for this site. As the name suggests, minification of code makes it smaller and results in less data needing to be transferred. Minification steps on this site include:

  • Minify JS
  • Minify CSS (both inline CSS & external stylesheets)
  • Minify HTML

🚀 Critical & non-critical CSS

During my site's the build process, all the CSS is dumped into one temporary file. I then have a build step that looks at the HTML output of each page individually and uses the Critical package to inline any critical CSS directly into the HEAD of that page. This process allows the browser to quickly find and parse the minimum CSS required to render the content that a visitor will first see on any given page.

What this also allows me to do is then to take the remaining (non-critical) CSS, and perform a further optimisation step. Once the critical CSS for a page has been inlined into the HTML, I then take the remaining non-critical CSS and pass it through PurgeCSS. PurgeCSS looks at the HTML output of the page and finds what non-critical CSS rules will be required. Those rules are then saved in a hashed file, and a <link> tag referencing that file is added to the page.

This entire process has several benefits:

  • The page renders sooner since the critical CSS comes along with the initial HTML document.
  • Purging the remaining CSS of rules that are not required for a given page significantly reduces the amount of CSS shipped. The smaller files are faster to download and parse.
  • Since the non-critical CSS is saved with a hashed filename it automatically becomes immutable. This allows me to set very aggressive caching rules knowing that any change to a page's CSS will bust the cache by changing the filename.

Image optimisations

I tend not to use video in my content, so images are the heaviest assets on my site. Optimising them goes a long way towards ensuring each page on this site loads fast and has as small of an environmental footprint as possible.

💚🚀 Serving modern formats

Using modern image formats like WebP or AVIF is arguably the easiest way to go about reducing image size. For my site, I use Cloudinary (affiliate link) to do most of the heavy lifting. With very little effort I'm able to serve AVIF images to browsers that support it, WebP to those that don't, and JPEG as a fallback. Most of the traffic to my site comes from Google Chrome users, so they're being served AVIF. Since the size of these images is much smaller than JPEG they are downloaded faster. Once again, their small size also reduces the energy required to transfer them over the network.

As a further step, I upload and process images using Cloudinary only as and when they are needed. Using Cloudinary's remote image fetch URL I'm able to keep a local version of each image, which is only uploaded & processed by Cloudinary when it is first requested. Subsequent requests for that image are then served from Cloudinary's cache. I've also added an extra layer of caching using Cloudflare. This helps keep my usage well within Cloudinary's free tier.

💚🚀 Delivering appropriately sized images

Presenting users with images that are appropriately sized for their viewport/device/screen is another performance and sustainability gain. Rather than making a mobile user download a 1600px wide hero image, they're served a 320px wide image instead (640px for retina displays). Again, this reduces the size of the files downloaded by the user. And yep, you guessed it, this approach also requires less electricity to transfer over the network.

Delivering responsive images can be rather tedious to do by hand. Thankfully the Eleventy Image Plugin handles all of this for me. By just passing in the original image alongside some parameters for sizing the plugin creates local, resized copies for each image. It also injects the required HTML directly into each page.

Other image optimisations

  • 💚🚀 I lazy-load images as much as possible using the native loading="lazy" attribute. This means users aren't forced to download images further down the page until they need to see them. For some images that I know will always appear "above-the-fold", I use loading="eager" to help the browser prioritise them.
  • 🚀 Using the decoding="async" attribute on all images means they're processed off the main thread by the browser. This means they won't block JavaScript from executing, and shouldn't impact page usability while images load.
  • 💚🚀 The Eleventy Image Plugin generates resized images with immutable file names. This means that any time the image is changed, the name will too. With this I'm able to set very aggressive caching for images.
  • 🚀 On each image, I include height and width attributes. These allow modern browsers to imply the space that should be reserved for the image before it loads. This helps to prevent Cumulative Layout Shift.

💚🚀 Cloudflare Workers

Last issue I mentioned the decision to use Cloudflare as my website host, as well as my desire to ship as little JavaScript as possible for each page of the site. With this in mind, I've leaned on Cloudflare Workers to help inject dynamic content into pages before they are served to the user. This means the dynamic parts of the page are built on Cloudflare's edge network. As a result, I avoid shipping additional JavaScript files to visitors on those pages. Examples of where I've used this approach are:

This is a small performance boost on low-end devices. It's sustainability impact is small, with code being run at the edge, only when requested. With Cloudflare neutralising the CO2 impact of their operations, I'm comfortable with using Workers for this use case.

Still room to improve

I still feel there's room for me to improve this site - especially from a sustainability perspective. Performance-wise, the site scores 100s on Google Lighthouse tests and passes all Core Web Vital metrics.

On the sustainability front, there are a few areas I can still address that would lead to reduced data and energy usage. These are:

  • 💚 Search - Adding search functionality to this site would make it much, much easier for visitors to find information. This is especially true for the Writing section which has several pages worth of posts at this stage.
  • 💚 Dark mode - A dark mode would help give visitors a slight energy saving, especially on mobile devices. That said, there is research out there that points to light text on dark backgrounds being harder for reader's to focus on. Since my site is mostly text content, it's something I'm still considering.
  • 💚🚀 Removing more unused JS - Even though I'm already shipping very little JavaScript, there's still a bit more I can shave off. Ideally, most pages would eventually have around 10kB of JavaScript.
  • 💚 More accurate CO2 measurement - As mentioned earlier, including lazy-loaded images and scripts in the CO2 measurements on each page is something that I'm looking into.
  • 💚 Green hosting - Look into moving website hosting to a green hosting provider, while keep Cloudflare as my CDN. If you know a green web host that can provide CI/CD for a static website, let me know!

Articles

WebPageTest on Learn with Jason
Tim Kadlec from WebPageTest joined Jason Lengstorf on his weekly "Learn with Jason" stream to go through how WebPageTest can be used to run a simple website performance audit. There's a lot of nuggets in this video, even if you're only just getting started with web perf.


The next issue of Optimised will be in your inbox on August 6th. The website has an archive of all previous emails. It's a good place to recap on anything we've covered, and also handy to share with friends or colleagues.

As always if you've got any feedback or specific topics you want to be covered then just reply to this email.

Keep safe, stay well.
Fershad.

Subscribe

Receive fortnightly website optimisation tips, insights, and articles directly to your inbox.
Alternately, you can receive updates using RSS.

This newsletter is powered by Buttondown.