In this guide, I will teach you how I rebuilt my Portfolio/Blog using Hugo Static Site Generator and achieved 100% for Performance, Accessibility, SEO, Best Practices.

rebuildng my portfolio with hugo

hugo fast speed optimization

Yeah, as a frontend developer, I built a portfolio while I was starting my career, it wasn’t that okay neither is it so responsive either, so recently I came across Hugo.

Hugo is a fast and flexible open-source static site generators built with love in GoLang, Hugo also makes building websites fun again, there are other stats site generators like Gridsome, Gatsby, Nextjs.

Well, am not good at GoLang Dev to create a template from a crash, so I used an existing template called Hugo Friend, hoping to push some pull request to the original repo to fix somethings I solved on my portfolio while redesigning.

Let’s get started,

  1. Install Hugo CLI
  2. Create a new site
  3. Deploy To Netlify
  4. How I worked on the portfolio SEO
  5. How I improved my portfolio site speed
  6. How I Configured Better Web Site Security with Cloudflare and Netlify on my Hugo Portfolio

Firstly you need to install Hugo CLI in your dev environment

Install Hugo CLI

Install Hugo on macOS

brew install hugo

For visual guide here is a video;

Install Hugo on Ubuntu and Debian

sudo apt-get install hugo

Install Hugo on Windows

choco install hugo -confirm

For visual guide here is a video;

Install Hugo on Linux

snap install hugo --channel=extended

You can get more guidance on the installation process via Hugo Installation Docs

Creating a new site

Now that you have Hugo installed on your system, you can now create a new Hugo project using the following command

hugo new site

e.g hugo new site HugoSite

Now you will have a new folder named HugoSite with the following subfolders:

├── archetypes
├── assets
├── config
├── content
├── data
├── layouts
├── static
└── themes

If you need a better explanation on Hugo files and directories to check here.

Now you choose to use a ready-made template or remodify files from a ready-made template to your own site, you can also use the starter template Hugo Friend NG or better still use Hugo ready-made templates that are available here.

If you will be using the ready-made template, you have to download the template folder into your own site themes folder, here is how the structure will look like.

HugoSite/themes/template-folder

But if you want to edit the templates straight on your site without using the /themes folder, then you need to copy the layouts folder in the template folder and replace it with the layouts in your own site folder.

Well, one of the advantages of not using the theme directly is that you get to optimise things well, remove CSS you don’t need, add things that you need and much more.

So you have your template ready in your theme folder or maybe you choose to replace the layouts folder of yours with one of the template layouts, let’s start our Hugo server

hugo server -D

Now you can access your server using localhost:1313.

Deploying to Netlify

I was using Firebase hosting initially, but I find Netlify a good fit for this, Netlify is promising and I love their platform and the solutions they offer.

To deploy on Netlify,

#1 Head over to app.netlify.com and select your preferred signup method.

deploy hugo to netlify

#2 Create a New Site by Select the “New site from git.” button

create a new netlify site

#3 Select your Git provider

pick git provider

#4 Pick the repository where you pushed your portfolio codes too.

choose a repository

#5 You can set the publish directory to public or leave it to default, lets netlify do the whole process for you.

deploy netlify

How I worked on the portfolio SEO

Well, being new to Hugo made it a bit hard for me, but based on my SEO experience, I did fix and implement something.

The template Layout I used was rendering summary of First lines of the Blog post as meta description, which is bad practice for SEO.

All pages and section needs to have a unique meta description

Here is the code snippet that i used for fix the issue.

<meta name="description" content="{{ if .IsHome }}{{ .Site.Params.Description }}{{ else }}{{ .Description }}{{ end }}” />

How I improved my portfolio site speed

Am a guy that’s obsessed with web page speed optimisation, so here is what I did to optimise my Hugo site speed,

1# Removing Unused CSS, Minifying the CSS and adding Fingerprint:

Yeah, I started by looking closely into CSS I felt ain’t useful on the website and removing them.

Then I minified the CSS using the below code and added fingerprint, wondering what the fingerprint does?

the fingerprint helps in ensuring that fresh content are served, MD5 is the most common implementation of this idea.

MD5 hash will be calculated and added to the CSS file, something like this styles.min.c1783285df9c580224cb4b14efea9ff1.css”.

The browser will keep cache this file and render it to users always.

So let’s say you make few changes to these CSS files and redeploy it, a new md5 hash will be added to it.

So if you force clear cache the CSS for all browsers, the styles.min.c1783285df9c580224cb4b14efea9ff1.css will appear as 404 files if you inspect via Chrome Dev Tool.

So the fingerprinting of js and CSS assets helps you deliver fresh contents to users.

Below is code to minify and add md5 fingerprint to your Hugo site, add the below codes to your header files, the resources.Get “/styles.css” change the path of the CSS to your own CSS path.

{{ $style := resources.Get "/styles.css" | minify | fingerprint "md5" }}

2# Compress, Minify JS files, Optimize and Add fingerprints:

JavaScript happens to be most of the rendering blocking assets in Pagespeed Insights, so since I have more than one JavaScript files.

I will need to compress all the JavaScript files into one, so as to reduce the number of HTTP request made the server.

Now that the JS files are combined and concatenated into the bundle.js file, let’s minify them to remove the unnecessary spaces and comments using resources.Minify

We’ve compressed the js into one, we’ve minified it, let’s add the fingerprint, below are the lines of code that performs all the above explanation,

{{ $main := resources.Get "js/main.js" }}
{{ $menu := resources.Get "js/menu.js" }}
{{ $prism := resources.Get "js/prism.js" }}
{{ $theme := resources.Get "js/theme.js" }}
{{ $secureJS := slice $main $menu $prism $theme | resources.Concat "bundle.js" | resources.Minify | resources.Fingerprint "sha512" }}
<script defer type="text/javascript" src="{{ $secureJS.RelPermalink }}" integrity="{{ $secureJS.Data.Integrity }}"></script>

If you check the above code well, you will see I added defer to the

<script.....></script>.

Yeah, loading Javascript files Asyncronaously will help your page speed and improve website load time.

You might be thinking why didn’t I use async instead of defer, well defer seems to be the best practices.

You can as well look into this guide and make your own decision, choosing can vary based on what project you are working on.

To implement it, you need to create a javascript.html file in your Layouts/partial/, paste the following codes and call the JavaScript.html into the place you want it to be using

{{ partial "javascript.html" . }}

We aren’t done yet when you create a web application, you will need to see metrics on who visits the website, which brings in Analytics tools and the common on is google analytics, this is a third-party tool.

Third-party assets are known to slow down web pages because you have no control over them, you can’t set a caching rule on it, but will still do something to it anyway.

In Hugo here is the code that adds Google analytics to your sites and also loads it asynchronously

{{- if .Site.GoogleAnalytics }}
{{ template "_internal/google_analytics_async.html" . }}
{{- end}}

To set your analytics tracking code, just add this to your config.toml file

googleAnalytics = “Your Tracking Code”

That’s all for JS files.

3# Optimize your images

Images occupy 50% of a webpage, stop the habit of not optimising images before uploading them to your sites, according to web.dev, they advise webmasters and web dev to adopt the use of WebP image formats.

But seems not all browser supports that yet, alternative its better to use JPG images instead of the heavy png images.

Also using inline Images can help your website page speed, using im\line images live SVG or base64.

But Base64 encoded images can increase your site KB size but it will reduce your HTTP request.

Best place to use inlined mages are logos, favicon if possible, and images above the fold.

So next time you work on projects always optimise images you use, you can use sqooush.app to compress and even change image extensions and other edits, also you can use base64 to change your images to base64 inline images.

#4 Host Fonts Locally:

When building a website, we all want to use fancy fonts, the option everyone uses is the Google font, well google font can really hurt your LCP (Large Contentful Paint) because you can’t control this font since it’s not hosted on your site.

Well since we can host it locally why won’t don’t we do it then, download all the font sizes you will be using in your webpages

Here is a correct font declaration when hosting your fonts locally,

@font-face {
font-family: geomanist;
font-weight: 300;
font-style: normal;
font-display:swap;
src: url(../fonts/geomanist-light-webfont.woff2) format("woff2"),
url(../fonts/geomanist-light-webfont.woff) format("woff")
}

font-display: swap; The ‘swap’ value tells the browser that text using this font should be displayed immediately using a system font. Once the custom font is ready, the system font is swapped out.

Also preloading or perfecting Fonts are good e.g

<link rel="preload" as="font" href="../fonts/geomanist-bold-webfont.woff2" crossorigin="anonymous">
<link rel="preload" as="font" href="../fonts/geomanist-regular-webfont.woff2" crossorigin="anonymous">

rel=”preload” as=”font” attributes will ask the browser to start downloading the required resource as soon as possible. It will also tell the browser that this is a font, so it can appropriately prioritise it in its resource queue.

#5 Preconnect External resources

Well we can’t do without using external resources on our web apps since they are external resources, we have little control over, scripts like Google Analytics, Google fonts and many more external scripts that are common on web apps need to be fetched earlier so they won’t have much impact on our web page speed.

This can be done using DNS-prefetch, reconnect and prefetch itself, below is the one I used

<link rel="dns-prefetch" href="https://www.google-analytics.com">
<link href="https://www.google-analytics.com" rel="preconnect" crossorigin>

Robin has a simple and straight forward guide preload, prefetch and prebrowsing

#6 Using a CDN

Cloudflare to the rescue again, though netlify promise a CDN, which is called Netlify edge, I must say its Netlify Edge is so awesome when I used it, it did break some functionality on my website, so I left it to use Cloudflare, but I must say my page speed increased with 20% using netlify edge.

But Cloudflare the promising guy, who has data centre almost all over the world, think of it, it’s an awesome package, but we the free Cloudflare users can also access it.

CDN works like this, it caches your CSS, js and image files and saves them to their storage on all their available data centre, so when a user from Australia visit your webpage, the cached resources of your site in Cloudflare CDN at Australia get delivered to the Australia users.

with cdn

So the data gets loaded so fast because your files are available in the data centre near the visitor.

But when you don’t use any CDN, and your hosting provider doesn’t have data centre around the world, then some regions will experience slow load page, because your website will be rendering files from US data centre to Russia users which makes it a bit slower.

no cdn

#7 Configure your cache settings well:

This might be tricky at first, All website deployed on Netlify has a default cache control which is max-age=31536000, public, must-revalidate, which means all content should be cached but also re-validated.

The rationale is that “This favors you if you use the netlify edge can also as a content creator — you can make changes any of your content and see results instantly.

But am using Cloudflare CDN and I won’t be changing contents or remodifying all time, so that cache control isn’t good enough for me, I want all my files to be stored in Cloudflare cache and I want the expiry period to be a year

So I decided to override the netlify default cache using netlify.toml file and I set the

[[headers]]
for “/*”
[headers.values]
Cache-Control = “public”
[[headers]]
for = "/posts/*" # This defines which paths this specific [[headers]] block will cover.
[headers.values]
Cache-Control = "public, s-max-age=604800"

If you notice, I didn’t set max-age or s-max-age to the for “/*” header, I only set “public”, I did that because I already set the max-age value in Cloudflare.

You can also choose to set the whole thing using Netlify.toml, all you will do over your Cloudflare dashboard is to set Browser Cache TTL to Respect Existing Header.

For the [[header]] “/posts/“, this is the path to my blogpost page, where frequent changes could occur, so set s-max-age to 604800 (a week), so am using s-max-age to modify the cache setup for that particular path.

You can learn more about cache controls and Cloudflare cache with the below resources.

Configuring Better Web Site Security with Cloudflare and Netlify

Am an infosec guy, so when building a web app, I always perform security checks and optimizations, but this is a portfolio so I did some few setups.

Started it with checking my HTTP Headers and my score was awful, here are the tools I used

securityheaders.io to check my site headers
Qualys SSL Labs to check my site cryptography

But how did I fix those awful metrics I got, I used Cloudflare, my folio site was hosted on netlify, I leveraged that also.

So let’s start with Cloudflare setup,

1# Enabling DNSSEC:

DNSSEC helps secure your domain from DNS hijacking. To enable it, I turned the DNSSEC on in my account, and then import a DS record created by Cloudflare to the registrar where I bought my domain.

dnssec prevent dns hijacking cloudflare

If you can’t find the DNSSEC settings in your domain registrar dashboard, contact them to help you on that.

Also if you need a good .me domain provider, checkout porkbun, they have the cheapest .me renewal charges.

2# Configure HTTPS always:

by enabling the Always Use HTTPS option, whenever a user visits the non-https version of my site, they get redirected back to the https version all time, also it helps in fixing mixed content.

always use https cloudflare

3# Enabling the HTTP Strict Transport Security (HSTS):

You can choose to set it up via Cloudflare, but I did set up mine using the netlify.toml file which I will discuss with you soon.

hsts cloudflare

4# Minimum TLS Versions:

Navigate to SSL/TLS settings in your Cloudflare dashboard and pick TLS 1.2, so this means visitors having TLS version below TLS 1.2 won’t be able to connect to your domain, so if TLS 1.1 Is selected, users with TLS 1.0 won’t be able to connect.

Why would I specify a minimum TLS version?

Some security standards, such as PCI DSS 3.2, enforce strong cryptographic standards where strong cryptography is defined as TLS 1.1 or newer. As an example, specifying TLS 1.1 can help your domain become compliant with PCI DSS 3.2

tls settings cloudflare

Also, enable the TLS 1.3 option which is the latest version of the TLS protocol for improved security and performance.

5# SSL settings:

Since my web site is hosted on netlify, netlify do promise us SSL certificate which is from Lets Encrypt, so I have a valid SSL Certificate, so I have that https, green padlock already, but that doesn’t mean there is a full end to end encryption between visitors and my site, there could be sniffers.

So I set up the Cloudflare SSL Full (Strict) which encrypts users information right from the moment they typed in my blog URL on their browser till they finally land on my website.

ssl settings cloudflare

The reason why I picked SSL full(Strict) was that I already have a valid SSL certificate gifted by netlify from Lets Encrypt.

Cloudflare already did its part, let’s move to the Netlify.toml, also I could have used Hugo config.toml to set the HTTP headers also, but I choose netlify.toml,

Here we go,

If you don’t have Netlify.toml in your Hugo site folder root, create one now

Now open the file and paste the following headers configurations

[[headers]]
for = "/*" # This defines which paths this specific [[headers]] block will cover.
[headers.values]
X-Frame-Options = "DENY"
X-XSS-Protection = "1; mode=block"
X-Content-Type-Options = "nosniff"
Content-Security-Policy = "form-action https:"
Referrer-Policy = "no-referrer-when-downgrade"
Feature-Policy = "accelerometer 'none'; camera 'none'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; payment 'none'; usb 'none'"

When you are done, redeploy to netlify and recheck your HTTP headers and SSL lab sure again, you should have this.

qualys ssl lab test

qualys ssl lab test

Conclusion

Learning doesn’t have a stop, and the internet spaces will keep evolving, so you also shouldn’t build a solution or web pages alone, build User optimized pages, nobody wants to wait extra 5 secs for webpages to load.

Clients want SEO optimised web pages that help them get listed along with competitors on the search space.

So when next you are building something, put the Users in mind, Search Bot in Mind, then tweak your way around to implement clients requirements.

Stay Inspired and keep learning, being a developer isn’t speedy process, its a steady journey, you can also contribute to Hugo Open Source community.

If you have any issues, feel free to comment below.

LEAVE A REPLY

Please enter your comment!
Please enter your name here