Using S3 to serve the blog images

After the succesful set up of Hugo that was done in the first post, I was thinking on how to serve images. It’s pretty common to use S3 to serve static content for blogs, so I figured I’d give it a try.

We will do this in a couple of steps:

  1. Set up an s3 bucket on aws
  2. Add a DNS CNAME to hide the S3 links at your registrar
  3. Set up a GitHub Action step to upload images to S3
  4. And then it’s done!

1. Set up an s3 bucket on aws

First off, we will log in to AWS Management Console and go to the S3 menu. It’s a bit difficult to navigate on the AWS Management Console, but hopefully you will find it.

Here you create a bucket called something like (for me I have This will help you later when you want to hide that you are serving images from S3. Make sure that it is public (so that you can serve it to users browsing your site). It will give you warnings in this step, but it’s fine.

Next step is to create a IAM to be able to upload to your bucket with the GitHub Action. Select IAM in the services menu at the top. The steps to actually create a user with permissions are a bit convoluted, so please follow this link and come back when you’re done.

Now we hopefully have a user with upload permissions and a access key and a secret. These will be used in step 3.

To be able to have the images you have on S3 being served up looking like they are coming from your domain you need to set up a DNS CNAME for images at your registrar (cloudflare, namecheap, google domains or what you might have) to point to


This will cause point to

3. Set up a GitHub Action step to upload images to S3

Now for some more magic. We want to have any images pushed to the s3images directory in our hugo site sources repo to be copied to S3. This can be accomplished by adding some more code in the main.yml that we created in [the first post]( Add this to the bottom:

	  - name: Images
        uses: jakejarvis/s3-sync-action@master
          args: --acl public-read --follow-symlinks --delete
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          SOURCE_DIR: 's3images' 

For this to work properly, you will need to add the Access key and the secret from step 1.

4. And then it’s done!

Now whenever you add or remove an image into the s3images directory and push to GitHub, it will sync to your S3 bucket.

And then you can just use markdown image links or figure links in Hugo.

Take care!

comments powered by Disqus