Alex’s Blog

I try to be smart, sometimes

Docker Container Building via GitHub Actions with Diun.

My last post about GitHub Actions left off with a docker container being built when a commit was pushed to the repo. Now this is great if you are building a docker container, for your project, in your current repo. However, for me though that’s not the case. At least not for my Caddy image. My build process takes the official image builder and builds a custom Caddy binary with a few extra plugins. So when I discovered Diun I was like, OH.. YES!

What is Diun?

Docker Image Update Notifier is a CLI application written in Go and delivered as a single executable (and a Docker image) to receive notifications when a Docker image is updated on a Docker registry.

I came across this service called Diun, while reading through the Discord server, that basically allows you to configure what docker images you want to monitor for updates and then notify you of said updates. Very useful if you want update notifications for a bunch of images you use.

What really peaked my interest about Diun was the fact that it can execute scripts when it detects an update. If you are familiar with the GitHub API you might see the gears spinning here. This post is going to get a bit technical so bare with me here.

Setting up Diun

First things first. We need to get Diun up and running so we can make sure it can see the docker image we want to monitor. For this post I will be doing exactly what I did to monitor the official Caddy image on docker hub.

Before we begin, make sure you have the following tasks done and ready:

  • A working workflow created in your GitHub repository
  • A machine running Docker (preferably Linux) with docker-compose installed and updated.
  • A text editor either on the Docker host or local to you. I recommend nano for CLI and Visual Studio Code for desktop.

I am going to assume you are using ssh to access your machine. Which you should be using anyway.

Step 1 – Create Diun files.

ssh to your Docker host and create a directory for Diun’s configuration and data files.

ssh [email protected] && mkdir -p /diun/data

Verify the directory creation:

ls && ls ./diun

Once we have successfully created the directory for Diun, we can create the config files.

Step 2 – Configuring Diun

First file we will tackle is the main configuration file. This file will tell Diun where to send your notifications, etc. I will be creating the file using VSC but you can do it via a CLI editor. Just follow the basic yaml syntax.

Create a file called diun.yml and drop the below code block into it and save the file.

    workers: 20
    schedule: "0 */6 * * *"
      filename: /data/images.yml
      cmd: "sh"

Basically this file tells Diun to when to check for updates (watch), where to find the images to check (providers), and where to notify you when an update is found (notif). In this case, we are providing an external file called images.yml where we are specifying the images to watch. And when an update is found we want Diun to execute a script located in /data called Diun has lots of good docs so check them out if you want some more advanced configurations.

Now that we have the primary config file created we can create our outside images file.

Create a new yaml file in /data. For this example I will be calling it images.yml.

touch /diun/data/images.yml

Edit images.yml and specify the Docker image(s) to monitor. Again, for this example it will be the official Caddy image hosted on Docker Hub.

- name:

You can name as many images as you want inside the file as well as some additional properties.

Once the images.yml file is updated. We can deploy Diun.

Step 3 – Deploying Diun

Copy the below docker-compose.yml file into the /diun directory.

version: "3.5"
    image: crazymax/diun:latest
      - "/path/to/diun/data:/data"
      - "/path/to/diun/diun.yml:/diun.yml:ro"
      - "/var/run/docker.sock:/var/run/docker.sock"
      - "TZ=America/New_York"
      - "LOG_LEVEL=info"
      - "LOG_JSON=false"
    restart: always

Edit the volume mappings to the location of the diun/data directory and diun.yml file. Save the file and run the following in the diun directory,.

docker-compose up -d

Once you see a container ID, tail the container log to make sure it picked up the image file and settings correctly.

docker logs -f diun_diun_1

With Diun running, we can now switch gears and get our workflow trigger setup.

Creating a Workflow Dispatch Trigger

If you are not familiar with dispatches on GitHub, you can read more about them here.

Step 1 – Update the Workflow

First thing we need to do is update the workflow yaml file to support the dispatch trigger. Edit your workflow config file in .github\workflows\yourfile.yml and add the following information to the on: section.

    types: <customvalue>

Replace <customvalue> with a value you want to use to call the workflow. Mine is named after the name of the workflow.

Step 2 – Creating a Personal Access Token.

In order to call the workflow via the Github API we need to use a PAT (Personal Access Token). Navigate to your GitHub profile settings/tokens page.

Click on Generate new token.

On the New personal access token page, edit the note field and name the token something memorable. Then check the repos box, scroll to the bottom, and click generate token.

After you click generate you will be returned to the Personal access tokens page with your new token highlighted in green. Copy the token somewhere safe as it can’t be accessed once you leave this page.

The above picture shows a real access token. It was deleted right after creating this post. 🙂

Step 3 – Creating the Script

Now that we have the workflow setup and a PAT, we can create the script that will run when Diun detects an image update. To do this, return to your docker host and create the file in the diun/data folder.

cd ./diun/data && nano

Paste the following into the script file

curl -H "Accept: application/vnd.github.v3+json" \
    -H "Authorization: token 2fead8b03246f8f7315d9a8dd64f8130d8be72c7 " \
    --request POST \
    --data '{"event_type": "<your custom value here>"}' \


  • 2fead8b03246f8f7315d9a8dd64f8130d8be72c7 with your PAT that you created earlier.
  • <your custom value here> with the value you specified earlier in the on: types: section of your workflow yaml file.
  • USERNAME with your GitHub username.
  • REPOSITORYNAME with the name of the repository where your workflow file is located.

Save and close the file. Now set the file to be executable by Diun.

chmod +x ./
chown root:root

Now since Diun’s image is built using Alpine as its base, we need to install curl as it doesn’t come with it by default. To do this first enter the container’s run environment.

docker exec -it diun_diun_1 bin/sh

Now update bash and install curl.

apk add --no-cache --upgrade bash
apk add --no-cache curl

When the commands are done, you can type Exit to leave the container environment.

Note: You will have to do this each time you recreate the container unless you build your own image of Diun with curl included. Stay tuned as I might end up doing this.

Step 4 -Testing the Script

Now that we have everything ready, let’s test the script to make sure its working. Run the following command in the diun/data folder.

sudo sh ./

If you have your Actions tab open on your repository, refresh and you should see the workflow was kicked off after calling the script.


Once the script is confirmed working, all we have to do now is wait for an image update from Caddy’s official repo and see if Diun + script work. I will update this post as soon as an update happens and make changes if it doesn’t work though it should be working!

If you have any questions, feel free to reach out via email! Also, if you like the blog consider supporting me!

Now off to migrate more things from my local Jenkins instance…

Written by Alexander Henderson

I do IT things sometimes.

More From This Category

Server Crash of 2021

Server Crash of 2021

Ended 2020 with a bang in my homelab. Well not an actual bang more like a sudden power cut on my main server. Before we dive into the details, (I am currently typing this while my new OS install is reapplying permissions to all my files). Let's get to know my main...

read more

Expanding Pi-Hole Stats with Prometheus

The other day I came across a Prometheus Exporter for Pi-hole (found in a comment on /r/pihole) that gives WAY more stats/data compared to the InfluxDB script I posted about awhile back. With this exporter, I was able to setup a more detailed dashboard. Now currently...

read more


Submit a Comment