A bare-bones, CRON-driven job queue.
Go to file
2020-12-02 12:47:00 -06:00
src Initial commit 2020-12-02 12:34:36 -06:00
.gitignore Initial commit 2020-12-02 17:53:50 +00:00
LICENSE Initial commit 2020-12-02 17:53:50 +00:00
README.md README 2020-12-02 12:47:00 -06:00

miniq

A bare-bones, CRON-driven job queue.

I wanted a simple way to queue jobs to be run in the background without having to run a separate daemon. I also wanted it to be super flexible so I could queue any type of job (shell, Python, Node, &c.).

MiniQ is what I came up with. With just 3 files, MiniQ provides the ability to log jobs using the miniq.py executable, and the process_jobs.py script will run periodically and execute the queued jobs in the background.

How it works

When you log a job with the miniq.py executable, it parses the job name and any additional arguments that follow.

Then, it creates a job tracking file in the ${MINIQ_ROOT}/queue directory. This JSON file contains the job to be executed and the arguments to be passed.

Periodically, the process_jobs.py script will be run as a CRON. This script searches the ${MINIO_ROOT}/queue directory for job tracking files. If it finds any, it marks them as claimed and begins executing them one-by-one.

When a job is executed, its executable is found as ${MINIO_ROOT}/jobs/${job_type}.job. This file is executed and passed the arguments stored in the job file. The job executable can be any type of executable, as long as the OS can execute it as a command.

Once the executable finishes, the job tracking file is erased.

Installation

Since MiniQ is file-system driven, you can run it on a single machine, or spread across multiple machines using a network file system like NFS or SSHFS.

To get started, clone this repo to a safe location:

git clone https://code.garrettmills.dev/garrettmills/miniq

For the client-side, you can make the executable easier to use by wrapping it in a shell script.

Create the file $HOME/.local/bin/miniq with the following contents:

#!/bin/sh

/path/to/miniq/src/miniq.py $@

Now, make it executable:

chmod +x $HOME/.local/bin/miniq

Now, you should be able to log jobs using the miniq command. However, in order for the jobs to be run, we need to configure the system to run the process_jobs.py executable periodically.

This can be done using whatever timer you prefer. In my case, I'm going to set it up as a CRON job, but you can use SystemD timers, or anything else. For example, I added the following CRON entry with crontab -e:

* * * * * /path/to/miniq/src/process_jobs.py

This will run the job processor every minute.

Changing the MiniQ Root Dir

You can tell MiniQ where to look for job definitions and tracking files by setting the MINIQ_ROOT environment variable wherever the miniq.py or process_jobs.py executables will be run.

Example

Let's define a job to send a push notification. Create the file ${MINIQ_ROOT}/jobs/notify.job:

#!/bin/bash

NOTIFY_TITLE="$1"
NOTIFY_MESSAGE="$2"
NOTIFY_PRIORITY="${3:-5}"
NOTIFY_TOKEN="some-gotify-token"

curl -X POST --data "title=$NOTIFY_TITLE" --data "message=$NOTIFY_MESSAGE" --data "priority=$NOTIFY_PRIORITY" "https://notifications.my.domain/message?token=$NOTIFY_TOKEN"

The contents of the job executable are not important. This is just one I had as an example.

Now, make sure the job is executable:

chmod +x "${MINIO_ROOT}/jobs/notify.job"

Once we've done that, we can log jobs using the miniq executable we set up earlier:

miniq notify Title "This is a message!"
miniq notify Title "This is also a message!"
miniq notify Title "This is a message with optional priority argument!" 9

Once the job processor runs, it will execute our notify.job script 3 times, passing in the different arguments.