A bare-bones, CRON-driven job queue.
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Garrett Mills bddbdbad14
Add background processing daemon incase CRON is unavailable
3 years ago
src Add background processing daemon incase CRON is unavailable 3 years ago
.gitignore Initial commit 4 years ago
LICENSE Initial commit 4 years ago
README.md README 4 years ago



A bare-bones, CRON-driven job queue.

I wanted a simple way to queue jobs to be run in the background without having to run a separate daemon. I also wanted it to be super flexible so I could queue any type of job (shell, Python, Node, &c.).

MiniQ is what I came up with. With just 3 files, MiniQ provides the ability to log jobs using the miniq.py executable, and the process_jobs.py script will run periodically and execute the queued jobs in the background.

How it works

When you log a job with the miniq.py executable, it parses the job name and any additional arguments that follow.

Then, it creates a job tracking file in the ${MINIQ_ROOT}/queue directory. This JSON file contains the job to be executed and the arguments to be passed.

Periodically, the process_jobs.py script will be run as a CRON. This script searches the ${MINIO_ROOT}/queue directory for job tracking files. If it finds any, it marks them as claimed and begins executing them one-by-one.

When a job is executed, its executable is found as ${MINIO_ROOT}/jobs/${job_type}.job. This file is executed and passed the arguments stored in the job file. The job executable can be any type of executable, as long as the OS can execute it as a command.

Once the executable finishes, the job tracking file is erased.


Since MiniQ is file-system driven, you can run it on a single machine, or spread across multiple machines using a network file system like NFS or SSHFS.

To get started, clone this repo to a safe location:

git clone https://code.garrettmills.dev/garrettmills/miniq

For the client-side, you can make the executable easier to use by wrapping it in a shell script.

Create the file $HOME/.local/bin/miniq with the following contents:


/path/to/miniq/src/miniq.py $@

Now, make it executable:

chmod +x $HOME/.local/bin/miniq

Now, you should be able to log jobs using the miniq command. However, in order for the jobs to be run, we need to configure the system to run the process_jobs.py executable periodically.

This can be done using whatever timer you prefer. In my case, I'm going to set it up as a CRON job, but you can use SystemD timers, or anything else. For example, I added the following CRON entry with crontab -e:

* * * * * /path/to/miniq/src/process_jobs.py

This will run the job processor every minute.

Changing the MiniQ Root Dir

You can tell MiniQ where to look for job definitions and tracking files by setting the MINIQ_ROOT environment variable wherever the miniq.py or process_jobs.py executables will be run.


Let's define a job to send a push notification. Create the file ${MINIQ_ROOT}/jobs/notify.job:



curl -X POST --data "title=$NOTIFY_TITLE" --data "message=$NOTIFY_MESSAGE" --data "priority=$NOTIFY_PRIORITY" "https://notifications.my.domain/message?token=$NOTIFY_TOKEN"

The contents of the job executable are not important. This is just one I had as an example.

Now, make sure the job is executable:

chmod +x "${MINIO_ROOT}/jobs/notify.job"

Once we've done that, we can log jobs using the miniq executable we set up earlier:

miniq notify Title "This is a message!"
miniq notify Title "This is also a message!"
miniq notify Title "This is a message with optional priority argument!" 9

Once the job processor runs, it will execute our notify.job script 3 times, passing in the different arguments.