<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet href="/vendor/feed/atom.xsl" type="text/xsl"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en-US">
                        <id>https://mackhankins.com/feed</id>
                                <link href="https://mackhankins.com/feed" rel="self"></link>
                                <title><![CDATA[Stuff & Things — Mack Hankins]]></title>
                    
                                <subtitle>Thoughts on development, tools, and building stuff.</subtitle>
                                                    <updated>2026-04-16T11:32:09+00:00</updated>
                        <entry>
            <title><![CDATA[Cheap Cold Storage Backups with AWS Glacier Deep Archive and TrueNAS]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/cheap-cold-storage-backups-with-aws-glacier-deep-archive-and-truenas" />
            <id>https://mackhankins.com/4</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[How to set up a monthly off-site backup from TrueNAS to AWS S3 Glacier Deep Archive for about $1/TB/month using Cloud Sync Tasks.]]>
            </summary>
                                    <updated>2026-04-16T11:32:09+00:00</updated>
        </entry>
            <entry>
            <title><![CDATA[A Simple Bash Script for Multi-Server MySQL Backups]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/a-simple-bash-script-for-multi-server-mysql-backups" />
            <id>https://mackhankins.com/8</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[A Bash script that automates MySQL backups across multiple servers using mysqldump, with daily and monthly snapshots, JSON configuration, retention cleanup, and simple restore workflows.]]>
            </summary>
                                    <updated>2026-04-14T07:10:21+00:00</updated>
        </entry>
            <entry>
            <title><![CDATA[Automated Multi-Server PostgreSQL Backups with a Simple Bash Script]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/automated-multi-server-postgresql-backups-with-a-simple-bash-script" />
            <id>https://mackhankins.com/5</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[A JSON-configured Bash script that backs up every database on every PostgreSQL server you own, with daily/monthly rotation and automatic retention cleanup.]]>
            </summary>
                                    <updated>2026-04-09T05:00:00+00:00</updated>
        </entry>
            <entry>
            <title><![CDATA[Dynamic OG Images in Laravel with Intervention Image]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/dynamic-og-images-in-laravel-with-intervention-image" />
            <id>https://mackhankins.com/7</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[Generate branded social cards for your blog posts on the fly using Intervention Image v4, with caching, custom fonts, and zero external services.]]>
            </summary>
                                    <updated>2026-04-05T09:51:32+00:00</updated>
        </entry>
            <entry>
            <title><![CDATA[Replacing Two Legacy Systems with a Single Python Service]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/replacing-two-legacy-systems-with-a-single-python-service" />
            <id>https://mackhankins.com/1</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[For years, the Mississippi Department of Archives and History relied on two separate systems to track changes on the preservation storage array: a bash script (`pres-audit.sh`) that logged file changes, and a multi-server ZFS snapshot pipeline (`pres_snap`) that involved TrueNAS cron jobs, a Redis queue, and a dedicated Ubuntu processing server. They worked, mostly, but the complexity was becoming a liability. When something broke, debugging meant jumping between three different machines and piecing together what happened.]]>
            </summary>
                                    <updated>2026-04-02T09:19:06+00:00</updated>
        </entry>
            <entry>
            <title><![CDATA[Process Files 4.0 — Batch File Processing for Digital Archives]]></title>
            <link rel="alternate" href="https://mackhankins.com/blog/process-files-40-batch-file-processing-for-digital-archives" />
            <id>https://mackhankins.com/2</id>
            <author>
                <name><![CDATA[Mack Hankins]]></name>
            </author>
            <summary type="html">
                <![CDATA[How I built a modular Bash toolkit to convert messy archive collections into clean, web-ready access copies.]]>
            </summary>
                                    <updated>2026-03-26T10:02:19+00:00</updated>
        </entry>
    </feed>
