☰ Posts

File the following under: “Mac Nerd”

Automator publishing to S3

December 20, 2016

UPDATE (2017-12-13): I wrote about a new strategy, using AWS’s own command line interface to update the s3 bucket. It’s much better and this one had some problems anyway.

I use Harp JS for this site. (Harp is a static site generator with almost the perfect balance between simplicity and customization, and I hope it lives for a long time. Go check it out.) I then host the static html files in an Amazon s3 bucket. A fairly common setup these days for static sites.

The publishing flow involves running a compile command for harp to generate the static files. Then uploading all the html—and sometimes a few static assets—to s3 via some FTP client. It’s not very tedious but I’d like to make the publishing step as friction-less as possible to remove any excuses my lazy brain has against writing more.

What I’m going to try here is to use Expandrive to mount my S3 bucket as a volume on my Mac, then use Automator to rsync the local Harp-generated files to the mounted S3. It does remove an extra manual step, but perhaps more importantly, it removes my need of having to think about what files were changed. Rsync can figure that out and just upload the changed files. So my full flow will be:

  • write post in markdown and update the data.json file in Harp
  • Run harp compile from the command line within the Harp directory
  • Double-click the desktop Automator script to trigger the sync to S3.

My hope is that this is easy enough that I’ll write a little more.

As for Automator, here’s the action I used:

Automator script
My super basic Automator script for publishing to S3

Nothing much to it. Once the S3 bucket is mounted to my iMac using Expandrive (and it usually is), I can use it as the target in the rsync command. The www directory is where Harp outputs the static site files after a compile.

Archive