I replaced my privately used Macbook Air with an iPad Pro earlier this year as it wasn’t a good fit to the major use-cases I need a computer for in private. At home I mostly use it to watch videos or read articles with the very occasional game (I very much prefer dedicated gaming consoles). The majority of time it is used to manage family photos and videos – and for that the iPad works much faster and comfortable than my previous Mac.

I haven’t blogged much about all of these changes. Why? Because I set up my blog to be generated from a set of Markdown files using the popular Jekyll static site generator which only runs on “real” computers: The last issue I had to solve for a true iPad-only life.

I considered moving back to a dynamic system like Wordpress or Ghost, which generate pages on demand. I decided not to do it for two main reasons. Firstly, software like Wordpress needs to be updated due to new vulnerablities being discovered. Secondly, since such blog systems are running whenever a visitor requests a page (fetching the data from a database and creating the page) they need at least a virtual machine to run in, and those come with running costs as well.

Even without a lot of traffic it’d be necessary to pay for a system mostly idling along. It would also require constant maintenance. All of this seems unreasonable to me. Additionally, should traffic suddenly spike it won’t even scale, most likely taking the blog offline. Static HTML files hosted on Amazon S3 don’t have any of these drawbacks. Perfect!

For everybody interested in handling their own blog in the same way, I’m going to roughly explain how it is working for me. What I’m basically doing is to write a post on the iPad and push it to a Bitbucket Git repository from where it gets published.

Prerequisite

A couple of web services and applications are necessary to get started.

Services:

iPad Apps:

macOS Apps:

Bitbucket Repository

Enable and configure the pipeline for your repository by adding this bitbucket-pipelines.yml file. Replace docker_username, docker_repo_name, bucket_name_production, bucket_name_staging and region_string with the values from your setups.

image: docker_username/docker_repo_name
pipelines:
  default:
    - step:
        script:
          - jekyll build
  branches:
    develop:
      - step:
          script:
            - jekyll build
            - aws s3 sync /opt/atlassian/pipelines/agent/build/_site s3://bucket_name_staging --region region_string
    master:
      - step:
          script:
            - jekyll build
            - aws s3 sync /opt/atlassian/pipelines/agent/build/_site s3://bucket_name_production --region region_string

I am distinguishing between develop and master branch. This gives me the option to commit a draft I’d like to preview to develop to see how it renders via a second S3 bucket. If I am satisfied with it I can just merge it to master to publish to the production blog. This staging environment is not mandatory. Should you be confident about your writing straight away you can also directly commit to master and push it.

Docker Image

To create the necessary Docker image to build the Jekyll site, I used a Macbook I had at hand. If the iPad is absolutely the only available computer, a temporary Amazon EC2 instance could also be used (Linode or Digital Ocean are fine as well). Since I didn’t explore this option it will not be explained here.

To create the Docker image we need to install the Docker Toolbox and start it. A Terminal window will open in which we create our Dockerfile with our favourite editor. All we need to do is to install the AWS command line tools and Jekyll.

The configuration installs Jekyll from Rubygems even though it would be available as a debian package. The reason for this is that the debian package is quite old (version 3.0 instead of 3.6 at the time of writing this). The special parameters for apt-get are to reduce the amount of additional packages to keep the size of the Docker image small. Big images might negatively affect the build time in Pipelines.

# The base image
FROM debian:jessie-slim 
 
# Additionally required software to build the Jekyll site
RUN apt-get update
RUN apt-get install --no-install-recommends --no-install-suggests -y xvfb awscli ruby ruby-dev bundler build-essential
RUN gem install jekyll:3.6.2 jekyll-paginate:1.1.0 --no-document

# Cleanup
RUN apt-get clean
RUN gem sources --clear-all

With this we can compile the Docker image, but we need to make it accessible for the Bitbucket Pipeline. A good option is creating an account on Docker Hub, to which we can conveniently push the finished image. After having done so, we should log in with out Docker account, then build and push the image to Docker Hub. Since I keep my image public you can also use mine, but I can’t guarantee it will work for you: I might change it in the future to improve it.

docker login
docker build -t docker_username/docker_repo_name .
docker push docker_username/docker_repo_name

Editorial and Workflow

With the pieces in place, all we need to do is write a post in Editorial. Adding images is as easy as dropping them into the assets/ folder that exists alongside the Editorial markdown files in Dropbox and simply reference them in the text. When done, we share it with Working Copy to commit and push the change to Bitbucket. I started off with an available workflow to send a post using Working Copy and modified it to better suit my needs: Jekyll Publish.

My Workflow prompts to input tags, creates the Jekyll front matter and writes the article and its image files to the repository. The final step of it pushes everything to Bitbucket. Once this is done the Bitbucket Pipeline kicks in, executes Jekyll to build the site and, immediately after finishing, syncs the resulting site to Amazon S3 were it will be available immediately.

On the Bitbucket Free Tier, the available build time per account is limited to 50 minutes per month. Even if you write a post every day this should be fine as long as the site isn’t massively huge. A build for my blog in the current state currently takes about 20 seconds. The majority of the time is lost to downloading the Docker image, so there is still room for improvement creating a more sophisticated process for a smaller image.

If you find this setup useful I encourage you to adopt it. I’d also be very happy to hear of possible improvements you made. Happy blogging without the downside of a dynamic publishing system.