How the blog was built part 4 - migration

I finally did it. I finally went and bought a Macbook and the M1 Air at that. The first three parts in this series document how I built the static Ghost blog in a Windows environment and now I have MacOS to contend with.

How the blog was built part 4 - migration
Photo by Joshua Woroniecki / Unsplash

I finally did it. I finally went and bought a Macbook and the M1 Air at that. The first three parts in this series document how I built the static Ghost blog in a Windows environment and now I have MacOS to contend with. In this part I will show how the code has changed and the issues I have encountered in rebuilding the blog on a Macbook.

Lets look at the tech installed first and then the changes to the blog code required.

GIT

A new platform needs GIT and I first had to install Home-brew https://brew.sh/ and then install GIT for MacOS, https://git-scm.com/download/mac.

Docker

I needed Docker for Mac for my local Ghost blog. I thought I would build the Docker tutorial docker build -t docker101tutorial to test it.

I had this jinja version package issue AttributeError: module 'jinja2' has no attribute 'contextfilter' and to fix you set the Jinja version number to 3.03 in the requirements.txt file.

jinja2==3.0.3
requirements.txt

Running the Docker voting sample

There was also an issue when running the Docker example voting app. Something like Ports are not available: listen tcp 0.0.0.0:5000: bind: address already in use.

To solve this I followed this advice to find out what was using port 5000, lsof -i:5000.

Then to be honest I don't recall what I disabled. I wish I had noted that - Gah! If you do have this issue please let me know your solution so I can record it here.

GitHub token

I use VSCode for my blog code and I needed to connect my GitHub to do this. To do this you must setup a GitHub token. Just follow the steps there.

Now the changes required when running the code.

The Dockerfile

Issue Service 'app' failed to build when docker composing

I ran docker compose up and received this error:

=> ERROR [ 2/24] RUN yum -y -q install which curl wget gettext patch gcc 
0.8s
------                                                                   

> [ 2/24] RUN yum -y -q install which curl wget gettext patch gcc-c++ 
make git-core bzip2 unzip gcc python3-devel python3-setuptools redhat-
rpm-config sudo  &&     yum -y -q clean all:
#5 0.575 Error: Failed to download metadata for repo 'appstream': Cannot 
prepare internal mirrorlist: No URLs in mirrorlist
------
executor failed running [/bin/sh -c yum -y -q install which curl wget 
gettext patch gcc-c++ make git-core bzip2 unzip gcc python3-devel 
python3-setuptools redhat-rpm-config sudo  &&     yum -y -q clean all]: 
exit code: 1
ERROR: Service 'app' failed to build : Build failed

This was because Centos v8 was deprecated so I added the following snippet to the Dockerfile. The changes repoint the package installs to a new location until I can probably use a different Linux distro.

RUN cd /etc/yum.repos.d/
RUN sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-*
RUN sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-*

Python install issue

Installing Python in the Dockerfile used Python which is version 3.6 but now out of date. This meant I had to upgrade to Python v3.8 in the package install.

RUN yum -y -q install which curl ... python38-devel ...

docker-compose.yml

Issue platform required on MacOS

I don't recall the extact issue but as I'm running on MacBook M1, I had to specify the platform but then the composition should still work on a Windows machine.

services:
  app:
    platform: linux/x86_64 
    image: ...
docker-compose.yml

Issue config could not be found

The backup script needs to sync files to S3 and I was getting the message The config [ghost-blogger] could not be found. This meant I had to add an entry to the volumes list of where the AWS folder is in MacOS.

Warning: composing down and then up will reset your local Ghost installation and you will need to install your last backup.
	volumes:
      ...
      - ~/.aws:/root/.aws:ro
      ...
docker-compose.yml

backup_website.sh

As I was now using MacOS I converted the Windows-only backup_website.bat file to a shell script that could be used on any platform.

The notable changes were getting the datetime, the cypress UI test and copying the JSON files. I also changed the name of the profile in the .as-a.ini file (that has the Cypress password) to binarydreams-blog.

Not related to using macOS but still, I separated the content and profile UI tests into their own files because otherwise the Export content download would not happen.

Then those export data file needed renaming with the same datetime before copying to the backup folder.

#!/bin/bash

GHOST_DOMAIN=binarydreams.biz
AWS_REGION=eu-west-1

# Get the current date time as GMT - No daylight savings time.
DATETIME=`date +"%Y-%m-%d-%H-%M-%S"`

echo "${DATETIME}"

echo "Back up content folder first"
docker compose exec -T app /usr/local/bin/backup.sh ${DATETIME}

echo "Run the UI test to export the content as a JSON file and return to 
this process"
npx as-a binarydreams-blog cypress run --spec "cypress/e2e/ghost_export_content.cy.js,cypress/e2e/ghost_export_profile.cy.js" --env timestamp=${DATETIME}

echo "Rename the exported JSON files with a timestamp"
find $GHOST_OUTPUT -iname 'binarydreams.ghost*' -exec mv {} ${GHOST_OUTPUT}content.ghost.$DATETIME.json \;
find $GHOST_OUTPUT -iname 'profile.ghost*' -exec mv {} ${GHOST_OUTPUT}profile.ghost.$DATETIME.json \;

echo "Copy the JSON file to the backup folder"
cp ./cypress/downloads/*.json ./data/backup/$DATETIME/

echo "Sync back up files to S3"
aws s3 sync ./data/backup/ s3://$GHOST_DOMAIN-backup --region
$AWS_REGION --profile ghost-blogger
backup_website.sh

Then to run it on a terminal you enter sh ./backup_website.sh.

While testing the Cypress script I also found a major version had been released and this changed quite a few things.

Issue testing the backup script Cannot find file with env setting

This message was because I didn't have the as-a.ini saved with the full-stop prefix, even though this worked on Windows.

/Users/jon-paul.flood/.as-a.ini or /Users/jon-paul.flood/.as-a/.as-a.ini.

Issue running Cypress script spawn cypress ENONT

I ran the backup script and received this error:

Error: spawn cypress ENOENT
    at ChildProcess._handle.onexit (node:internal/child_process:283:19)
    at onErrorNT (node:internal/child_process:476:16)
    at process.processTicksAndRejections (node:internal/process
/task_queues:82:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn cypress',
  path: 'cypress',
  spawnargs: [ 'run' ]
}
cypress exit code -2
Error: spawn cypress ENOENT
    at ChildProcess._handle.onexit (node:internal/child_process:283:19)
    at onErrorNT (node:internal/child_process:476:16)
    at process.processTicksAndRejections (node:internal/process
/task_queues:82:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn cypress',
  path: 'cypress',
  spawnargs: [ 'run' ]
}

The fix was I had the profile in the .as-a.ini set to .(full-stop) and NOT the actual profile name of binarydreams-blog.

update_website.sh

This also replaced a Windows-only batch file, update_website.bat but the only differences are the bash reference and echo replacing rem.

!/bin/bash

echo "Generate static files"
docker-compose exec -T app /usr/local/bin/generate_static_content.sh

echo "Sync code to S3"
docker-compose exec -T app /usr/local/bin/upload_static_content.sh
update_website.sh

Also run in a terminal as sh ./update_website.sh.

Issue testing the update script permission denied

Running the upate script, I received this error:

OCI runtime exec failed: exec failed: unable to start container process: exec: "/usr/local/bin/upload_static_content.sh": permission denied: unknown

I had to use this command chmod -R +x * in the bin folder to execute all the shell script files as found on this website.

Setup a Ghost backup

During all these changes I had to set up a past Ghost backup, particularly when composed docker but this time with the volume location of the AWS profile, ghost-blogger. And thank goodness I did have my backups!

I would like to automate more of this but these are the manual steps currently.

  1. Run docker compose up.
  2. Download the last backup file from S3 and uncompress the ZIP.
  3. Replace the content folder with the backup.
  4. Browse to the site, start the blog as new.
  5. Setup the basic user (again!) with all the text and the profile picture.
  6. Go to settings and import the backed up JSON file.
  7. Suspend the Ghost user account as not used.
  8. The default Ghost articles with the Ghost author should be unpublished or deleted.

When I think about it, the automation could be, steps 2-3 as a shell script and 5-8 could be done by Cypress.

Issue importing the backed up JSON file

I received these warning while importing the JSON backup file.

Import successful with warnings

User: Entry was not imported and ignored. Detected duplicated entry.
{
   "id":"62572eeb179d0a0018299745",
   "name":"Jon-Paul",
   "slug":"jp",
   "password":"*************************************************",
   "email":"*******@binarydreams.biz",
   "profile_image":"https://binarydreams.biz/content/images/2021/09/ProfilePicture.png",
   "cover_image":"https://binarydreams.biz/content/images/2021/09/background.jpg",
   "bio":"AWS cloud developer",
   "website":null,
   "location":null,
   "facebook":null,
   "twitter":null,
   "accessibility":"{\"navigation\":{\"expanded\":{\"posts\":false}},\"launchComplete\":true}",
   "status":"active",
   "locale":null,
   "visibility":"public",
   "meta_title":null,
   "meta_description":null,
   "tour":null,
   "last_seen":"2021-10-30 17:20:35",
   "created_at":"2021-09-06 21:44:13",
   "updated_at":"2021-10-30 17:20:35",
   "roles":[
      "Administrator"
   ]
}

User: Entry was not imported and ignored. Detected duplicated entry.
{
   "id":"62572eeb179d0a0018299746",
   "name":"Ghost",
   "slug":"ghost",
   "password":"**************************************************",
   "email":"[email protected]",
   "profile_image":"https://static.ghost.org/v4.0.0/images/ghost-user.png",
   "cover_image":null,
   "bio":"You can delete this user to remove all the welcome posts",
   "website":"https://ghost.org",
   "location":"The Internet",
   "facebook":"ghost",
   "twitter":"@ghost",
   "accessibility":null,
   "status":"inactive",
   "locale":null,
   "visibility":"public",
   "meta_title":null,
   "meta_description":null,
   "tour":null,
   "last_seen":null,
   "created_at":"2021-09-06 21:44:14",
   "updated_at":"2021-09-06 21:53:50",
   "roles":[
      "Contributor"
   ]
}

Settings: Theme not imported, please upload in Settings - Design
{
   "id":"61368bb4b3a69e001a3618e6",
   "group":"theme",
   "key":"active_theme",
   "value":"casper",
   "type":"string",
   "flags":"RO",
   "created_at":"2021-09-06 21:44:21",
   "updated_at":"2021-09-06 21:44:21"
}

First User issue probably should have created a completely different user to avoid that warning.
Second warning was the Ghost default user so I’m not bothered about that.
The Theme warning though? Hmm.

The import results were that:

  • The Ghost original pages
    - Contribute page was created but my import didn’t replace it. I manually deleted it.
    - Privacy page was created but my import didn’t replace it. I manually deleted it.
    - Contact page was created but my import didn’t replace it. I manually deleted it.
    - About this site should be unpublished or deleted but is published in new site.
  • My Cover photo missing.
  • My staff details cover, name, bio, slug not set as I configured.

This all means I need a backup import script. On the to-do list!

Transferring my domain

Not required for my migration to MacOS but might as well note the transition here. My original domain host (EUKHost) hosting costs were about £12 and I knew Cloudflare were cheaper, basically at cost. I already use Cloudflare for the domain routing and it made sense now to have all my domain management in one place.

This YouTube video was a help to do this even though I had a different domain host.