So in part 2 we backed the Ghost files up. Now we are going to create the static files and then upload them to S3.
Hosting the website
I'm using an AWS S3 bucket to host the website with CloudFlare routing and this article was instrumental in setting it up so I highly recommend a read. I won't repeat it all here but I will summarise what you do:
Assuming you have a domain of
mywebsite.biz and subdomain of
www you will create 2 separate buckets:
- Subdomain bucket named
www.mywebsite.biz, configured for static website hosting but public access to the bucket is enabled.
- Domain bucket named
mywebsite.biz. Requests will need redirecting from this domain bucket to the subdomain bucket.
- Bucket policies will need configuring to allow the Cloudflare IP addresses ensuring only requests coming from the Cloudflare proxy will be responded to.
Then you need to set up your site on Cloudflare. It provides caching and some DDos protection but not the end-to-end encryption you might want. As this is just a static website and I don't make requests to anything else, I'm not necessarily worried about that even though I'd like to be 100% as a principle.
- You will create 2 CNAME records for your root domain and subdomain. Each record will have the name as the domain and the value as the relevant S3 bucket endpoint.
- Change your domain nameservers to Cloudflare.
- To use HTTPS for the traffic between visitors and Cloudflare this article describes how to. On your Cloudflare dashboard SSL/TLS Overview tab - the encryption mode should be Flexible and the Edge Certificate tab with Always Use HTTPS enabled.
- Use this online tester to verify your setup.
Now visitors will be able to visit the website using the subdomain or your root domain once you have uploaded the static files.
The only differences with the original GitHub file
generate_static_content.sh are I added
sudo to use with the gssg tool and not uploading the static files to the cloud. Note the
gssg tool does have pre-requisites that need installing first.
It works by extracting the files and swapping the
https://binarydreams.biz for your configured domain.
To run the script execute
docker-compose exec -T app /usr/local/bin/generate_static_content.sh and create the static files.
So rather than have the file upload in static generation script I wanted it in its own brand new file - this helped with testing it on it's own too.
- This script takes an argument for the AWS region defaulted to eu-west-1.
- It takes the generated files in the
/staticfolder, deletes the existing files, uploads to the S3 domain bucket and makes them publicly readable, all using the configured
To run this script you execute this
docker-compose exec -T app /usr/local/bin/upload_static_content.sh.
All together now
Now lets bring it all together with the script executions in a batch file.
The AWS profile needs tobe checked this command can run before the actual script starts.
When I run that file it will now create the static files and upload to S3 which you see before you now!
Looking at my costs - my domain (£10 per year), the S3 bucket which is currently a few pence per month and Cloudflare for free as an individual user. Awesome!
- Easily gives me SSL certification
- CDN Caching to reduce requests to the bucket
Hopefully this has been of some help, any feedback always welcome!