Today I switched my website from GitHub Pages and Cloudflare back to my own servers. It wasn’t that hard but there are also some drawbacks I want to report.
Reasons for Switching
It’s super easy to set up github pages and as they have ruby installed it’s even easier to run your Jekyll based site on it. You can also add custom domains to it which I did. At some point in the past years I wanted to encrypt the user’s session and [enabled SSL via Cloudflare]({% post_url 2014-10-16-ssl %}). But there is an issue with the whole workflow while it’s super easy:
- GitHub owns your data. If it’s down (i.e. DDoS), your site is, too.
- GitHub has a [problem with A-Record Domains using no
www.
]({% post_url 2014-04-08-github-pages-redirect-performance %}). - GitHub has only unofficial HTTPS support on GitHub pages and none for custom domains.
- Cloudflare wants to handle your nameservers and DNS entirely.
- Cloudflare has only free shared SSL certificates (which have some drawbacks).
- Cloudflare only mocks your HTTPS connection to the next server. No one knows if it’s encrypted all the way back to GitHub.
- Both GitHub and Cloudflare are vulnerable to DDoS and might be down.
- In case you need to switch your site due to an outage / shutdown one of the companies’ will be down for at least (depending on Cloudflare settings) 48hrs completely.
- Cloudflare does not allow 3rd party CDNs anymore (at least in the free plan).
Preparations
First I needed to set up the project infrastructure for my Jekyll project. I wanted my workflow to stay the same so I needed git
on the server. In fact, I preferred to do the jekyll build
on my local machine now instead of the server so I don’t necessarily need ruby on the server, even though my shared hosting package at Uberspace does support that as well.
Second, I need my own SSL/TLS Certificate. I used sslmate for that creating Comodo certs in an super easy way. That was actually very easy already. I created them locally so I now needed to push them to the server via SSH (please be very careful here and keep everything secure and private in this step) and notified my hoster so he can configure the server to use it.
The domain was already allowed and configured for the server so now I only had to switch the nameserver and DNS to the uberspace host. This can take up to 48hrs to populate to everyone. Ready. My site runs on my own hosting service again, the certificate is my own which I can fully trust and I still have an easy workflow to push content to my site.
Trouble
But not everything went fine for me. In fact, there are a couple of drawbacks:
- My site might be slower for everyone far away from Germany as the site now runs from a single server in Germany again (previously was choosing nearest server worldwide by github). This can be done, too, but for now it’s too much effort.
- My site was down for a couple of hours due to me failing on the DNS change and my hoster to tell me that the IP running SSL services is another one than for HTTP.
- There is no easy integration for creating and configuriing SSL certificates at shared hosting services yet.
- You really need to know about the potential impacts of DNS, SSL, HSTS and changing nameservers.
- My Github website repository went on serving all content (solved manually by support staff)
- I needed to redirect gh-pages to their github.io domain on my server.
This experience again showed me how hard it still is for a normal user (and I’d call myself tech-experienced and CLI-aware already) to configure HTTPS properly.
It made me think about the hard push by browser vendors to HTTPS and question that in regard to people who don’t have the possibility to configure their webspaces (I know a lot of them where this wouldn’t have been possible at all without switching to a way more expensive package) or simply don’t have the skills and endurance of trying it out.