Notes on using a custom domain to host the site
More notes on my journey towards static site domination.
So far, in my minimal web development experience I’ve never setup a true website such that my site had it’s own domain. I’ve owned at least one domain in he past, but I only used that for email services. However, I’ve heard one can setup a GitLab (and GitHub, etc.) site with a custome domain. Since I already had a custom domain, wetlandscapes.com, I thought I’d give setting up a custom domain a shot with the new site.
Since I’ve never done this before, and GitLab’s documentation, as it pertains to Amazon Web Service, was not super obvious to me, I thought I’d document all the steps I went through to go from having a site hosted via wetlandscapes.gitlab.io to wetlandscapes.com. That said, I’m really not sure how many of these steps were truly necessary. Ye have been warned.
Setting up a custom domain with GitLab and Amazon Web Services
- Registered my domain with Amazon Route 53. I did this a while ago (2017), so don’t really remember the details of that transaction.
- Started the custom domain process with GitLab.
- Went to my repo.
- Settings (lower left) -> Pages
- On the New Pages Domain page
- Added my domain: wetlandscapes.com
- Turned out some automatic security feature (not really sure what that was, but I couldn’t get to the next page without turning it on).
- On the Pages Domain page
- Copied my DNS and Verification Status
- My SSL certificate was inactive, since I hadn’t verified my domain, yet.
- Setup my DNS records for Pages
- Added a bucket to my root domain via AWS
- From the dashboard -> S3 -> Create bucket
- Bucket names:
- wetlandscapes.com
- www.wetlandscapes.com
- Added a key, “wetlandscapes”, so I could track any costs I might occur by doing this stuff.
- I kep the rest of the default options.
- Configure my “main” bucket, wetlandscapes.com
- Click the name of the bucket from the Amazon S3 console.
- Click Properties.
- Click Static Website Hosting
- Selected “Use this bucket to host a website”.
- Index document. From the
public/
directory of my site it looks like index.html works. - Error document. Same, but it looks like 404.html is the error document.
- Index document. From the
- Configure my “ancillary” bucket, www.wetlandscapes.com
- Same as above, but when I get to the Static Website Hosting I select “Redirect requests”
- Target bucket or domain: wetlandscapes.com
- Same as above, but when I get to the Static Website Hosting I select “Redirect requests”
- Setup web traffic logging (just because I’m curious)
- Create another bucket called logs.wetlandscapes.com
- Using the same options (and defaults) as I did with the other buckets.
- Once the bucket is created, click on it.
- Go to Overview.
- Create folders (use bucket settings):
- root
- cdn
- Go back to the S3 console page.
- Click the wetlandscapes.com bucket.
- Click Properties.
- Server access logging.
- Target bucket: logs.wetlandscapes.com
- Target prefix: root/
- Allow public access
- Click my main bucket.
- Click Permissions.
- On the Block Public Access tab click the Edit button.
- Uncheck Block All Public Access and then save.
- Confirm the change.
- Attach a bucket policy
In my main bucket click Permissions
Clock Bucket Policy
Copy and paste the following into the policy editor (note the reference to wetlandscapes.com):
{ "Version":"2012-10-17", "Statement":[ { "Sid":"PublicReadGetObject", "Effect":"Allow", "Principal":"*", "Action":[ "s3:GetObject" ], "Resource":[ "arn:aws:s3:::wetlandscapes.com/*" ] } ] }
- Added a bucket to my root domain via AWS
Note that the above step comes with a scary warning about never granting public access to an S3 bucket.
Test that my “domain endpoint” works. NOTE: ENDPOINTS MAY BE SUPERFLUOUS AS MY SITE IS ROUTED THROUGH GITLAB. 1. For this I ended up adding a dummy site to my main bucket. That is, I uploaded a index.html file with the following content:
```html <html xmlns="http://www.w3.org/1999/xhtml" > <head> <title>My Website Home Page</title> </head> <body> <h1>Welcome to my website</h1> <p>Now hosted on Amazon S3!</p> </body> </html> ```
- I then browsed to my endpoint: http://wetlandscapes.com.s3-website-us-west-2.amazonaws.com/
Add alias records 1. Go to Route 53. 1. Click Hosted zones (left margin). 1. Click my domain name, wetlandscapes.com 1. Click Create Record Set 1. The first thing I did was apply my GitLab varification code. This use a TXT type. I didn’t use aliases or anything, but I did have to ensure that the Value was in quotes. 1. Second was that I had to make a Type A connection with an IP value of 35.185.44.232 (I found this on GitLab). * Note 1: There were already NS and SOA types in my record set. I’m not sure what those are, but they were there. * Note 2: I thought I had to include a CNAME type, but everything was fine with just an A and TXT Type record.
Confirm DNS 1. On GitLab navigate: Settings -> Pages -> Domains: Details -> Click the Verify button.
Boom! The site showed up!
- Setting up the subdomain.
- One last thing: We need to setup the subdomain “www” of wetlandscapes.com.
- In AWS, got to Rout 53.
- Hosted Zones.
- wetlandscapes.com
- Create Record Set.
- Add “www” to the Name.
- Alias: Yes.
- Here is where those S3 buckets come into play. Alias Target: s3-website-us-west-2.amazonaws.com.
- It’s this link between the subdomain “www”, the bucket above, my “main” bucket, and my “main” site that allows this all to work. Pretty neat.
- Save Record Set.
The subdomain, www.wetlandscapes.com, was re-routing traffic immediately.