Pedicularis groenlandica

Wetlands and landscapes

Data science, water, isotopes, and ecosystems

Notes on using a custom domain to host the site

More notes on my journey towards static site domination.

Jason Mercer

5 minutes read

So far, in my minimal web development experience I’ve never setup a true website such that my site had it’s own domain. I’ve owned at least one domain in he past, but I only used that for email services. However, I’ve heard one can setup a GitLab (and GitHub, etc.) site with a custome domain. Since I already had a custom domain, wetlandscapes.com, I thought I’d give setting up a custom domain a shot with the new site.

Since I’ve never done this before, and GitLab’s documentation, as it pertains to Amazon Web Service, was not super obvious to me, I thought I’d document all the steps I went through to go from having a site hosted via wetlandscapes.gitlab.io to wetlandscapes.com. That said, I’m really not sure how many of these steps were truly necessary. Ye have been warned.

Setting up a custom domain with GitLab and Amazon Web Services

  1. Registered my domain with Amazon Route 53. I did this a while ago (2017), so don’t really remember the details of that transaction.
  2. Started the custom domain process with GitLab.
    1. Went to my repo.
    2. Settings (lower left) -> Pages
    3. On the New Pages Domain page
      1. Added my domain: wetlandscapes.com
      2. Turned out some automatic security feature (not really sure what that was, but I couldn’t get to the next page without turning it on).
    4. On the Pages Domain page
      1. Copied my DNS and Verification Status
      2. My SSL certificate was inactive, since I hadn’t verified my domain, yet.
  3. Setup my DNS records for Pages
    1. Added a bucket to my root domain via AWS
      1. From the dashboard -> S3 -> Create bucket
      2. Bucket names:
        • wetlandscapes.com
        • www.wetlandscapes.com
      3. Added a key, “wetlandscapes”, so I could track any costs I might occur by doing this stuff.
      4. I kep the rest of the default options.
    2. Configure my “main” bucket, wetlandscapes.com
      1. Click the name of the bucket from the Amazon S3 console.
      2. Click Properties.
      3. Click Static Website Hosting
      4. Selected “Use this bucket to host a website”.
        1. Index document. From the public/ directory of my site it looks like index.html works.
        2. Error document. Same, but it looks like 404.html is the error document.
    3. Configure my “ancillary” bucket, www.wetlandscapes.com
      1. Same as above, but when I get to the Static Website Hosting I select “Redirect requests”
        1. Target bucket or domain: wetlandscapes.com
    4. Setup web traffic logging (just because I’m curious)
      1. Create another bucket called logs.wetlandscapes.com
      2. Using the same options (and defaults) as I did with the other buckets.
      3. Once the bucket is created, click on it.
      4. Go to Overview.
      5. Create folders (use bucket settings):
        • root
        • cdn
      6. Go back to the S3 console page.
      7. Click the wetlandscapes.com bucket.
      8. Click Properties.
      9. Server access logging.
        1. Target bucket: logs.wetlandscapes.com
        2. Target prefix: root/
    5. Allow public access
      1. Click my main bucket.
      2. Click Permissions.
      3. On the Block Public Access tab click the Edit button.
      4. Uncheck Block All Public Access and then save.
      5. Confirm the change.
    6. Attach a bucket policy
      1. In my main bucket click Permissions

      2. Clock Bucket Policy

      3. Copy and paste the following into the policy editor (note the reference to wetlandscapes.com):

        {
          "Version":"2012-10-17",
          "Statement":[
              {
                  "Sid":"PublicReadGetObject",
                  "Effect":"Allow",
                  "Principal":"*",
                  "Action":[
                      "s3:GetObject"
                  ],
                  "Resource":[
                      "arn:aws:s3:::wetlandscapes.com/*"
                  ]
              }
          ]
        }
        

Note that the above step comes with a scary warning about never granting public access to an S3 bucket.

  1. Test that my “domain endpoint” works. NOTE: ENDPOINTS MAY BE SUPERFLUOUS AS MY SITE IS ROUTED THROUGH GITLAB. 1. For this I ended up adding a dummy site to my main bucket. That is, I uploaded a index.html file with the following content:

     ```html
         <html xmlns="http://www.w3.org/1999/xhtml" >
           <head>
             <title>My Website Home Page</title>
         </head>
         <body>
           <h1>Welcome to my website</h1>
           <p>Now hosted on Amazon S3!</p>
         </body>
         </html>
         ```
    1. I then browsed to my endpoint: http://wetlandscapes.com.s3-website-us-west-2.amazonaws.com/
  2. Add alias records 1. Go to Route 53. 1. Click Hosted zones (left margin). 1. Click my domain name, wetlandscapes.com 1. Click Create Record Set 1. The first thing I did was apply my GitLab varification code. This use a TXT type. I didn’t use aliases or anything, but I did have to ensure that the Value was in quotes. 1. Second was that I had to make a Type A connection with an IP value of 35.185.44.232 (I found this on GitLab). * Note 1: There were already NS and SOA types in my record set. I’m not sure what those are, but they were there. * Note 2: I thought I had to include a CNAME type, but everything was fine with just an A and TXT Type record.

  3. Confirm DNS 1. On GitLab navigate: Settings -> Pages -> Domains: Details -> Click the Verify button.

Boom! The site showed up!

  1. Setting up the subdomain.
    1. One last thing: We need to setup the subdomain “www” of wetlandscapes.com.
    2. In AWS, got to Rout 53.
    3. Hosted Zones.
    4. wetlandscapes.com
    5. Create Record Set.
    6. Add “www” to the Name.
    7. Alias: Yes.
    8. Here is where those S3 buckets come into play. Alias Target: s3-website-us-west-2.amazonaws.com.
      1. It’s this link between the subdomain “www”, the bucket above, my “main” bucket, and my “main” site that allows this all to work. Pretty neat.
    9. Save Record Set.

The subdomain, www.wetlandscapes.com, was re-routing traffic immediately.

Recent posts

See more

Categories

About

I am a PhD candidate at the University of Wyoming, studying mountain wetlandscapes.