Lately I have been doing more and more websites using static site generators like Jekyll and Hugo. Not only are they quick to develop and cheap to host, but often times clients cannot tell the difference.

One issue that people often overlook when migrating a site from something like WordPress to a static site generator like Hugo is dynamic content. Something as simple as a contact form, or in this case an automatically updating Twitter feed require a backend to process form requests, send emails or authenticate with an API using a private key.

One problem I’ve faced is displaying a live Twitter feed. Twitter used to let you query their API for a particular user and get back JSON which you could embed in your website but not anymore. They now require OAuth2 authentication to make API calls which means that displaying your Twitter feed on a static website becomes a little bit more involved. Of course, you could just use Twitter’s first party JavaScript widget, but all that tracking code they stuff in there comes with a fairly moderate performance hit. Not to mention that it is near impossible to style it to match the look and feel of your website.

Instead I wanted to be able to make a simple AJAX request and recieve back a JSON structure representing my latest tweets that I could render into HTML and inject into the page anyway I wish. Static websites hosted on S3 differ from a typical shared host running a LAMP stack where you can just drop a .php in your public directory and have it get executed on every request. That’s not to say you cannot add dynamic functionality, you most certainly can but the approach is just different. To achieve my goal I would create an AWS Lambda function loaded up with my Twitter API keys and authenticate that way.

Getting Started with Node Lambda

If you’re going to be doing any NodeJS development on AWS Lambda whatsoever, then I highly recommend node-lambda. It is a Node package for easily testing (locally) and deploying Lambda functions.

Now, I could just call the Lambda function directly from my website but to maximize performance and adhere to Twitter’s API rate limiting policy I would need to cache the response to a static file on S3, which I will call tweets.json. Serving a static file from S3 or CloudFront is also much cheaper that an invoking a Lambda function on every page view.

I am using a few NPM packages to make this work.

  • Twitter package to handle authentication
  • Official AWS SDK
  • Moment to format timestamps in a nice human readable format
  • Node Lambda to make testing and deployment a breeze
  • Dotenv to keep my private keys out of my code and safe in a source control excluded enviroment file

Fetching Tweets from Twitter

On AWS Lambda, whichever function you assign to exports.handler becomes the entry point for that Lambda function. Here, I am giving the Twitter package my API keys so it can authenticate my user timeline request. If there is an error in the response, if the last tweet I recieve back is more than an hour old I know that my S3 bucket is already up to date and I don’t have to continue. If those checks have passed I can go on and call the upload() and invalidate() functions which is where I acutally interact with the AWS APIs.

exports.handler = function (event, context, callback) {
  let twitter = new Twitter({
    consumer_key: process.env.TWITTER_CONSUMER_KEY,
    consumer_secret: process.env.TWITTER_CONSUMER_SECRET,
    access_token_key: process.env.TWITTER_ACCESS_TOKEN_KEY,
    access_token_secret: process.env.TWITTER_ACCESS_TOKEN_SECRET
  })

  let params = {
    screen_name: process.env.TWITTER_SCREEN_NAME
  }

  twitter.get('statuses/user_timeline', params, tweetHandler)
}

Twitter Response Validation

function tweetHandler (error, tweets, response) {
  if (error) {
    console.log(error, error.stack)
    return
  }

  let latestTweet = moment(new Date(tweets[0].created_at))
  let lastHour = moment().subtract(1, 'hours')

  if (latestTweet.isBefore(lastHour)) {
    console.log('No new tweets to upload.')
    return
  }

  upload(tweets)
  invalidate()
}

Uploading to S3

The function S3.upload() takes a bunch of parameters, notably ContentType which should be set to application/json or text/json to ensure that your clientside code doesn’t have any problems interpretting the AJAX response properly.

function upload (data) {
  let s3 = new AWS.S3()

  let params = {
    Bucket: process.env.AWS_S3_BUCKET,
    Key: process.env.AWS_S3_KEY,
    Body: JSON.stringify(data),
    ContentType: 'application/json',
    CacheControl: process.env.AWS_S3_CACHE_CONTROL
  }

  s3.upload(params, function (error, data) {
    console.log('Sucessfully uploaded tweets.json to S3')
  })
}

The next step is only applicable if you are using CloudFront to serve your S3 website. CloudFront is a CDN which means that for the benefit of your visitors your site files are hosting on many servers around the world. If you change files in your S3 bucket, it takes some time for your this global network of servers to all become aware of your change. This process is called invalidation and CloudFront does not do it automatically when you make changes from the API so we will need to create a cache invalidaton.

Invalidating the CloudFront Distribution

function invalidate () {
  const cloudfront = new AWS.CloudFront()

  let params = {
    DistributionId: process.env.AWS_CLOUDFRONT_DISTRIBUTION,
    InvalidationBatch: {
      CallerReference: '' + new Date().getTime(),
      Paths: {
        Quantity: 1,
        Items: [process.env.AWS_S3_KEY]
      }
    }
  }

  cloudfront.createInvalidation(params, function (error, data) {
    if (error) {
      console.log(error, error.stack)
    } else {
      console.log('Sucessfully invalidated tweets.json')
    }
  })
}

In case you were wondering about the log messages, they are very helpful during development and also for diagnosing problems in production.

Congratulations, now you can build a highly performant static website with a dynamic Twitter block. I’ve published the code on my GitHub account and created an NPM package but hopefully this article will explain a bit about the reasons for this approach and a little about how it works.