Tech Blog

Hosting a Webhook Listener in AWS

As part of a continuing series on leveraging the public cloud, we present this walk through for hosting an Alma webhook listener using the Amazon Web Services platform. If you’re looking for an introduction to webhooks and a basic listener implementation, see this previous blog post.

Implement a Webhook Receiver

In order to use webhooks, we need a receiver to listen on HTTP and respond to the events sent by Alma. Since we don’t want to set up a web server in our own institution, we’ll leverage the public cloud. In this example, we will use two services of AWS- Lambda and their API Gateway. Using Lambda, we can run micro-services in the cloud without worrying about the underlying infrastructure. The API Gateway can be used to expose those Lambda-hosted micro-services as an HTTP endpoint. The flow can be visualized as follows:

API Gateway

Alma expects an HTTP endpoint which accepts both GET and POST messages. The GET method is used for the challenge phase of the initial setup. Alma sends a term and the endpoint is expected to echo the term back, allowing Alma to confirm that there is an endpoint listening that expects messages from Alma. The POST message accepts the actual webhook calls for each event defined in Alma.

The AWS API Gateway provides some helpful getting started guides. At a high level, we need to set up a new API, create a resource, add two methods to the resource, and configure the methods.

For each method, we need to configure the following aspects:

  • Method Request: define parameters to be recognized by the API
  • Integration Request: define the underlying service and the data passed to it
  • Integration Response: map the service response to a status code
  • Method Response: specify the possible HTTP status codes

For the GET method, we configure “challenge” as a URL query string parameter in the Method Request. In the Integration Request, we specify a “Mock Integration”, since we don’t need a real service in the background to simply echo a reply back to the caller. In the Integration Response, we select the Mapping Template for the 200 status code. In the application/json content type, we create a simple Mapping Template which echos back the challenge parameter:

{
   "challenge":"$input.params('challenge')"
}

With this configuration, the API Gateway replies with the “challenge” query string parameter whenever called as a GET method.

The POST method is a bit more complex as we need to call a Lambda function behind the scenes to do the actual work. We start by specifying Lambda as the integration type. We’ll want to pass in the information we receive from Alma, so we need to create a mapping template for the application/json content type. The mapping template takes the signature from the HTTP headers and wraps the rest of the input body as a JSON object called “body”:

{
  "signature":"$input.params().header.get('X-Exl-Signature')",
  "body":$input.json('$')
}

This configuration allows the entire payload to be accessed by the underlying function, along with the signature to be verified.

Now we can work on defining the Lambda service which will accept the method calls from the API Gateway.

Lambda Webhook Service

The code in our Lambda function must do the following:

  • Accept the input provided by Alma and proxied via the API Gateway
  • Validate the signature
  • Route the message to a relevant handler depending on the event type
  • Return a success or error response based on the outcome

We’ll create our example in Node.js (Lambda also supports Python and Java as of this writing). Our service receives the message sent by the API Gateway in JSON format with the following fields:

{
  "signature": "BASE64-HMACSHA256",
  "body": {
    "id": "1234567890".
    "action": "some_action",
    "some_object": {
      "field": "value",
      "field": "value"
    }
  }
}

First, we validate the signature which ensures the message came from Alma. Alma signs the body payload with a Base64-encoded HMAC SHA256 hash. We make the same calculation and compare the result to the signature Alma sent. It’s important that the same secret configured in Alma is used to validate the signature in our code. If it matches we continue processing.

function validateSignature(next) {
  var body = event.body;
  var hash = crypto.createHmac('SHA256', secret)
    .update(JSON.stringify(body))
    .digest('base64');
  if (hash != event.signature) next("Signature invalid");
  else next(null, body)
}

Next we route the message based on the event type. We assume there is a file with the action name which exposes a method called “process”. We send the message body to the handler for processing.

function routeMessage(data, next) {
  try {
    var processor = require("./" + data.action.toLowerCase() + ".js");
    processor.process(data, next);
  } catch(err) {
    if (err.code == 'MODULE_NOT_FOUND')
      next("Invalid action.");
    else next(err);
  }
}

We then call “done” on the context object to tell Lambda that we’ve finished processing, passing back any error that we received along the way.

HTTP Responses and Errors

We now need to translate the outcome of the Lambda function to an HTTP status code and return that code to Alma. To do that, we go back to the API Gateway and configure the Method Response to allow 200, 400, 401, and 500. Then we edit the Integration Response to map the value returned from Lambda to the HTTP codes based on the error message passed up from Lambda.

Configure Alma

To complete our end-to-end testing, we need to deploy the API in the AWS API Gateway. Once deployed, we can go to the URL provided by AWS and test that it responds in a browser. Then we’re ready to configure Alma to call our webhook.

We need to set up an integration profile in order for Alma to call our webhook. We create a new Webhook Integration Profile and use the URL exposed by the AWS API Gateway as the webhook listener URL. Assuming we configured the API Gateway correctly, our endpoint will answer Alma’s challenge correctly and we will be able to activate the listener. Then we add a random secret and configure the same secret in our Lambda function.

End-to-end Testing

As of this writing, Alma supports sending a webhook when a job is completed. Upon completion, Alma will POST an object with an action of “JOB_END” and a “job_instance” property which contains all of the information about the job which was run.

In our Lambda function, we’ll create a file called “job_end.js” which exposes a “process” function. Of course, you can do whatever business logic is required inside of the process function. In our example, we will get the details of the user who executed the job and send an SMS message to let them know the job was completed. So our function looks as follows:

exports.process = function(data, callback) {
  var job = data.job_instance;

  // Get user phone & send SMS
  alma.get("/users/" + job.submitted_by.value, 
  function(err, user) {
    var phone;
    if ( user.contact_info.phone && 
      (phone = user.contact_info.phone.find(p => p.preferred_sms)) ) {
        utils.sendSms(phone.phone_number, 
        "The job " + job.name + " has completed.", 
        callback);
    } else {
      console.log("No preferred SMS number found.")
      callback(err);
    }
  });
}

Now when a job completes, the user is notified immediately.

This is just the beginning for webhooks. As Alma continues to release support for additional events, we’ll see even more opportunities for seamless integration with other systems.

The code for this sample is available in this Github repository.

Github

Leave a Reply