4 minutes read

POSTED Jan, 2019 dot IN Serverless

What Node.js Custom Runtime Brings and How We Developed It?

İbrahim Gürses

Written by İbrahim Gürses

Ex-Software Engineer @Thundra

middle-centered-blog-1 X

“Serverless monitoring with no code change and zero overhead” is our mantra at Thundra. I know it is a bold statement, but we have been working hard to fulfill this claim. First, we achieved “zero overhead” by leveraging  async monitoring and even today Thundra is still the only agent-based solution which supports async monitoring. Then, we bring automatic monitoring support to our NodeJS agent in which our serverless plugin automatically wraps your AWS Lambda functions and you can deploy your functions without changing anything in your code. Even with these nice and convenient ways, what if there was a simpler configuration to enable monitoring features of Thundra on an already deployed AWS Lambda function with just a few changes over AWS Lambda Console. AWS Lambda Layers and Custom Runtime comes to the rescue.


Thundra NodeJS Custom Runtime lets you monitor your AWS Lambda Functions with no code change. Check Thundra Docs for detailed instructions. You can also check out this video which showcases how you can integrate Thundra with your AWS Lambda function in less than a minute.

AWS Lambda Layers and Custom Runtime

AWS Re:Invent 2018 has come and gone with exciting announcements for the serverless community. We were particularly more excited for Layer and Bring Your Own Runtime support because it could enable us to deliver Thundra NodeJS agent with literal “no code change” approach. Thundra was one of the launch partners for this announcement, which was a thrilling moment for us because we were able to play with this technology beforehand. For those who do not know what layers and custom runtime is :

  • Lambda Layers, a way to centrally manage code and data that is shared across multiple functions.

  • Lambda Runtime API, a simple interface to use any programming language, or a specific language version, for developing your functions. 

So How We Developed It?

So, in order to develop Custom Runtime for NodeJS, we needed to learn how the current NodeJS runtime of AWS Lambda worked. For those who are curious like Janaka Bandara, you already know you can run shell commands in the AWS Lambda and learn about the underlying OS Level stuff. With a few lines of NodeJS code like below, you can run a shell command inside AWS Lambda and get its output.


let {exec} = require('child_process');
exports.handler = (event, context, callback) => {
exec(event.cmd, (err, stdout, stderr) => {
  if (err) console.log(stderr);
  callback(undefined, {statusCode: 200});


You can run the code above on AWS Lambda, it is a simple lambda function which gets a command and forks a child process then executes the command and finally logs the output to the console. For example, if you invoke this lambda function with JSON payload {“cmd”: “pwd”}, it will log /var/task to the console. In Linux systems,pwd command prints the current working directory. So we can see that AWS Lambda NodeJS runtime runs the user code under /var/task folder.

Now, with the help of running simple shell commands, we could see how the current AWS NodeJS runtime operates, how it parses the handler, and what it does for initialization, what are the environment variable in the container. We extracted the index.js file which AWS Lambda uses to run user functions. It taught us a lot about how AWS builds Lambda Context object which environment variables are used, how the execution loop is constructed. Sure there were some native calls to some C++ add-ons impairing our ability to analyze the code, but it gave us a general idea for developing our custom runtime. You can take a look at this code from here.

 Then, we started developing our custom runtime, but we had a major problem. Current Custom Runtime only included NodeJS Version 4.3 which is an archaic version of NodeJS and our agent was not designed to work on that old version of NodeJS. So, we needed a NodeJS binary. AWS Lambda runs Custom Runtime on Amazon Linux with this AMI specifically. So, we started an EC2 instance with that AMI and built NodeJS from source code on that instance. We created a node binary for NodeJS version 8.10 for Thundra Custom Runtime, we could include the latest version of NodeJS, but we did not have time to test our agent with the latest version. So, we included it with 8.10 but we can provide a layer with the latest versions of NodeJS in the future.

After finishing the implementation of our Custom Runtime, we packaged everything as an AWS Lambda Layer so that user can simply add our runtime as a layer and start using Thundra. The final version of the layer included these files:


|-- bootstrap  # entry point for custom runtime, starts index.js
|-- node
| |-- index.js # Custom Runtime Implementation         
| |-- node # Node 8.10 binary, used by Custom Runtime              
| |-- node_modules # Contains Thundra NodeJS agent dependencies     
| |-- package-lock.json # Contains Thundra NodeJS agent build info
| |-- package.json # Contains Thundra NodeJS agent build info

So, how did we implement the custom runtime? We followed the steps in the official AWS doc, as an extra step, we are automatically wrapping your functions with Thundra before running your code so that you don’t need to wrap your code with our handler at all. Our custom runtime implementation consists of the steps like below.

 1. Get an event – Call the next invocation API to get the next event. The response body contains the event data. Response headers contain the request ID and other information.

 2. Create a context object – Create an object with context information from environment variables and headers in the API response.

 3. Wrap user function with Thundra Agent - Thundra agent uses wrapping for monitoring with custom runtime we do that for you now.

 4. Invoke the wrapped function handler – Pass the event and context object to the handler wrapped with Thundra

 5. Handle the response – Call the invocation response API to post the response from the handler.

 6. Handle errors – If an error occurs, call the invocation error API

 7. Cleanup – Release unused resources, send data to other services, or perform additional tasks before getting the next event.

 8. Wait for the Next Invocation - Custom Runtime freezes the container and thaws the container when a new request comes and we go to step 1 again and continue the loop

In order to visualize things better, the below diagram shows what happens during the first invocation that comes to the Custom Runtime container and Runtime API initializes everything. The part annotated with Cold Start happens when each time AWS Lambda creates a brand new container. The part where we annotate with invoke phase is run inside a loop, for each new invocation coming to already alive container runs the invoke phase again.

Custom Runtime-blog 

Figure 1: Sequence diagram of what happens when the first request comes

No code change

Layers and Runtime API are two powerful features offered by AWS Lambda. It enabled us to run the Thundra monitoring agent with no code change on a serverless platform. In this blog post, I tried to walk you through how we developed Thundra Custom Runtime and by explaining the inner workings of Custom Runtime and how we use Layer to offer our NodeJS agent. The value of AWS Lambda is its simplicity and how it abstracts underlying platform stuff. Custom Runtime may be against that values. At the end of the day, the beauty of serverless is to focus on our business problems, not the platforms or runtimes they run. But for those who are developing services for AWS Lambda like us, these are fantastic technologies that will bring many cool things to the serverless ecosystem.

You can try Thundra NodeJS Custom Runtime by following the simple instructions on the Thundra Docs. You can also check out this video which showcases how you can integrate Thundra with your AWS Lambda function in less than a minute. Thundra is free to use and set up takes only a few minutes. Get your account and start using Thundra today.