Skip to main content

Javascript Asynchronicity Patterns

· 18 min read
Software Engineer

In this article, we discuss asynchronicity in Javascript, its evolution over time, and some of the patterns for dealing with various asynchronous challenges. We'll discuss callbacks, promises, and the present-day async/await approach to asynchronicity and when and how each of these are relevant in modern applications.

What is Asynchronicity?

Asynchronous code is when pieces of your code execute at different times and often with no guarantee of order of execution. This comes up in different forms and in a lot different languages with things like threading, subprocesses, and, yes, asynchronous functions.

Javascript is asynchronous by default. Where in other languages like Python or Java you can choose when to do things asynchronously, JS behaves that way out of the box. This introduces some inherent frustrations when you don't know when a piece of code will execute.

function doSomething() {
var delay = 1000;
setTimeout(function() {
console.log('I am asynchronous')
}, delay);
}

console.log('I am first');
doSomething();
console.log('I am last');

When this runs, we get the unintuitive result:

I am first
I am last
I am asynchronous

Now what happens if the delay in our doSomething() function is unpredictable. Maybe it's an API call dependent on an external system with many other consumers, maybe it's a database query, or maybe it's a file read from disk. But in any of those scenarios, the timing is unpredictable.

Javascript deals with this with something called the Event Loop. The event loop continually kicks off asynchronous operations, continues executing code, and then, when the asynchronous operations are done, the results are handled. In fact, this is one of the reasons Javascript is particularly good at high I/O operations.

Async Approaches

Callbacks

Asynchronicity in Javascript begins with the concept of callbacks. A callback is a function that gets passed as an argument to an asynchronous function that gets called when the asynchronous function is done. In one of the better analogies I've heard, David Malan describes callbacks as analogous to a fast-food line in Harvard's CS164.

As an example, consider how we use the Node.js fs.readFile function, which takes a callback as its final option.

const fs = require('fs');
const filePath = 'README.md';
const options = {};

function myCallback(err, data) {
console.log('Err ', err);
console.log('Data ', data);
}

fs.readFile(filePath, options, myCallback)

A shorter way of writing this is to use an anonymous functions in the readFile call. It's worth noting that the highlighted lines below are inside the callback function, this has implications for variable scope and control flow that we'll discuss more later.

fs.readFile(filePath, options, (err, data) => {
console.log('Err ', err);
console.log('Data ', data);
})

In this case the callback is passed two arguments err and data. While this certainly isn't a standard, this pattern where the first argument is an error and the second argument is data is a very common convention. We'll see this more later.

The Problem with Callbacks

The biggest problem that comes up with callbacks is something colloquially known as callback hell. This is what happens when you use anonymous function whose bodies call other functions that take callbacks and so on. Let's look at it visually. Suppose I need to read from a file and then need to make additional calls to asynchronous functions.

fs.readFile(filePath, options, (err, data) => {
const nextFile = data.toString();
fs.readFile(nextFile, options, (err, data) => {
fs.readFile(data.toString(), options, (err, data) => {
fs.readFile(data.toString(), options, (err, data) => {
// do more things, maybe more callbacks?
});
});
});
})

A more extensive example of this is shown here. The problem is that this can become very hard to read and understand, making your code nearly unmaintainable due to something called cognitive complexity.

Cognitive Complexity

Cognitive Complexity is a measure defined by SonarSource as a way of measuring understandability of source code. It builds upon the concept of cyclomatic complexity, which is effectively a measure of the number of branches in a piece of code. Cognitive complexity extends this concept by also taking into account the role indentation and nested code plays in understandability.

For example highly nested code with complex logic is much harder to understand than flat code. Consider the following code that prints all leap years from the years 1 to 10,000:

leap_year_complex.js
for (let i = 1; i < 10000; i++) {
let isLeapYear = false;
// If year is divisible by 4, it's a leap year
if (i % 4 == 0) {
isLeapYear = true;
// unless it's divisible by 100
if (i % 100 == 0) {
isLeapYear = false;
// but not if it's divisible by 400
if (i % 400) {
isLeapYear = true;
}
}
}
if (isLeapYear) {
console.log(i);
}
}
leap_year_simple.js
var total = 0;
// loop over every 4 years
for (let i = 4; i < 10000; i+=4) {
// if year is divisible by 100 and not 400, it's not a leap year
if (i % 100 == 0 && i % 400 != 0) {
continue;
}
// Otherwise, it's a leap year
console.log(i);
}

Which of these is easier to understand? The second is not only shorter, but less nested and easier to read. Imagine the how much harder this would be to understand when the nested content is dozens of lines of code.

Promises

One way the Javascript language began to address this is with promises. A promise implements a function that runs asynchronously and is completed by either resolving the promise with data or rejecting the promise with an error.

The function passed into the Promise constructor has two arguments: resolve and reject. These are functions to be called on success or failure respectively in a way that is not that dissimilar to a callback.

let aPromise = new Promise(function(resolve, reject) {
setTimeout(function() {
let randomNumber = Math.random();

// Succeed most of the time
if (randomNumber > 0.1)
resolve(randomNumber);
// fail ~10% of the time
else
reject({message: 'Random number was too small', number: randomNumber});
}, 1000);
});

// This defines what to do with the completed promise
aPromise.then(
function(value) { console.log('INFO: ', value) },
function(error) { console.log('ERROR: ', error) }
);

We can make functions return promises rather than accepting callbacks as input. Additionally, rather than passing two functions to the .then(), we can use .catch() to handle errors. In the example below, we define a function that returns a promise and uses both .then and .catch to handle the results.

function aPromiseFunction() {
return new Promise(function(resolve, reject) {
setTimeout(function() {
let randomNumber = Math.random();

// Succeed most of the time
if (randomNumber > 0.1)
resolve(randomNumber);
// fail ~10% of the time
else
reject({message: 'Random number was too small', number: randomNumber});
}, 1000);
});
}

// Use our promise function and handle results
aPromiseFunction()
.then((result) => { console.log('INFO: ', result) })
.catch((error) => { console.log('ERROR: ', error) });

You might look at this (much like I did when I first saw promises) and wonder "how is this better than callbacks?" After all, the promise function is very indented, there are functions inside functions, and it's confusing to read. A callback version of the same code might look something like:

function aCallbackEquivalent(callback) {
setTimeout(function() {
let randomNumber = Math.random();

// Succeed most of the time
if (randomNumber > 0.1)
callback(null, randomNumber);
// fail ~10% of the time
else
callback('Random number was too small', randomNumber});
}, 1000);
}

This is arguably cleaner. The value of promises comes when we're consuming them and when we need to chain many asynchronous function calls together. Let's look at an example that makes multiple database calls before returning a result. In this code, we use mongoose to lookup a user from a database, then get all of the user's projects from the database, and finally return list of project IDs.

User.findOne({username: 'foo'})
.then((user) => {
// When the user is found, find all projects where the user is a member
return Project.find({members: user._id});
})
.then((projects) => {
// When projects are found, return a 200 with the list of project IDs
let results = projects.map(project => project._id);
return res.send(results);
})
.catch((error) => {
// If an error occurs, return a 500 and the error message
res.status(500).send(error.message)
});

Compare this to having several nested callbacks. This code is much flatter and much more readable. We can keep making more and more asynchronous calls and simply add more .then statements to the chain without nesting the code any deeper.

The MDN guide on using promises gets into some additional detail on the benefits of promises including guarantees, some of the details around chaining, and error propagation to a single catch block.

Promisify

We mentioned earlier the common callback convention where a callback takes two arguments: error and data. This pattern is so common that Node.js 8.0.0 added the util.promisify function.

This util.promisify takes as input a function whose last argument is a callback that follows this conventions and returns a new function that instead returns a promise. We can convert our read file example above into a promise like this:

const util = require('util');
const fs = require('fs');
const filePath = 'README.md';
const options = {};
const readFile = util.promisify(fs.readFile);

readFile(filePath, options)
.then((data) => { console.log('Data ', data); })
.catch((err) => { console.log('Error ', err); } );

This allows us to start shifting existing code away from callbacks and towards promises.

Async / Await

Promises still have their limitations. One of the biggest of which is that the control flow still enters the anonymous functions defined in the .then and .catch blocks which impacts variable scope and the ability to return cleanly from the top level function. This often means the calling function also needs to return a promise.

Javascript improved on asynchronous handling again in ECMAScript 2017 (this corresponds to Node.js 8ish) by introducing the async and await syntax. This adds a new syntax to the Javascript language that allows promises to be handled in a flatter and seemly more synchronous syntax to make promises easier to work with.

To do this, our code needs to be in a function that's explicitly declared asynchronous using the async keyword. Once inside an async function, we can use the await keyword to wait for the result of a resolved promise, setting the resolved result to a variable. We then handle errors using a try-catch block much more reminiscent of synchronous code. Let's look at an example using our promisified file read.

const util = require('util');
const fs = require('fs');
const filePath = 'README.md';
const options = {};
const readFile = util.promisify(fs.readFile);

async function main() {
try {
let data = await readFile(filePath, options)
console.log('Data: ', data.toString());
}
catch (error) {
console.log('Error: ', error);
}
}

Now we can immediately see a syntax and a control flow that we're likely already comfortable with from languages like Java or Python. There are a number of benefits to this including limiting nested code, avoiding variable scope changes introduced by anonymous functions, and allowing our top-level function to return a result rather than wrapping the function body in a promise definition.

Patterns

The reality of working in Javascript is it's a language that has evolved a lot in recent years and, as a result, we often need to know how to work with all of the asynchronous approaches above for one reason or another.

Managing Callbacks

Sometimes callbacks are unavoidable. One example is jQuery's ajax function, which doesn't take a callback, but instead takes a settings object with properties such as success and failure that can be set to callback functions.

There are a two useful techniques to manage complexity introduced by callbacks:

  1. Wrap it in a promise where you can. It makes the immediate code a bit more complex, but it allows you to re-use that code using promises and async/await.

  2. Avoid anonymous functions. Name your functions and define them separately to avoid highly nested code. Once scenario where this makes sense is when your code is wrapped in a promise and needs to call the resolve function from inside the callback.

To convert callbacks into promises, let's look at a couple examples. The example below takes a sample function that uses a non-conventional callback form and converts it to a promise.

// Original
function wrapperFunction(successCb, failureCb) {
// ... here is some code ...

someAsyncFunction(options, function(param1, param2, param3, param4) {
// do more things
}, moreOptions);

// ... here is some more code ...
}

// As a Promise
function wrapperFunction2() {
return new Promise((resolve, reject) => {
someAsyncFunction(options, function(param1, param2, param3, param4) {
// Do some logic to check for errors
if (param1 != 'foo' && param2 != 'bar') {
reject('An error occurred')
}
resolve([param1, param2, param3, param4])
}, moreOptions);
});
}

Let's look at a more tangible example using jQuery. We begin with a function that we've implemented to make an API call to a server. The function takes two arguments, a success handler callback and a failure handler callback.

function makeApiCall(successHandler, failureHandler) {
$.ajax({
url: '/api/objects/1',
method: 'GET'
success: successHandler,
failure: failureHandler
})
}

There a few problems with this. Other than it's non-conventional form, the caller of makeApiCall needs to know the function signatures of successHandler and failureHandler. By turning this into a promise, we convert this into a standardized form for the caller.

function makeApiCall(successHandler, failureHandler) {
return new Promise((resolve, reject) => {
$.ajax({
url: '/api/objects/1',
method: 'GET'
success: (data) => { resolve(data) },
failure: (error) => { reject(error) }
})
});

}

With this new form, we can now use this in promise chains or with async/await.

Promisify

When you have functions that use the standard error/data callback convention, use promisify. You can then use promise chaining or async/await to keep your code clean.

const { promisify } = require('util');
const someFunction = promisify(someFunctionWithCallback);

// Promise chaining
someFunction
.then(...)
.catch(...)

// Async/await
try {
var result = await someFunction();
}
catch (err) {
console.log(err)
}

Scope Problems

One of the major problems of callbacks and promises is the scope change of nested anonymous functions. The example below is based on a real example that relied the AWS SDK for Javascript to make calls to Amazon Cognito.

This code snippet implements a doSignUp() function that makes a call to Amazon Cognito using the AWS SDK's signUp function

function doSignUp() {
return new Promise((resolve, reject) => {
sdk.signUp(options, async (error, result) => {
// I want to return something from doSignUp here ...
return result;
});
})
}

The problem is, this doesn't work. To see why, let's first rewrite this by naming all the anonymous functions.

function doSignUp() {
return new Promise(function myPromiseFunction(resolve, reject) {
sdk.signUp(options, async function signUpCallback(error, result) {
// I want to return something from doSignUp here ...
return result;
});
})
}

Now we can see a little more clearly, the return statement doesn't return from doSignUp, it returns from signUpCallback. To properly handle this, we'd need to use resolve(result) rather than return result. But we can do better.

Let's rewrite this again using util.promisify to remove the callback and use async/await to remove the promise constructor entirely.

async function doSignUp() {
const signUpPromisified = util.promisify(sdk.signUp);
return signUpPromisified(options);
}

In this case our signUpPromisified returns a promise and our code much cleaner.

Promise.all

Another common pattern is when you need to execute a loop whose body does something asynchronous. I've found this occur on occasion for database calls or similar. The following example shows the .then block of a promise that looks up all of a user's projects in a database then manipulates those projects. This might occur when a user is deleted from the database and they also need to be removed from associated projects.

Project.find(query)
.then((projects) => {
projects.forEach((project) => {
project.foo = bar; // make some changes to the project
project.save(); // .save is asynchronous
});

// This is problematic...what happens if the save operations aren't done?
return projects;
})
.then(projects => { /* do more stuff */ })

We can address this with the Promise.all function. Promise.all takes an array of promises and returns a single promise that resolves as an array of the resolved values of all the input promises.

Project.find(query)
.then((projects) => {
// We'll store an array of promises
const promises = [];

projects.forEach((project) => {
project.foo = bar; // make some changes to the project
promises.push(project.save()); // .save is asynchronous
});

// Return a single promise with aggregated results.
return Promise.all(promises);
})
.then(projects => { /* do more stuff */ })

When any of the promises reject, the new aggregated promise is also rejected. For slightly different behavior, we can use Promise.allSettled to look at the results of each promise rather than rejecting all when one fails.

Await Performance

One of the problems we run into with async/await is performance, particularly when handling a situation similar to the Promise.all case above. Let's explore what happens in a similar situation to the promise.all, but using async await instead.

Project.find(query)
.then((projects) => {
const savedProjects = [];
projects.forEach((project) => {
project.foo = bar;
let savedProject = await project.save();
savedProjects.push(savedProject);
});
return savedProjects;
})

The problem here arises when we await project.save(). This blocks the code after it from executing until the .save() promise is resolved. Thus, each iteration of the loop becomes blocking and for a large list of projects, this could introduce a major performance hit.

How do we fix it?

Let's start by looking at a simpler example that we can try and test without any dependencies on an external database. We begin by introducing a simple asynchronous delay function. There is no doubt this is an overly simplistic and utterly useless function, but it should illustrate what's going on.

function delay(ms) {
return new Promise((resolve, reject) => {
setTimeout(function(){
resolve(true);
}, ms);
});
};

Now suppose we need to call this asynchronous function multiple times.

async function main() {
await delay(1000);
await delay(1000);
await delay(1000);
}

// time it
console.time('main');
main().then(() => console.timeEnd('main'));

We find that this takes a little over 3 seconds to execute. Why? Well, when the first delay is called, we await the result before calling the next delay function. Thus each delay is effectively synchronous and blocking.

To allow them to execute in parallel, we need to call the delay function without awaiting the result and therefore allow the body of the promise to start executing. We can store the promise that is returned and then, only when we need to synchronize the results, do we explicitly await the promises.

async function mainFast() {
const delayPromise1 = delay(1000);
const delayPromise2 = delay(1000);
const delayPromise3 = delay(1000);
await delayPromise1;
await delayPromise2;
await delayPromise3;
}

// time it
console.time('mainFast');
mainFast().then(() => console.timeEnd('mainFast'));

Unlike our first example, the mainFast example above executes in a little over 1 second (rather than 3s).

A More Realistic Example

Let's look at one more example of this. Consider a scenario where we need to make N database calls that take 5-6 ms each. We can mock this, very similar to our delay example.

function mockDbCall() {
return new Promise((resolve, reject) => {
setTimeout(function(){
resolve(true);
}, 5 + Math.random());
});
};

Now let's use both of the above approaches and compare their performance for N operations.

var N = 1000;

async function main() {
for (let i = 0; i < N; i++) {
await mockDbCall();
}
}

async function mainFast() {
const promises = [];
for (let i = 0; i < N; i++) {
promises.push(mockDbCall());
}
promises.forEach(async (p) => { await p; });
}

// time main
console.time('main');
main().then(res => console.timeEnd('main'));

// time mainFast
console.time('mainFast');
mainFast().then(res => console.timeEnd('mainFast'));

Our results are pretty astonishing. There is some randomness in this test, but in my first run of this, main took 6.556s and mainFast took 0.536s. This an order of magnitude difference and can have a significant impact when this pattern is used (or used poorly) to implement an API endpoint.

Summary

Javascript is an asynchronous language and working in JS means that asynchronous code is an inevitability. Wrapping my head around asynchronous programming was one of the harder professional learning curves I've had to endure, it's not easy.

The patterns above aren't all inclusive. They are just some of the ones that I've encountered most often. Learning how to write asynchronous code (in any language, not just JS) and how to deal with some of the common situations can save you a lot of headache. So take the time to learn and practice.