Writing asynchronous JavaScript without using the Promise
object is a lot like baking a cake with your eyes closed. It can be done, but it’s gonna be messy and you’ll probably end up burning yourself.
I won’t say it’s necessary, but you get the idea. It’s real nice. Sometimes, though, it needs a little help to solve some unique challenges, like when you’re trying to sequentially resolve a bunch of promises in order, one after the other. A trick like this is handy, for example, when you’re doing some sort of batch processing via AJAX. You want the server to process a bunch of things, but not all at once, so you space the processing out over time.
Ruling out packages that help make this task easier (like Caolan McMahon’s async library), the most commonly suggested solution for sequentially resolving promises is to use Array.prototype.reduce()
. You might’ve heard of this one. Take a collection of things, and reduce them to a single value, like this:
let result = [1,2,5].reduce((accumulator, item) => {
return accumulator + item;
}, 0); // <-- Our initial value.
console.log(result); // 8
But, when using reduce()
for our purposes, the setup looks more like this:
let userIDs = [1,2,3];
userIDs.reduce( (previousPromise, nextID) => {
return previousPromise.then(() => {
return methodThatReturnsAPromise(nextID);
});
}, Promise.resolve());
Or, in a more modern format:
let userIDs = [1,2,3];
userIDs.reduce( async (previousPromise, nextID) => {
await previousPromise;
return methodThatReturnsAPromise(nextID);
}, Promise.resolve());
This is neat! But for the longest time, I just swallowed this solution and copied that chunk of code into my application because it “worked.” This post is me taking a stab at understanding two things:
- Why does this approach even work?
- Why can’t we use other
Array
methods to do the same thing?
Why does this even work?
Remember, the main purpose of reduce()
is to “reduce” a bunch of things into one thing, and it does that by storing up the result in the accumulator
as the loop runs. But that accumulator
doesn’t have to be numeric. The loop can return whatever it wants (like a promise), and recycle that value through the callback every iteration. Notably, no matter what the accumulator
value is, the loop itself never changes its behavior — including its pace of execution. It just keeps rolling through the collection as fast as the thread allows.
This is huge to understand because it probably goes against what you think is happening during this loop (at least, it did for me). When we use it to sequentially resolve promises, the reduce()
loop isn’t actually slowing down at all. It’s completely synchronous, doing its normal thing as fast as it can, just like always.
Look at the following snippet and notice how the progress of the loop isn’t hindered at all by the promises returned in the callback.
function methodThatReturnsAPromise(nextID) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Resolve! ${dayjs().format('hh:mm:ss')}`);
resolve();
}, 1000);
});
}
[1,2,3].reduce( (accumulatorPromise, nextID) => {
console.log(`Loop! ${dayjs().format('hh:mm:ss')}`);
return accumulatorPromise.then(() => {
return methodThatReturnsAPromise(nextID);
});
}, Promise.resolve());
In our console:
"Loop! 11:28:06"
"Loop! 11:28:06"
"Loop! 11:28:06"
"Resolve! 11:28:07"
"Resolve! 11:28:08"
"Resolve! 11:28:09"
The promises resolve in order as we expect, but the loop itself is quick, steady, and synchronous. After looking at the MDN polyfill for reduce()
, this makes sense. There’s nothing asynchronous about a while()
loop triggering the callback()
over and over again, which is what’s happening under the hood:
while (k < len) {
if (k in o) {
value = callback(value, o[k], k, o);
}
k++;
}
With all that in mind, the real magic occurs in this piece right here:
return previousPromise.then(() => {
return methodThatReturnsAPromise(nextID)
});
Each time our callback fires, we return a promise that resolves to another promise. And while reduce()
doesn’t wait for any resolution to take place, the advantage it does provide is the ability to pass something back into the same callback after each run, a feature unique to reduce()
. As a result, we’re able build a chain of promises that resolve into more promises, making everything nice and sequential:
new Promise( (resolve, reject) => {
// Promise #1
resolve();
}).then( (result) => {
// Promise #2
return result;
}).then( (result) => {
// Promise #3
return result;
}); // ... and so on!
All of this should also reveal why we can’t just return a single, new promise each iteration. Because the loop runs synchronously, each promise will be fired immediately, instead of waiting for those created before it.
[1,2,3].reduce( (previousPromise, nextID) => {
console.log(`Loop! ${dayjs().format('hh:mm:ss')}`);
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Resolve! ${dayjs().format('hh:mm:ss')}`);
resolve(nextID);
}, 1000);
});
}, Promise.resolve());
In our console:
"Loop! 11:31:20"
"Loop! 11:31:20"
"Loop! 11:31:20"
"Resolve! 11:31:21"
"Resolve! 11:31:21"
"Resolve! 11:31:21"
Is it possible to wait until all processing is finished before doing something else? Yes. The synchronous nature of reduce()
doesn’t mean you can’t throw a party after every item has been completely processed. Look:
function methodThatReturnsAPromise(id) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Processing ${id}`);
resolve(id);
}, 1000);
});
}
let result = [1,2,3].reduce( (accumulatorPromise, nextID) => {
return accumulatorPromise.then(() => {
return methodThatReturnsAPromise(nextID);
});
}, Promise.resolve());
result.then(e => {
console.log("Resolution is complete! Let's party.")
});
Since all we’re returning in our callback is a chained promise, that’s all we get when the loop is finished: a promise. After that, we can handle it however we want, even long after reduce()
has run its course.
Why won’t any other Array methods work?
Remember, under the hood of reduce()
, we’re not waiting for our callback to complete before moving onto the next item. It’s completely synchronous. The same goes for all of these other methods:
Array.prototype.map()
Array.prototype.forEach()
Array.prototype.filter()
Array.prototype.some()
Array.prototype.every()
But reduce()
is special.
We found that the reason reduce()
works for us is because we’re able to return something right back to our same callback (namely, a promise), which we can then build upon by having it resolve into another promise. With all of these other methods, however, we just can’t pass an argument to our callback that was returned from our callback. Instead, each of those callback arguments are predetermined, making it impossible for us to leverage them for something like sequential promise resolution.
[1,2,3].map((item, [index, array]) => [value]);
[1,2,3].filter((item, [index, array]) => [boolean]);
[1,2,3].some((item, [index, array]) => [boolean]);
[1,2,3].every((item, [index, array]) => [boolean]);
I hope this helps!
At the very least, I hope this helps shed some light on why reduce()
is uniquely qualified to handle promises in this way, and maybe give you a better understanding of how common Array
methods operate under the hood. Did I miss something? Get something wrong? Let me know!
This could be mosleading. Sequentially means you will get
Loop...
thenResolve
before proceeding to the next Promise. But this makes no difference with a regularPromise.all(...)
.Hi, Rong —
I get the confusion, but there’s actually a notable difference between what this post is about and what
Promise.all()
does.Promise.all()
is designed to do something after a collection of promises have all resolved, regardless of the order in which they do so (they could all resolve in parallel andPromise.all()
would be satisfied). This post is more about making those promises all resolve in order, one at a time, and never in parallel. As far as I’ve seen, there’s no way to do that with something likePromise.all().
Hope that makes sense and that I understood your comment correctly.I’ve never actually found a situation where reduce() was cleaner than a simple loop. In your example,
let promise = Promise.resolve();
for (const nextID of userIDs) {
promise = promise.then(() => methodThatReturnsAPromise(nextID));
}
is much easier to read and understand. Even worse is when people used reduce() to mutate an object and the reducer function just returns its first argument. It significantly harms readability for the benefit of maybe avoiding a temporary variable.
These are some great points. The
for...in
approach is definitely more legible; I’m curious if the draw toreduce()
goes beyond saving a variable declaration.Speaking of, an exploration of why THAT approach works for sequential resolution would be super interesting. A
for...in
loop isn’t always synchronous?Or if you can use async/await it’s just:
Perfect! To the point. Enjoyed reading.
Perform a map inside of a Promise.all. If the function performed inside of the map uses async/await, then it will complete before the next iteration (in order, not in parallel). The bonus is that the Promise.all resolves all of your promise values into an array. I’m posting this from my iPad, otherwise I’d give a code example.
Why not just use rxjs?
I’ve gotta say this is less common way of handling promises which you can’t find that much around the web, and it’s always nice to explore new ways, so, than you for writing this article Alex!
Can’t really understand why are there so many “hate” comments, it’s not like there’s a universal way of handling anything in programming –> right tool for the right job, depending on the circumstaces.
If I need to handle multiple promises, I generally use
Promise.all
like some people had written, never had a situation where I’d need an approach like this (even though it’s nice to know).Today, with HTTP/2 browsers can easily handle multiple simultaneous requests, and in case you have them that many, your users would definitely wait for a lot for sequential requests to resolve one by one.
Also, if we have a situation to do promises sequentually, like having a factory method which pings some server so it could build up a global config object, wouldn’t that code be a bit tightly-coupled?
Can you give me an example of a real-world usage for this?
I wouldn’t recommend this approach at all. 1. Why should I use a sync approach with async, better then not use promises . 2. Leads to confusions, and might introduce hidden problems. Had a situation, where someone, with the hope to tackle slowness in code, had coded with reduce exactly this, solving promises in order (and the funny thing was that the promise callback was a normal function with a setTimeout of 1). The piece of code, not only was hard to debug, but was hiding the real slowness that was somewhere else.
If there is a real world case that you need to do this, then think your code again, you might be doing something wrong.
My advice is to avoid in general this sort of cheat anti-patterns
You do you, bro! This post isn’t prescriptive at all. It’s just interested in explaining, as common as it is, why the
reduce()
approach works to begin with. The more familiar I’ve become with loops like these, I’m honestly not sure if I’d reach for it myself. I suppose it strongly depends on the use case.Well as soon as you need to make a network request, you now have async code…so you have no choice there. If you need things done sequentially (e.g. guarantee that things are entered into a database in a sequential order), then you will need it to behave synchronously. I could give more examples, but the main thing here is that this is valid a solution for certain situations.
@Mike Network request can be done in sync mode eventually (sync AJAX), but that not the use case because will freeze the user UI.
The second statement that you claim is a wrong use case, because you cannot guarantee a sequential actions on the front end never (I can always cheat with custom js). All validation should happen on backend no matter what the front end guarantees.
If you give me more example I can show you if they are wrong or right. The database example is a bad use case.
Ok another example. You are in Node. You have a database as a service (let’s say Firebase). You interact with the database using the official Firebase Node library which returns promises. You need to perform operations in sequential order.
@Mike again, the database example is not correct does not matter who is the client and who is the server. In your example: simple you send the index of your element in the array and store that as order no need to run sequentially.
Even if you send them sequentially what will be the order on firebase? the id which is a random string on Firebase?
What if a promise fails?
if you really need the order then you send the array index of your element and solve the promises in parallel.
I can prove to you that every use case of this method can be done in much easier better ways.
Have anyone noticed that the last Promise is not resolve? So if we have 3 userIds, we will process only first two of them.
Cheers!
I agree Alex, this is just a interesting use case, but very very rare for real use case. Actually I cannot think one for this as best alternative.
I like your post! So I want to ask you if i can translate your post into Chinese, and post it on my blog. Of course I will add a link of your post and indicate that my post is translated from yours at the beginning of my blog.
Hey there! Sure, you’re free to use anything on the site: https://css-tricks.com/license/