JavaScript reduce function performance enhancement
Time complexity of implemented algorithms is an aspect which for sure should be taken into consideration when working on websites with big amounts of data. In this manner, let us test out the performance of the mother and father of all JavaScripts higher order functions - Array.prototype.reduce.
Problem description: Our API provides us with an array of objects that looks something like:
const objects = [
{ id: 1, name: "name 1"},
{ id: 2, name: "name 2"},
{ id: 3, name: "name 3"},
]
and we want to create a lookup table out of it, like this:
{
1: "name 1",
2: "name 2",
3: "name 3",
}
using reduce:
const lookup = objects.reduce((lookup, object) => ({
...lookup,
[object.id]: object.name
}), {})
It works, but the time complexity of this algorithm is (unexpectedly) O(n^3), because of the internal calls of Object.assign() that needs to copy bigger and bigger objects from iteration to iteration.
This is what JavaScript does internally:
objects.reduce((lookup, object) => Object.assign(
Object.assign(lookup, {}),
{ [object.id]: object.name }
),
{}
);
Time taken:
100 records: ~0.08ms
1.000 records: ~1.45ms
10.000 records: ~ 18ms
100.000 records: ~2600ms
1 mill records: ~breaks
Performance boost:
const lookup = objects.reduce((lookup, object) => {
lookup[object.id] = object.name
return lookup
}, {})
Time taken:
100 records: ~0.045ms
1.000 records: ~ 0.15ms
10.000 records: ~ 1.5ms
100.000 records: ~ 3.2ms
1 mill records: ~ 20ms
Great performance improvement by avoiding Object.assign calls and new object creation in each iteration.