Local mutations are all right
Published on in Clean code and JavaScript
For example, mutating a reducer function's accumulator array can be better than religiously avoiding mutations and returning only new arrays from the reducer.
Table of contents
Example of a mutating code
I recently published a blog post about how to find all possible combinations of array items in JS. Here's the code:
const getCombinations = (items) =>
items.reduce((combos, item) => {
const newCombos = combos.map((combo) => combo.concat(item))
combos.push([item], ...newCombos)
return combos
}, [])
Notice how I'm mutating the accumulator array (combos
) in the reducer function
by pushing items to it?
Gasp!
Mutations can easily lead to confusing code. However, not all mutations are bad.
In this case,
the mutations are local –
mutations of combos
happen only inside the reducer function.
Keeping track (mentally) of the mutations is easy.
Avoiding that mutation
The mutations could be avoided e.g. like this:
const getCombinations = (items) =>
items.reduce(
(combos, item) =>
combos.concat([[item], ...combos.map((combo) => combo.concat(item))]),
[]
)
Look ma, no mutation!
(combos.concat()
returns a new array instead of mutating the existing array.)
Or like this:
const getCombinations = (items) =>
items.reduce(
(combos, item) => [
...combos,
[item],
...combos.map((combo) => combo.concat(item)),
],
[]
)
But both non-mutating versions are actually horrible:
Inefficient
A new accumulator array is created on each iteration. It doesn't matter with short input arrays, but it does matter with longer input arrays.
In my solution to the Advent of Code 2015/03 puzzle, I found that using the spread syntax made my code a hundred times slower (on my machine) compared to mutating the accumulator array – 10ms vs 1,000ms. (Advent of Code puzzles typically have large inputs.)
Less readable
In the first version (using combos.concat()
),
it's difficult to see at a glance what the non-mutating one-liner is doing.
Prettier also insists on formatting the reducer function's body on a single line. The following would be clearer (but still not as clear as the non-mutating version):
const getCombinations = (items) =>
items.reduce(
(combos, item) =>
// prettier-ignore
combos.concat([
[item],
...combos.map((combo) => combo.concat(item)),
]),
[]
)
The second version (no combos.concat()
) is better
but still not great (debatable, I guess).
Unnecessary
The accumulator value is quite meant to be mutated – you create an array or an object to which you accumulate values on every iteration.
Saying that the accumulator is meant to be mutated is a bold statement, so I checked what MDN says about array reducers in JS (emphasis added by me):
reduce()
is a central concept in functional programming, where it's not possible to mutate any value, so in order to accumulate all values in an array, one must return a new accumulator value on every iteration. This convention propagates to JavaScript'sreduce()
: you should use spreading or other copying methods where possible to create new arrays and objects as the accumulator, rather than mutating the existing one. [...]
Eh... I respect the MDN docs, but I don't buy that. If you are aiming for maximum purity etc., then maybe yeah. But I would prefer performance and clarity to a rigid functional programming mindset.
I wanted more opinions (:D), so I googled "js reduce mutate accumulator" and found Is mutating accumulator in reduce function considered bad practice? on Stack Overflow. The currently highest-scored answer by "customcommander" has a nice summary:
It isn't [a bad practice] if you own the accumulator.
And that's quite much what I mean by local mutations:
- If you own the accumulator, the mutations are local (restricted to the reducer function's scope) and thus safe.
- If you don't own the accumulator – if it's coming from elsewhere – the mutations won't be local anymore, so you need to be very careful with the mutations.
If the accumulator is coming from elsewhere, it's easy to shallow-copy it to avoid non-local mutations:
array.reduce(
(acc, val) => {
// ...
return acc
},
{ ...objectComingFromElsewhere }
)
array.reduce(
(acc, val) => {
// ...
return acc
},
[...arrayComingFromElsewhere]
)
But be wary of deep mutations, or do a deep clone instead.
Conclusion
I'll conclude by quoting the last two paragraphs of a relevant article by the author of the aforementioned Stack Overflow answer (two commas added by me for clarity):
Uncontrolled access to shared state can lead to complex bugs which is why immutability is so dear to functional programmers. However, immutability isn't only achieved through the cloning of data, in fact this is most likely to be a naive approach to it. Immutability is a quality of a data structure, not a process per se.
In this particular case the initial value of the reducing function is safe to mutate. The spread operator offers no additional protection against side effects.
In other words: local mutations are all right.