← Home

Map in terms of Reduce

12 October, 2025

I encountered a blog post recently about three types of programmers. Quoting from the post, here are the types:

This is a post of type 1.


Rich Hickey's "Reducers" talk

You can find a 2012 talk by Rich Hickey called "Reducers" about the thinking and design behind Clojure's reducers module. Hickey motivates Clojure's implementation of the canonical "sequence" operations map, filter, and reduce by construing all of those operations in terms of reduce. Along the way he says,

If we want to reach parallelism, we have to break away from sequentiality, and if we want to break away from sequentiality, we can't define everything in terms of "let's meet at 'sequence'". So the first tool we need to do is to make map and those other functions collection-ignorant. And so one idea to build them on reduce.

The talk goes on for another 30 minutes, but let's stop right here. This idea of map and filter in terms of reduce. Interesting, right? When I heard it the first time, I wanted to pause and write a simple version of this myself just to get a feel for it. It doesn't have to be production-ready or as flexible and fast as what Clojure eventually implements. We are just trying it for the sake of trying it.

So that's what this post will do. We will write some cheap implementations of map and filter in terms of reduce. There will be a little bit of background info about how these higher-order functions are "conceived" but there are plenty of ways to supplement your understanding of these routines elsewhere on the web.

Reduce

reduce is a function with signature reduce: fn sequence initial -> value. It builds a value by applying a binary operation fn successively over the sequence, each time accumulating the result into some single value that is returned. Conventionally the caller also provides an initial value that initializes the accumulator value (in case of an empty sequence, for example).

Simple python example: summing a list of numbers is the same as reducing an add function over the list. We need a binary function that underlies sum, and it's easy to see that function is add.

# or you could use operator.add
def add(a, b):
    return a + b

And then sum (I will define the function sum2 to distinguish it from the built-in sum) can be implemented as...

from functools import reduce
from typing import Iterable


def sum2(sequence: Iterable):
    return reduce(add, sequence, 0)

If we call sum2 passing some list of numbers, we get their sum.

seq = [1, 2, 3]

print(sum2(seq))

##[out] 6

We can validate that it's the same result as the built-in sum function.

print(sum(seq))

##[out] 6

And we can, for the sake of education, hack together our own version of reduce just to make the internals explicit.

# implementing with iteration over sequence
def reduce2(fn, sequence: Iterable, initial):
    for val in sequence:
        initial = fn(initial, val)
    return initial


print(reduce2(add, seq, 0))

##[out] 6

Map

map is a higher-order function that applies a function to each element in a "collection" of some kind, returning a new collection of the same type.1 For example, the expression (map double [1 2 3]) would evaluate to [2 4 6].

The classifical implementation of map is a recursive function. We will use Python to demonstrate the idea, but you wouldn't actually do the work of map this way in Python.2

def map_rec(fn, sequence: list):
    if len(sequence) == 0:
        return []
    else:
        return [fn(sequence[0]), *map_rec(fn, sequence[1:])]

The base case of the recursion is that mapping a function over an empty list returns an empty list. But if the list is not empty, apply the function to the first element and attach that element to the list generated by passing the remainder of the list to map. Here is an example using that double idea.

def double(x):
    return 2 * x

print(map_rec(double, [1, 2, 3]))

##[out] [2, 4, 6]

Map as Reduce

How is possible that you can implement map as reduce? The only thing to unlock is that reductions do not have to return scalar values. A list is a value, and it is a perfectly valid value to accumulate during a reduction. Just as append fn(B) to list A is a valid function to pass into reduce.

So here is map as a reduction.

def map_as_reduce(fn, sequence: list):
    return reduce(_binary_op_map(fn), sequence, list())


# we only need a way to construe the "core of map" as a binary operation
# on a list and an argument
def _binary_op_map(fn):
    def binary_op(accum: Iterable, x):
        accum.append(fn(x))
        return accum
    return binary_op


map_as_reduce(double, [1, 2, 3])

##[out] [2, 4, 6]

The "empty list" case, which was previously the base case of the recursive implementation, is handled by initializing the reduction with an empty list(). If the sequence is empty, that empty list is returned without any modifications.

Filter

We can do this with filter as well. filter is a sequence operation that takes a function and a sequence, returning a new sequence containing only the elements where function(element) is true. Simple example: filtering a list for even numbers only.

def is_even(x):
    return (x % 2) == 0

list(filter(is_even, [0, 1, 2, 3, 4, 5]))

##[out] [0, 2, 4]

Filter as Reduce

We do a similar conceptual trick as we did with map. What binary operation do we pass to reduce to achieve filter? A function that appends right to some list left only if fn(left) is true.

def filter_as_reduce(fn, sequence: list):
    return reduce(_binary_op_filter(fn), sequence, list())


def _binary_op_filter(fn):
    def binary_op(accum: list, x):
        if fn(x):
            accum.append(x)
        return accum
    return binary_op


filter_as_reduce(is_even, [0, 1, 2, 3, 4, 5])

##[out] [0, 2, 4]

Conclusion

The Hickey talk and Clojure's eventual implementation of these ideas are much smarter and solve more problems in sequence programming than what we have discussed here. But that's ok. All we really wanted to do was a little mental exercise to explore the proposition of map-as-reduce.

1

In Python map returns a generator, and the user can materialize this into whatever result type they want, but whatever. Let's ignore this for now.

2

This would be a bad memory allocation pattern. We are creating N many lists only to de-structure N-1 of them. A language with a compiler would optimize this away. Python handles it with lazy sequence generators.