Any limitations are your own
Others point out that you're making a mistake with the types. Each of your functions expect [k,v]
input, but neither of them output that form - neither compose(f,g)
or compose(g,f)
will work in this case
Anyway, transducers are generic and need not know anything about the types of data they handle
const flip = ([ key, value ]) =>
[ value, key ]
const double = ([ key, value ]) =>
[ key, value * 2 ]
const pairToObject = ([ key, value ]) =>
({ [key]: value })
const entriesToObject = (iterable) =>
Transducer ()
.log ('begin:')
.map (double)
.log ('double:')
.map (flip)
.log ('flip:')
.map (pairToObject)
.log ('obj:')
.reduce (Object.assign, {}, Object.entries (iterable))
console.log (entriesToObject ({one: 1, two: 2}))
// begin: [ 'one', 1 ]
// double: [ 'one', 2 ]
// flip: [ 2, 'one' ]
// obj: { 2: 'one' }
// begin: [ 'two', 2 ]
// double: [ 'two', 4 ]
// flip: [ 4, 'two' ]
// obj: { 4: 'two' }
// => { 2: 'one', 4: 'two' }
Of course we have the standard boring array of numbers and return a boring array of numbers as a possibility too
const main = nums =>
Transducer ()
.log ('begin:')
.filter (x => x > 2)
.log ('greater than 2:')
.map (x => x * x)
.log ('square:')
.filter (x => x < 30)
.log ('less than 30:')
.reduce ((acc, x) => [...acc, x], [], nums)
console.log (main ([ 1, 2, 3, 4, 5, 6, 7 ]))
// begin: 1
// begin: 2
// begin: 3
// greater than 2: 3
// square: 9
// less than 30: 9
// begin: 4
// greater than 2: 4
// square: 16
// less than 30: 16
// begin: 5
// greater than 2: 5
// square: 25
// less than 30: 25
// begin: 6
// greater than 2: 6
// square: 36
// begin: 7
// greater than 2: 7
// square: 49
// [ 9, 16, 25 ]
More interestingly, we can take an input of an array of objects and return a set
const main2 = (people = []) =>
Transducer ()
.log ('begin:')
.filter (p => p.age > 13)
.log ('age over 13:')
.map (p => p.name)
.log ('name:')
.filter (name => name.length > 3)
.log ('name is long enough:')
.reduce ((acc, x) => acc.add (x), new Set, people)
const data =
[ { name: "alice", age: 55 }
, { name: "bob", age: 16 }
, { name: "alice", age: 12 }
, { name: "margaret", age: 66 }
, { name: "alice", age: 91 }
]
console.log (main2 (data))
// begin: { name: 'alice', age: 55 }
// age over 13: { name: 'alice', age: 55 }
// name: alice
// name is long enough: alice
// begin: { name: 'bob', age: 16 }
// age over 13: { name: 'bob', age: 16 }
// name: bob
// begin: { name: 'alice', age: 12 }
// begin: { name: 'margaret', age: 66 }
// age over 13: { name: 'margaret', age: 66 }
// name: margaret
// name is long enough: margaret
// begin: { name: 'alice', age: 91 }
// age over 13: { name: 'alice', age: 91 }
// name: alice
// name is long enough: alice
// => Set { 'alice', 'margaret' }
See? We can perform any type of transformations you want. You just need a Transducer
that fits the bill
const identity = x =>
x
const Transducer = (t = identity) => ({
map: (f = identity) =>
Transducer (k =>
t ((acc, x) => k (acc, f (x))))
, filter: (f = identity) =>
Transducer (k =>
t ((acc, x) => f (x) ? k (acc, x) : acc))
, tap: (f = () => undefined) =>
Transducer (k =>
t ((acc, x) => (f (x), k (acc, x))))
, log: (s = "") =>
Transducer (t) .tap (x => console.log (s, x))
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
xs.reduce (t (f), acc)
})
Full program demonstration - .log
is added just so you can see things happening in the correct order
const identity = x =>
x
const flip = ([ key, value ]) =>
[ value, key ]
const double = ([ key, value ]) =>
[ key, value * 2 ]
const pairToObject = ([ key, value ]) =>
({ [key]: value })
const Transducer = (t = identity) => ({
map: (f = identity) =>
Transducer (k =>
t ((acc, x) => k (acc, f (x))))
, filter: (f = identity) =>
Transducer (k =>
t ((acc, x) => f (x) ? k (acc, x) : acc))
, tap: (f = () => undefined) =>
Transducer (k =>
t ((acc, x) => (f (x), k (acc, x))))
, log: (s = "") =>
Transducer (t) .tap (x => console.log (s, x))
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
xs.reduce (t (f), acc)
})
const entriesToObject = (iterable) =>
Transducer ()
.log ('begin:')
.map (double)
.log ('double:')
.map (flip)
.log ('flip:')
.map (pairToObject)
.log ('obj:')
.reduce (Object.assign, {}, Object.entries (iterable))
console.log (entriesToObject ({one: 1, two: 2}))
// begin: [ 'one', 1 ]
// double: [ 'one', 2 ]
// flip: [ 2, 'one' ]
// obj: { 2: 'one' }
// begin: [ 'two', 2 ]
// double: [ 'two', 4 ]
// flip: [ 4, 'two' ]
// obj: { 4: 'two' }
// => { 2: 'one', 4: 'two' }
functional programming vs functional programs
JavaScript doesn't include functional utilities like map
, filter
or reduce
for other iterables like Generator, Map, or Set. When writing a function that enables functional programming, we can do so in a variety of ways - consider the varying implementations of reduce
// possible implementation 1
const reduce = (f = (a,b) => a, acc = null, xs = []) =>
xs.reduce (f, acc)
// possible implementation 2
const reduce = (f = (a,b) => a, acc = null, [ x = Empty, ...xs ]) =>
isEmpty (x)
? acc
: reduce (f, f (acc, x) xs)
// possible implementation 3
const reduce = (f = (a,b) => a, acc = null, xs = []) =>
{
for (const x of xs)
acc = f (acc, x)
return acc
}
Each implementation of reduce
above enables functional programming; however, only one implementation is itself a functional program
This is just a wrapper around native Array.prototype.reduce
. it has the same disadvantage as Array.prototype.reduce
because it only works for arrays. Here we are happy that we can now write reduce expressions using a normal function and creating the wrapper was easy. But, if we call reduce (add, 0, new Set ([ 1, 2, 3 ]))
, it fails because sets do not have a reduce
method and this makes us sad.
This works on any iterable now, but the recursive definition means that it will overflow the stack if xs
is significantly large - at least until JavaScript interpreters add support for tail call elimination. Here, we are happy about our representation of reduce
, but wherever we use it our program we are sad about its Achilles heel
This works on any iterable just like #2, however we must trade the elegant recursive expression for the imperative-style for
loop which ensures stack safety. The ugly details makes us sad about reduce
but it makes us happy wherever we use it in our program.
Why is this important? Well, in the Transducer
I shared, the reduce
method I included is:
const Transducer (t = identity) =>
({ ...
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
xs.reduce (t (f), acc)
})
This particular implementation is closest to our reduce
#1 above - it's a quick and dirty wrapper around Array.prototype.reduce
. Sure our Transducer
can perform transformations on arrays containing values of any type, but it means our Transducer can only accept arrays as input. We traded flexibility for an easier implementation.
We could write it closer to style #2, but then we inherit stack vulnerability wherever we use our transducer module on big data sets - which is where transducers are meant to excel in the first place. An implementation closer to #3 is itself not a functional program, but it enables functional programming —
The result is a module that necessarily utilizes some of JavaScript's imperative-style in order to enable the user to write functional-style programs in an unburdened fashion
const Transducer (t = identity) =>
({ ...
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
{
const reducer = t (f)
for (const x of xs)
acc = reducer (acc, x)
return acc
}
})
The idea here is you get to write your own Transducer
module and invent any other data types and utilities to support it. Familiarizing yourself with the trade-offs enable you to choose whatever is best for your program.
There's many ways around the "problem" presented in this section. So how can one really write functional programs in JavaScript if we're constantly having to revert to imperative style in various parts of our program? There's no silver bullet answer, but I have spent considerable time exploring various solutions. If you're this deep in the post and interested, I share some of that work here
Possibility #4
Yep, you can leverage Array.from
that converts any iterable to an array, which allows us to plug directly into Array.prototype.reduce
. Now transducers that accept any iterable input, functional style, and an easy implementation —
A drawback of this approach is that it creates an intermediate array of values (wasted memory) instead of handling the values one-at-a-time as they come out of the iterable. Note, even solution #2 shares non-trivial drawback
const Transducer (t = identity) =>
({ ...
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
Array.from (xs)
.reduce (t (f), acc)
})