alt.hn

4/19/2026 at 10:56:20 AM

Clojure: Transducers

https://clojure.org/reference/transducers

by tosh

4/21/2026 at 5:14:34 PM

Transducers work even better with a Clojure library called Injest. It has macros similar to the standard Clojure threading macros except Injest’s macros will recognize when you’re using transducers and automatically compose them correctly. You can even mix and match transducers and non-transducer functions and Injest will do its best to optimize the sequence of operations. And wait, there’s more! Injest has a parallelizing macro that will use transducers with the Clojure reducers library for simple and easy use of all your cores. Get it here: https://github.com/johnmn3/injest

Note: I’m not the author of Injest, just a satisfied programmer.

by drob518

4/21/2026 at 4:13:14 PM

I made srfi-171 [0], transducers for scheme. If you have any questions about them in general I can probably answer them. My version is pretty similar to the clojure version judging by the talks Rich Hickey gave on them.

I know a lot of people find them confusing.

0: https://srfi.schemers.org/srfi-171/srfi-171.html

by bjoli

4/21/2026 at 9:21:38 PM

thanks. this is going in my scheme.

by matrix12

4/21/2026 at 6:39:33 PM

Transducers are IMHO one of the most under-appreciated features of Clojure. Once you get to know them, building transducer pipelines becomes second nature. Then you realize that a lot of data processing can be expressed as a pipeline of transformations, and you end up with reusable components that can be applied in any context.

The fact that transducers are fast (you don't incur the cost of handling intermediate data structures, nor the GC costs afterwards) is icing on the cake at this point.

Much of the code I write begins with (into ...).

And in Clojure, like with anything that has been added to the language, anything related to transducers is a first-class citizen, so you can reasonably expect library functions to have all the additional arities.

[but don't try to write stateful transducers until you feel really comfortable with the concepts, they are really tricky and hard to get right]

by jwr

4/21/2026 at 4:35:38 PM

May I offer a little code riff slicing FizzBuzz using transducers, as one would do in practice, in real code (as in not a screening interview round).

Demo One: Computation and Output format pulled apart

  (def natural-nums (rest (range)))

  (def fizz-buzz-xform
    (comp (map basic-buzz)
          (take 100))) ;; early termination

  (transduce fizz-buzz-xform ;; calculate each step
             conj ;; and use this output method
             []   ;; to pour output into this data structure
             natural-nums)

  (transduce fizz-buzz-xform ;; calculate each step
             str ;; and use this output method
             ""  ;; to catenate output into this string
             natural-nums) ;; given this input

  (defn suffix-comma  [s]  (str s ","))

  (transduce (comp fizz-buzz-xform
                   (map suffix-comma)) ;; calculate each step
             str ;; and use this output method
             ""  ;; to catenate output into this string
             natural-nums) ;; given this input
Demos two and three for your further entertainment are here: https://www.evalapply.org/posts/n-ways-to-fizzbuzz-in-clojur...

(edit: fix formatting, and kill dangling paren)

by adityaathalye

4/21/2026 at 5:18:44 PM

Nowadays you can make use of some transducers ideas via gatherers in Java, however it isn't as straightforward as in plain Clojure.

by pjmlp

4/21/2026 at 5:34:07 PM

When I first read about transducers I was wowed. For example, if I want to walk all the files on my computer and find the duplicate photos in the whole file system, transducers provide a conveyor belt approach. And whether there are saving in terms of memory or anything, maybe. But the big win for me was to think about the problem as pipes instead of loops. And then if you could add conditionals and branches it is even easier to think about. At least I find it so.

I tried to implement transducers in JavaScript using yield and generators and that worked. That was before async/await, but now you can just `await readdir("/"); I'm unclear as to whether transducers offer significant advantages over async/await?

[[Note: I have a personal grudge against Java and since Clojure requires Java I just find myself unable to go down that road]]

by talkingtab

4/21/2026 at 6:51:37 PM

I think, like with the rest of Clojure, none of this is "revolutionary" in itself. Clojure doesn't try to be revolutionary, it's a bunch of existing ideas implemented together in a cohesive whole that can be used to build real complex systems (Rich Hickey said so himself).

Transducers are not new or revolutionary. The ideas have been around for a long time, I still remember using SERIES in Common Lisp to get more performance without creating intermediate data structures. You can probably decompose transducers into several ideas put together, and each one of those can be reproduced in another way in another language. What makes them nice in Clojure is, like the rest of Clojure, the fact that they form a cohesive whole with the rest of the language and the standard library.

by jwr

4/21/2026 at 7:12:35 PM

https://series.sourceforge.net/ this is the SERIES package you're referring to ? sorry, mostly a CL newb here, it's the first time I read about it

by jnpnj

4/21/2026 at 6:12:14 PM

You could always try ClojureScript

by justinhj

4/21/2026 at 4:16:25 PM

From (2016) at least.

https://web.archive.org/web/20161219045343/https://clojure.o...

by thih9

4/21/2026 at 5:10:26 PM

It's a blessing and a curse that zero innovation has occurred in the Clojure space since 2016. Pretty sure the only big things has been clojure.spec becoming more mainstream and the introduction of deps.edn to supplant lein. altho I am still partial to lein.

by whalesalad

4/21/2026 at 8:39:55 PM

I know others already pointed out a ton of things, but having worked with Clojure in 2016 and doing active Clojure development for my startup now I feel like I have to chime in too.

In 2016, Clojure was not great for serious data science. That has changed substantially and not just via Java Interop.

- It now has cross ecosystem GPU support via blueberry libraries like neanderthal, which in benchmarking, outperform some serious Java libraries in this space.

- It has columnar indexed JIT optimized data science libraries via cnuernber and techascent part of the Clojure ecosystem. In benchmarking they've outperformed libraries like numpy.

- The ecosystem around data science is also better. The projects aren't siloed like they used to be. The ecosystem is making things interoperate.

- You can now use Python from Clojure via the lib-pythonclj bindings. In general, CFFI is a lot better, not just for Python.

- The linters are way better than they used to be. The REPL support too.

Clojure already had one of the best efficiency scores in terms of code written to what is accomplished, but now you also get REPL integration, and LLMs have been increasingly capable of leveraging that. There are things like yogthos mycelium experiments to take advantage of that with RLLM calls. So its innovating in interesting new ways too, like cutting bugs in LLM generated code.

It just doesn't feel true to me that innovation isn't occurring. Clojure really has this import antigravity feel to it; things other languages would have to do a new release for, are just libraries that you can grab and try out (or maybe that's the python)

by JoshCole

4/21/2026 at 9:40:22 PM

Can you talk more about why you chose CLJ for datascience / ML.

Are there any benefits of using it over Python?

And how is the interop with Python libs?

by uxcolumbo

4/21/2026 at 11:41:05 PM

> Can you talk more about why you chose CLJ for datascience / ML.

I use Python for a lot of machine learning. My vision transformers, for example, are in Python. There is a lot to like about the Python ecosystem. Throwing away libraries like ablumentations and pytorch because you move to a different ecosystem is a real loss. You probably ought to be using Python if you're doing machine learning of the sort that one immediately thinks of when they see ML.

That said, data science and machine learning are words that cover a lot of ground.

Python often works because it serves as glue code to more optimized libraries. Sometimes, it is annoying to use it as glue code. For example, when you're working on computational game theory problems, the underlying data model tends to be a tree structure and the exploration algorithm explores that tree structure. There is a lot of branching. Vanilla python in such a case is horrifically slow.

I was looking at progress bars in tqdm reporting 10,000 years until the computation was done. I had already reached for numba and done some optimizations. Computational game theory is quite brutal. You're very often reminded that there are less atoms in the universe than objects of interest to correctly calculating what you want to calculate.

Most people use C, C++, and CUDA kernels for the sort of program I was writing. Some people have tried to do things in Python.

> Are there any benefits of using it over Python?

There is an open source implementation of a thing I built. It solves the same problem I solved, but in Python and worse than I solved it and with a lot of missing features. It has a comment in it, discussing that the universe will end before the code would finish, were it to be used at the non-trivial size. The code I wrote worked at the non-trivial size. Clojure, for me, finished. The universe hasn't ended yet. So I can't yet tell you how much faster my code was than the Python code I'm talking about.

> And how is the interop with Python libs?

Worked for me without issue, but I eventually got annoyed that I had to wait for two rounds of dependency resolution in some builds. Conda builds can sometimes have issues with dependency resolution taking an unreasonable amount of time. I was hitting that despite using very few libraries.

by JoshCole

4/22/2026 at 7:45:20 AM

To note that enough people have tried to do things in Python, that now writing CUDA kernels in Python is also a supported way, still WIP but NVidia is quite serious about it.

Basically their GPU JIT builds on top of MLIR, thus in the end is no different from anything else on top of LLVM.

by pjmlp

4/22/2026 at 8:45:13 AM

I like Clojure and want to get more into, but wondered what folks are doing when it comes to building AI powered apps. So thanks for sharing your experience.

And nice site btw :)

by uxcolumbo

4/21/2026 at 5:58:41 PM

Clojure 1.9: Spec.

Clojure 1.10: datafy/nav + tap> which has spawned a whole new set of tooling for exploring data.

Clojure 1.11: portable math (clojure.math, which also works on ClojureScript).

Clojure 1.12: huge improvements in Java interop.

And, yes, the new CLI and deps.edn, and tools.build to support "builds as programs".

by seancorfield

4/21/2026 at 7:08:01 PM

And we can look forward to Jank https://jank-lang.org/

by vaylian

4/22/2026 at 7:47:02 AM

Yes, although if one cares about Jank, they can also use a traditional Common Lisp or Scheme compiler, if compatibility with existing Clojure code isn't a requirement.

by pjmlp

4/21/2026 at 6:15:35 PM

Things have surely happened and the language has improved, but would you consider any of this to be innovative?

by whalesalad

4/21/2026 at 6:42:00 PM

Hmm. I'm not sure what you are looking for — myself, I write software that supports my living, and I'm not looking for thrills. What I get with Clojure is new concepts every couple of years or so, thought through and carefully implemented by people much smarter than me, in a way that doesn't break anything. This lets me concentrate on my work and deliver said software that supports my living. And pay the bills.

by jwr

4/21/2026 at 7:27:06 PM

Babashka is definitely innovative and useful

by waffletower

4/21/2026 at 9:35:20 PM

Agreed, that is huge for the ecosystem. I have a side project actually that has a unified codebase: central library and api server in clj, and the cli client is babashka.

by whalesalad

4/21/2026 at 6:21:36 PM

> zero innovation has occurred in the Clojure space since 2016.

Oh, really? Zero, eh?

clojure.spec, deps.edn, Babashka, nbb, tap>, requiring-resolve, add-libs, method values, interop improvements, Malli, Polylith, Portal, Clerk, hyperfiddle/electric, SCI, flowstorm ...

Maybe you should've started the sentence with "I stopped paying attention in 2016..."?

by iLemming

4/21/2026 at 9:25:58 PM

> clojure.spec

Tape-patches for self-inflicted language design issues isn't innovation, lol

by instig007

4/21/2026 at 6:18:52 PM

its Common Lisp cousin: https://github.com/fosskers/transducers/

by vindarel

4/21/2026 at 6:52:02 PM

I'd say SERIES is it's older cousin.

by jwr

4/21/2026 at 8:14:44 PM

SERIES would be the grandfather, no?

by BoingBoomTschak

4/21/2026 at 4:47:35 PM

The key insight behind transducers is that a ton of performance is lost not to bad algorithms or slow interpreters but to copying things around needlessly in memory, specifically through intermediate collections.

While the mechanics of transducers are interesting the bottom line is they allow you to fuse functions and basic conditional logic together in such a way that you transform a collection exactly once instead of n times, meaning new allocation happens only once. Once you start using them you begin to see intermediate collections everywhere.

Of course, in any language you can theoretically do everything in one hyperoptimized loop; transducers get you this loop without much of a compromise on keeping your program broken into simple, composable parts where intent is very clear. In fact your code ends up looking nearly identical (especially once you learn about eductions… cough).

by eduction

4/21/2026 at 5:03:15 PM

These sound wild in terms of promise but I never understood them in a practical way.

by fud101

4/21/2026 at 5:08:32 PM

They're not really that interesting. They're "reduce transformers". So, take a reduction operation, turn it into an object, define a way to convert one reduction operation into another and you're basically done. 99% of the time they're basically mapcat.

The real thing to learn is how to express things in terms of reduce. Once you've understood that, just take a look at e.g. the map and filter transducers and it should be pretty obvious. But it doesn't work until you've grasped the fundamentals.

by moomin

4/21/2026 at 5:43:38 PM

Canonical example is rewriting a non transducing set of collection transformations like

   (->> posts
      (map with-user)
      (filter authorized?)
      (map with-friends)
      (into []))
That’s five collections, this is two, using transducers:

    (into []
          (comp
            (map with-user)
            (filter authorized?)
            (map with-friends))
          posts)
A transducer is returned by comp, and each item within comp is itself a transducer. You can see how the flow is exactly like the double threading macro.

map for example is called with one arg, this means it will return a transducer, unlike in the first example when it has a second argument, the coll posts, so immediately runs over that and returns a new coll.

The composed transducer returned by comp is passed to into as the second of three arguments. In three argument form, into applies the transducer to each item in coll, the third argument. In two argument form, as in the first example, it just puts coll into the first argument (also a coll).

by eduction

4/21/2026 at 5:57:20 PM

That does not sound like a good example. The two-argument form of `map` already returns a lazy sequence. Same for `filter`. I thought lazy sequences are already supposed to get rid of the performance problem of materializing the entire collection. So

by kccqzy

4/21/2026 at 6:04:09 PM

Lazy sequences reduce the size of intermediate collections but they “chunk” - you get 32 items at a time, multiply that by however many transformations you have and obviously by the size of the items.

There are some additional inefficiencies in terms of context capturing at each lazy transformation point. The problem gets worse outside of a tidy immediate set of transformations like you’ll see in any example.

This article gives a good overview of the inefficiencies, search on “thunk” for tldr. https://clojure-goes-fast.com/blog/clojures-deadly-sin/ (I don’t agree with its near condemnation of the whole lazy pattern (laziness is quite useful - we can complain about it because we have it, it would suck if we didn’t).)

by eduction

4/21/2026 at 7:18:34 PM

So what’s your coding style in Clojure? Do you eschew lazy sequences as much as possible and only use either non-lazy manipulation functions like mapv or transducers?

I liked using lazy sequences because it’s more amenable to breaking larger functions into smaller ones and decreases coupling. One part of my program uses map, and a distant part of it uses filter on the result of the map. With transducers it seems like the way to do it is eductions, but I avoided it because each time it is used it reevaluates each item, so it’s sacrificing time for less space, which is not usually what I want.

I should add that I almost always write my code with lazy sequences first because it’s intuitive. Then maybe one time out of five I re-read my code after it’s done and realize I could refactor it to use transduce. I don’t think I’ve ever used eduction at all.

by kccqzy

4/21/2026 at 8:44:52 PM

It's evolving, and I'm using transducers more over time, but I still regularly am in situations where a simple map or mapv is all I need.

Lazy sequences can be a good fit for a lot of use cases. For example, I have some scenarios where I'm selecting from a web page DOM and most of the time I only want the first match but sometimes I want them all - laziness is great there. Or walking directories in a certain order, and the number of items they contains varies, so I don't know how many I'll need to walk but I know it's usually a small fraction of the total. Laziness is great there.

This can still work with transducers - you can either pass a lazy thing in as the coll to an eager transducing context (maybe with a "take n" along the way) or use the "sequence" transducing context which is lazy.

I tend to reach for transducers in places in my code where I'm combining multiple collection transformations, usually with literal map/filter/take/whatever right there in the code. Easy wins.

Recently I've started building more functions that return either transducers or eductions (depending on whether I want to "set" / couple in the base collection, which is what eduction is good for) so I can compose disparate functions at different points in the code and combine them efficiently. I did this in the context of a web pipeline, where I was chaining a request through different functions to come up with a response. Passing an eduction along, I could just nest it inside other eductions when I wanted to add transducers, then realize the whole thing at the end with an into and render.

Mentally it took me some time to wrap my head around transducers and when and how to use them, so I'm still figuring it out, but I could see myself ending up using them for most things. Rich Hickey, who created clojure, has said if he had thought of them near the beginning he'd have built the whole language around them. But I don't worry about it too much, I mostly just want to get sh-t done and I use them when I can see the opportunity to do so.

by eduction

4/21/2026 at 6:16:01 PM

This, by the way, is why the lead example in the original linked post on clojure.org is very much like mine.

by eduction

4/21/2026 at 6:36:02 PM

Thanks. So is this not an optimiser Clojure runtime can do for you automatically? I find the first one simpler to read and understand.

by fud101

4/21/2026 at 6:47:13 PM

Performance is one of the niceties of transducers, but the real benefits are from better code abstractions.

For example, transducers decouple the collection type from data-processing functions. So you can write (into #{} ...) (a set), (into [] ...) (a vector) or (into {} ...) (a map) — and you don't have to modify the functions that process your data, or convert a collection at the end. The functions don't care about your target data structure, or the source data structure. They only care about what they process.

The fact that no intermediate structures have to be created is an additional nicety, not really an optimization.

It is true that for simple examples the (-> ...) is easier to read and understand. But you get used to the (into) syntax quickly, and you can do so much more this way (composable pipelines built on demand!).

by jwr

4/21/2026 at 9:31:27 PM

I'd argue for most people performance is the single best reason to use them. Exception is if you regularly use streams/channels and benefit from transforming inside of them.

To take your example, there isn't much abstraction difference between (into #{} (map inc ids)) vs (into #{} (map inc) ids), nor is there a flexibility difference. The non transducer version has the exact same benefit of allowing specification of an arbitrary destination coll and accepting just as wide range of things as the source (any seqable). Whether in a transducer or not, inc doesn't care about where its argument is coming from or going. The only difference between those two invocations is performance.

Functions already provide a ton of abstractability and the programmer will rightly ask, "why should I bother with transducers instead of just using functions?" (aka other, arbitrary functions not of the particular transducer shape) The answer is usually going to be performance.

For a literal core async pipeline, of course, there is no replacing transducers because they are built to be used there, and there is a big abstraction benefit to being able to just hand in a transducer to the pipeline or chan vs building a function that reads from one channel, transforms, and puts on another channel. I never had the impression these pipelines were widely used, but I'd love to be wrong!

by eduction

4/21/2026 at 7:02:13 PM

I never understood what was so special about Clojure's Transducers. Isn't it essentially just applying a transformation on the lambda applied to a fold?

by solomonb

4/21/2026 at 9:45:47 PM

Fundamentally, there are two ways of representing iteration pipelines: source driven, and drain driven. This almost always maps to the idea of _internal_ iteration and _external_ iteration, because the source is wrapped inside the transforms. Transducers are unusual in being source driven but also external iterators.

Most imperative languages choose one of two things, internal iteration that doesn't support composable flow control, and external iteration that does. This is why you see pause/resume style iteration in Python, Rust, Java, and even Javascript. If that's your experience, transducers are a pretty novel place in the trade-off space: you keep most of the composability, but you get to drive it from things like event sources.

But the gap is a bit smaller than it might appear. Rust's iterators are conceptually external iterators, but they actually do support internal iteration through `try_fold`, and even in languages that don't, you can 'just' convert external to internal iterators.

Then all you have to do to recover what transducers give you is pass the object to the source, let it run `try_fold` whenever it has data, and check for early termination via `size_hint`. There's one more trick for the rare case of iterators with buffering, but you don't have to change the Iterator interface for that, you just need to pass one bit of shared state to the objects on construction.

Not all Iterators are strictly valid to be source-driven, and while most do, not everything works nicely when iterated this way (eg. Skip could but doesn't handle this case correctly, because it's not required to), but I don't think transducers can actually do anything this setup can't. It's just an API difference after that point.

by Veedrac

4/22/2026 at 1:24:47 AM

> If that's your experience, transducers are a pretty novel place in the trade-off space

That is not my experience and TBH I don't know what a lot of your terminology specifically means.

by solomonb

4/22/2026 at 5:30:15 AM

I wasn't saying you would have that experience, I was saying that the reason people act like transducers are unique is that transducers are an unconventional place on well worn ground.

Ultimately, yes, everything bottoms out, most special tricks seem less special the more you understand about them, because it's programming and Turing Equivalence is the bedrock the whole field rests on. But the average person learning about transducers is not going to spot how closely related it is to other things that already exist.

I'm happy to elaborate on any part of the terminology if you're curious, but tbh I mostly wrote it for myself because I thought the framing was novel and wanted it noted down somewhere.

by Veedrac

4/21/2026 at 7:11:16 PM

That is a bit reductive. You can consider these implementations in other languages: https://github.com/hypirion/haskell-transducers -- https://github.com/ruuda/transducers

by waffletower

4/21/2026 at 7:29:24 PM

It seems like a messy abstraction whose results could be achieved through a variety of other tools. :/

by solomonb

4/21/2026 at 7:31:56 PM

It isn't messy in Clojure

by waffletower

4/21/2026 at 3:47:14 PM

transducers and async flow are :chefkiss

by mannycalavera42

4/21/2026 at 7:14:36 PM

I am a fan of Christophe Grand's xforms library -- https://github.com/cgrand/xforms -- I find the transducer nexus function, by-key, to be particularly useful for eliminating clojure.core destructuring dances when one needs group-by with post-processing.

by waffletower

4/21/2026 at 7:21:59 PM

A not too contrived example: (require '[net.cgrand.xforms :as x]) (into {} (x/by-key :name :size (comp (x/reduce +) (map str))) example-mapseq)

by waffletower

4/21/2026 at 4:44:16 PM

[dead]

by faraway9911

4/21/2026 at 5:57:23 PM

You get this for free in Haskell, and you also save on not having to remember useless terminology for something that has no application on their own outside Foldables anyways.

by instig007

4/21/2026 at 6:06:34 PM

>...you also save on not having to remember useless terminology...

It may be true in this particular case, but in my admittedly brief experience using Haskell you absolutely end up having to remember a hell of a lot of useless terminology for incredibly trivial things.

by Maxatar

4/21/2026 at 6:20:28 PM

Terminology doesn't bother me nearly as much as people defining custom operators.

I used to think it was cute the you could make custom operators in Haskell but as I've worked more with the language, I wish the community would just accept that "words" are actually a pretty useful tool.

by tombert

4/21/2026 at 6:32:46 PM

> You get this for free in Haskell,

Oh, my favorite part of the orange site, that's why we come here, that's the 'meat of HN' - language tribalism with a technical veneer. Congratulations, not only you said something as lame as: "French doesn't need the subjunctive mood because German has word order rules that already express uncertainty", but you're also incorrect factually.

Haskell's laziness gives you fusion-like memory behavior on lists for free. But transducers solve a broader problem - portable, composable, context-independent transformations over arbitrary reducing processes - and that you don't get for free in Haskell either.

Transducers exist because Clojure is strict, has a rich collection library, and needed a composable abstraction over reducing processes that works uniformly across collections, channels, streams, and anything else that can be expressed as a step function. They're a solution to a specific problem in a specific context.

Haskell's laziness exists because the language chose non-strict semantics as a foundational design decision, with entirely different consequences - both positive (fusion, elegant expression of infinite structures) and negative (space leaks, reasoning difficulty about resource usage).

by iLemming

4/21/2026 at 7:16:35 PM

> Haskell's laziness gives you fusion-like memory behavior on lists for free.

Haskell laziness & fusion isn't limited to lists, you can fuse any lawful composition of functions applied over data with the required lawful instances used for the said composition. There's no difference to what transducers are designed for.

> But transducers solve a broader problem - portable, composable, context-independent transformations over arbitrary reducing processes - and that you don't get for free in Haskell either.

Transducers don't solve a broader problem, it's the same problem of reducing complexities of your algorithims by eliminating transient data representations. If you think otherwise, I invite you to provide a practical example of the broader scope, especially the part about "context-independent transformations" that would be different to what Haskell provides you without that separate notion.

> and negative (space leaks, reasoning difficulty about resource usage).

which is mostly FUD spread by internet crowd who don't know the basics of call-by-need semantics, such as the places you don't bind your intermediate evaluations at, and what language constructs implicitly force evaluations for you.

by instig007

4/21/2026 at 7:26:50 PM

> you can fuse any lawful composition of functions

each of those requires manually written rewrite rules or specific library support. It's not a universal property that falls out of laziness - it's careful engineering per data type. Transducers work over any reducing function by construction, not by optimization rules that may or may not fire.

> it's the same problem

It is not. Take a transducer like `(comp (filter odd?) (map inc) (take 5))`. You can apply this to a vector, a lazy seq, a core.async channel, or a custom step function you wrote five minutes ago. The transformation is defined once, independent of source and destination. In Haskell, fusing over a list is one thing. Applying that same composed transformation to a conduit, a streaming pipeline, an io-streams source, and a pure fold requires different code or different typeclass machinery for each. You can absolutely build this abstraction in Haskell (the foldl library gets close), but it's not free - it's a library with design choices, just like transducers are.

You're third claim is basically the "skill issue" defense. Two Haskell Simons - Marlow, and Jones, and also Edward Kmett have all written and spoken about the difficulty of reasoning about space behavior in lazy Haskell. If the people who build the compiler and its core libraries acknowledge it as a real trade-off, dismissing it as FUD from people who "don't know the basics" is not an argument. It's gatekeeping.

Come on, how can you fail to see the difference between: "Haskell can express similar things" with "Haskell gives you this for free"?

by iLemming

4/21/2026 at 9:07:04 PM

Why do you eliminate a library-based solution from the equation if it can actually prove the point that there's no difference in intent as long as my runtime is already lazy by default?

> It is not. Take a transducer like `(comp (filter odd?) (map inc) (take 5))`. You can apply this to a vector, a lazy seq, a core.async channel, or a custom step function you wrote five minutes ago. In Haskell, fusing over a list is one thing. Applying that same composed transformation to a conduit, a streaming pipeline, an io-streams source, and a pure fold requires different code or different typeclass machinery for each.

You can do that only because Clojure doesn't care whether the underlying iterable is to be processed by a side-effectful evaluation. That doesn't negate the fact that the underlying evaluation has a useless notion of "transducer". I said "fuse" in my previous comment to demonstrate that further comptime optimisations are possible that eliminate some transient steps altogether. If you don't need that you can just rely on generic lazy composition of functions that you define once over type classes' constraints.

`IsList` + `OverloadedLists` already exist. Had Haskell had a single type class for all iterable implicitly side-effectful data, you would have got the same singly-written algorithm without a single notion of a transducer. Let that sink in: it's not the transducer that's useful, it the differentiation between pure and side-effectful evaluations that allow your compiler to perform even better optimisations with out-of-order evaluations of pure stuff, as well as eliminating parts of inner steps within the composed step function, as opposed to focusing just on the reducing step-function during the composition. It's not a useful abstraction to have if you care about better precision and advanced optimisations coming from the ability to distinguish pure stuff from non-pure stuff.

Haskell aside, if your goal is to just compose reusable algorithms, a call-by-need runtime + currying + pointfree notation get you covered, you don't need a notion of transducers that exist on their own (outside of the notion of foldable interfaces) to be able to claim exactly the same benefits.

> Two Haskell Simons - Marlow, and Jones, and also Edward Kmett have all written and spoken about the difficulty of reasoning about space behavior in lazy Haskell.

There's a difference between what the people said in the past, and the things the crowd claims the people meant about laziness and space leaks. We can go over individual statements and see if they hold the same "negative" meaning that you say is there.

by instig007

4/21/2026 at 6:22:08 PM

It goes beyond a foldable, can be applied to streams. Clojure had foldables, called reducers, this was generalized further when core.async came along - transducers can be attached to core async channels and also used in places where reducers were used. The terminology is used to document the thing that various contexts accept (chan, into, sequence, eduction etc). They exist to make the language simpler and more general. They could actually allow a bunch of old constructs to be dispensed with but came along too late to build the whole language around.

by eduction

4/21/2026 at 7:01:41 PM

> It goes beyond a foldable, can be applied to streams.

> Clojure had foldables, called reducers, this was generalized further when core.async came along - transducers can be attached to core async channels and also used in places where reducers were used.

Ok, you mean there's a distinction between foldables and the effectful and/or infinite streams, so there's natural divide between them in terms of interfaces such as (for instance) `Foldable f` and `Stream f e` where `e` is the effect context. It's a fair distinction, however, I guess my overall point is that they all have applicability within the same kind of folding algorithms that don't need a separate notion of "a composing object that's called a transducer" if you hop your Clojure practice onto Haskell runtime where transformations are lazy by default.

by instig007

4/21/2026 at 7:22:07 PM

Is there a gain of clojure transducers to js style iterators? - https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

both solve the copying problem, and not relying on concrete types

by css_apologist

4/22/2026 at 10:49:19 AM

I think the misunderstanding is about iterators "not relying on concrete types". Rather, iterators are the concrete type. Consider the example transformation from the transducers page:

  (def xf
    (comp
      (filter odd?)
      (map inc)
      (take 5)))
You'll see there's no notion of a concrete type that the transformation operates on. It can work with vectors, seqs, core.async channels, etc. Now consider how that could be written in JavaScript such that it works on arrays, sets, generators, iterators, etc. without having to first convert to another type (such as an iterator). That is what's meant about transducers not being tied to a concrete type.

by joe-user

4/21/2026 at 8:23:52 PM

They compose. And can be passed around and be completely oblivious to how they will be reduced. With conj or sum or whatever they want. And you can extend them at any point at any end.

They are like map, filter and friends, but they compose. I think of iterators as an iterator protocol and transducers as a streaming protocol. An iterator just describes how to iterate over a collection. Transducers are transformations that can be plugged into any point where data goes in one direction.

by bjoli

4/22/2026 at 3:26:07 AM

js iterators work over lazy streams

by css_apologist