instead? pmap being defined as "Like map, except f is applied in parallel" [1]
That said, I'm too ignorant to understand the deeper benefits of the parallel approach. It seems it enables nifty things according to some other comments in this thread, yet I've been unable to understand quite why so far.
The deeper benefits come from having a language design where almost everything expects to work with arrays and array indices, so that idiom can be combined with many other operators and combinators.
Quick example in K, J's cousin, which I know far better: consider "{[d;p]d@&p d}". That's a 3-argument lambda which takes a data set (d) and a predicate function (p), applies the predicate function to the data set's values (p d) returning a boolean vector of the predicate's results, filters the result set to just the indices of the true/1 values (&), then indexes the data set by those indexes. The predicate function can be applied to the data set in parallel, and it can be slices by the indices in parallel. D can be a small data set in memory, or a terabyte of memory-mapped data on disk; it doesn't matter. (Really, it could just be "{x@&y x}" or "{x[&y[x]]}", but I named the parameters.) 'where' ("&") is just one of many operators that combines with array slicing like this.
In pseudo-Lisp it might look something like "(lambda (d p) (pmap d (where (pmap f d))))". 'where' is a function that converts e.g. [1 2 3 4 5] to [0 1 1 2 2 2 3 3 3 3 4 4 4 4 4], and [0 1 0 1 0] to [1 3]. This works for both boolean results and reshaping data sets, but the underlying implementation is the same.
Okay. I think I've figured out how to integrate clojure-style lazy streams with an array-oriented language, but it's a work in progress, and currently shelved until I get my Strage Loop presentation done.