> It's only magic if you don't understand how it works.
That’s why I used scare quotes around the term ;)
> To me it is awkward to describe but simple to understand.
It’s not awkward to describe though. It’s literally just syntactic sugar for chaining functions.
It’s probably one of the easiest programming concepts to describe.
From our conversation, I’m not sure you do understand it because you keep bringing other tangential topics into the fold. Granted I don’t think the article does a great job at explaining what pipelining is, but then I think it’s point was more to demonstrate cool syntactic tricks you can pull off when writing functions as a pipeline.
edit: just realised you aren't the same person who wrote the original comment claiming pipelining was about iteration. Apologies for getting you mixed together.
If you're going to be snarky then at least get your facts right.
`Iter` is a method of `data`. And do you know what a method is? It's a function attached to an object. A FUNCTION. Pipelining is just syntactic sugar around chaining functions.
You even proved my point when you quoted the article:
Literally the only thing changing is the syntax of the code. You've got all of the same functions being called, with the same parameters and in the same order.
The article itself makes no mention of this affecting how the code is executed either. Instead, it talks about code readability.
In fact the article further proves my point when it says:
> You can, of course, just assign the result of every filter and map call to a helper variable, and I will (begrudgingly) acknowledge that that works, and is significantly better than trying to do absurd levels of nesting.
What it means by this is something like the following:
list = iter(data)
list = map(channel, |w| w.toWingding())
list = filter(list, |w| w.alive)
list = map(list, |w| w.id)
result = collect(list)
While I do have some experience in this field (having written a pipeline-orientated programming language from scratch), I'll cite some other sources too, so it's not just my word against yours:
The reason the article focuses on map/reduce type functions is because it's a common idiom for nesting commands. In fact you'll be familiar with this in Bash:
cat largefile.txt | sort | uniq --count
(before you argue about "useless use of `cat`" and other optimisations that could be made, this is just an example to demonstrate my point).
In here, each command is a process but analogous to a function in general-purpose programming languages like Rust, LISP, Javascript, etc. Those UNIX processes might internally loop through the contents of STDIN as a LF-delimited list but that happens transparently to the pipeline. Bash, when piping each command to the next, doesn't know how each process will internally operate. And likewise, in general-purpose programming language world, pipelines in LISP, Rust, JavaScript (et al) don't know nor care how each function behaves internally with it's passed parameters just so long as the output data type is compatible with the data type of the next function -- and if it isn't, then that's an error in the code (ie compile-time error in Rust or runtime error in Javascript).
So to summerise, pipelining has nothing to do with iteration. It's just syntactic sugar to make nested functions easier to read. And if the examples seem to focus on map/reduce, it's just because that's a common set of functions you'd want to chain and which are particularly ugly to read in nested form. ie they're an example of functions called in a pipeline, not the reason pipelines exist nor proof that pipelines themselves have any internal logic around iteration.
Yeah, you are right, it's about syntactic sugar, I didn't read the article except first two examples.
Pipelines are about iteration, of course. And they do have internal logic around iteration.
cat largefile.txt | sort | uniq --count
is an excellent example. While cat and count iterate on each character sequentially, sort and uniq require buffering and allocating additional structures.
The pipeline isn’t doing any of that though. The commands are. The pipeline is just connecting the output of one command to the input of the other.
Iteration is about looping and if the pipeline in the above example was some kind of secret source for iteration then the commands above would fork multiple times, but they don’t.
To me it is awkward to describe but simple to understand. Lucky me I have no intention of describing it.