although some devs do write c code the way they would write k code.
Including Arthur Whitney himself; this is one famous example of that, and is worth close inspection if you're curious about the whole concept of array languages in general:
Observe how quickly it "climbs the abstraction ladder", making heavy use of macros --- if you try to read it like "normal" C and just jump into the middle it's definitely not going to be readable, but if you start at the top and "consume" the definitions first, it becomes surprisingly easy to understand. One thing that stands out is how it makes almost all the loops implicit, via the DO macro.
I am not a professional programmer, I use it as a tool and most of my stuff is written in high level languages, but I have some C in there. Having said that, I feel like I am competent enough programmer that if I look at someone's low level code I get the gist of it almost immediately. I looked at this and it in no way was clear to me at all. I don't see the reason to shorten stuff like printf and main. Reducing key strokes adds something?
> if you try to read it like "normal" C and just jump into the middle it's definitely not going to be readable, but if you start at the top and "consume" the definitions first, it becomes surprisingly easy to understand.
I will take your word for this as you might know more than me about programming, but I feel like at least half of my colleagues need to get a gist of what is happening not what exactly is happening to the word. I make simulators for very complex machines. If we went around "climbing the abstraction ladder" every time something needs to be modified or added I think I will personally not get any work done.
> I don't see the reason to shorten stuff like printf and main. Reducing key strokes adds something?
Bugs hide in the extra chars.
On a more serious note, there are several (debatable) benefits to brevity (in programming & in general communication);
- less code to keep in your head
- less chance of stupid typos (e.g. prinft instead of printf)
- more "expressive" in that each char has more meaning
- to someone who is fluent in the "shorthand", it is quicker to both write & read
All these benefits don't seem like benefits to me. I don't want to learn a new short hand language when I already learned an extra language, English, to learn programming. Also I don't see expressiveness in terms of keystrokes, you don't store bytes in your head, you store tokens. I can have very different words for a character (char, "", etc.) but in my head they take exactly the same amount of space.
Are simple typos that annoying? Writing a wrong function call will lead to either a compile time error or a wrong behavior, but latter happens when the function names are not chosen smartly. Also we have IDEs now, they are pretty good at handling typos.
I don't understand how it is quicker to remember printf("hello world") than P("hello world").
> I can have very different words for a character (char, "", etc.) but in my head they take exactly the same amount of space.
how sure are you? given how little we know about the brain, i'd be cautious with asserting something like that as a fact. while i don't think we store bytes in our head, i'm not certain that the length of a "symbol" doesn't matter either.
Maybe different people do it differently, after all some have photographic memory, I don't. But I certainly don't think about all the letters involved in a variable but rather the concept of the variable, or what that variable represents. Doesn't matter if it's called "numberOfBurgersPerCustomer" or "nb" as such. But in case I forget, "numberOfBurgersPerCustomer" immediately tells me what is going on compared to just "nb".
I guess the point is that I don't want to wade in definitions just to get the gist of a code. I want to jump in there, see immediately what's going on and where the problem can be.
Another thing about that code that I think irks some folks but is also reminds me of reading math is that it's much easier if you are aware of some common conventions and notation. For instance, the lines:
#define V1(f) A f(w)A w;
#define V2(f) A f(a,w)A a,w;
Are much more quickly recognizable if you know that in APL the left and right arguments to a function are alpha (α) and omega (ω), so you'd immediately recognize these macros as for defining 1-argument and 2-argument functions.
For folks in the know, conventions like this allow quicker communication and expression, but they also raise the barrier somewhat to the uninitiated.
Side note for folks used to "modern" (post 1989) C code - this is old K&R style function definition. This SO post (https://stackoverflow.com/a/3092074) shows a comparison.
Including Arthur Whitney himself; this is one famous example of that, and is worth close inspection if you're curious about the whole concept of array languages in general:
https://code.jsoftware.com/wiki/Essays/Incunabulum
Observe how quickly it "climbs the abstraction ladder", making heavy use of macros --- if you try to read it like "normal" C and just jump into the middle it's definitely not going to be readable, but if you start at the top and "consume" the definitions first, it becomes surprisingly easy to understand. One thing that stands out is how it makes almost all the loops implicit, via the DO macro.