Writers do something different from programmers. What programmers do is often more akin to mathematics, especially the type of stuff language creators do. And mathematical notation likes to be short.
I rather strongly disagree when it comes to programming. Programming is as much communication between humans and domain modeling as it is instructing a computer to do something. Shortening names seems to be a cultural holdover when name length had some real effect or constraint, and now thereβs highly subjective and debatable constraints like being shorter, faster to type, length symmetry, etc.
Mathematics uses symbols because there is usually no specific context for an abstract concept, and on other cases it relies on convention to provide context. Shorter names are additionally a side due to the density of information and the ability to write complex symbols on a chalkboard and in LaTeX. The symbols are the meaning and not some shortened version of it.
`π` is a mathematical symbol, but you could also write `CircumferenceOfCircleWithUnitDiameter`. The reason everyone prefers `π` should be clear. The same is true for a less extreme case like `fst` and `first`: You don't need to think about what `fst` means, if you are fluent in the programming language it is clear, and more concise than `first`. `fst` becomes an atomic symbol in the programmer's mind, it is not something you need to interpret like `first`. This is not about communicating with other humans or even the computer, it is about efficient representation within your own mind.
so much so that I think "differently" in good old car/cdr lisp. even though car/cdr means nothing on any cpu architecture, are cryptic as can be .. they embody the structural inductive step perfectly in my mind. when in clojure or sml I write slower :)
No writer does this. Why are programmers obsessed with word length to such a degree?