For the sorts of things I do, a (hierarchical and potentially even non-DAG) symbolic formula* is both much smaller and more easily reproduced than a (necessarily linear) sequence of physical actions. Therefore I tend to write myself CLI tools, with the occasional output via a graphical window or a browser.
Maybe this is why I don't have any voluntarily-installed apps on my phones? The phone form factor is suitable for entertainment, tolerable for consuming information, but horribly tedious for manipulating or generating it.
* it is a commonly held belief that "avoiding syntax errors", an activity most typically found only at the start of the learning curve, is worth spending person-decades on. Who (since Englebart, Iverson, et.al in the 1960s?) is attempting to tackle "avoiding semantic errors"? We have billions of USD spent on making more approachable vuvuzelas; where are the violins? (or better yet, pianos: both initially more approachable and ultimately more powerful than any of the strings)
This got me to thinking, I would totally learn APL on a phone app. If the custom keyboard could accommodate all the symbols, the "symbolic compression" vastly lowers resistance.
They don't have to be large programs, either. Just enough to describe basic constructs, and maybe include some "programming pearls," puzzles, or poetry.
The app could have ASCII symbols pop up, toast-like, as input is entered. Over time, the mind may associate "/\" with the corresponding single key.
A cursory glance for "apl programming" doesn't turn up much, but appropriate artistry and game mechanics could turn a new generation to a form of programming where symbols--not just alphanumerics--are both arcane and wondrous (yet also pragmatic).
Replit's mobile app has a decent virtual trackpoint-like cursor widget. If you added an APL keyboard as you suggested, it might be a nice combination if you leave out their AI autocomplete.
If you want to get extreme about input method minimalism, you might be able to remap Applejak[1]'s WASD / E input approach to swipe and tap gestures.
Interesting, Replit's widget does demonstrate how custom a UI component could be. Applejak's swipe and tap is definitely max minimal. Thank-you for these.
Between this and the comments¹ from 082349872349872², there's plenty of inspiration to go on.
You're welcome. https://dl.acm.org/doi/pdf/10.1145/1283920.1283935 has some phrases that may be interesting to start with, but note that this was Iverson's Turing Award lecture, and he uses a somewhat-functional definition style that —as far as I know— was a proposal but never shipped by any commercial APL.
Another source of interesting phrases: http://dfns.dyalog.com/sindx.htm (but note that here they're embedded in Dyalog's functional definition style; I don't have any clue if that site supports it even as an option)
Does type theory fit into whatever definition of semantics you're using? It is certainly reaching some incredible heights of usability with tools like Lean and closer to the mainstream with functional languages like Haskell or compilers for Rust. Verification as a topic certainly expands beyond that, 'semantic solving' has been around for a long time and produces some fascinating things
Yeah, when I was trying to think of more recent examples type theory came to mind, but I think "structured programming" has had more bang for the buck so far...
Same here. Im heavy CLI user, even tho im also Windows user. Luicky, we have Cygwin, kudos to developers of it.
Anyway, CLI also greatly fits w/ UNIX philosophy and reusability. Once you are confortable which CLI, learning new CLI is just easy. And the power of pipe, where you can combine several small CLI tools to get interesting results. Now try that w/ GUI ;)
Maybe this is why I don't have any voluntarily-installed apps on my phones? The phone form factor is suitable for entertainment, tolerable for consuming information, but horribly tedious for manipulating or generating it.
* it is a commonly held belief that "avoiding syntax errors", an activity most typically found only at the start of the learning curve, is worth spending person-decades on. Who (since Englebart, Iverson, et.al in the 1960s?) is attempting to tackle "avoiding semantic errors"? We have billions of USD spent on making more approachable vuvuzelas; where are the violins? (or better yet, pianos: both initially more approachable and ultimately more powerful than any of the strings)