Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What makes APL different from, say, Lisp or Haskell? Do you have tutorials to recommend?


It's very hard to find good tutorials on APL because it's not very popular and most of its implementations are closed-source and not compatible with each other's language extensions, but it's most recognizable for its extreme use of non-standard codepoints. Every function in APL is defined by a single character, but those characters range from . to most of the Greek alphabet (taking similar meanings as in abstract math) to things like ⍋ (sort ascending). Wikipedia has a few fun examples if you just want a very brief taste; you can also read a tutorial from MicroAPL at http://www.microapl.com/apl/tutorial_contents.html

It's mostly good for being able to express mathematical formulas with very little translation from the math world - "executable proofs," I think the quote is - and having matrices of arbitrary dimension as first-class values is unusual if not unique. But for any practical purpose it's to Haskell what Haskell is to Java.


> But for any practical purpose it's to Haskell what Haskell is to Java.

Can you elaborate on this? As I understand, the core strengths of APL are succinct notation, built-in verbs which operate on vectors/matrices, and a requirement to program in a point-free style. All of this can be done in Haskell.


A Java programmer unfamiliar with Haskell looks at a Haskell program and shouts, "I can't make even the slightest bit of sense out of this!"

A Haskell programmer unfamiliar with APL looks at an APL program and...


Most Haskell programmers should be familiar with right-to-left point-free style, and should be able to infer that symbols stand in for names.

Of course, understanding the individual symbols is a different matter, but hardly requiring a conceptual leap.


>A Haskell programmer unfamiliar with APL looks at an APL program and...

And says "what's the big deal?". That's exactly the question, what is the big deal. APL isn't scary, I'm not shouting "I can't make sense of this", I am asking "how is this better than haskell in the same way haskell is better than java?".


I'm not really interested in debating the reaction of an imagined Haskell programmer. I was just restating what the grandparent's analogy meant.

Your question is fine, but not what he meant by the analogy.


I'm not imagined, I am real. I know you were restating the analogy, the problem is that the analogy is wrong. I can't find anything about APL that a haskell developer would find new or interesting or frightening or anything like that.


Ok.


More esoteric organization/concepts for anyone coming from the C family (which is basically everyone), more out-there notation, more deserving of the title "write-only," and less ability to do anything you might want to do with a real computer beyond using it as a calculator. I wouldn't want to do much work with Haskell's GTK bindings, but at least they exist.


That tutorial is deeply unimpressive. It seems very excited about APL having functions, and not directly mapping to machine-level constructs. In 1962 I can imagine that being impressive (if you weren't familiar with Lisp or ALGOL); today, not so much. The one thing that does seem somewhat interesting is the emphasis it puts on "operators" (i.e., second-order functions). This is obviously not new to anyone familiar with functional programming, but I do like the way that tutorial jumps in quite quickly to the practical utility of a few simple second-order functions (reduce, product, map).


Like I said, it's hard to find good ones; I didn't say I had succeeded. I learned a bit of it for a programming language design course, but I never got beyond the basic concepts.


Definitely watch this video http://www.youtube.com/watch?v=a9xAKttWgP4


APL has its own codepage? I have to say, that's a better and simpler way of avoiding success at all costs than Haskell ever found.

Not that I dislike the idea -- on the contrary, I'm inclined to conclude from my excitement over this and Haskell that I dislike success...


Well in the end it doesn't matter if your language is looking for popularity or not. What matters is what you can do with it. You think a language with weird symbols all around can't win? Just look at Pearl.

On a related note, if one plans to sell the Language of The Future Of Programming, I swear this thing will know the same fate as Planner, NLS, Sketchpad, Prolog, Smalltalk and whatnot if it cannot help me with the problems I have to solve just tomorrow.


Try J. Or Kona (open source K). All ascii characters.


Haskell has a rule - to avoid popularity at all costs


You should parse it as "avoid (popularity at all costs)" rather than "(avoid popularity) at all costs".


Well of course it does. Popularity is a side effect.


All the decent tutorials that I know of were in book form. Unless someone's scanned them they're gone. I know mine got destroyed in a flooded basement.

Now, I didn't learn APL from a tutorial, I learned it (in 1976) from a book. This book: http://www.jsoftware.com/papers/APL.htm from 1962.

If my memory hasn't been completely corrupted by background radiation, I've seen papers as early as the mid 1950s about this notation.

APL started out as a notation for expressing computation (this is not precise but good enough). As far as I'm concerned it's sitting at a level of abstraction higher than Haskell (arguably like a library overtop Haskell).

Now, in the theme of this thread, APL was able to achieve all of this given the constraints at the time.

The MCM/70 was a microprocessor based laptop computer that shipped in 1974 (demonstrated in 1972, some prototypes delivered to customers in 1973) and ran APL using an 80 kHz (that kilo) 8008 (with a whole 8 bytes of stack) with 2 kBytes (that's kilo) RAM or maxed out at 8 kB (again, that's kilo) of RAM. This is a small slow machine that still ran APL (and nothing else). IEEE Annals of Computer History has this computer as the earliest commercial, non-kit personal computer (IEEE Annals of the History of Computing, 2003: pg. 62-75). And, I say again, it ran APL exclusively.

Control Data dominated the super computer market in the 70s. The CDC 7600 (designed by Cray himself, 36.4 MHz with 65 kWord (a word was some multiple of 12 bits, probably 60 bits but I'm fuzzy on that) and about 36 MFLOPS according to wikipedia) was normally programmed in FORTRAN. In fact, this would be a classic machine to run FORTRAN. However, the APL implementation available was often able to outperform it, almost always when coded by an engineer (and I mean like civil, mechanical, industrial, etc engineer, not a software engineer) rather than someone specialising in writing fast software.

I wish everyone would think about what these people accomplished given those constraints. And think about this world and think again about Bret Victor's talk.


Thank you. Please consider writing a blog so that this knowledge doesn't disappear.

Were those destroyed tutorials published books?


The ones I remember were all books. At the time, I thought this was one of the best books available: http://www.amazon.com/APL-Interactive-Approach-Leonard-Gilma... -- but I don't know if I'd pay $522 for it... actually I do know, and I wouldn't. The paper covered versions are just fine, and a much better price :-)

EDIT: I just opened the drop down on the paper covered versions. Prices between $34.13 and $1806.23!!! Is that real?!? Wow, I had five or six copies of something that seems to be incredibly valuable. Too late for an insurance claim on that basement flood.


Probably Amazon bot bidding wars: http://www.michaeleisen.org/blog/?p=358


I'd say abstraction and notation. Start here:

http://www.jdl.ac.cn/turing/pdf/p444-iverson.pdf




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: