Studying SML should be part of the entry examination for programming language designers. It's old, has its warts, but is still vastly superior to most things that came later. (Type inference? check! Pattern matching? check! TCO? check!. Performance ? excellent!, ...)
F# doesn't have many of OCaml's more advanced features. In particular, no functors and no polymorphic variants. Also the OO system is basically .NET, which isn't that surprising (but means that you don't get the neat row typing with inference that OCaml does).
OTOH some of the choices it makes are a bit more pragmatic - e.g. arithmetic is overloaded for ints and floats, locals can be marked as mutable similar to record fields, and fields are scoped to record types (so different types can use the same field name).
It’s incredible how much of an understatement it is to say that F# and OCaml were “inspired” by Standard ML. They’re practically step-siblings, sharing more similarities than I could have ever imagined.
Edit: Obviously, F# is the step-sibling here, given its half-dotnet parentage. However, they’re all solid choices. Their spectacular type inference makes coding in them a very gratifying experience. I can only think of TypeScript as the closest to them among the more popular modern languages.
Barry Revzin has written a paper to try to give C++ expressions like this. It's ugly, even by the standards of C++ but it would work. The rationale is that C++ 29 or C++ 32 wil probably get pattern matching and pattern matching doesn't have great ergonomics if everything is a statement rather than an expression. There are plenty of other things C++ will need to fix once it gets pattern matching, but the direction makes sense if you insist on trying to teach this particular old dog new tricks.
Also, I think Rust would consider itself to be a mainstream language, though they aren't techically "making this switch" because the language has always been expresion oriented from the outset. The Book has about two paragraphs about the statements in the language and then a whole chapter on expressions because almost everything is an expression.
They do, but the fact that there is even a distinction at this point is rather baffling. If your type system has the unit type and the bottom type, all statements should just be expressions having one of those two types.
Well, Rust barely has statements [1]. Nearly everything in Rust is an expression, and AFAIK statements /are/ essentially expressions that yield `()` [2].
> AFAIK statements /are/ essentially expressions that yield `()`
This isn't true. The inverse is true, "expression statements" can turn an expression into a statement.
What you're seeing in the playground is just how blocks are defined[1]:
> The syntax for a block is {, then any inner attributes, then any number of statements, then an optional expression, called the final operand, and finally a }.
In this case, you have one statement, and no optional expression.
And so:
> The type of a block is the type of the final operand, or () if the final operand is omitted.
So that's how this works.
Now, that being said, I don't think it's too terrible of a mental model to think of this situation in that way. But if we're getting into nitty-gritty details, that's not actually how it works.
No problem! This really only comes up in very edge cases anyway, which is partially why I don’t think it’s the worst mental model for most people. As long as you know that you can’t put let in expression position, that’s really the only place it goes wrong. Most people who haven’t used a language like Ruby would even think of items as being expressions in the first place.
> Does it interact with the borrow checker somehow?
Nope. The borrow checker cares about the control flow graph, expression vs statement doesn't matter.
> why does Rust make the expression/statement distinction?
I am not 100% sure. If I had to guess, older Rust was much less expression oriented, and then, over time, got moreso.
But also, I think it kinda just makes sense in general for the kind of language Rust is. Like, in Ruby, where everything truly is an expression, importing a file and then evaluating it has side effects. Whereas in Rust, 95% of statements are declarations, and of those 95%, the only ones you really use in a normal execution context are let statements. The rest are stuff like "declaring functions" and "declaring structs" and those don't really get evaluated in a language like Rust.
let being a statement is nice because it means it can only happen at the "top level" of a block, and not say, inside a loop condition.
> Like, in Ruby, where everything truly is an expression, importing a file and then evaluating it has side effects.
In the context of ML, I think it's a more useful baseline. So declarations are still declarations, but e.g. ; is just a sequential evaluation operator.
> let being a statement is nice because it means it can only happen at the "top level" of a block, and not say, inside a loop condition.
I would argue that it's actually a downside - it means that a loop condition cannot have common subexpressions (that nevertheless need to be evaluated on every iteration) factored out.
This was the language they taught in our functional programming course while I was doing my CS degree. That was around 15 years ago now. I wonder if they still use it in the course
I was taught the Standard ML of New Jersey (which as a non-American I did not realise is a joke, it's referring to the company now known to you as Exxon, the Standard Oil of New Jersey) at university.
I strongly believe that - although my home institution no longer teaches an ML as first language - this is the best way to teach CS to undergraduates. An ML has all the fundamental ideas you will need to also teach about this discipline, and (so long as you choose e.g. SML/NJ or similar, not Rust or something) it won't be a language the average teenager you recruited might already know, so the week 1 exercise showing they've understood what they're doing is actually work for all of your students, averting a scenario where some of them drift away only to realise at exam time that they haven't learned a thing.
Agreed - we wrote an ML style language in my uni compilers course and it was taught by the primary maintainer of MLTon (Dr Fluet is a great guy). Since it’s full program optimizing, the compilation took a while. He told us to give SML NJ a try for faster compilation but slower execution. It was a headache and only marginally faster compilation for our use case.
That said, it’s the OG so I give it some slack. I did enjoy MLton though, but it’s easier to do when the instructor wrote it.
Not everyone; I'm the kind of person who wishes posts about machine learning would be prefixed with these kinds of clarifications ("watch out, this is not related to the programming language but a family of stochastic algorithms referred to as 'machine learning'"), because I consistently fall for it.
My favorite bit of SML trivia: the infix function composition operator is “o” — lowercase letter o — so you can write ‘(f o g)(x)’ just like mathematical notation.
Defiantly worth studying SML imo. Pattern matching is a cool feature. Although it's not as comprehensive as most of the pattern matchers in Lisp. You can't match on bitfields, comparisons other than equality by value, etc.
Datatypes are just ok. Classes would be better. It's sort of strange to represent lists (and everything else) as enumerations. It's not really essential or fundamental but I guess that's what Lisp is for.
I suppose by enumerations you mean sum types. I would argue that these are pretty fundamental? you have product types (structs/records/tuples) - a value is made up of x and y - and sum types - a value can be either X or Y. I think the combination of these is what you need to precisely express any concrete data type.
I did mean sum types, variants, etc. It's not really clear what I meant by representing the data but I'm referring to type inference. SML can't solve the problem, and Lisp doesn't have it.
My CS degree course used SML as the first language to teach everyone in the first semester. Put everyone on the same level, as it was unlikely people would have prior experience. Also made it easy to teach things like recursion.
Ha, same here! It really helped my imposter syndrome, as I overheard a couple of guys talking about the ARM assembly they were doing on their Archimedes on the first day…and I hadn't written anything fancier than QuickBASIC at the time…