Well the idea of building as much as possible into the language is old and has repeatedly failed. Languages like PL/1, APL, and to a lesser extent PHP have all tried and failed at this. People do not need huge languages with as much stuff as possible crammed into them. What they need are languages where you can be reasonably certain, and maybe even formally prove things about the execution of your code (putting everything in The Cloud (tm) also hampers this), and they need to be able to grow the language to fit their needs (e.g. Scheme, Haskell, Clojure, etc... are all pretty good at this).
The whole idea of claiming "moar data" solves everything comes across as incredibly naive and ignorant of what people like Tony Hoare, Dijkstra, or Alan Perlis have said about computing.
There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult -- Tony Hoare
> People do not need huge languages with as much stuff as possible crammed into them.
I think you should replace "people" with "systems programmers" in order not to express an opinion that will in time (if not already) prove to be anachronistic and narrow-minded.
And the kinds of things we're "cramming in" (in reality, very slowly and methodically adding) have little to do with what you'd find in the languages you mention.
Think of it like this: standard libraries are great, but give you access to functionality that is effectively linear in the amount of stuff you learn, because they cannot be too tightly coupled in order to remain modular in the usual way.
With the Wolfram Language, you have more of a network effect: the more you learn, the more useful what you already know becomes, because the topic modelling you did over there can produce a network you can cluster over here and that you can then decorate with the graphics primitives you just learned about.
> The whole idea of claiming "moar data" solves everything comes across as incredibly naive and ignorant of what people like Tony Hoare, Dijkstra, or Alan Perlis have said about computing.
What is the "everything" that needs to be "solved"? Writing systems software?
I'm sure having graphs and images and timeseries built into your language isn't tremendously useful for writing a server. But people have already written servers. We don't need more languages to write servers (other than Rust or Go, perhaps).
We "need" the next generation of smart apps, things you couldn't even contemplate if you didn't already have access to curated data and algorithms that all work together. Or rather, that's what Wolfram Research is trying to make possible.
So in conclusion, it's not a systems programming language.
I think we got that.
So to break it down, and this is how I understand it as a user, it's a collection of algorithms, Lego bricks of math and a library of data sources glued together with a functional REPL that is semantically represented as a notebook.
Can you stick that on the web site somewhere without the millions of bullet points, marketoid spew and weaseling.
Maybe system programmers are “doing it wrong” now?
Even when you write a simple VGA driver you have to solve some equations. They are simple, so what does an average [good] C programmer does? He writes them down in his favourite inexpressive specialized language. Consequences: 1) a simple concept immediately becomes an incomprehensible mess; 2) approaches using more advanced ideas rarely get the proper attention because a system programmer is unable even to write them down, let alone reason about them and tweak them. There is no reason why a VGA driver for certain microprocessor could not be generated by a high-level language with the basic concepts behind it expressed more clearly, and nevertheless formally.
Agree with you entirely (especially after recently writing an SSDP and UPnP stack) but unfortunately the academics with all the supposed solutions have managed to deliver nothing with any momentum, so we're stuck with a 40 year old programming system and 30 year old abstractions.
Then again, you know what? It works pretty well so perhaps we're not doing it wrong
Out of curiosity, would you classify this in the 4GL camp?
Not to denigrate the technology and functionality that the Wolfram language is exposing, but it sure sounds a lot like the 4GL pitches from the 90s. Is that a fair comparison?
The whole idea of claiming "moar data" solves everything comes across as incredibly naive and ignorant of what people like Tony Hoare, Dijkstra, or Alan Perlis have said about computing.
There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult -- Tony Hoare