It's not every day that we see Oz mentioned here! I was very involved in writing the Mozart/Oz 2.0 VM.
I also wrote a "toy" (read: for school) dialect of Scala compiling to Oz and therefore turning every local variable or field into what Scala calls a Future, for free. Performance was abysmal, though! But in terms of language idioms, it was quite nice.
---
Unrelated: about Wasm, none of what it does is new, obviously. What's interesting about it is that
a) browser vendors agree to do it together, and
b) the design grows to accommodate many source languages. This used not to be the case but the eventual arrival of WasmGC significantly redistributed the cards of the game.
Relevant background here: I'm the author of the Scala to JavaScript compiler, and now co-author of the Scala to Wasm compiler.
I'm a bit hesitant to describe $NEW_CONCEPT/TECH as just $OLD_CONCEPT/TECH. Echoes of older things in a new context can really amount to something different. Yes, VMware didn't create the idea of virtualization and Docker et al didn't create containerization but the results were pretty novel.
I'd rather say that good ideas keep on returning, no matter whether they are remembered or getting reinvented.
It's not that those who reapplied the old concept in new circumstances are not innovators; they are! Much like the guy who rearranged the well known thread, needle, and needle eye and invented the sewing machine, completely transforming the whole industry.
But seeing the old idea resurfacing again (and again) in a new form gives you that feeling of a wheel being reinvented, in a newer and usually better form, but still very recognizable.
About WASM, it is not the first sandboxed bytecode interpreter but the first that runs in a browser and that has usable toolchains to compile not “browsers first” languages into it. I’d argue that that’s where the novelty is.
Maybe you know this better than me. Were non-JVM native languages available 20 years ago for Java applets?
My conception of it is that they were pretty much Java only (with Clojure and Scala also available in the later years before they got deprecated?). Is this conception wrong?
Java, Flash, Silverlight, ActiveX? There were loads of technologies to run different languages in a browser, but they were all proprietary to a point; none of them were a web standard, they all needed separate installation or a specific browser, and they were all basically black boxes in the browser. Whereas (from what I understand) wasm is a built-in browser standard.
There was (is?) also asm.js, which IIRC was a subset of JS that removed any dynamicness so it would be a lot faster than vanilla JS. But again, no broadly carried / w3c standard.
Just one of the many articles that are slowly surfacing, now that WebAssembly is interesting enough as possible attack vector.
While there is a sandbox, you can attack WASM modules the same way as a traditional process via OS IPC, by misusing the public API in a way that corrupts internal memory state (linear memory accesses aren't bound checked), thus fooling future calls to follow execution paths that they shouldn't. With enough luck, one gets an execution path that e.g. validates an invalid credential as good.
The JVM implemented properly should not have security issues. The class library however... (i.e. it's a lot easier to sandbox things if you start without any classes that interact outside the sandbox).
The JVM is fairly good at sandboxing, as these things go. Turns out sandboxing arbitrary software is an extremely hard problem (as the WASM folks are starting to encounter in the wild)
At least in so far as the higher level (DOM, browser runtime) and lower level (memory access, to the extent that it's mediated by the WASM VM) have no security issues...
The VM itself is pretty tight, but abstractions have a nasty habit of being leaky.
Garbage collection in every high level language: Java, which was the first mainstream language to do it-- people were seriously using cpp for high level business logic at the time, and were suspicious of GC for its performance.
But Java itself got it from LISP, which had introduced GC without it ever going mainstream decades prior
(2)
No SQL had already been tried as hierarchical databases in the 70s or 80s iirc. Relational model won because it was far more powerful. Then in the early 2010s, due to a sudden influx of fresh grads and boot campers etc, who often hada poor grasp on SQL, schemaless stuff became very popular... And thankfully the trend died back down as people rediscovered the same thing. Today's highly scalable databases like Spanner and Cassandra don't ostentatiously abandon relational calculus, they reimplement a similar model even if it isn't officiallu SQL
(3)
And then there's the entire cycle that's gone back and forth several times of client based vs server based:
First there were early ENIAC type computers that werr big single units. I would consider that similar to thick client.
Then as those developed we had a long era of something more similar to cloud, in that a single computer developed processes to support many partitioned users who submitted punch card batches.
That developed even further into the apex at the time of cloud style computing: terminal systems like ITS, MULTIcS, and finally in the 70s, UNIX.
Then the PC revolution of the 80s turned that totally on its head and we went back to very very thick client, in fact often no servers at all (having a modem was an optional accessory)
We stuck with that through the 90s , the golden age of desktop software.
A lot of attempts were made to go back to thinner clients but the tech wasn't there yet.
Then of course came the webapp revolution started by Gmail's decision to exploit a weird little used API called XMLHttpRequest. The PC rapidly transformed over the next decade from a thick client to a thin vessel for a web browser, as best exemplified by the Chromebook, where everything happens in the cloud -- just like it did in the mainframe and terminal days 50 yeara ago...
The trend could stay that way or turn around -- it's always depended in hardware performance balance changes.
To be honest, NoSQL makes sense where the stream of writes is very intense, so ACID guarantees are impossible to enforce along with relational guarantees, like referential integrity. See stuff like Cassandra.
Schemaless has its place for document storage and the like, but it requires a much more careful approach, else it can devolve into insanity.
I think you missed my point that what you're describing as the "old" idea were actually the "new" idea with a corresponding "old" idea. For instance, you mention OCaml's typing, but OCaml is from 1996 and Milner's type-inference work (which was for an early version of ML) is from 1982. And ML itself is from 1973 according to https://en.wikipedia.org/wiki/ML_(programming_language) ...
[My personal experience from doing related-work searches for research papers is that there was often an at least somewhat relevant reference from the 60s...]
about 15 years ago the joke was, `cat /etc/services | mail apply@ycombinator` as at the time it seemed like startups were just doing file transfer, email, network file systems, etc. it wasn't far off, as unix is file based, and the internet is also file based.
And to a point they were correct; file transfer 15 years ago was closely linked to piracy and dodgy websites that scam you into pressing an ad instead of a download button. It's only thanks to e.g. dropbox / cloud file storage suppliers, wetransfer, etc that that bit has been resolved.
Dunno about email though, the last real innovation in that space that I can remember with lasting impact was gmail. There were a few more tidbits like inbox (RIP), the inbox zero methodology, and Airmail (?) but none of them really took off.
There's always a push and pull between old and new tech and I agree some of the hot new tech is regurgitated old tech, but most of your examples aren't really comparable.
I would say that my examples are rhymes, different developments of the sane theme. They are not literal repetitions, of course; comparable, not identical.
Basically all of Tailwind CSS. Inline styles are nothing new, neither are utility classes, or the scalability issues of inline styles that led to Tailwind reinventing classes with their `@apply` macro for creating component classes.
Edit for another: RPC calls are really old and went out of style maybe 15 or 20 years ago in most codebases. Most of the modern JavaScript metaframeworks are now using RPC calls obscured by the build/bundling process.
Thank you for mentioning Tailwind. Every time some young dev talks about how Tailwind is "forward thinking" I just want to scream into a pillow. This is also the case now that SSR is becoming popular again.
I can deal with the SSR becoming hip again, but can we please settle on either back or front-end rendering? Either was good, but trying to combine the two is evil.
SSR is the most mindblowing of the lot, it's gone full circle.
I mean granted, I've worked with e.g. Gatsby for a while which is SSR on the one side but a hydrated SPA with preloading etc on the other making for really fast and low bandwidth websites, but still.
Sure, I'm not saying RPC isn't used today or that it doesn't solce specific problems.
It is a reinvention of an old idea though. There was around 15 years where RPC rotted on the vine until Google brought it back for (mostly) the enterprise scale, and another 6 or 7 years before JavaScript frameworks rediscovered it again for fullstack web applications.
… Eh? The predecessor to gRPC seems to have started internally at Google in 2001, and Google open-sourced it in 2015. In 2001, CORBA was all the rage; by the mid-noughties this had been replaced with SOAP, and maybe Thrift rpc in trendier places. I gather there was a whole parallel Microsoft ecosystem with DCOM and things, though that wasn’t my world and I don’t know much about it. But the point is that there hasn’t been a time where some form of RPC wasn’t in fairly common use since at least the early 90s.
The details change, and each one tries to solve the problems of the past (typically by inventing exciting new problems), but conceptually none of these things are _that_ different.
I may have completely missed a generation of RPC tooling. I was thinking specifically about web development in this context, but in general I don't remember hearing anything about RPC use between the early 2000s and mid to late teens (other than legacy systems maybe).
Web frameworks barely even abstract much. You still spend so much time marshaling things in and out of strings everywhere, and cramming information into URLs.
Mono-repos are now coming back with a "hipster" shine to them, with fancy in-repo build systems and what not.
What's funny about this example is that it's arguably not even that much of a time-difference between the two epochs of forgetting and re-learning. It's just that everyone jumped on the microservices bandwagon so much that they couldn't deal with it in a mono-repo context, so they dumped it and convinced the world that many smaller repos was "better". Then they learnt the hard lessons of distributed and complicated version dependencies and coordinating that across many teams and deployments. Their answer to this? Not back to mono-repos, no no no, semantic versioning dude, it's the hip new thing! When that was a bust and no one could get around to being convinced of using it "the right way", they were forced to begrudgingly acknowledge the value of mono-repos. But not before they made a whole little mini-industry of new build or dependency systems to "support" mono-repos as if they're just lots of little repos all under a single version-controlled repo.
These days I get this kind of stuff: "Hey you guys wrote this neat module as part of your project, can you separate it out and we can both share it as a dependency? Because, you know, it's a separate little mini-something inside of their codebase." ...Only to then be told that separating it out would "ruin" their "developer experience" and people would have to, gasp, manage it as a dependency instead of having it in their repo.
/rant. It's really hard not to be shocked and disgusted at this level of industry-level brain rot. I never thought I'd be "that guy" complaining about my lawn, but seriously, our industry is messed up and driven by way too many JS hipsters and their github-resume-based-development.
This is kinda why I really, really dislike the "social coding" meme that went around in the 2010s.
I get it, it's a team sport. It's just that the more people you put on your "team" the less agency everyone feels because responsibility gets diffused and it becomes more about about the "team" and less about actually doing the thing.
It's not the cycling that's the problem but that one can do nothing to stop it. People (perhaps including me) are dumb and insist on learning by making the mistake.
> Only to then be told that separating it out would "ruin" their "developer experience" and people would have to, gasp, manage it as a dependency instead of having it in their repo.
I hate having to do this, because then I have to get Nexus working with whatever the package manager in question is (Maven, npm, pip, NuGet all have different ways of publishing packages), setup CI for the publishing and god forbid I also need to manage the Nexus credentials for local installs and possibly even might have a Git submodule somewhere in specific cases, which also confuses some tooling like GitKraken sometimes.
It does prove your point, but honestly dependency management is a pain and I wish it wasn’t so; separating a module from your main codebase and publishing it as a package should be no harder than renaming a class file.