Well, there must be a reason why research keeps discovering new ways of computing and programming while the industry is stucked with the outdated methods.
I loved that movie, but I don't think it is too relevant here. I mean, you can rediscover and read any literature written in the 20s or the 1890s, which is exactly what our field is not doing.
It's simple: industry and academia are too far apart today.
Look at the languages that come out of academia, then look at the languages that have been invented over the last few decades which have gained traction. The latter list includes a lot of crazy items, things like Perl, PHP, Javascript, Ruby, and Python.
Some of them with their merits but for the most part hugely flawed, in some cases bordering on fundamentally broken. But what do they have in common? They were all invented by people needing to solve immediate problems and they are all designed to solve practical problems. Interestingly, Python was invented while its author was working for a research institute but it was a side project.
The point being: languages invented by research organizations tend to be too distanced from real-world needs of everyday programmers to be even remotely practical. Which is why almost all of the new languages invented over the past 3 decades that have become popular have either been created by a single person or been created by industry.
LLVM and Scala come to mind as PL projects born in academia and enjoying wider adoption. Not all researchers are interested in solving the "real problems out there", but some do, and are successful at it.
I loved that movie, but I don't think it is too relevant here. I mean, you can rediscover and read any literature written in the 20s or the 1890s, which is exactly what our field is not doing.