Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> We've been doing concurrency for many years now; it's called pthreads.

Sometimes, the major advances come when fresh ideas are infused from the outside. In Darwin's case it was his geological work that inspired his theory. In concurrency maybe it will be ideas from neuroscience.

> No radically different solution to concurrency is magically going to appear tomorrow: programmers _need_ to understand concurrency, and work with existing systems.

The environment is changing. In 2007, the oxygen levels started increasing, single threaded CPU scaling hit the wall. It has gone from doubling every 2 years to a few % increases per year.

We are only at the beginning of this paradigm shift to massively multi-core CPUs. Both the tools and the theory are still in their infancy. In HW there are many promising advances being explored, such as GPUs, Intel Phi, new FPGAs, and projects like Parallella.

The software side also requires new tools to drive these new technologies. Maybe a radical new idea, but more likely some evolved form of CSP, functional, flow-Based, and/or reactive programming models from the 70s, that didn't work with the HW environment at the time will fill this new niche.

For example, one of the smartest guys I know working on a neuromorphic engineering where he's creating a ASIC with thousand of cores now and may evolve to (b)millions. If this trilobite emerges on top, whatever language is used to program it might have been terrible in the 70s or for your "existing systems" but it may be the future of programming.



> Sometimes, the major advances come when fresh ideas are infused from the outside.

I agree with this largely; over-specialization leads to myopia (often accompanied by emotional attachment to one's work).

> In Darwin's case it was his geological work that inspired his theory.

If you read On the Origin of Species, you'll see that Darwin started from very simple observations about cross-pollination leading to hybrid plant strains. He spent years studying various species of animals. In the book, he begins out very modestly, following step by step from his Christian foundations, without making any outrageous claims. The fossils he collected on his Beagle expedition sparked his interest in the field, and served as good evidence for his theory.

> In concurrency maybe it will be ideas from neuroscience.

Unlikely, considering what little we know about the neocortex. The brain is not primarily a computation machine at all; it's a hierarchical memory system that makes mild extrapolations. There is some interest in applying what we know to computer science, but I've not seen anything concrete so far (read: code; not some abstract papers).

> We are only at the beginning of this paradigm shift to massively multi-core CPUs.

From the point of view of manufacturing, it makes most sense. It's probably too expensive to design and manufacture a single core in which all the transistors dance to a very high clock frequency. Not to mention power consumption, heat dissipation, and failures. In a multi-core, you have the flexibility to switch off a few cores to save power, run them at different clock speeds, and cope with failures. Even from the point of view of Linux, scheduling tons of routines on one core can get very complicated.

> In HW there are many promising advances being explored, such as GPUs, Intel Phi, new FPGAs, and projects like Parallella.

Ofcourse, but I don't speculate much about the distant future. The fact of the matter is that silicon-based x86 CPUs will rule commodity hardware in the foreseeable future.

> [...]

All this speculation is fine. Nothing is going to happen overnight; in the best case, we'll see an announcement about a new concurrent language on HN tomorrow, which might turn into a real language with users after 10 years of work ;) I'll probably participate and write patches for it.

For the record, Go (which is considered "new") is over 5 years old now.


I think you missed my point about Darwin. Darwin was inspired by the geologic theory, gradualism, where small changes are summed up over long time periods. It was this outside theory applied to biology that helped him to shape his radical new theory.

Right now threads are the only game in town, and I think you're right. For existing hardware, there probably won't be any magic solution, at least no with some major tradeoff like performance hit you get with Erlang.

I was thinking about neuromorphic hardware when I mentioned neuroscience. From what I hear the software side there is more analogous to HDL.

Go is great stopgap for existing thread based HW. But if the goal is to achieve strong AI, we're going to need some outside inspiration. Possibility from a hierarchical memory system, a massively parallel one.

I wish I could offer less speculation, and more solid ideas. Hopefully someone here on HN will. I think that was the point of the video. To inspire.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: