Hacker Newsnew | past | comments | ask | show | jobs | submit | fraber's commentslogin

Did you know Buddhism describes a “layer stack” of the mind, complete with a loss/reward function—much like AI and machine learning?

Back in 2015, my friend Brian Fenton and I published an AI article (https://link.springer.com/chapter/10.1007/978-3-319-21365-1_...) at AGI Berlin conference on a cognitive architecture with self-models at multiple levels—allowing AI to reflect on its own states and capabilities. But since we wrote it in highly technical language for an AI audience, it flew under the radar.

Today, I have published an article on the same topic—this time from a Buddhist perspective (https://medium.com/@frankber/buddhas-layer-stack-software-ar...). This approach lets me explore introspection and consciousness, ideas that don’t always fit within the constraints of scientific papers.


AOLServer (now Naviserver again) is still the base for the largest open-source project management system: ]project-open[ (https://www.project-open.com/)


Interesting, didn't expect it to still be around.


There was a massive parallel implementation of Prolog running on literally 256 processors: BA-Prolog (http://fraber.de/bap/). Unfortunately the hardware platform was abandoned some years later by Inmos.


Prolog is very, very dead. I love Prolog with all my heart, but it excells at problems that are solved today much more efficiently using neuronal networks. So it's utterly obsolete.

The issue of Prolog is that you need to code your rules manually. Doing ML with Prolog is possible, but very clumsy. Better stick to Python.

Speed is irrelevant, because most problems suitable for Prolog are exponential. Implementation is irrelevant, because SWI-Prolog does all you need with good integrations, except that it's a bit slower. But that's irrelevant, see above.

Learning Prolog is a great experience for any advanced computer science student. It amazes, doesn't it?


Prolog was never good at the things they thought it would be at, like AI, which is better done by ML today, specifically often like you said, with NNs. But it turned out to be good for other things, and those use cases are still alive today, even though there are many competitors. Look at Tiobe index, Prolog's usage is constant just under 1 percent, and has been for decades. So it's good for something.


Prolog was never designed for function approximation, like Neural Nets, so there is no comparison. Machine Learning with Prolog is perfectly possible and not at all clumsy. In fact these days we can even say it is done elegantly, by raising everything to the second order of logic where deduction and induction become one and the same.

Let me know if you need links and refs, but please try to keep your knowledge up-to-date before making big, splashy statements like "Prolog is very, very dead".


The following post is from October 2020, but I remember having used an interface to PostgreSQL more than 10 years ago: https://swi-prolog.discourse.group/t/swi-prolog-connecting-t... For those who don't get it: side-effects allow for the power of Prolog, just don't use them if you don't want them...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: